Technology and workflow
From the technical perspective, our goals is to use many modern computer graphics techniques:
* The platform specifications for Imagine Cup 2005 Rendering Invitational aren't defined yet (30 Mar 2005), so we made 'reasonable' estimates ourselves and hope for the best.
Demo creation process
Our content creation process consists of the following steps:
This workflow is pretty similar to the way modern games or machinima movies are made. The steps don't strictly follow the listed order; most of them are performed in iterative way (for example, the model may require some tweaking after the animation is made). Some steps may be performed concurrently by different persons (e.g. texturing and animation are mostly independent, once the models are made).
Our goal was lots of detail on the architecture of the outer walls - example on the left shows detail in 25 x 25 centimeters region. The main room is roughly 8 x 10 meters in size and 4 meters in height, and with such amount of detail we'd reach 20-30 million polygons only for the outer walls! Therefore we decided to use fairly low-polygon models in the demo, and use normal maps to simulate surface details.
A normal map is a regular rexture, but instead of storing colors, it stores surface orientations. Each pixel in the texture is interpreted like a 3D vector that is used in lighting
calculations. Below is one piece of the outer walls: the low-polygon version that is used for rendering (240 triangles), the normal map,
the ambient occlusion map (see below) and finally how it it looks with normal and ambient occlusion maps, rendered in realtime (without texture
and with texture). The high-polygon version (56000 triangles) is not used for realtime rendering at all, just for normal map computation.
Another goal was soft "natural" lighting on the scenery: for example, some areas of the surface aren't exposed to light and hence should be darker. Ambient occlusion enables us to have exactly this: for each point on the surface, we calculate how much of it is "open" (i.e. not occluded by some nearby surface). In the end, for each distinct object in the scene we have a special "ambient occlusion" texture that we use at rendering time in the lighting calculations.
Example on the left is our main character without ambient occlusion (left part) and with ambient occlusion (right part) applied. On the right there's example piece from the outer walls, rendered with normal and ambient occlusion maps.
For character shadows, we developed a custom algorithm that can generate shadow penubra regions very efficiently, when projected shadows are cast on nearly planar surfaces. A paper that describes our algorithm is already accepted to upcoming ShaderX4 book.
On the right side there's an example of our soft shadows algorithm (left part) versus standard projected shadows algorithm (right part). The soft penumbra regions make the shadow look much more realistic.
We have many complex models in the outer walls, and at first we thought it would be a problem to nicely texture all of them. The main difficulty is that most of the models are highly curved (i.e. not flat), and laying down the flat texture onto a 3D model in such a way that no "seams" would be visible is very hard.
Most of the time the models had to be textured with stone or stone-like textures. We thought about writing procedural "stone shaders", but rejected that idea - writing a program that would generate realistic and interesting 3D stone texture is very hard.
In one eureka! moment we found nice and easy way to seamlessly apply regular stone textures to the 3D models. We already had authored unique surface-to-plane UV mappings (for normal and ambient occlusion maps) for all our models, so our idea was this: project tileable stone textures onto the model's surface from several angles; and weight the results according to the orientation of the surface (surface's normal).
We wrote a tool to do that (screenshot on the right), and the results turned out to be very good - given an example texture, we only need to set several parameters and the tool generates the texture for any model in several seconds. The tool can also generate "gloss" textures (where brightness indicates how much surface is "shiny" at this point) in a similar way.
The generated textures can be tweaked in bitmap editor like regular textures if desired, but the tool that generates a nice and seamless "base version" saved lots of time.
Texturing of each object was done in the following steps:
Wall fracture and physics
Accoring to scenario, the walls fracture when hit by "the powers" of the character, and the pieces realistically fall on the floor, some bounce a bit and come to rest. Additionally, in interactive part of the demo the user can hit any wall anywhere, and it realistically fractures and the pieces fall out. This rules out any precalculated animation for the fracture & physics - both of them must actually be calculated and performed in realtime.
For fracture calculation we use algorithmically generated "fracture patterns" that are further organized into a hierarchy of patterns that can support both small and large fracture details. Hierarchical organization also helps with simulation speed, as we can efficiently fracture out only some region of the whole wall. Because our walls are flat, the fracture is essentially two dimensional simulation and can be calculated very fast.
Once the fractured out pieces of the wall are determined, we apply initial forces to them according to the blast curve model that's widely used in structural engineering research.
The motion of falling pieces is simulated using standard rigid body dynamics methods. We don't try to invent the bicycle here and will use some of the freely available rigid body dynamics and collision detection libraries. Currently in evaluation are ODE (completely free, source code available, and we have some experience with it), NovodeX (very fast, free for non commercial use) and Newton Game Dynamics (free, but doesn't come with source code). Note: we ended up using slightly modified ODE physics library.