Effects, Meta-Effects and Renderables

Meta-Effects and Renderables

As shown earlier, the rendering of the scene is cut into steps, such as pre-processing, reflection and refraction, opaque, transparency, post-processing, and a final step in which the GUI is drawn. All effects registered to the step are rendered in sequence, then the next step is processed, which allow the user to make sure some things are rendered before others, such as a shadow map (which Effect is registered to the pre-process RenderingStep) being rendered to before a Mesh tries to sample it, during the opaque step. Also the Effects may be sorted given their identifier, so that the skybox Effect might run last of the opaque step.

Both the Mesh and the Light objects are considered Renderables. The reason why the Light is one, is that the engine uses Shadow Maps, which are basically a property of a Light. A Mesh or Light, is linked to a Meta-Effect, which contains a list of EffectID. To the artist, the Meta-Effect is the effect applied to render the element, but to the programmer it is the sum of different EffectID.

An EffectID is resolved to an Effect at runtime by the engine, given hardware capabilities. That allow reuse of existing, less demanding, Effects, as fallback. A river and a car can both use the “Reflection CubeMap” EffectID, to handle their own reflections. Now about the EffectID and Effect resolution, consider the following example:

  • A "Single Texturing" Effect, requiring only texture mapping support.
  • A "DOT3 Bump Mapping" Effect, requiring DOT3 hardware support.
  • A "Shader Normal Mapping" Effect, requiring shader support.
  • A "Simple Texturing" Meta-Effect, which render a textured object.
  • A "Normal Mapping" Meta-Effect, which render a bumpy object using a normal map.

The "Single Texturing" Effect, will declare being compatible with the "Simple Texturing" EffectID, at a priority 10, but also being compatible with the "Normal Mapping" EffectID, at priority 5. For the Effect writer, declaring that "Single Texturing" Effect supports "Normal Mapping" EffectID at priority 5 mean that it's intended to be used as a fallback. The "DOT3 Bump Mapping" Effect will declare being compatible with the "Normal Mapping" EffectID at priority 10. The "Shader Normal Mapping" Effect will declare being compatible with the "Normal Mapping" EffectID at priority 20.

When the engine meets the "Normal Mapping" Meta-Effect, which contains a single EffectID of the same name, not yet linked to an Effect, it'll find all the Effects which declare supporting it, then try to initialise each of them, in descending priority order, until one works. In our case, the engine will start with "Shader Normal Mapping", and if the hardware does support shaders, the initialisation will succeed, and the engine will link it to the EffectID. If the engine is running on a GeForce 2, it won't initialise, and the "DOT3 Bump Mapping" Effect will be initialised successfully, and be linked to the EffectID.

Last, if the hardware is even older, such as a TNT, the previous Effect will fail to initialise, and the "Single Texturing" Effect will be initialised successfully, and linked to the EffectID. As I said earlier, the Meta-Effect exist because an object might need more than a single rendering operation to be performed to render itself. For example the "Reflecting Normal Mapped" Meta-Effect requires two EffectID : "Reflection CubeMap" and "Reflecting Normal Mapping". The first is registered to the 'Reflection' RenderingStep, and the second to the 'Opaque' RenderingStep.

Effects

An Effect contains rendering code, that may be a part or the whole process to get the Renderable on screen. In the OctoPort version, with the patched RenderGraph, both the Light and Mesh were Renderables, and there was a single Effect interface. In FlExtEngine, there's a Renderable interface, which is the base of the Mesh interface, and there is a base class for Effects, which is then specialised as MeshEffect and GenericEffect. The reason why there are 2 effects, is that GenericEffects, operates on Renderables, whereas the MeshEffect operates on Meshes.

The Effects declare data they'll need. A GenericEffect declares Framebuffers (size, targets, type...) and a MeshEffect a vertex declaration (vertex attributes), and textures. When a Renderable is bound to a Meta-Effect, the engine run through the declarations of all the resolved Effects, so as to only load the subset of data required. The exact set of data required can only be known at runtime since it's only at that moment that compatible Effects to be used are known.

That solution obviously have an effect on the model's data. Either the exporter must loop through all the effects and their fallback to build up a list of required data (such as the vertex attributes to export), or make a rule that the all fallback effects will only use a subset of the highest demanding effect data. To avoid having to compute that too often, it might be a good idea to store that data list inside the Meta- Effect, since it's a Meta-Effect that an artist does set to the model.

Editor's Note

Next time, Roderic will talk about resource management in the game engine, be that geometry, materials and anything else the engine has to consume in order to draw the right pixels. We hope to publish that in little over a week, rather than the bazillion months we took to get this part out of the door, after part 1.