Perturbed Environment Bump Mapping

This Bump Mapping technique is based on the principles of Environment Mapping and is used by MicroSoft is DirectX V6.0. This technique can be implemented in Software without problems but Hardware implementations on existing hardware or announced hardware seem to be unavailable.

I will use 2 pictures to explain this technique. The first picture shows normal environment mapping:





Now the central sphere is our object, this can be anything but a sphere is easy to explain it. The environment map is placed on a big sphere surrounding the object. On this sphere there is an environment map mapped. Now to do environment mapping the hardware must be capable of searching the correct texel in the environment map. This is done using mathematical equations and should deliver the pixels via the arrows as shown. With a sphere as an object the outside sphere is just projected onto the object sphere, but similar principles are used for other other objects, its just complex math. Now we want to do bump mapping so we want light effects. For this reason we use a environment map that shows a lightmap. Here we see that the environment map has the bright light at the right, you can imagine the big sphere being bright yellow on the right side and dark on the left. Right now the environment mapping just simulates normal lighting. So lets now introduce some bumps:




Of course the object doesn't get real bumps but they show you the idea : the bumps will change what texel is selected from the environment map and that way light and dark effects are generated. Now this means that when looking for the texels in the environment map a perturbation is used based on a bump map. So basically you change the maths that search the correct environment texel.

This technique is very advanced because it can support several lights in one pass (just place more than one highlight in the environment map). You can also make reflecting bump mapped objects , so a combination of bump mapping and environment mapping. Microsoft has given us a screenshot of that technique:





Problem with this technique is that you need two texture maps at the same time : you will need to read the perturbation from the first and use that perturbation when looking for the correct texel in the second map. Doing this on existing hardware is impossible. Future hardware will have to include extra gates to support this.