Filtering Quality - FilterTest

We start our look at filtering quality with a look at the current driver's output in the popular D3D9 AF testing tool, FilterTest, at the available AF levels. In Quality mode (out-of-the-box for the driver), NVIDIA's long-used trilinear filtering optimisation is on by default, whereas in High Quality mode all optimisations are claimed to be off.

Quality Mode - 0x, 2x, 4x, 8x, 16x

Click for a bigger version

Click for a bigger version

Click for a bigger version

Click for a bigger version

Click for a bigger version

High Quality - 0x, 2x, 4x, 8x, 16x

Click for a bigger version

Click for a bigger version

Click for a bigger version

Click for a bigger version

Click for a bigger version


It's clear from FilterTest and what it measures that the default quality is a leap over what's available on 6- and 7-series GeForce products (and let's not forget GeForce FX, too). While not truly invariant (which we'll show you on the next page!), it's the best it has ever been on consumer 3D hardware.

While NVIDIA are certainly able to do per-app detection to pare that back and go back to older variant level selection (the choice to do so still exists for the hardware), we're hoping that's not a choice made for many applications given the filtering horsepower G80 has.

We'll also pipe up about turning off all filtering optimisations out of the box by the driver, at least for SKUs like GeForce 8800 GTX and GTS (and the same was said of certain GeForce 7-series products) where the chip's configuration is so performant that it's something we urge users to do. Quality mode still applies the ages old trilinear opt, which we'd like to see disappear (by making High Quality the default mode in the driver and letting the user apply tradeoffs, not the IHV).