Conclusion

It should be clear from our Architecture and GPU Analysis and this article that G80 is a mightily capable GPU in terms of final pixel-level image quality. Defined by its D3D10 compatibility, G80 extends most of that defined capability to D3D9-class applications, too.

With largely orthogonal filtering and ROP hardware that doesn't care what its sampling from or filtering, in order to commit final pixels, we can't take issue with what it's capable of in that respect. On top of that, the driver doesn't seem to be playing tricks when it comes to setting up the hardware to do requested work, such that filtering issues prevalent in prior NVIDIA hardware appear to have disappeared (and that should be the case for the lifetime of all G8x-based products, we think).

CSAA, however, throws up some interesting questions. It exists outside of D3D, and thus without NVIDIA remapping regular multisample modes to CSAA modes (with or without the user or something like WHQL being in on it), it becomes a product of the driver control panel interface to set and use effectively.

Add to that the fact its not infallible in its method and you won't always get the quality improvements per-pixel, along with the fact that 16xQ CSAA costs about as much in performance (if not silicon) as a true 16x+ MSAA mode would because of the depth test costs, and it can appear slightly curious on the surface as to why NVIDIA would spend engineering effort and transistors on the feature. The silicon costs for CSAA are therefore sufficiently cheap for NVIDIA to want to offer its users a genuine and usable IQ enhancement, outside of D3D, if they choose to enable it.

Therefore we can only take issue with CSAA and its implementation, and NVIDIA's insistance that filtering optimisations still need to be applied as a default on hardware like GeForces 8800 GTS and 8800 GTX. The rest of the hardware seems to do a sterling job from our investigations, and, like we said in the Architecture and GPU Analysis piece, if there's a glaring fault in its IQ (even with early drivers) that isn't an application specific issue, we are yet to find it.

There's a whole section of image quality discussion that concerns non-3D applications of G80, and mostly motion video, which relies on a driver not yet released at the time of writing, which will form another satellite piece (time permitting). We should mention, though, that the latest released driver for G80 apparently generates a score of 128 in the HQV standard definition DVD video benchmark.

And so before we finish here, leaving us with performance to discuss in the third part of our G80 investigation, we're left saying that image quality is nothing without the rendering horsepower to realise it in useable framerates, such is the entire tenet of real-time rendering. It's probably clear to all that the first two G80 SKUs are pretty performant in modern games and applications, to the point of seriously walking over all current competition in many cases. We'll check that reality out soon.

Comments

Want to comment on the article? Follow on to the discussion thread on our forums.