Product: GeForce FX 5800 Ultra (Reference Sample)
Chipset: NVIDIA GeForce FX
 

We've had allusions to it at the launch of GeForce4 last February, we've discussed every rumour, we've studied financial releases and conference calls for expected release dates, we've been handed titbits of information, we've studied developer details, we've attended the launch, we've interviewed product managers for details and we've anticipated the arrival of products. Now, finally, we get to see what NV30, formally known as GeForce FX, one of the most anticipated graphics cards of 2002 (and early 2003), can actually do.

The picture above may paint the road to GeForce FX as a rocky one. With NVIDIA's management's insistence that GeForce FX would be ready for Fall 2002, it soon became clear that it wasn't, and at the launch the official line was that large product availability wouldn't come about until early 2003. Exactly what the delays were changes depending on who you may be talking to; some say they were due to reconfiguration of the chip to increase its power in the face of some unexpected competition, while others say that it's all down to the early adoption of TSMC's .13µ process and it not being ready when NVIDIA had expected. It seems that the later talk puts the delays to NVIDIA first designing the part on TSMC's low-k dielectric .13µ process which wasn't ready at the time NVIDIA needed it, forcing them to move from the low-k process to the standard .13µ process.

 

A Little History

Ever since the delayed launch of 3dfx's (NVIDIA's primary competition at the time) VSA-100 chip, designed to power the Voodoo4 and 5 series, NVIDIA have been in the driving seat for 3D graphics. GeForce256 arrived, touting T&L amongst other features, when 3dfx were 'paper launching' VSA-100 at Comdex. Then GeForce 2 GTS appeared well in time for Voodoo5's eventual release, out-powering and out-featuring 3dfx's product. The demise of 3dfx followed later that year, with NVIDIA reaping the technology and much of their sought after engineering talent. NVIDIA's next high end 3D product, GeForce3, was released in a virtual competitive vacuum with the features and performance clearly outclassing everything else within a little while after its release. Some argue that GeForce3 was in fact late, but even if this was the case it was inconsequential since there was little competition at the time, and they had a handy refresh the prior year in the form of GeForce2 Ultra.

While 3dfx, a company that had the hearts and minds of many gamers, demised, NVIDIA already knew that their real competition lay with ATI. While perhaps not holding the mind share of the gamers, ATI still had the ear of many OEMs, and this NVIDIA knows well is where the real money is found. During the periods of the GeForce2 reign, ATI showed that it still had its hand in the 3D market. The original Radeon had what it took in terms of features and innovation, but its configuration didn't let it shine in gaming titles of the era. ATI's next release, Radeon 8500, showed they were on the ball in terms of features, but with the part appearing six months after GeForce3, and what looked to have been a slightly rushed release, it remained in the mid-end performance range as NVIDIA had their GeForce 3 Ti 500 and Ti 200 refreshes ready. NVIDIA further hammered their point home in February 2002 with the release of their GeForce4 line.

Despite having products to answer ATI at every turn, it wasn't quite enough to push ATI to the same fate as 3dfx. NVIDIA had, however, instilled in the minds of the OEMs that they are the company that deliver, and ATI's market share had been falling. ATI appear to have realised that in the face NVIDIA's fierce competition it would be a case of reform or die. To this end, ATI took their own measures, first by purchasing the IP company ARTX, formed with a core group of ex-SGI engineers and known to have the license on the graphics element of Nintendo's Gamecube console, for $400Million (which still causes much discussion with ATI's investors) and ATI's CEO, K.Y. Ho, stepping aside from the operational level and letting Dave Orton (inherited from ARTX) take over the reigns. Inevitably it takes some time for such changes to shake out products, but when it finally did it couldn't have happened at a more annoying time for NVIDIA.

While the rumour mill of ATI's latest release churned away, R300, the first tangible fruits of the union of ATI and ARTX, was eventually announced and managed to raise many eyebrows. Putting NVIDIA's GeForce 4 Ti4600 product in the shade with its 325MHz clock speed, 8 pixel pipelines and 256-bit memory bus, the Radeon 9700 squarely put ATI back on the map and raised their profile not just with the OEMs again, but with gamers and developers alike. Although the expectation was that NV30, with its advanced manufacturing process, would easily be more than a match for Radeon 9700 PRO, it wasn't on hand and instead we were hearing a steady stream of rumour and delay.

It's now been about six months since the release of Radeon 9700 PRO, which is six months that NVIDIA haven't had the highest performance product -- almost unheard of since GeForce256, and arguably before. However, GeForce FX is starting to break cover and make its presence known and it's time to see exactly what this bodes in terms of performance and features we can use now...