The gap between the high end graphics boards and the low end appears to be growing ever bigger, and enthusiasts are using an ever greater range of image quality features by default on their applications. Are we likely to see a change in the default settings, such as higher resolutions with FSAA and/or Anisotropic Filtering, or does the differing IHV implementations make this difficult? Will we see a return to high end low detail benchmarks for the high end and low end boards?

Nick: As much as we would like to have Anisotropic filtering and Anti-Aliasing as default, it most probably won’t happen because it would jeopardize apples-to-apples comparison. The problem is that even though 3DMark would ask for Anti-Aliasing or Anisotropic filtering, the driver has the final say in what will be rendered. If the driver tells the application “yeah, AA is on”, it could still render the frame without any AA. That is a problem, and a problem that is very difficult to solve. This doesn’t mean that we wouldn’t trust what drivers do, but the risk is too high that some driver version would have broken AA / AF, and thus making the result not comparable. We would love to have 4xAA and 8xAF on as default, but until we come up with a 100% way to ensure that they really are on during the default run, they may remain in the benchmark only as options. The default resolution is still not quite decided, so that’s something I can’t comment on.

The next 3DMark will continue the same route as we took with 3DMark03. It won’t have any low & high detail scenes; only one single detail level.

Tests proved that 3DMark03's game tests, bar game test 1, proved to be fairly resilient to CPU performance, instead squarely putting the onus on the graphics performance. We've been talking to Tim Sweeney regarding the UnrealEngine3 and he's often remarked how graphics bound the engine is - do you believe there to be a trend in this direction, or do you feel that your onus only on graphics performance is not inline with games and it might need to be changed in the future? (Note: for Beyond3D's purposes we like the fact that it is remarkably graphics bound as we want to be testing the graphics card performance)

Nick: 3DMark mainly measures the graphics performance of the PC. For full system benchmarking we strongly suggest using PCMark04. In 3DMark2001 SE the game tests were actually GPU/VPU bound when it was released, but by time it has become more and more bottlenecked by the rest of the system (CPU, memory speeds etc). 3DMark03 has also come to a point where the CPU has more impact on the result. The latest generation of graphics cards are incredibly fast, and thus making the GPU/VPU wait for the CPU to catch up in many cases.

Patric: We always design a new 3DMark version to scale with the graphics hardware, so that it can efficiently compare the performance of the very latest and greatest graphics hardware at least at launch time. In time each 3DMark version becomes more or less CPU bound anyway, as graphics hardware performance grows faster than CPU performance. So if we launch an already CPU bound benchmark, it will become only more so over time and we never really would measure the graphics performance. Many game benchmarks out there were CPU bound already at launch, become even more so over time, and therefore offer less interesting graphics performance measurements.

Mr. Sweeney may be right. Shader technology allows the use of DX-wrapper type light graphics engines, like the one we were using in 3DMark03, and also the one we’ll use in the next 3DMark. These need less CPU power to get something drawn on screen, which leaves more system resources for other tasks. You can of course program your engine to use the CPU also for some rendering, but graphics hardware of today is so powerful that only few drawing optimizations on the CPU end up boosting the performance in a reasonable way. Most of these have some even more efficient implementation on the GPU/VPU. I guess CPU optimizations for rendering make more sense for code paths intended for legacy or value hardware.