Click for a bigger version
In your opinion, what is the currently the biggest hurdle to overcome in 3D hardware technology? We've seen that shaders go beyond simple texturing and lighting and we're witnessing faster bus technology… what specifics would you like to see addressed?

Now is a great time because 3D hardware is coming out of the dark ages of being toy game acceleration technology, and morphing into highly-parallel general computing technology in its own right. The last hurdle is that the GPU vendors need to get out of the mindset of "how many shader instructions should we limit out card to?" and aim to create true Turing-complete computing devices.

We're already almost there. You just need to stop treating your 1024 shader instruction limit as a hardcoded limit, and redefine it as a 1024-instruction cache of instructions stored in main memory. Then my 1023-instruction shaders will run at full performance, and my 5000-instruction shaders might run much more slowly, but at least they will run and not give you an error or corrupt rendering data. You need to stop looking at video memory as a fixed-size resource, and integrate it seamlessly into the virtual memory page hierarchy that's existed in the computing world for more than 30 years. The GPU vendors need to overcome some hard technical problems and also some mental blocks.

In the long run, what will define a GPU -- as distinct from a CPU -- is its ability to process a large number of independent data streams (be they pixels or vertices or something completely arbitrary) in parallel, given guarantees that all input data (such as textures or vertex streams) are constant for the duration of their processing, and thus free of the kind of data hazards that force CPU algorithms to single-thread. There will also be a very different set of assumptions about GPU performance -- that floating point is probably much faster than a CPU, that mispredicted branches are probably much slower, and that cache misses are probably much more expensive.

When can we expect to see games featuring the UnrealEngine3? Have there been any interests in this engine from potential licensees?

We're not going to be announcing anything along these lines publically for a while, because UnrealEngine3 projects are quite early in development and are are largely tied to platforms and launch timeframes that haven't been announced. But we will be showing UnrealEngine3 behind closed doors at GDC to select development teams.

Is UnrealEngine3 being created with Longhorn (the codename for Microsoft's next Operating System) in mind?

We expect to ship 32-bit and 64-bit executables on-disc, likely with the highest level of graphical detail that our game supports on PC only be available on 64-bit CPU's running the codename Longhorn OS. We certainly won't require it to run the game, but there are a lot of things we can do based on its key architectural improvements including address space, but not only that.

With consoles being more attractive than the PC in terms of profits, how does this affect the way game engines are being designed that simultaneously target both platforms?

Next generation console weighs verily heavily on our minds for the third generation Unreal Engine, and is going to be a major focus of ours both from a game point of view and an engine point of view. Can't say more yet, though.

QUOTE

"Now is a great time because 3D hardware is coming out of the dark ages of being toy game acceleration technology, and morphing into highly-parallel general computing technology in its own right."

Tim Sweeney
Epic Games

Your thoughts on PCI-Express in terms of its potential effect on game engine design decisions, as expressed by Ubisoft's Dany Lepage on his own developer page at our site?

Fundamental architectural improvements due to PCI-Express will likely have to wait for the next major Microsoft OS release to be widely utilized by games and by video drivers. But in the meantime, it's a great platform improvement, allowing another decade of significant performance scaling just as AGP reaches its limits. I'm also looking forward to it as a geek toy, being able to buy high-end PC's with two PCI-Express graphics slots and plug in two high-end video cards.