CEDEC 2007: Capcom on Lost Planet Part II

Sunday 30th September 2007, 09:00:00 PM, written by Stefan Salzl

As promised the second part of our summary of the Capcom presentation on Lost Planet at CEDEC 2007 focuses on the PC version of the game and its improvements due to the use of DirectX 10. Part I can be found here. Since the features of the Xbox 360 version have already been discussed in Nishikawa’s article (translation by our forum member one, technical summary of the GPU effects and of its approach to parallelism), we will concentrate on the new features of the latest update. Additionally, we have supplemented the GameWatch summary with information from a 4gamers.net article on Lost Planet to clarify several points made in the talk.

According to Takami Taki, technical manager of Capcom’s Production Studio 2, the PC version was first proposed at the end of 2006. In only five months, the title was ported to the PC and released in June, with an update in August adding a few new features to the title.

One of the first improvements added to the PC version was ambient occlusion mapping (also supported in DX9 mode), a feature missing in the X360 version (Slide - Comparison).

According to Capcom, the DirectX 10 mode offers not only a performance boost, but also an increase in visual quality: The performance increase stems from the use of new DirectX 10 features such as geometry shaders, stream output, increased depth buffer resolution (no MSAA trickery) or comparative sampling of shadow maps (PCF). The biggest leaps in visual fidelity are mainly due to new effects, mainly benefiting from the use of geometry shaders. motion blur, depth of field and fur shading have all gained an improvement due to new features in DirectX 10 in the latest Lost Planet update (Slide).

Motion Blur

The DirectX 10 motion blur extracts a velocity map from the source image, creates line geometry via the geometry shaders and combines them in an accumulation buffer. Afterwards, the objects are blurred accordingly and combined with the source image (Schematic).

The new implementation reduces occlusion artifacts and artifacts caused by fast movements, which are both present in the DirectX 9 version.(DirectX 9 - DirectX 10)

Fur Shading

Similarly, the fur shading benefits from DirectX 10’s geometry shaders. Instead of splitting the texture into layers and subsequently drawing them, a method that causes visual artifacts when viewed from at an angle, the updated fur shading uses the geometry shaders to create actual geometry. After the extraction of a fur map, which includes length and direction of the hair, line geometry is created from it with geometry shaders and stored in an accumulation buffer, from which the final image is drawn. “We're literally growing hairs”, remarked Sawada, programmer of the Production Studio 2. (Schematic - DirectX 9 - DirectX 10 - Comparison Slide)

The CEDEC presentation also showed a real-time demo which allowed the length of the fur to be altered on the fly, demonstrating the flexibility of the method instead of solutions prepared previously in 3D modeling programs.

Depth of Field

The old version of the depth of field effect blurred an image in a few gradations which were subsequently blended, based on their distance from the view point. The new method in Lost Planet enlarges points in out-of-focus areas based on their distance from the camera, similar to real-life photography.

Based on a depth map, the scene is divided into 4 viewports (Multiple Viewport Rendering), taking into account occlusion: an “In Focus” viewport, a “Close but Out of Focus” viewport, “Wide and Out of Focus” viewport and a separate viewport for “objects with large movement”. Due to performance reasons, both out-of-focus viewports are cleverly rendered at a fourth of the final resolution, since they are also the viewports with the biggest blur.

Now in each viewpoint, the circle of confusion (which, in Capcom’s algorithm, takes the shape of a hexagonal iris) is calculated for each pixel based only on its z-depth. The hexagonal iris is then collected in a triangular texture, which circumscribes the iris. All the triangle textures are collected in an accumulation buffer and then rendered to the viewport to give the final blurred viewport. Combining all four viewports then gives the final image with the improved depth of field effect (Schematic - DirectX 9 - DirectX 10).

At first glance collecting the iris in triangular textures seems inefficient, but Satoshi Ishida, also programmer for Capcom’s Production Studio 2, notes in the 4gamer article: “Even on GeForce 8800, if you create more than 3 vertices via geometry shaders, the performance drops dramatically. So the feature of the geometry shaders, like we’re using it now, generally makes 2 or 3 vertices. The efficiency would be better with rectangles, but if you actually measure the performance, using triangles in 1.5 times faster than rectangles.”

Moreover, the latest Lost Planet update also included a new high quality shadow setting (16 sample) for DirectX 9 users, in addition to the nine sample medium setting previously available. The original 32 sample shadow setting, only available on DirectX 10 cards, has been renamed to DirectX 10 shadows.

According to Sawada the new updated DirectX 10 version offers a 10-20% gain on a GeForce 8800 GTX over its DirectX 9 performance, while increasing visual experience for gamers. (Slide).

The talk ended with the presentation of a Resident Evil 5 trailer, another MT Framework based title besides Devil May Cry 4.

Discuss on the forums

Tagging

b3d ± capcom, pc, xbox360, directx10


Latest Thread Comments (5 total)
Posted by Ether_Snake on Wednesday, 29-Apr-09 04:05:37 UTC
Sorry for bumping this very old article but with the newly released Lost Planet 2 videos, I looked up motion blur articles and stumbled on this one.The part I'm really interested in is the way they do depth of field as described in the article. While this was implemented on PC for DX10 supported graphics cards, I am wondering if this might also be on LP2 on 360?Basically, I'd like to suggest using this at work on a future project, or something similar, but I'm wondering if this is too costly on PS3/360?And just wondering to make sure I understand; they basically use an alpha hexagonal texture mapped to triangles that are covering the screen (like a plane subdivided into many triangles), and the rendered images are projected onto the proper triangles, (and then everything is added to an accumulation buffer to get the blurred results)?I'm just fuzzy on how those triangles are laid out, and how the multiple viewport rendering results are applied to them (it sorts which triangle correspond to which depth?).Thanks for the time! This was quite an interesting article!

Posted by ultragpu on Wednesday, 29-Apr-09 13:31:03 UTC
Quoting Ether_Snake
Sorry for bumping this very old article but with the newly released Lost Planet 2 videos, I looked up motion blur articles and stumbled on this one.

The part I'm really interested in is the way they do depth of field as described in the article. While this was implemented on PC for DX10 supported graphics cards, I am wondering if this might also be on LP2 on 360?

Basically, I'd like to suggest using this at work on a future project, or something similar, but I'm wondering if this is too costly on PS3/360?

And just wondering to make sure I understand; they basically use an alpha hexagonal texture mapped to triangles that are covering the screen (like a plane subdivided into many triangles), and the rendered images are projected onto the proper triangles, (and then everything is added to an accumulation buffer to get the blurred results)?

I'm just fuzzy on how those triangles are laid out, and how the multiple viewport rendering results are applied to them (it sorts which triangle correspond to which depth?).

Thanks for the time! This was quite an interesting article!
It might be slightly off topic, but why would depth of field be too much for consoles, at least Uncharted 2 has demonstrated such an effect.

Posted by Ether_Snake on Thursday, 30-Apr-09 03:36:52 UTC
Look at the article, it shows how they did it. It's not typical depth of field. And from their description, it would only work with DX10+.

Posted by ultragpu on Thursday, 30-Apr-09 05:14:11 UTC
Quoting Ether_Snake
Look at the article, it shows how they did it. It's not typical depth of field. And from their description, it would only work with DX10+.
I did read the article when I replied and last time I heard Cell can replicate some of the DX10 effects too, I think it was Nao32 who gave the insight. But hey, I'm simply speculating at this point.

Posted by Neb on Thursday, 30-Apr-09 10:38:51 UTC
Quoting ultragpu
I did read the article when I replied and last time I heard Cell can replicate some of the DX10 effects too, I think it was Nao32 who gave the insight. But hey, I'm simply speculating at this point.
So can a Pentium 3. But you need to have the processing power to do it in realtime together with other stuff.


Add your comment in the forums

Related b3d News

Book reviews: The Magic of Computer Graphics and 3D Engine Design for Virtual Globes
A new approach to graphics performance analysis
RWT Analyzes Bulldozer Benchmarks
Really Techy Worldly Hot Par 2010 coverage
Carmack Rage: either console could look superior
Sub $100 graphics at Tech Report
Broadcom purchase AMD's DTV business
Lucid Hydra 100 multi-GPU scaling demonstrated at IDF
The Khronos Group announce Heterogeneous Computing Initiative
3DMark, the Game