Introduction

Since its launch last November, Sony's latest gaming platform has given early adopters trouble when attempting to play certain titles in 1080i/p. The console did not automatically upscale its video output to desired resolutions; it was up to either the game software to support these resolutions natively or the internal scalers of users' HDTVs. This forced many people, developers and owners alike, to question the very existence of scaling hardware in the PlayStation 3.

Worse, it created a sour taste in the mouths of owners of older CRT-based HDTV sets, many of which are not capable of accepting a 720p signal at all, and thus only capable of displaying 480i/p and 1080i video signals. If they wanted to run their games in HD resolution, the solution for these disgruntled owners until now was to hope that developers would release their games with 1080i/p support --not a walk in the park for the developer-- or, simply, to buy a new HDTV. As one can imagine, the latter was not the most well-received solution in the history of CE devices.

The key words in that last paragraph would be “until now,” because with the latest PlayStation 3 software development kit (SDK) update, Sony Computer Entertainment Inc. (SCEI) has finally exposed part of the built-in hardware scalerto developers.

Will this mean that most, if not all, future games will support output at 1080p/i resolutions? Moreover --and this is the question that owners of 1080i-only CRT HDTVs crave to see answered--does this mean that current PS3 games may eventually support the native HD resolution of their televisions? Well, the answer requires some good old-fashioned explanation, so let’s start already!

The issue and its solution until now

For openers, let's talk about the way the PlayStation 3 outputs modern videogames. First, the console renders a frame buffer - called the back buffer - in which the drawing passes are executed. Following that, a second frame buffer is rendered, the front buffer. The front buffer is basically the final result of all the rendering operations that took place in the back buffer, and thus it is the picture that one sees on their screen. The front buffer is the optimal point in the display process where scaling solutions can operate.

Before the latest SDK update, developers working on PS3 wishing to support 1080p/i in their 720p-native games had very few options available to them. For 1080i, a theoretical possibility (since it’s not supported on PS3) would be to do “field rendering” -- a technique that consists of software scaling the 1280x720 backbuffer into a 1920x540 frontbuffer that will successively render odd and even interlaced lines. The problems with this technique are that it can only output in 1080 interlaced (not progressive), that it has image quality issues, and even more limiting, that it requires the game renderer to refresh at 60Hz (60 frame per seconds) at all times. If the renderer misses a single frame, the output image quality will be terrible. Clearly, field rendering is not an optimal solution for the video output issues of the PlayStation 3.

The only real option available to developers, though, was to upscale the front buffer in software to 1920x1080. This avoids the image quality issues associated with field rendering, and it is also capable of outputting 1080p as well as 1080i signals. Additionally, it is perfectly compatible with games that render at 30Hz, or with unstable framerates (yeah, we know that there’s a few of those out there.)

So, you may say, why don’t developers just upscale the front buffer to 1080p in software and be done with it? As simple as this sounds, the techniques described above are not free. They have a price in both computational resources and RAM. The computational issue--in the form of doubled fillrate cost--is a problem on its own, since fillrate is not one of RSX's fortes. But it’s not as stressful as it sounds, since a front buffer is being rendered in a single pass, unlike the back buffer, which requires numerous passes and is therefore more sensitive to resolution increases. The real deal breaker in this scenario is the larger footprint in RAM occupied by these upscaled buffers. It's a price that not all development houses can pay if they plan on their game fitting into the PlayStation 3’s memory.

To illustrate how serious the memory problem would be, let's look at the increased requirements that a software upscaler would impose. There are three main resolutions that comprise the common HDTV standards: 720p (1280x720 progressive), 1080i (1920x1080 interlaced) and 1080p (1920x1080 progressive). In order to output at 720p, a PlayStation game needs to render a 1280x720 front buffer; in the case of 1080i/p, it needs a 1920x1080 pixel front buffer. As you can see in the graph below, these different buffers have very different sizes.

As the above graph illustrates, a 1920x1080 front buffer is twice the size of a 1280x720 buffer. With the difference in megabytes measured in the single digits, you could argue that in the grand scheme of things, the difference is not that large. But in reality, when dealing with any closed system, game developers are already trying to shoehorn their games into the available RAM. The PlayStation 3 is no exception, and many developers simply cannot afford to spend more RAM on an upscaled front buffer... and as the games already on the market plainly show, many of them didn’t.

In the best case scenario, a developer would wait for RSX to "vblank" (vertical blank--in other words, be left with only the front buffer loaded in memory) and then use RSX to upscale. Developers that seek to have RSX rendering the next frame in the back buffer as soon as possible have to store their front buffer as a 720p image and then upscale it when the GPU is available. In this case, at a single point in time, a 720p back buffer, a 720p front buffer, and a upscaled 1080p frame buffer are all resident in the PlayStation 3 video memory.

Fast forward to the day when a new SDK update was made available, and with it comes a new solution for developers.