Lets start by looking at the influence of this filtering on smooth color gradients. To test the impact, I have created a test image with a smooth color gradient going from red to black. I have then used ordered dithering (as used by most 3D accelerators) to reduce the color depth. This dithered image is then upsampled by various filters. The first filter is a simple 2 by 2 filter, the next two are 3 by 3 filters (with different parameters). So lets look at the results:

 







Now all these images might look equal at first, but that's probably because you are looking at these images in a very high resolution. In games, you are often limited to 640x480 or 800x600, and in those resolutions, the dither patterns are much more visual. So to really see the difference, I will now show some zoomed images:









You can clearly see the typical dithering pattern in the second normal image from the top. The 3 images below that one use a filter that tries to bring the16-bit dithered image back to an output closer to the original, 24 bit, input. As you can see, the dithering is reduced and more shades of red are created, but the dithering patterns aren't removed completely. So basically, it seems indeed true, that a higher color depth is achieved by this method. But I can't tell if the bottom 3 images are truly 22 bits, 21 bits or 19 bits.