Conclusion
Ray tracing isn't the holy grail of real-time rendering-- it should be obvious that it has serious problems of its own to conquer, like rasterisation, before it can offer acceptable real-time image quality. Despite what a lot of the current press seems to suggest, though, the two techniques aren't mutually exclusive in the first place.A hybrid approach will probably offer the best of both worlds. Many ray tracers already replace primary rays with rasterisation and only compute secondary rays through traditional ray tracing. The ability to spawn rays in the GPU's shader core would solve many hard problems that rasterisation engines have to face. There certainly seems to be no major image quality advantage to moving to a ray tracer for a large portion of the common realistic scenes that we want to render.
In my opinion, research in real-time ray tracing is chasing the wrong thing. While it's nice to have a system capable of handling simple refraction and reflection, it doesn't provide a solution to the bigger problem of solving the Kajiya rendering equation. Ray tracing is likely to have a position in real-time rendering similar to its position in the off-line world: a useful technique to use for a few effects but not used by the main renderer.
Instead, solving the indirect lighting problem is the real holy grail of real-time rendering. Of course, at this point, no one even has a serious framework to do so that doesn't require massive amounts of precomputed data, so there's still a lot of work to be done regardless of approach.