Ray tracing to kill off the GPU? Intel thinks yes

Posted by Intel.com Research

A blog entry at Intel.com Research article has popped up which discusses the possibility of the GPU being killed in favor of the multi-core CPU.  The reason?  The CPU is much more capable at handling ray tracing operations, that graphical holy grail which take the perspective of the eye and traces the light ray back to its source.  Ray tracing is a very demanding compute load on a system.  Real-time demos have been shown running Quake IV, but it required an 8-core system to operate at decent frame rates.  Still, Intel and AMD have promised eight-core chips within a few years.  If we double the price of the quad-core chips today, then we quickly realize that the price for such a machine, even today, is not that far beyond the high-end GPUs of today.  In fact, with the ability to provide dedicated ray tracing outside of the main CPU in the system, such a card would probably sell like hotcakes to gamers.

here you can see the significant difference between GPU rendering through rasterization, versus ray tracing the same scene.  The rasterize process has no way to accommodate the true surfaces, reflections, textures, etc., that ray tracing can.  Look at the details in the teapot.  Also, on the cup the spoon's reflection, the spoon itself and even the table.  It's all much more realistic with ray tracing.  And it's something CPUs would be better suited to than GPUs.  BTW, for a better quality image go to the link above.

So, will the GPU be killed off in favor of the CPU, provided that CPU sits atop a plug-in card which is also capable of doing basic graphics?  TG Daily believes it could be a possible alternative or add-on card that is also installed in the system next to the real GPU card, but we don't believe it will ever serve as a replacement.

What's your take?