NVIDIA Comments on Ray Tracing and Rasterization Debate
Another take on ray tracing
The debate that is real-time ray tracing for games is not new by any stretch, but it does have a very strong renewed interest thanks to some articles we published recently from Daniel Pohl. The first was published in December of 2006, titled "Ray Tracing and Gaming - Quake 4: Ray Traced Project". To quote:
technology on desktop computers. The number of CPU cores is increasing
and special purpose ray tracing-hardware-prototypes (http://www.saarcor.de/)
show impressive results in speed improvement. It is still a long way to
playing computer games in graphics like the Lord of the
Ring-movies, but we are getting closer to it.
I wrote an article in September of 2007 called "Rendering Games with Raytracing Will Revolutionize Graphics" that looked at ray tracing performance improvements since Daniel's first article, what role Intel and their Larrabee project are playing in raytracing graphics and other benefits ray tracing might offer like scaling down to other devices.
Sample of ray tracing work by Intel - from our "Ray Tracing and Gaming - One Year Later" article
Daniel's second article was released in January of this year: "Ray Tracing and Gaming - One Year Later" that offers up examples from Intel about how the benefits of ray tracing can directly affect gamers and game engines. If you haven't read all of these incredibly interesting pieces, I suggest you do so.
There are many people that don't see ray tracing as the holy grail of gaming graphics, however. A corporation like NVIDIA, that has a vested interested in graphics beyond the scale of any other organization today, has to take a more pragmatic look at rendering technologies including both rasterization and ray tracing; unlike Intel they have decades of high-end rasterization research behind them and see the future of 3D graphics remaining with that technology rather than switching to something new like ray tracing.
I recently was able to spend some time with NVIDIA's Dr. David Kirk, Chief Scientist of NVIDIA, and ask him some questions regarding the rasterization and ray tracing debate. I think you'll find his answers to be quite interesting in light of all the hype and exciting about Intel's developments.
PC Perspective: Ray tracing obviously has some advantages when it comes to high levels of geometry in a scene, but what are you doing to offset that advantage in traditional raster renderers?
High poly count screenshot from Oblivion - courtesy of WaitingforOblivion.com
PC Perspective: Antialiasing is somewhat problematic for ray tracing, since the "rays" being cast either hit something, or they don’t. Hence post-processing effects might be problematic. Are there other limitations that ray tracing has that you are aware of?
PC Perspective: While the benefits of ray tracing do look compelling, why is it that NVIDIA and AMD/ATI have concentrated on the traditional rasterization architectures rather than going ray tracing?
PC Perspective: Is there an advantage in typical pixel shader effects with ray tracing or rasterization? Or do many of these effects work identically regardless?
Rasterization shaders at work in Crysis - courtesy Crysis-game.ru
PC Perspective: Do you see a convergence between ray tracing and rasterization? Or do the disadvantages of both render types make it unpalatable?
PC Perspective: In terms of die size, which is more efficient in how they work?
PC Perspective: Because GPUs are becoming more general processing devices, do you think that next generation (or gen +2) would be able to handle some ray tracing routines? Would there be a need for them to handle those routines?
Early GPU ray tracing work - courtesy of graphics.cs.uni-sb.de
PC Perspective: What are your thoughts on the ability for ray tracing to scale across multiple platforms easily by decreasing the number of rays (and thus the resolution) and adapting the application to different hardware such as mobile gaming and/or cell phones? How does this compare to how rasterization engines can scale?
Dr. Kirk obviously has as different opinion on the future of ray tracing that Intel's researchers do - and that is to be expected. NVIDIA (and AMD/ATI for that matter) have a monetary requirement for reasonable research in the area of computer graphics, and thus their decisions are going to be based less on the "pie-in-the-sky" theory of graphics and more on the real-world applications of any such technology can accomplish.
NVIDIA obviously disagrees then with the statements Intel has made about the collaboration of rasterization and ray tracing renderers not being feasible. Dr. Kirk mentions that both will be able to work together to provide better image quality without dramatically hindering performance though the details of such implementations aren't available.
Also interesting is Dr. Kirk's mentioning of the CUDA platform that allows programmers to write applications for GPUs in the same way they write them for CPUs. I am very curious to see how a completely independent party would view the performance of each option; Intel's CPU-based ray tracers of today or NVIDIA's CUDA based ray tracing option. Which would be faster and easier to work with? These are questions that we can hope to get answered as ray tracing evolves and more and more interested parties get involved at universities and game developers.
It is also apparent that Intel has a LOT of work ahead of itself if they are actually going to try and convince game developers to adopt ray tracing as their primary rendering option. (We should clarify that we don't actually know that is happening, but all signs point that direction.) I recently wrote a news piece about Intel's purchase of Project Offset, a game engine developer, and Havok, a physics API, and theorized that Intel might be making its own game engine to give away or sell to developers to push ray tracing adoption.
If that's the case, NVIDIA and AMD will have a battle on their hands but luckily it is one they are used to fighting. They just aren't used to fighting the 800 lb gorilla in the discrete GPU market and that might require a very different strategy. NVIDIA's stance is that rasterization is not inherently worse for gaming than ray tracing is if only because all the years of work and research that has gone into up to today. They seem willing to adopt ray tracing support on their cards in terms of programmability with CUDA and let the developers decide which option will be right for the industry.
My thanks go out to Dr. David Kirk for taking the time to answer our questions and amuse us with theories on the future of gaming and graphics. Also, thanks goes to our own Josh Walrath for stepping up with detailed questions for our interview. Stay tuned to PC Perspective for more information and research in the world of ray tracing and rasterization!
There is a good discussion going on our graphics card forum here; get in and give us your thoughts on this whole rasterization and ray tracing debate!
Be sure to use our pricing engine to find the best prices on NVIDIA and AMD graphics cards and anything else you might need: