Crytek's Cevat Yerli Speaks on Rasterization and Ray Tracing
More questions and closing thoughts
PCPER: How well do you think current programmable GPU technology can handle
ray tracing? Do you think dedicated ray tracing logic is going to be
for this problem. The trend is towards more generic programmable
massively parallel solutions, so I don’t
think we’ll require dedicated ray tracing hardware.
PCPER: Do you see the possibility of combining rasterization and ray tracing in future rendering engines?
solution of rasterization and ray casting, and most likely is the way
to go, whilst I think there is at least one more generation with almost
pure rasterization, very clearly any proposed graphics hardware
architecture will perform great in pure rasterization.
PCPER: If you have considered any of Intel's upcoming Larrabee hardware, do
you see a future for something that is described currently as
psuedo-x86 80-cores processors?
parallelism. There are performance advantages to keeping much of the
current rendering pipeline in hardware, so it will be interesting to
see whether a completely programmable solution will compete in the near
future. If it can there are some interesting possibilities.
PCPER: Do you expect changes to DX or OpenGL that might include ray tracing?
Or do you see an entirely new API required for ray tracing to catch on
for mainstream graphics?
Instead we’ll see the APIs for general purpose computations on GPU like
hardware which can be used for ray tracing.
PCPER: Have you attempted any simple ray tracing routines on current console
hardware (namely XBox 360 and PS3). Which would be the more attractive
platform for ray tracing? The PS3 with its Cell processor with 7 SPs,
or the triple core PPC on the XBox 360?
Crysis, 2007; image courtesy gamespot.com
rasterization, for a real world case. Hybrid solutions yes, one can
afford this to some degree.
PCPER: What are you personally comfortable with? Pixels that "look good enough" or are "truly accurate"?
can be seen, pixels which look good are pixels that are good. So looks
good enough pixels are just fine, considering the intensity and
interactivity of games.
PCPER: Intel buys Havok and Project Offset?? What are your thoughts on what Intel might be doing with this?
Crysis, 2007; image courtesy gamespot.com
that show to the public, how they want developers to push the real-time
graphics boundaries by great practical examples.
and graphics engines within CryEngine these developments don’t directly affect us or our licensees.
Cevat has given us some more food for thought with his answers to our ray tracing questions. On the positive side for ray tracing development he seems confident that IF ray tracing picks up as a general rendering method for games that it would likely develop algorithms and theory designs that would improve performance and efficiency dramatically, just as rasterization has done over the last 20 years. Anti-aliasing is one such example where raster engines have developed significant improvements for efficiency that ray tracing would likely do over time as well.
Cevat does see a future for ray tracing, but more in the form of a mixed rendering design that probably won't be implemented for several years; five or more. By his count, for the next three years or so rasterization will continue to be the dominate rendering method for games and thus any potential graphics hardware for this market will need to compatible and perform well on rasterization. Cevat thinks that in a time span of three to five years we might begin to see some implementation of ray tracing in games but not in the pure, classical ray tracing fashion. Instead we will likely see the hybrid rendering techniques that we have discussed several times in previous interviews: ray tracing for shadows, certain reflective objects, etc.
The good news for NVIDIA and AMD is that with the increase in programmability in their GPU designs and even more progression down that path, Cevat thinks that GPU hardware will likely be able to handle the types of ray tracing that would be implemented by game designers. Yes, Intel's Larrabee design is going to be COMPLETELY programmable but the necessary flexibility might be matched by upcoming GPU designs from NVIDIA and AMD thus limiting the advantages Larrbee might have. It's also possible that, as Cevat notes, the slightly fixed function in which current generation GPUs execute enables more efficient graphics rendering and the complete programmability design that Intel is implementing might not compete in terms of performance. But that same programmability could also open up a lot of new design ideas and new rendering techniques.
And the fact remains that pretty much everyone we talk to outside of Intel is confident that rasterization is going to have at least one more dedicated generation in gaming. That means that if Intel intends to compete in the graphics world in the next 2-3 years that it will have to take on NVIDIA and AMD/ATI on current-model GPU terms - rasterization, DirectX and OpenGL. By Intel's own admission they will be compatible with current rasterization models, they have said exactly that at the recent IDF show in Shanghai, but how well they compete will really only be answers when the hardware is available to developers and reviewers.
Many people seem to be calling for NVIDIA and AMD's graphics departments to be be-headed once Intel enters the graphics arena; but the truth of the matter is that Intel has a lot to prove before developers and gamers will take it's word on faith. The debacle that has been Intel's integrated graphics, though a very different design team, has definitely soured a lot of people's view of Intel and graphics. Just look at how the Microsoft Vista-capable issue is panning out or how developers blame Intel for PC gaming's current downturn. To take Intel just at their word, that Larrabee is something completely different and will work and perform as they claim, just seems naive. Is Intel capable of turning out a great graphics technology? Absolutely - they are Intel, the largest semiconductor design company in the world and they have the engineers and finances to do just about anything. But there have been plenty of other expensive and advanced generation
"failures" out there too, and some pretty hefty ones: i740
graphics, Pentium 4, original Itanium.
But Intel has an uphill battle; I am eager to follow it, document it and see what the future holds for ALL of these players in the PC landscape.
I want to personally thank Cevat Yerli, Doug Binks and Zyad Tikanouine at Crytek for taking the time to talk with me and answer these questions.
Please join us in the forums to discuss this incredibly interesting information!!
More Reading on Gaming and Ray Tracing:
- John Carmack on id Tech 6, Ray Tracing, Consoles, Physics and more
- NVIDIA Comments on Ray Tracing and Rasterization Debate
- Ray Tracing and Gaming - One Year Later
- Rendering Games with Raytracing Will Revolutionize Graphics
- Ray Tracing and Gaming - Quake 4: Ray Traced Project
- Intel buys Project Offset, makers of the Offset engine
- Intel demonstrates ray tracing on ultra-mobile PCs
- Playstation 3 Runs Real-time Ray tracing
- His future's so bright ... he's gotta ray trace shade
Be sure to use our pricing engine to find the best prices on NVIDIA and AMD graphics cards and anything else you might need: