Larrabee team member speaks on rasterization for the future

Subject: Graphics Cards | April 25, 2008 - 12:00 PM |
Tagged:

We have talked many, many times about Larrabee in the last several years and we consider ourselves to be pretty knowledgeable on the subject.  It's always interesting to see some new comments on the topic though and it turns out one of Larrabee's developers at Intel posted a bit on their blog on the subject of rasterization on the new GPU architecture.

Let's page through his comments if we can:

I've been trying to keep quiet, but I need to get one thing very clear. Larrabee is going to render DirectX and OpenGL games through rasterisation, not through raytracing.


I'm
not sure how the message got so muddled. I think in our quest to just
keep our heads down and get on with it, we've possibly been a bit too
quiet. So some comments about exciting new rendering tech got
misinterpreted as our one and only plan. Larrabee's tech enables many
fascinating possibilities, and we're excited by all of them. But this
current confusion has got a lot of developers worried about us breaking
their games and forcing them to change the way they do things. That's
not the case, and I apologise for any panic.

I'd have to agree with him - a large portion of Intel's push into the graphics market with Larrabee seemed to be centered on ray tracing and only in recent months has the discussion of rasterization really taken a front seat at all.  We first heard of it from John Carmack directly - he was the first to really verify that Larrabee would rasterize and HAD to in fact.

There's only one way to render the huge range of DirectX and OpenGL
games out there, and that's the way they were designed to run - the
conventional rasterisation pipeline. That has been the goal for the
Larrabee team from day one, and it continues to be the primary focus of
the hardware and software teams. We take triangles, we rasterise them,
we do Z tests, we do pixel shading, we write to a framebuffer. There's
plenty of room within that pipeline for innovation to last us for many
years to come. It's done very nicely for over a quarter of a century,
and there's plenty of life in the old thing yet.


Larrabee up close

Obviously this is something we have come to realize ourselves - rasterization isn't going anywhere.  There isn't any mention of how well the Larrabee architecture does perform on this type of pipeline but the fact that he claims the team has focused on that from day one is at least a good sign. 

There's no doubt Larrabee is going to be the world's most awesome
raytracer. It's going to be the world's most awesome chip at a lot of
heavy computing tasks - that's the joy of total programmability
combined with serious number-crunching power. But that is cool stuff
for those that want to play with wacky tech. We're not assuming
everybody in the world will do this, we're not forcing anyone to do so,
and we certainly can't just do it behind their backs and expect things
to work - that would be absurd. Raytracing on Larrabee is a fascinating
research project, it's an exciting new way of thinking about rendering
scenes, just like splatting or voxels or any number of neat ideas, but
it is absolutely not the focus of Larrabee's primary rendering
capabilities, and never has been - not even for a moment.

And here is the crux: while the Larrabee architecture will be GOOD at ray tracing according to this post, they will not be asking people to move to that type of rendering anytime soon.  If that's the case then much of our ray tracing and rasterization concerns could be at ease.  However, we might still be seeing one half of conflicting internal debates and the possibility remains that Larrabee will perform poorly enough in rasterization that ray tracing is way to make the product stand out.  Obviously this is something we just don't know yet.

We are totally focussed on making the existing (and future) DX and OGL
pipelines go fast using far more conventional methods. When we talk
about the rendering pipeline changing beyond what people currently
know, we're talking about using something a lot less radical than
raytracing. Still very exciting for game developers - stuff they've
been asking for ages that other IHVs have completely failed to
deliver on. There's some very exciting changes to the existing graphics
pipelines coming up if developers choose to enable the extra quality
and consistency that Larrabee can offer. But these are incremental
changes, and they will remain completely under game developers' control
- if they don't want to use them, we will look like any other fast
video card. We would not and could not change the rendering behaviour
of the existingAPIs.

This is kind of interesting and if I were to hazard a guess I'd say that he is referring to a more robust data-sharing caching structure and nearly complete programmability compared to even the latest G80 or RV670 architectures from NVIDIA and AMD. 

We can surely hope to see more of this kind of information from Intel and the team that works on Larrabee - it is important for the media, developers and gamers to see what Intel has in store for them in the world of PC gaming if we are to truly be excited and accepting of Intel's first entry to the world of discrete graphics in quite some time.

More soon!!!

More Reading on Gaming and Ray Tracing:

No comments posted yet.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.