G92 Details – The NVIDIA 8800 GT

The NVIDIA 8800 GT looks to be the best deal in a graphics card in many years offering performance on par with $400 GPUs for about half the price! Come see how the XFX model we reviewed stands up to the competition.
Introduction

I was one of the people that would have said consoles had the edge in the gaming world if you’d have asked me at any point before the middle of October this year.  With games like Halo 3, Guitar Hero, Gears of War and more out and only on one of the three major consoles it’s hard to argue with a list like that.  But fear not, PC gamers!  The month of November is here to save us all.

From now until the end of the year, hardware and software releases are going to combine in a way to make the PC once again the most impressive gaming platform available.  Just reading about games like Crysis, Call of Duty 4, Unreal Tournament 3, Supreme Commander and FEAR expansions, Alan Wake and even the new Hellgate: London makes our mouths water with anticipation.  And to go along with those titles hardware companies like NVIDIA and AMD have new GPUs set to break price/performance records while both Intel and AMD have new processors launching promising a great gaming experience. 

The first iteration of these hardware changes comes today with the NVIDIA GeForce 8800 GT, previously known as the G92 chip. 

G92 GPU Details

The G92 architecture is mostly the same as the G80 architecture that we first saw revealed last November with the 8800 GTX launch. But there are some key changes that differentiate it from other G80-based architectures like those used on the 8800 GTS and 8600 series of cards. 

The magic all starts with a new process technology from TSMC; the 8800 GT GPU is developed on a 65nm process and is made up of 754 million transistors on a 325 mm square die.  For those of you with really good memories, you’ll note that the original G80 GPU was built on 90nm process technology and was made up of 680 million transistors.  The fact that the new G92 architecture has MORE transistors on a smaller die leads us to all kinds of interesting thought trails but we’ll stick to the specifications on the GPU as we know it right now.

XFX GeForce 8800 GT Graphics Card Review - G92 Revealed - Graphics Cards 83
The G92 core – not quite as big as G80

The new G92 sports 112 shader processors, just 16 short of the number that G80 had, that are slightly upgraded for better texture operations as the 8600/8400 series of GPUs was.  The 8800 GTS GPU based on the original G80 had only 96 shader processors so we can see right away that the 8800 GT could be a potentially very powerful GPU.  The G92 shaders are clocked at 1500 MHz while the GTS shaders came clocked at 1200 MHz and give the new 8800 GT yet another potential statistical advantage. 

Even the main core clock on the 8800 GT is faster: 600 MHz versus 500 MHz on the GTS and 575 MHz on the 8800 GTX flagship.  The much maligned 8800 Ultra was clocked at 612 MHz so the 8800 GT isn’t far behind that either.  

The memory configurations are somewhat changed however; the 8800 GT will be running on a 256-bit memory bus where as both the 8800 Ultra and GTX cards used 384-bit buses and the GTS cards used a 320-bit bus.  The GDDR3 memory, of which there will be 512MB, is clocked at 900 MHz on the reference design of the 8800 GT.

XFX GeForce 8800 GT Graphics Card Review - G92 Revealed - Graphics Cards 84

Of course, if numbers and multiplication were all that was necessary to evaluate hardware, then a calculator could do it.  However, as you can see from our specification comparison chart above, the numbers comparing the various NVIDIA and AMD hardware leave a lot of ambiguity; especially when you start comparing numbers of shaders from one architecture to another for example. 

At the NVIDIA editor’s day last month, it was mentioned that some other minor improvements were made in the G92 architecture including updated SLI AA support (which will be brought back to Vista drivers soon we are told) to improve performance at resolution like 1600×1200 and 1920×1200 on cards like the 8800 GT.  For AA in general, improved efficiency at the high end of resolutions will help reduce the performance hits from enabling 4xAA at levels of 19×12 and higher.  NVIDIA engineers also told us that instruction issue was improved, macro architecture has been improved and the power management capabilities have been improved for better idle consumption. 

New Features

Besides the architectural changes we mentioned, there are some new things thrown into the new 8800 GT chip as well.  First, users will be glad to hear that the updated VP2 hardware decode engine that was put into the 8600 series of cards but was too late to be included in the 8800 GTX/Ultra/GTS cards is in fact included in full on the 8800 GT.  That means the full HD decode process can be offloaded onto the GPU as I detailed in my initial look at the PureVideo HD technology.

PCI Express 2.0 makes its way to the NVIDIA 8800 GT as well though the benefits of it are going to be somewhat limited in my view if only because PCIe 1.0 was never really a bottleneck.  Moving to 2.0 doubles the amount of bandwidth available to the GPU and is an evolutionary step that is both backwards and forwards compatible — you’ll be able to use this PCIe 2.0 card with your PCIe 1.0 motherboard and vice versa.

Another addition is the moving of the display chip from an external PCB-based installation to integrating it on the GPU die itself.  Now the G92 natively supports two dual-link DVI output with HDCP.  Interestingly, NVIDIA also said that HDMI output was going to be supported with a DVI-to-HDMI adapter in much the same way AMD’s 2000 series of GPUs do it though the first rounds of cards weren’t going to have it integrated yet.  If that is important to you it might be prudent to wait it out. 

What’s the secret?

The fact that the number of transistors is increased from the G80 core to the G92 core, even with the inclusion of the VP2 processor and integrated display chip, leads me to believe that the G92 is simple a cut down version of something faster coming down the line.  It’s entirely possible that NVIDIA’s new chip really has 128 or more SPs on it that are simply disabled in order to sell a cheaper product, much in the same way the 8800 GTS GPU was with the G80 core of the 8800 GTX and Ultra cards.

We’ll just have to wait and see what secrets NVIDIA has yet to divulge at another time.  For now, let’s see how impressive the new architecture really is.
« PreviousNext »