Review Index:
Feedback

NVIDIA GeForce GTX 660 2GB Review - Kepler GK106 at $229

Author: Ryan Shrout
Manufacturer: NVIDIA

GPU Testbed - Sandy Bridge-E, X79, Games

We decided that it was high time we replaced the somewhat-dated Nehalem-based infrastructure (even though honestly, it was fast enough) with something a bit more current.  Obviously, that meant going with the new Intel Sandy Bridge-E processor and X79 motherboard. By combining support for 40 PCI Express lanes and 3-4 full size GPU slots, it makes for the perfect GPU base.

Our reviews will based around the following system:

  • Intel Core i7-3960X CPU
  • ASUS P9X79 Pro motherboard
  • Corsair DDR3-1600 4 x 4GB Vengeance memory
  • 600GB Western Digital VelociRaptor HDD
  • 1200 watt Corsair Professional Series power supply
  • Windows 7 SP1 x64

The ASUS P9X79 Pro

The Intel Core i7-3960X gives us the fastest consumer-level CPU on the market to help eliminate the possibility of any processor-based bottlenecks in our testing (whenever possible).  There are still going to be some games that could use more speed (Skyrim comes to mind) but for our purposes this is as good as you get without getting into any kind of overclocked settings.  The ASUS P9X79 Pro motherboard has enough space for three dual-slot graphics cards when the time comes for testing 3-Way SLI (and CrossFire), and 8 DIMM slots should we want to go up from our current setup of 16GB of Corsair Vengeance memory.  

I chose to stick with the 600GB VelociRaptor hard drive rather than an SSD as our total installation size with Windows 7 SP1 x64 and 6+ games was already hitting the 115GB range.  Finally the 1200 watt power supply from Corsair offers up more than enough juice for three power hungry graphics cards while running quietly enough to not throw off our noise testing drastically.

Speaking of noise, we are re-introducing our sound level testing thanks to the Extech 407738 Sound Level Meter capable of monitor decibel ratings as low as 20db.  This allows me to accurately tell you the noise levels generated by the graphics cards that make in-house at PC Perspective.

Along with the new hardware configuration comes a host of new games.  For this review we will be using the following benchmarks and games for performance evaluation:

  • Battlefield 3
  • Elder Scrolls V: Skyrim
  • DiRT 3
  • Batman: Arkham City
  • Metro 2033
  • Deus Ex: Human Revolution
  • 3DMark11
  • Unigine Heaven v2.5

This collection of games is both current and takes into account several different genres as well - first person role playing, third person action, racing, first person shooting, etc.  3DMark11 and Unigine Heaven give us a way to see how the cards stack up in a more synthetic environment while the real-world gameplay testing provided by the six games completes the performance picture.

View Full Size

GeForce GTX 660 2GB Reference Specs

For our review we have quite a few interesting comparisons to make.  First, our reference GTX 660 card will go up against the GTX 660 Ti to see how much performance you get out of that $70 increase in price.  From AMD, the new GK106 GPU will be pitted against the Radeon HD 7870 GHz Edition and the Radeon HD 7850. 

Our driver revisions for the AMD cards was Catalyst 12.8 and for NVIDIA a new GTX 660-ready beta of 306.23.

We will of course take a look at the two retail cards from EVGA and MSI to see how their overclocked settings affect the performance in gaming.

And finally, we'll wrap up with a comparison of the GeForce GTX 660 2GB to the GeForce 9800 GT 1GB, the GeForce GTX 460 1GB and the GeForce GTX 560 Ti 1GB.  It's a section of the review that users looking for upgrade information won't want to miss!

September 13, 2012 | 01:55 PM - Posted by Thedarklord

This is the true sweet spot gamers have been looking for, even NVIDIA in its own documents compares it to other greats (9800GT/ect).

I also really do like the review including legacy cards, gives a really nice comparison, and not just comparing to what is currently available on the market.

September 13, 2012 | 02:54 PM - Posted by Tim Verry

Yup, I noticed that comparison as well. I don't remember exactly how 9800GT compares to 8800GT (were they exactly the same, just a rebrand?) but the latter was a card that sold like hotcakes :). 

September 13, 2012 | 05:45 PM - Posted by Thedarklord

Yeah, the 9800 GT was a rebranded 8800 GT, most of the time all that was done was a slight clock increase, later versions included a die shrink (65nm -> 55nm).

The 8800 GT and its bigger brother 8800 GTS 512 MB (which was rebranded to be the 9800 GTX), really should never have been. I think NVIDIA launched them just in time for the holiday season that year, they were only on the market for about 3 months before they were rebranded. >.<

But never the less, the 8800GT's/9800 GT's were an insanely popular gaming card, and rightfully so I think.

September 13, 2012 | 07:05 PM - Posted by aparsh335i (not verified)

I loved my 8800GTS 512! Nvidia did some really confusing stuff back then! Remember this?
8800GTS 640gb, 8800GTS 320gb, 8800GTS 512mb (g92).
The 512mb was way faster than the other 2.
Then the 8800gts 512mb was rebranded as the 9800GTX (mentioned as ).
Then rebranded as the 9800GTX+ (was it 45nm then? cant remember)
Then rebranded as GTS 250.
What a trip!

September 13, 2012 | 08:27 PM - Posted by Thedarklord

Haha, yes I do remember the fun with GeForce 8 branding/marketing back then.

Yup, there is alot I could go into about those cards (Ive been a computer hardware geek since the GeForce 3 Days, and video cards are my fav parts, lol).

Yes the 9800 GTX was rebranded to be the 9800 GTX+ which was the die shrink from (65nm -> 55nm).

45nm was skipped on Video Cards/GPU's mainly due to area density/manufacturing timing issues.

So that same G92 core went from 8800 GTS 512 MB -> 9800 GTX -> 9800 GTX+ -> GTS 250 ! How time flys ^.^

September 13, 2012 | 04:27 PM - Posted by Anonymous (not verified)

Thanks for the trip down memory lane with the 9800GT. Identical to the earlier 8800GT except for a BIOS flash for that fancy 9 Series number.

September 13, 2012 | 05:25 PM - Posted by PapaDragon

Best GTX 660 review period, thank you for your hard effort.

September 13, 2012 | 08:36 PM - Posted by Ryan Shrout

Thanks for your feedback everyone.  Let me know if you have any questions or suggestions!

September 13, 2012 | 11:29 PM - Posted by Thedarklord

At the risk of asking you guys to do more work ^.^, but in these video card reviews, can you include a few more video cards as comparison sake?

Rather than only the respective cards immidiate competition or siblings?

Say in this case, adding a GTX 670, GTX 680, HD 7970, HD 7950, GTX 580, GTX 570, ect. Seeing how cards compete across and inside generations is really nice/helpful. And, unless the testbed has changed, you could almost imput the benchmarks from previous reviews (though a "quick" re-run of the cards with updated drivers would be great).

Id be happy to help ;), lol.

September 13, 2012 | 10:03 PM - Posted by your reviews are fucked (not verified)

No way in hell would skyrim bring a gpu to 27 fps min at 1680

fix your fucking test rigs damnit.

September 14, 2012 | 07:21 AM - Posted by Ryan Shrout

It happens during a scene transition - going from a town to the overworld. 

September 15, 2012 | 02:42 AM - Posted by your reviews are fucked (not verified)

then your "Minimum" result is fundamentally WRONG.

September 15, 2012 | 05:33 PM - Posted by Ryan Shrout

They aren't wrong, they represent the system as a whole anyway. 

September 15, 2012 | 08:25 AM - Posted by Anonymous (not verified)

are you guys using a older version of Skyrim for the tests?

Skyrim until patch 1.4 (I think) was much worse in terms of CPU performance, since all cards are doing 27 min this can only be caused by something else.

September 15, 2012 | 05:32 PM - Posted by Ryan Shrout

Latest Steam updates...

September 15, 2012 | 09:29 AM - Posted by KNN Foddr (not verified)

The transitions scenes appear to be capped at 30fps, and aren't part of gameplay. Including them in a benchmark for a videocard doesn't seem very illuminating.

September 20, 2012 | 09:33 PM - Posted by Buck-O (not verified)

I will be refreshing when people learn how FRAPs works, and why there are dips in the charting as a result of cut scenes. It will do it on any system. And while...no, it isnt particularly illuminating on benchmark data...it is a necessity that all cards are shown on the same play through, with the same cut scenes, and the same corresponding dips. Please get off your high horse.

One thing that i will take issue with however, is the Anti Aliasing settings. I am discouraged that nVidia reviews consistently limit the AA to a paltry 4x. And usually its MSAA not FSAA. It is pretty common knowledge that high levels of AA on nVidia cards, particularly at higher resolutions, is a weak point of their design. And typically show steeper performance drop off as the AA and resolution goes higher then ATi cards under similar conditions would.

For the sake of fair and balanced testing, the true maximum settings should be run on the game, as a typical user would select from the start. Not cherry picking AA levels that are favorable to one vendors hardware setup that greatly skew expected real world performance.

If you are going to bother with the new WQHD resolutions, the least you could do is run 8x FSAA.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.