Review Index:
Feedback

NVIDIA GeForce GTX 660 2GB Review - Kepler GK106 at $229

Author: Ryan Shrout
Manufacturer: NVIDIA

Analysis and Conclusions

Performance

It seems like a broken record, but the new GeForce GTX 660 2GB card is another great graphics card release from NVIDIA using the Kepler architecture.  Based on GK106, the first "real" card that uses a GPU that isn't the GK104 (we don't count the GT 640, sorry) shows a solid set of performance results in our games.  While we did test at 1680x1050, 1920x1080 and 2560x1600, the real important resolution is 1080p as that is where I imagine most users with this class of card will be targeting their displays. 

At 1080p, the GTX 660 was able to outperform the Radeon HD 7870 GHz Edition in Battlefield 3 and Batman: Arkham City while matching it in performance on Metro 2033.  In our other games, Skyrim, DiRT 3 and Deus Ex, the GTX 660 fell between the HD 7870 GHz Edition and HD 7850 cards. 

View Full Size

Considering that the GTX 660 2GB card is going to have an MSRP about $30 less than the HD 7870, and about $30 more than Radeon HD 7850, that gives the new GTX 660 a pretty positive result.  It is not quite the runaway that the early GTX 600-series cards were, but as the cost of these graphics cards decreases, the margins get tighter and the overlap between them will increase dramatically.

Interestingly, the GTX 660 vanilla is able to compete with the GTX 660 Ti pretty well for the $70 price difference.  In fact, in Skyrim, Batman and Metro, the gap is small enough to be negligible.  If you are going to be gaming at 1080p only for the foreseeable future, it just makes sense to go with the GTX 660 over the GTX 660 Ti.

Pricing and Availability

The GeForce GTX 660 2GB cards from most of NVIDIA's partners will be available today starting at $229 and going up for custom models.

As I just mentioned, the GTX 660 makes a good case for itself against ALL of the other cards in this list, including the GTX 660 Ti.  For $70 less you could upgrade to a larger SSD or anything else; though if you were going to buy Borderlands 2 you might as well get the Ti version that comes with the free promotional copy...

View Full Size

The Radeon HD 7870 GHz Edition is CLOSE to being the better choice over the GTX 660, but just can't quite cut it with the 50/50 performance results and the more expensive price tag.  BUT, with the newly added copy of Sleeping Dogs that IS included with the HD 7870 GHz Edition (but not the HD 7850) you could definitely argue that the value is there. 

Final Thoughts

What can we say?  The GK106-based GeForce GTX 660 2GB card from NVIDIA is another winning combination of performance, features and efficiency.  Starting at $229, this new graphics card starts at the exact same MSRP as the GTX 460 1GB card did at its launch and it plays the similar role as a disrupter.  We have been waiting for NVIDIA's Kepler to hit lower price points and the GTX 660 does it in impressive fashion. 

September 13, 2012 | 01:55 PM - Posted by Thedarklord

This is the true sweet spot gamers have been looking for, even NVIDIA in its own documents compares it to other greats (9800GT/ect).

I also really do like the review including legacy cards, gives a really nice comparison, and not just comparing to what is currently available on the market.

September 13, 2012 | 02:54 PM - Posted by Tim Verry

Yup, I noticed that comparison as well. I don't remember exactly how 9800GT compares to 8800GT (were they exactly the same, just a rebrand?) but the latter was a card that sold like hotcakes :). 

September 13, 2012 | 05:45 PM - Posted by Thedarklord

Yeah, the 9800 GT was a rebranded 8800 GT, most of the time all that was done was a slight clock increase, later versions included a die shrink (65nm -> 55nm).

The 8800 GT and its bigger brother 8800 GTS 512 MB (which was rebranded to be the 9800 GTX), really should never have been. I think NVIDIA launched them just in time for the holiday season that year, they were only on the market for about 3 months before they were rebranded. >.<

But never the less, the 8800GT's/9800 GT's were an insanely popular gaming card, and rightfully so I think.

September 13, 2012 | 07:05 PM - Posted by aparsh335i (not verified)

I loved my 8800GTS 512! Nvidia did some really confusing stuff back then! Remember this?
8800GTS 640gb, 8800GTS 320gb, 8800GTS 512mb (g92).
The 512mb was way faster than the other 2.
Then the 8800gts 512mb was rebranded as the 9800GTX (mentioned as ).
Then rebranded as the 9800GTX+ (was it 45nm then? cant remember)
Then rebranded as GTS 250.
What a trip!

September 13, 2012 | 08:27 PM - Posted by Thedarklord

Haha, yes I do remember the fun with GeForce 8 branding/marketing back then.

Yup, there is alot I could go into about those cards (Ive been a computer hardware geek since the GeForce 3 Days, and video cards are my fav parts, lol).

Yes the 9800 GTX was rebranded to be the 9800 GTX+ which was the die shrink from (65nm -> 55nm).

45nm was skipped on Video Cards/GPU's mainly due to area density/manufacturing timing issues.

So that same G92 core went from 8800 GTS 512 MB -> 9800 GTX -> 9800 GTX+ -> GTS 250 ! How time flys ^.^

September 13, 2012 | 04:27 PM - Posted by Anonymous (not verified)

Thanks for the trip down memory lane with the 9800GT. Identical to the earlier 8800GT except for a BIOS flash for that fancy 9 Series number.

September 13, 2012 | 05:25 PM - Posted by PapaDragon

Best GTX 660 review period, thank you for your hard effort.

September 13, 2012 | 08:36 PM - Posted by Ryan Shrout

Thanks for your feedback everyone.  Let me know if you have any questions or suggestions!

September 13, 2012 | 11:29 PM - Posted by Thedarklord

At the risk of asking you guys to do more work ^.^, but in these video card reviews, can you include a few more video cards as comparison sake?

Rather than only the respective cards immidiate competition or siblings?

Say in this case, adding a GTX 670, GTX 680, HD 7970, HD 7950, GTX 580, GTX 570, ect. Seeing how cards compete across and inside generations is really nice/helpful. And, unless the testbed has changed, you could almost imput the benchmarks from previous reviews (though a "quick" re-run of the cards with updated drivers would be great).

Id be happy to help ;), lol.

September 13, 2012 | 10:03 PM - Posted by your reviews are fucked (not verified)

No way in hell would skyrim bring a gpu to 27 fps min at 1680

fix your fucking test rigs damnit.

September 14, 2012 | 07:21 AM - Posted by Ryan Shrout

It happens during a scene transition - going from a town to the overworld. 

September 15, 2012 | 02:42 AM - Posted by your reviews are fucked (not verified)

then your "Minimum" result is fundamentally WRONG.

September 15, 2012 | 05:33 PM - Posted by Ryan Shrout

They aren't wrong, they represent the system as a whole anyway. 

September 15, 2012 | 08:25 AM - Posted by Anonymous (not verified)

are you guys using a older version of Skyrim for the tests?

Skyrim until patch 1.4 (I think) was much worse in terms of CPU performance, since all cards are doing 27 min this can only be caused by something else.

September 15, 2012 | 05:32 PM - Posted by Ryan Shrout

Latest Steam updates...

September 15, 2012 | 09:29 AM - Posted by KNN Foddr (not verified)

The transitions scenes appear to be capped at 30fps, and aren't part of gameplay. Including them in a benchmark for a videocard doesn't seem very illuminating.

September 20, 2012 | 09:33 PM - Posted by Buck-O (not verified)

I will be refreshing when people learn how FRAPs works, and why there are dips in the charting as a result of cut scenes. It will do it on any system. And while...no, it isnt particularly illuminating on benchmark data...it is a necessity that all cards are shown on the same play through, with the same cut scenes, and the same corresponding dips. Please get off your high horse.

One thing that i will take issue with however, is the Anti Aliasing settings. I am discouraged that nVidia reviews consistently limit the AA to a paltry 4x. And usually its MSAA not FSAA. It is pretty common knowledge that high levels of AA on nVidia cards, particularly at higher resolutions, is a weak point of their design. And typically show steeper performance drop off as the AA and resolution goes higher then ATi cards under similar conditions would.

For the sake of fair and balanced testing, the true maximum settings should be run on the game, as a typical user would select from the start. Not cherry picking AA levels that are favorable to one vendors hardware setup that greatly skew expected real world performance.

If you are going to bother with the new WQHD resolutions, the least you could do is run 8x FSAA.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.