Review Index:

Galaxy GeForce GTX 680 2GB Graphics Card Review

Author: Ryan Shrout
Manufacturer: Galaxy

Conclusions and Final Thoughts


I think I said it best in the original GTX 680 review: "I expected the GeForce GTX 680 to be faster than the Radeon HD 7970 from AMD but I honestly didn't expect it to perform THIS WELL right out of the gate."  The same obviously holds true for the Galaxy version of the GTX 680 considering both cards are built on the same frame.  "In my testing, the GTX 680 2GB card was able to beat AMD's flagship single-GPU card in Battlefield 3, Skyrim, DiRT 3 and Batman: Arkham City. The two offerings were pretty much a toss-up in Deus Ex and the AMD card came out ahead of the NVIDIA card in Metro 2033. "

View Full Size

There has been question and concern about the effect of GPU Boost technology and the variability of performance from one card to the next with the GTX 680.  While I have only tested 4 GPUs (two reference, Galaxy and one other for a pending review), the results have thus far fallen well within our standard error rates of 1-3%.  That still doesn't mean that we won't see bigger variances on a chip to chip level, just that with our small sample size we haven't seen any indications of it quite yet. 


There are a host of new features included on Kepler, starting with the addition of being able to support more than two displays. Yes, the AMD cards can still support 6 outputs if you can find one of those magic DP hubs but I think that the four NVIDIA has included are probably enough for most users. I really still wish that NVIDIA wasn't 2+ years behind on this — but we have it now so NVIDIA fans can stop being pestered by the AMD camp.

Interestingly, Galaxy was the only company to push forward on display outputs with Fermi, as we saw in our looks at the various Galaxy MDT graphics cards (GTX 570 and GT 520).  While we gave the company lots of credit for stepping outside the box and helping to push the NVIDIA market ahead, all of that work is likely lost now that NVIDIA has introduced the ability to natively support four displays.  

View Full Size

GPU Boost is the other big contributor to the success of Kepler as it enables the GPU to perform optimally for EACH game and allows the GPU clock to scale accordingly. In my testing the feature works — and works rather well — and yet still is flexible enough to allow gamers to overclock their new graphics cards with some easy to manipulate software.  I am a big fan of both the new Adaptive VSync and Frame Rate Target options as well, because they give users the ability and added flexibility that we haven't seen before. The eternal debate of vsync on versus vsync off hasn't been put completely to rest, but with the capability to smoothly scale under 60 FPS now an option on the GTX 680 I can see enabling that more and more in my own gaming.

Pricing and Availability

As of this writing, there are NO GTX 680s in stock at - but you should check for yourself as you are reading this.  The Galaxy model we are reviewing is also one of those that is missing in action and comes with a current price tag of $509.  You can also check Best Buy and Amazon for stock...but good luck.

View Full Size

NVIDIA claims that "demand is just insane right now" and that users looking to grab a GTX 680 should set stock notifications at places like  And while that is a grand idea for those of you with frequent access to your email and the Internet, it sucks for just about everyone that wants to buy a product that isn't available.  Are we seeing limitations of the 28nm process technology from TSMC, yield issues on Kepler or a combination of both?  Chances are there are a lot of reasons for the lack of cards on the market, but I will give NVIDIA credit for sticking to the $499 price tag when they could have obviously charged $599 and still sold through their card stock.  


Final Thoughts

The Galaxy GeForce GTX 680 2GB is exactly what gamers are looking for today - a card based on the new Kepler architecture that includes all of the features introduced with the GPU including GPU Boost, frame rate target and adaptive Vsync.  If you can find one of these cards for sale you had best grab it quickly though as it likely won't be there long. 

View Full Size

April 17, 2012 | 02:47 PM - Posted by jewie27 (not verified)

Time and time again, reviewers fail to test every feature of Nvidia GPU's. When will they ever benchmark 3D Vision performance? One of Nvidia's high end features. Sigh...

I was hoping to see Battlefield 3 benchmarks for 3D Vision in single and SLI mode.

April 17, 2012 | 03:03 PM - Posted by Ryan Shrout

We have another piece looking at that, including Surround, come up next!

April 18, 2012 | 03:47 PM - Posted by jewie27 (not verified)

awesome, thanks :)

April 18, 2012 | 11:26 PM - Posted by Dane Clark (not verified)

Sweet. As a 3d vision owner I have to say, there is not many sites that measure features like this and it is always great to see sites covering 3d performance.

April 17, 2012 | 05:12 PM - Posted by rick (not verified)

in every game in this review there is a picture of in game options with 1920*1200rez, but in the test graph its 1920*1080
whats up with that?

April 18, 2012 | 10:59 AM - Posted by Ryan Shrout

Game testing screenshots were taken before we moved from 1920x1080 testing. Nothing being hidden.


April 18, 2012 | 03:07 AM - Posted by cc (not verified)

still used GPU-Z 0.5.9? lol

April 18, 2012 | 10:57 AM - Posted by Ryan Shrout

Yeah, testing was done a while back. :)

April 18, 2012 | 09:49 AM - Posted by edr (not verified)

Nice review Ryan. Wish I could afford a better card. I'll stick with my 560ti for another year at least :).

April 18, 2012 | 05:05 PM - Posted by mtrush (not verified)

I'm not impressed with performance the 680gtx has over 580gtx.
The Min/fps performance doesn't warrant buying a 680gtx!
but only for power consumption i'd buy it.

April 18, 2012 | 07:26 PM - Posted by David (not verified)

Right, which is what they are going for. Why does somebody need more than a 580? No game maxes that card out yet, so why put something on the market which is an even more unnecessary leap in performance? Why spend a lot more money to see a number on the screen hop up a few pegs?

April 20, 2012 | 04:59 PM - Posted by AParsh335i (not verified)

What about when GTX 680 hits $399.99...Offers 3+ monitor support out of the box, consumers less power, runs cooler, and outperforms your GTX 580? Also, why did you think buying the GTX 580 was a good deal in the first place? I know I didn't. I bought a HD6950 2gb and unlocked it to a HD6970 for $300 or less, and it gets close to GTX 580 performance.
The main thing to remember (or learn if you didnt know) is that the GTX 680 is based on the GK104 GPU, not GK110 GPU. The GK110 GPU is a bigger, badder, faster GPU but is not available right this second.

Here's an assumption - if you were Nvidia and you had something maybe 25-40%+ faster than the GTX 680, but the GTX 680 is already about 10-20%+ faster than the competition at a lower price, would you release the big one? If you want to make money, which is what businesses do, you would want to hide the badass one and make huge margin on the GK104 based card as long as you can.

Check out the comparison of GK104 Vs GK110.

April 20, 2012 | 05:07 PM - Posted by AParsh335i (not verified)

The 580 does not max every game out, plain and simple. There are a decent amount of gamers doing 3d, 3monitor, or both at the same time. I'm a big fan of Eyefinity/Nvidia Surround myself, and it was a bummer that the gtx 580 were not supporting it without 2 cards. GTX 680 fixes that and adds more memory - perfect.

April 20, 2012 | 04:51 PM - Posted by AParsh335i (not verified)

Where iz thee SLI performance resultz?!?

April 26, 2012 | 02:24 AM - Posted by train_wreck (not verified)

and i was thinking i was kicking ass with my watercooled GTX 580. some things never change.

December 18, 2012 | 07:55 PM - Posted by Junita Feld (not verified)

I went over this website and I conceive you have a lot of good info, saved to bookmarks (:.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.