Review Index:
Feedback

Galaxy GeForce GTX 680 2GB Graphics Card Review

Author:
Manufacturer: Galaxy

Retail Ready

When the NVIDIA GeForce GTX 680 launched in March we were incredibly impressed with the performance and technology that the GPU was able to offer while also being power efficient.  Fast forward nearly a month and we are still having problems finding the card in stock - a HUGE negative towards the perception of the card and the company at this point.

Still, we are promised by NVIDIA and its partners that they will soon have more on shelves, so we continue to look at the performance configurations and prepare articles and reviews for you.  Today we are taking a look at the Galaxy GeForce GTX 680 2GB card - their most basic model that is based on the reference design.

If you haven't done all the proper reading about the GeForce GTX 680 and the Kepler GPU, you should definitely check out my article from March that goes into a lot more detail on that subject before diving into our review of the Galaxy card.

The Card, In Pictures

View Full Size

The Galaxy GTX 680 is essentially identical to the reference design with the addition of some branding along the front and top of the card.  The card is still a dual-slot design, still requires a pair of 6-pin power connections and uses a very quiet fan in relation to the competition from AMD.

Continue reading our review of the Galaxy GeForce GTX 680 2GB Graphics Card!!

View Full Size

The back of the card doesn't reveal anything out of the ordinary.

View Full Size

A pair of 6-pin power connections provide all the power the GK104 chip needs and we see the new stacked connector method the company created for the GTX 680.

View Full Size

If you are an eager gamer you can run not just this GTX 680 but as many as four of them in a single system thanks to the robust SLI support provided by these dual connectors.

View Full Size

The display output configuration includes a pair of dual-link DVI connections, a full-size HDMI and a full-size DisplayPort connection.  Keep in mind that with NVIDIA's cards you utilize all four outputs simultaneously and you can support three displays without the need for any kind of active adapter. 

April 17, 2012 | 11:47 AM - Posted by jewie27 (not verified)

Time and time again, reviewers fail to test every feature of Nvidia GPU's. When will they ever benchmark 3D Vision performance? One of Nvidia's high end features. Sigh...

I was hoping to see Battlefield 3 benchmarks for 3D Vision in single and SLI mode.

April 17, 2012 | 12:03 PM - Posted by Ryan Shrout

We have another piece looking at that, including Surround, come up next!

April 18, 2012 | 12:47 PM - Posted by jewie27 (not verified)

awesome, thanks :)

April 18, 2012 | 08:26 PM - Posted by Dane Clark (not verified)

Sweet. As a 3d vision owner I have to say, there is not many sites that measure features like this and it is always great to see sites covering 3d performance.

April 17, 2012 | 02:12 PM - Posted by rick (not verified)

in every game in this review there is a picture of in game options with 1920*1200rez, but in the test graph its 1920*1080
whats up with that?

April 18, 2012 | 07:59 AM - Posted by Ryan Shrout

Game testing screenshots were taken before we moved from 1920x1080 testing. Nothing being hidden.

:D

April 18, 2012 | 12:07 AM - Posted by cc (not verified)

still used GPU-Z 0.5.9? lol

http://www.techpowerup.com/downloads/2120/TechPowerUp_GPU-Z_v0.6.0.html

April 18, 2012 | 07:57 AM - Posted by Ryan Shrout

Yeah, testing was done a while back. :)

April 18, 2012 | 06:49 AM - Posted by edr (not verified)

Nice review Ryan. Wish I could afford a better card. I'll stick with my 560ti for another year at least :).

April 18, 2012 | 02:05 PM - Posted by mtrush (not verified)

I'm not impressed with performance the 680gtx has over 580gtx.
The Min/fps performance doesn't warrant buying a 680gtx!
but only for power consumption i'd buy it.

April 18, 2012 | 04:26 PM - Posted by David (not verified)

Right, which is what they are going for. Why does somebody need more than a 580? No game maxes that card out yet, so why put something on the market which is an even more unnecessary leap in performance? Why spend a lot more money to see a number on the screen hop up a few pegs?

April 20, 2012 | 01:59 PM - Posted by AParsh335i (not verified)

What about when GTX 680 hits $399.99...Offers 3+ monitor support out of the box, consumers less power, runs cooler, and outperforms your GTX 580? Also, why did you think buying the GTX 580 was a good deal in the first place? I know I didn't. I bought a HD6950 2gb and unlocked it to a HD6970 for $300 or less, and it gets close to GTX 580 performance.
The main thing to remember (or learn if you didnt know) is that the GTX 680 is based on the GK104 GPU, not GK110 GPU. The GK110 GPU is a bigger, badder, faster GPU but is not available right this second.

Here's an assumption - if you were Nvidia and you had something maybe 25-40%+ faster than the GTX 680, but the GTX 680 is already about 10-20%+ faster than the competition at a lower price, would you release the big one? If you want to make money, which is what businesses do, you would want to hide the badass one and make huge margin on the GK104 based card as long as you can.

Check out the comparison of GK104 Vs GK110.
http://www.legitreviews.com/news/12803/

April 20, 2012 | 02:07 PM - Posted by AParsh335i (not verified)

The 580 does not max every game out, plain and simple. There are a decent amount of gamers doing 3d, 3monitor, or both at the same time. I'm a big fan of Eyefinity/Nvidia Surround myself, and it was a bummer that the gtx 580 were not supporting it without 2 cards. GTX 680 fixes that and adds more memory - perfect.

April 20, 2012 | 01:51 PM - Posted by AParsh335i (not verified)

Where iz thee SLI performance resultz?!?
http://i.imgur.com/XZkiq.jpg

April 25, 2012 | 11:24 PM - Posted by train_wreck (not verified)

and i was thinking i was kicking ass with my watercooled GTX 580. some things never change.

December 18, 2012 | 04:55 PM - Posted by Junita Feld (not verified)

I went over this website and I conceive you have a lot of good info, saved to bookmarks (:.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.