Review Index:
Feedback

Galaxy GeForce GTX 680 2GB Graphics Card Review

Author: Ryan Shrout
Manufacturer: Galaxy

Unigine Heaven v2.5

Heaven Benchmark (DirectX 11)


 

Heaven Benchmark is a DirectX 11 GPU benchmark based on advanced Unigine™ engine from Unigine Corp. It reveals the enchanting magic of floating islands with a tiny village hidden in the cloudy skies. Interactive mode provides emerging experience of exploring the intricate world of steampunk.

  • Support of DirectX 9, DirectX 10, DirectX 11 and OpenGL 4.0
  • Comprehensive use of tessellation technology
  • Advanced SSAO (screen-space ambient occlusion)
  • Volumetric cumulonimbus clouds generated by a physically accurate algorithm
  • Simulation of changing light conditions
  • Dynamic sky with light scattering
  • Interactive experience with fly/walk-through modes
  • Stereo 3D modes:
    • Anaglyph
    • Separate images
    • 3D Vision
    • iZ3
 
 
 
 

Unigine Heaven Test Settings

View Full Size

Very similar results once again - the GTX 680 is still proving to be a faster card than AMD's Radeon HD 7970.

April 17, 2012 | 11:47 AM - Posted by jewie27 (not verified)

Time and time again, reviewers fail to test every feature of Nvidia GPU's. When will they ever benchmark 3D Vision performance? One of Nvidia's high end features. Sigh...

I was hoping to see Battlefield 3 benchmarks for 3D Vision in single and SLI mode.

April 17, 2012 | 12:03 PM - Posted by Ryan Shrout

We have another piece looking at that, including Surround, come up next!

April 18, 2012 | 12:47 PM - Posted by jewie27 (not verified)

awesome, thanks :)

April 18, 2012 | 08:26 PM - Posted by Dane Clark (not verified)

Sweet. As a 3d vision owner I have to say, there is not many sites that measure features like this and it is always great to see sites covering 3d performance.

April 17, 2012 | 02:12 PM - Posted by rick (not verified)

in every game in this review there is a picture of in game options with 1920*1200rez, but in the test graph its 1920*1080
whats up with that?

April 18, 2012 | 07:59 AM - Posted by Ryan Shrout

Game testing screenshots were taken before we moved from 1920x1080 testing. Nothing being hidden.

:D

April 18, 2012 | 12:07 AM - Posted by cc (not verified)

still used GPU-Z 0.5.9? lol

http://www.techpowerup.com/downloads/2120/TechPowerUp_GPU-Z_v0.6.0.html

April 18, 2012 | 07:57 AM - Posted by Ryan Shrout

Yeah, testing was done a while back. :)

April 18, 2012 | 06:49 AM - Posted by edr (not verified)

Nice review Ryan. Wish I could afford a better card. I'll stick with my 560ti for another year at least :).

April 18, 2012 | 02:05 PM - Posted by mtrush (not verified)

I'm not impressed with performance the 680gtx has over 580gtx.
The Min/fps performance doesn't warrant buying a 680gtx!
but only for power consumption i'd buy it.

April 18, 2012 | 04:26 PM - Posted by David (not verified)

Right, which is what they are going for. Why does somebody need more than a 580? No game maxes that card out yet, so why put something on the market which is an even more unnecessary leap in performance? Why spend a lot more money to see a number on the screen hop up a few pegs?

April 20, 2012 | 01:59 PM - Posted by AParsh335i (not verified)

What about when GTX 680 hits $399.99...Offers 3+ monitor support out of the box, consumers less power, runs cooler, and outperforms your GTX 580? Also, why did you think buying the GTX 580 was a good deal in the first place? I know I didn't. I bought a HD6950 2gb and unlocked it to a HD6970 for $300 or less, and it gets close to GTX 580 performance.
The main thing to remember (or learn if you didnt know) is that the GTX 680 is based on the GK104 GPU, not GK110 GPU. The GK110 GPU is a bigger, badder, faster GPU but is not available right this second.

Here's an assumption - if you were Nvidia and you had something maybe 25-40%+ faster than the GTX 680, but the GTX 680 is already about 10-20%+ faster than the competition at a lower price, would you release the big one? If you want to make money, which is what businesses do, you would want to hide the badass one and make huge margin on the GK104 based card as long as you can.

Check out the comparison of GK104 Vs GK110.
http://www.legitreviews.com/news/12803/

April 20, 2012 | 02:07 PM - Posted by AParsh335i (not verified)

The 580 does not max every game out, plain and simple. There are a decent amount of gamers doing 3d, 3monitor, or both at the same time. I'm a big fan of Eyefinity/Nvidia Surround myself, and it was a bummer that the gtx 580 were not supporting it without 2 cards. GTX 680 fixes that and adds more memory - perfect.

April 20, 2012 | 01:51 PM - Posted by AParsh335i (not verified)

Where iz thee SLI performance resultz?!?
http://i.imgur.com/XZkiq.jpg

April 25, 2012 | 11:24 PM - Posted by train_wreck (not verified)

and i was thinking i was kicking ass with my watercooled GTX 580. some things never change.

December 18, 2012 | 04:55 PM - Posted by Junita Feld (not verified)

I went over this website and I conceive you have a lot of good info, saved to bookmarks (:.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.