Review Index:
Feedback

NVIDIA GeForce GTX 690 Review - Dual GK104 Kepler Greatness

Author:
Manufacturer: NVIDIA

GTX 690 Specifications

On Thursday May the 3rd at 10am PDT / 1pm EDT, stop by the PC Perspective Live page for an NVIDIA and PC Perspective hosted event surrounding the GeForce GTX 690 graphics card. Ryan Shrout and Tom Petersen will be on hand to talk about the technology, the performance characteristics as well as answer questions from the community from the chat room, twitter, etc. Be sure to catch it all at http://pcper.com/live

Okay, so it's not a surprise to you at all, or if it is, you haven't been paying attention.  Today is the first on-sale date and review release for the new NVIDIA GeForce GTX 690 4GB dual-GPU Kepler graphics card that we first announced in late April.  This is the dream card any PC gamer out there combining a pair of GTX 680 GK104 GPUs on a single PCB and running them in a single slot SLI configuration and is easily the fastest single card we have ever tested.  It also the most expensive reference card we have ever seen with a hefty $999 price tag. 

View Full Size

So how does it perform?  How about efficiency and power consumption - does the GTX 690 suffer the same problems the GTX 590 did?  Can AMD hope to compete with a dual-GPU HD 7990 card in the future?  All that and more in our review!

Kepler Architecture Overview

For those of you that may have missed the boat on the GTX 680 launch, the first card to use NVIDIA's new Kepler GPU architecture, you should definitely head over and read my review and analysis of that before heading into the deep-dive on the GTX 690 here today.  

View Full Size

Kepler is a 3.54 billion transistor GPU with 1536 CUDA cores / stream processors contained within and even in a single GPU configuration is able produce some impressive PC gaming performance results.  The new SMX-based design has some modest differences from Fermi the most dramatic of which is the removal of the "hot clock" - the factor that ran the shaders and twice the clock speed of the rest of the GPU.  Now, the entire chip runs at one speed, higher than 1 GHz on the GTX 680.  

Each SMX on Kepler now includes 192 CUDA cores as opposed to the 32 cores found in each SM on Fermi - a change that has increased efficiency and performance per watt quite dramatically.  

As I said above, there are lot more details on the changes in our GeForce GTX 680 review.

The GeForce GTX 690 Specifications

Many of the details surrounding the GTX 690 have already been revealed by NVIDIA's CEO Jen-Hsun Huang during a GeForce LAN event in China last week.  The card is going to be fast, expensive and is built out of components and materials we haven't seen any graphics card utilize before.

View Full Size

Depsite the high performance level of the card, the GTX 690 isn't much heavier and isn't much longer than the reference GTX 680 card.  We'll go over the details surrounding the materials, cooler and output configuration on the next page, but let's take some time just to look and debate the performance specifications.

Continue reading our review of the NVIDIA GeForce GTX 690 dual-Kepler graphics card!!

View Full Size

NVIDIA markets the GTX 690 as having 3072 CUDA cores and 16 SMX units.  This is correct but we have to keep in mind that we are looking at two separate GPUs, each with 1536 cores and 8 SMX units to get to that total.  Each GPU also sports 128 texture units and 32 ROPs.  That is a LOT of processing power on a single graphics card but I have always stated that having multi-GPUs of performance is never as ideal as having the same performance on a single GPU for some pretty obvious reasons.  Obviously you can't get a single GPU with this much horsepower so the debate is somewhat altered but don't expect to see 2x the performance of a single GTX 680 with this card - standard SLI rules apply.

The base clock speed is 915 MHz, a drop of 91 MHz (10%) compared to the base clock on the single GPU GTX 680.  Back when the GTX 590 was launched with a clock speed of 607 MHz, that was a drop of 27% compared to the clock speed of 772 MHz on the GTX 580 so the result will be performance is likely much closer to "two 680s" than we reached towards "two 580s". 

Better still, the Boost Clock of the GTX 690 is 1019 MHz, only 39 MHz (4%) lower than that of the GTX 680.  So, in a typical game in a typical scenario, we expect the GTX 690 to run within 4% of the performance of a pair of standard GTX 680 cards running in SLI.  NVIDIA was able to accomplish this by binning the GK104 GPUs to find those that offered the best performance efficiency and had the least leakage, able to clock higher while using less power.  

View Full Size

Each of the GK104 GPUs will have access to its own 2GB frame buffer running at the amazing 1500 MHz / 6 Gbps data rate, so again we don't expect to see any performancce drop because of this new implementation. 

The output configurations are noted here as well and include a set of three dual-link DVI connections and a single mini-DP port.  If you are running a Surround configuration you can actually just use all three dual-link DVI connections but if you are only interested in multiple monitor mode, one of the DVI's is disabled - more on that later. 

Let's take a closer look at the GTX 690 reference card!!

June 20, 2013 | 08:14 AM - Posted by August (not verified)

I always used to study piece of writing in news papers but now
as I am a user of net therefore from now I am using net
for articles, thanks to web.

Also visit my webpage core java interview questions and answers

May 4, 2012 | 12:17 PM - Posted by AParsh335i (not verified)

Wow. I'm surprised no one mentioned this yet - Newegg is already sold out of GTX 690 and they marked it up $200! I've posted a few times in response to people complaining about $999.99 price tag on this GTX 690 and that I think it's a great price considering that GTX 680s are currently selling for $550-650 retail. Of course, now newegg goes at $1199.99 with the 690. Thus is supply & demand.

May 5, 2012 | 12:40 PM - Posted by Ryan Shrout

Yeah, the price hikes are a pain in the ass and kind of hit NVIDIA in the face, but there is really nothing they can do about it without getting into some other legal issues.

May 5, 2012 | 01:36 AM - Posted by nabokovfan87

Why on earth is there a 6990 results, but not single 7970 results? Why is there 7970 CF results, but not a single 7970?

Very strange Ryan.

May 5, 2012 | 12:49 PM - Posted by Ryan Shrout

For our single card comparison graphs, there are only four spots. The GTX 680 is the faster of the two current generation single-GPU solutions, so it seemed more relevant to include IT rather than the HD 7970. If you are still curious about how the HD 7970 compares to the GTX 680, I recommend checking out this article: http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-GeForce-GTX-680-2GB-G...

Or even this: http://www.pcper.com/reviews/Graphics-Cards/Galaxy-GeForce-GTX-680-2GB-G...

May 6, 2012 | 07:49 PM - Posted by jewie27 (not verified)

Finally, thanks Ryan for posting 3D Vision performance.

May 6, 2012 | 07:53 PM - Posted by jewie27 (not verified)

The only problem is, how does it show the 3D FPS over 60 fps? 3D Vision has a FPS cap of 60.

May 6, 2012 | 11:25 PM - Posted by Thedarklord

Great review PCPer!

One thing id like to see though, with NVIDIA's Adaptive V-Sync, and/or other Kepler/driver tricks, does the GTX 690 exibit micro-stutter?

BTW I am SO happy that NVIDIA is really taking on the V-Sync problems, (Yay Kepler!), cause thats one of the most annoying artifacts, that and micro-stutter for SLI setups. ^.^

May 7, 2012 | 05:46 AM - Posted by ThorAxe

Whether I get two GTX 680s or a GTX 690 will solely depend on the price. If the price is the same I will most likely go for the 690 as it is beautiful and quiet.

May 12, 2012 | 08:41 AM - Posted by Pete (not verified)

Dont forget about your mobo. I've read you need to be running pic-e 3.0 x16 to get full benefit of the 690.

Thanks for the awesome benchmarks PCPer! Been looking for these everywhere. I just bought x3 Asus 27in 3d monitors. I have one 680 and another waiting on order......

Thinking about upgrading to 670 tri-sli, but that would mean mobo upgrade:(

May 28, 2012 | 10:04 AM - Posted by Anonymous (not verified)

Well, the fact that it is overpriced is still a fact. But, if you can handle money and you can save properly you would be able to actually afford the card. Don't spend money on crap, and learn to save !

June 23, 2012 | 09:53 PM - Posted by Anonymous (not verified)

if you use one card and have a 3 monitor 3d display would it have a very low fps?

July 14, 2012 | 08:59 AM - Posted by Knowsmorethanyou (not verified)

Look, nobody at all. NOBODY needs to buy the top of the line cards. You know why? i've got a 6790 AMD Radeon. and i will have that card for 4 years. After those 4 years. Yes, it might down in performance a little bit. But it's the exact reason Cross-fire was invented. then you just get ANOTHER 6790 which in 4 years will be half of what you purchased it for.

I know by personal studies that ALL cards top of the line over 900 frigging dollars is 150% pointless to buy. there are NO games out there TODAY in the 21st century that NEED that type of card. therefore YES, it (MIGHT) outperform cards IF there were games that required that card. BUT seeing as the only highly intense game at the moment that requires Rendering at a maximum for it's physics and a glowing picture for best performance and gameplay experience.... Is battlefield 3.

Now many can whine and moan. Offf, it's crysis 2. It's not. I played crysis on high setting with my 6 year old 5450Radeon. and i still have it to this day. Don't tell me its the most stressful game. It hasn't been proven. Since battlefield3 came out it has gone under countless awards and rewards. Many awards have been given due to their Intense realistic gameplay. Destruction 2.0 as many like to call it.. Or, it's "Physics 2.0" truthfully named. I'll tell you now, try playing on the office maps in Close quarters and shoot a .50cal down the cubicles and tell me those physics doesn't require rendering up the ass.

Spending 900 dollars on a graphics card that will be useless in 10 years and useful in 50 when battlefield 4000 comes out or whatever might. is Absolutely stupid. Until gamers get the knowledge in their head correctly. They'll never learn, and big companies will continue to make cheap products for big bucks that people can brag about and become smug.

Sorry to all the people buying 6990's and 690GTX cards. But my AMDRadeon6790 can outperform any of those cards

Remember... It matter what MODEL, it matter what YEAR, and it matter if you give a shit to clean it. Take care of something and you won't need to buy new crap for your computer every 3 years because "it got screwed up with dust residue" or the motor died out. Or the circuit board fried. Or i didn't wear an anti-static to fix my computer. or my computer keeps shutting down from overheating. No. your computer shuts down because you don't have enough power distribution cycling through your computer. Multiple rails is bad people.... Do the calculations of how many Watts your computer needs. as well as amps. you'll find it's not too hard. and you really don't need to buy super power house power supplies 1000watts.

August 17, 2012 | 04:20 AM - Posted by Anonymous (not verified)

i get black screen with this card on deus exhr and it forces me to reboot PC! same with saints 3.

August 11, 2013 | 09:54 PM - Posted by cybervang (not verified)

I bought a GEFORCE GTX 650 and it was the worse video card I've ever owned. It crashed left and right with frame errors and NOT DETECTED boots. My last was a RADEON with 0 issues and I always had to resort back to it when my GTX wanted to take a break.

My next purchase will be another Radeon UPgrade. I'm done with Nvidia.

Fast but BUGGY.

March 16, 2014 | 01:21 PM - Posted by Anonymous (not verified)

another thing is games, not all the games out there will let you play with 2 graphics cards, the 690 is one graphic card with 2 gpu's thus faking out the games and allowing you to play them, which is the main reason i bought one with 3 monitors set to 1920 x 1080

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.