Review Index:

NVIDIA GeForce GTX 690 Review - Dual GK104 Kepler Greatness

Author: Ryan Shrout
Manufacturer: NVIDIA

Kepler Features: Adaptive Vsync and Frame Rate Target

The new GTX 690 is going to get the same feature set as the GTX 680 that launched last month, so I thought it was pertinent to post about some of them here.  This information is taken from our original GTX 680 review.

Adaptive VSync

Adaptive VSync is a feature NVIDIA is integrating into the GTX 680 launch that attempts to finally provide the perfect solution for gamers that don't want stuttering and visual tearing in their gaming.  

View Full Size

VSync (vertical sync) is a technology that has been around forever and limits the number of frames being rendered to match the refresh rate of the display (usually 60 Hz).  While this does provide a gamer with tear-free gaming, it also means the frame rate has to "jump" between levels rather than smoothly transitioning between them.  If your frame rate dips below 60 FPS then the card has to start outputting 30 FPS, the next multiple down from 60 Hz.  If it dips even further you would jump to 20 FPS, etc. 

The problem is that those jumps can appear in the game as stuttering and the difference between running a game at 30 FPS and 55 FPS is pretty dramatic. 

View Full Size

View Full Size

With VSync disabled completely though, you will see visual tearing on horizontal lines in games. It is likely that every gamer has experienced this scenario. 

View Full Size

NVIDIA's Adaptive VSync technology actually disables VSync when the frame rate drops below the 60 FPS level so the game can smoothly transition to 58, 57, 45, etc frame rates without stuttering, and without dropping all the way to 30 Hz.  This technology is possible because NVIDIA's driver is able to tell how many frames it is rendering and recognize when the transition is about to happen, disabling the feature.  You can enable this in the control panel in the same drop down box as you would normally find VSync options. 

Obviously we wanted to see this at work for ourselves so I ran through some Metro 2033 with the GTX 680.  You will see three collections of data: standard VSync Off, standard VSync On and Adaptive VSync.  

View Full Size

View Full Size

With VSync off we averaged just over 63 FPS with a maximum frame rate of 107 FPS.  With standard VSync enabled you'll see the average fall all the way to 42 FPS but more importantly look at the red line in the top graph - the frame rate dips to the 30 Hz and back 60 Hz very quickly and when that happens you will often see a stutter as the transition takes place. 

With Adaptive VSync enabled (the green line), the performance closely matches the VSync disabled line (blue) when under 60 FPS but caps there as expected with VSync technology, preventing tearing.

Frame Rate Target

Yet another feature that NVIDIA is bringing to the GTX 680, via software applications like EVGA Precision X (and quite a few others) is frame rate targeting.  This feature allows gamers to set their own maximum frame rates, essentially telling the GPU to not bother pushing above it.

View Full Size

In this screenshot from EVGA's software you'll see I set the frame rate target at 80 FPS.  This indicates that I would like to cap the frame rate at 80 FPS since I personally don't see any advantages over that.  You can set it to just about anything, it's up to you.  If the game won't run that fast (for example, if it runs closer to 60 FPS), this option does nothing.

View Full Size

View Full Size

Running Deus Ex: Human Revolution with this setting off and on produces the intended result.  But it also produces another interesting one thanks to technology NVIDIA has included to limit power consumption.

View Full Size

On the left side of this graph is the power (by percentage) that the GPU is using, and you can see that is it clearly hitting near its 100% level.  On the right side; however, is the same run of DE:HR with the frame limit set to 80 FPS - it operates at much less than maximum power.

View Full Size

And the result is much lower power consumption - 45 watts less to be exact!  This saves on power usage, temperature, and fan noise as well producing a better overall experience for the user in older games or games that simply run at much higher than necessary frame rates.  For PC gamers that play CS:S in addition to BF3, this could be a really cool (no pun intended) feature.

May 3, 2012 | 09:45 AM - Posted by Asmodyus (not verified)

I am I the only one out there that thinks this card is to expensive? This is what not making since to me the GTX580 599 when it came out. GTX590 699-799 when it came out. GTX680 499. GTX690 1000.

I am sorry but that cost is way to high for a single video card regardless if it's 2 chips. I hope nvidia falls flat on there face. But has they say there is a lot of stupid people who will buy this card at that price.

May 3, 2012 | 10:44 AM - Posted by Bro (not verified)

Blame AMD for it. They are late with HD 7990.

May 3, 2012 | 08:18 PM - Posted by Pellervo (not verified)

The reason the price is so high is because IT IS what they say it is: fastest card on the planet. Let me put it simply: 680= $500K Ferrari. 690= million dollar F1. 690 is for those enthusiasts with the wallet.

May 8, 2012 | 12:42 AM - Posted by Mr King

I think with the 580 and 590, the 590's performance didn't match that of two 580s in SLI while the 690 does match two 680s in SLI. I think from that perspective the price makes some sense. I'm not happy to see the return of the thousand dollar video card, but there are other options too.

Although the Mars 2 was considerably more card than two 580s yet it wasn't twice the price. Then there is the whole issue of it being impossible to buy a Mars card.

May 9, 2012 | 01:35 AM - Posted by Branthog

I don't see what's wrong with a dual GPU card being $1,000 if it's okay for the single GPU card to be $500. Does that mean you think two single GPU cards for $500 each is okay, but stuffing a single $1,000 card is just "too much"?

Yeah, it's over-priced, but exactly what kind of price point would you plan on hitting with the different offerings they have right now, without severely undercutting the competition, unnecessarily?

Besides, chances are they'll be lower in a few months.

June 25, 2012 | 12:37 AM - Posted by Anonymous (not verified)

No, your the only one out their that is publicly complaining about the price tag of this card and how broke you are! Lol...calling people idiots that can afford to burn their $$ on this card? Really? I have this card and it smashes all games I load on it with all the setting set to ultra and well my eyes can only pickup 30 fps but believe you me the game play is excellent. Crysis, all settings are set to maximum and I'm not getting any tears or stutter...very smooth game play and in my book well worth the buy. I'll be looking at buying a second card next month why? Because I run x3 monitors and I will be playing 3D surround but the other reason is because I can not because I'm an idiot.

December 21, 2012 | 11:23 AM - Posted by Anonymous (not verified)

Very awesome.

But every time messages like these pop up my first thought is: 'you wasted a bucketload of cash'.

Ofcourse its entirely up to you, but the funny thing is that in the price segment of 500 and up, you are wasting tons of money on percentiles of performance gain.

Don't kid yourself.

May 3, 2012 | 10:49 AM - Posted by D1RTYD1Z619

After see this review I'd rather wait for the GTX680 prices to come down to SLI them. I have one already and am not having issues running any of my games at their highest setting on my current monitor's(1600*1200)resolution.

May 3, 2012 | 07:54 PM - Posted by AParsh335i (not verified)

You should honestly get a new monitor- you arent taking advantage of the 680.

May 5, 2012 | 02:37 PM - Posted by D1RTYD1Z619

But keeping this resolution will extend the life of my card because it will have to push less pixels.

May 6, 2012 | 02:41 PM - Posted by Ewe (not verified)

If you wait any longer, you will likely never get a 690 GTX. I highly doubt they will make many of these cards. You can't get a 590 anymore, that's for sure.

May 20, 2013 | 09:42 PM - Posted by battlewuss (not verified)

I got my GTX 590 two weeks ago :p

August 22, 2013 | 03:41 PM - Posted by Hugh (not verified)

Hello, every time i used to check blog posts here early in the dawn, for the reason that i enjoy to
gain knowledge of more and more.

My page: Jade Ludlow

May 17, 2012 | 07:33 PM - Posted by pat (not verified)

You all seem to be forgetting that the 690 also has twice the vram of a single 680. This will matter if you plan on playing with 3 screens. I have 3 gtx 570 2.5 gig and battlefield 3 is using just under 2 gig vram, and the game is not even maxed out. Fyi 2 680s with 2 gig vram does not add up to 4 gig vram in sli. The system can only utilize the vram on one card. So there you have it, the 690 in my opinion is actually worth more than 2 680, do to the 4 gigs of vram.

June 12, 2012 | 01:30 AM - Posted by Anonymous (not verified)

It's the same deal on the 690. In the case of normal sli the textures need to be loaded into the memory for both cards and thus the memory is not added together. The same is true for the 690. It has 2gb per core and those same textures need to be loaded into both. And so the memory per core is exactly the same. In the case of these dual gpu cards you always half the memory that it says on the box to give you the accurate reading for the amount of memory that is going to be used.

May 3, 2012 | 11:36 AM - Posted by Anonymous (not verified)

Would there be any difference between SLI 680's and a single 690? I would think that although the 680's will take more space, that the dedicated slot will improve bandwidth capability (or maybe not because of the sli adapter?).

May 3, 2012 | 11:47 AM - Posted by Tim K (not verified)

I am also very curious between the actual pro's and cons performance wise when comparing to this card to 2 680s in SLI. I am no savvy on mulit card setups but on the surface I see the benefits on only using a 2 slot form factor and less power with the 690, but worry a single point of failure with a very complex and hot running component. Would love to know if there are any performance advantages gained by having it on once PCB vs 2 cards in SLI.

May 3, 2012 | 01:38 PM - Posted by Jeremy Hellstrom

I thought someone might ask that -

May 4, 2012 | 01:21 AM - Posted by Bill (not verified)

The GTX 690 does make better sense than a 680 SLI setup if you are buying right now. If I were in the market though, I'd simply buy a 680, wait a year for the prices to come down, and then buy another. Yeah, I know, you're not getting the insane frame rates right now, but seriously, if you need something faster than a 680 at this time...let's just say I'd love to see the monitor setup!

May 9, 2012 | 01:36 AM - Posted by Branthog

Unfortunately, I have seen benchmarking that suggestions quad SLI (dual 690s) is actually worse performance than triple SLI or possibly even dual SLI.

The benching was not entirely conclusive as to whether this was a driver limitation, application (game) limitation, or a combination of the two.

December 21, 2012 | 11:28 AM - Posted by Anonymous (not verified)

Tri and quad SLI are well-known to scale badly, so this is nothing new.

The best scaling is achieved with Dual SLI. Above that, consider upgrading your card a notch instead of adding a third or fourth.

Let alone the issues concerning heat dissipation and power supply. For these last two, the 690 offers a (partial) solution but for the primary argument, the 690 changes nothing about the scalability of quad SLI.

May 3, 2012 | 02:48 PM - Posted by flatland (not verified)

Bjorn3D did a comparison between two GTX 680s and the GTX 690.

May 5, 2012 | 12:52 PM - Posted by Ryan Shrout did we...?

May 3, 2012 | 06:46 PM - Posted by SirPauly (not verified)


In your 3d vision performance investigations with the GTX 690, well, how did you receive those high averages when V-sync is enabled with 3d vision?

May 4, 2012 | 01:48 AM - Posted by Anonymous (not verified)

Always go with the single gpu option for anyone thinking of going SLI. I have sli 570's right now and it's beautiful when it works but a LOT of games don't properly support it and never will. Until the technology gets better, go for a better single gpu IMO. Maybe not one that costs 1k but still.

December 21, 2012 | 11:31 AM - Posted by Anonymous (not verified)

I have personally yet to discover the first game that would actually benefit from SLI, that doesn't support SLI setups.

Sure, your retro game collection isn't supported, but who needs an SLI setup for that?

Far Cry 3 was playable in SLI from day one, and I had zero issues with the 310.70/64 drivers. 80-110 fps steady with all possible candy on the screen. And this is with dual GTX 660 even.

In the case of my dual GTX 660 setup, I have a GTX 680 equivalent for 120 euro's less than the cost of a 680. Done deal.

May 4, 2012 | 11:39 AM - Posted by Anonymous (not verified)

Resolution is the key for what you purchase for graphics cards. If you do not play at 2650 or higher then you are wasting your money on this card. As I play at 5760 x 1080 I am very interested as my current 2 x 580 classifieds actually draw enough power to kick my breaker if my wife turns on the bathroom lights. I DO wish they had more vram on the cards, but for now it works just fine.

May 4, 2012 | 11:46 AM - Posted by I'm Hit! (not verified)

compared to sli 680 2way. the two way beats the 690 by about 2-4%. But the 690 runs quieter, cooler and takes about 30-50watts less power.

You will not notice that minor percentage loss in games. Get one from a reputable end card partner and just RMA your 690 if one or both gpu's fail on the card.

I am going from a 580 Classified to the 690 for my 30inch monitor = 2560 x 1600 resolution

May 5, 2012 | 12:50 PM - Posted by Ryan Shrout

You are a graphics monster sir!

December 21, 2012 | 11:34 AM - Posted by Anonymous (not verified)

I would also reckon that the 690 has less noticeable framedrops or micro stutter than dual 680, no?

Even though currently that is hardly an issue it still doesn't look nice in bench :)

May 4, 2012 | 12:17 PM - Posted by AParsh335i (not verified)

Wow. I'm surprised no one mentioned this yet - Newegg is already sold out of GTX 690 and they marked it up $200! I've posted a few times in response to people complaining about $999.99 price tag on this GTX 690 and that I think it's a great price considering that GTX 680s are currently selling for $550-650 retail. Of course, now newegg goes at $1199.99 with the 690. Thus is supply & demand.

May 5, 2012 | 12:40 PM - Posted by Ryan Shrout

Yeah, the price hikes are a pain in the ass and kind of hit NVIDIA in the face, but there is really nothing they can do about it without getting into some other legal issues.

May 5, 2012 | 01:36 AM - Posted by nabokovfan87

Why on earth is there a 6990 results, but not single 7970 results? Why is there 7970 CF results, but not a single 7970?

Very strange Ryan.

May 5, 2012 | 12:49 PM - Posted by Ryan Shrout

For our single card comparison graphs, there are only four spots. The GTX 680 is the faster of the two current generation single-GPU solutions, so it seemed more relevant to include IT rather than the HD 7970. If you are still curious about how the HD 7970 compares to the GTX 680, I recommend checking out this article:

Or even this:

May 6, 2012 | 07:49 PM - Posted by jewie27 (not verified)

Finally, thanks Ryan for posting 3D Vision performance.

May 6, 2012 | 07:53 PM - Posted by jewie27 (not verified)

The only problem is, how does it show the 3D FPS over 60 fps? 3D Vision has a FPS cap of 60.

May 6, 2012 | 11:25 PM - Posted by Thedarklord

Great review PCPer!

One thing id like to see though, with NVIDIA's Adaptive V-Sync, and/or other Kepler/driver tricks, does the GTX 690 exibit micro-stutter?

BTW I am SO happy that NVIDIA is really taking on the V-Sync problems, (Yay Kepler!), cause thats one of the most annoying artifacts, that and micro-stutter for SLI setups. ^.^

May 7, 2012 | 05:46 AM - Posted by ThorAxe

Whether I get two GTX 680s or a GTX 690 will solely depend on the price. If the price is the same I will most likely go for the 690 as it is beautiful and quiet.

May 12, 2012 | 08:41 AM - Posted by Pete (not verified)

Dont forget about your mobo. I've read you need to be running pic-e 3.0 x16 to get full benefit of the 690.

Thanks for the awesome benchmarks PCPer! Been looking for these everywhere. I just bought x3 Asus 27in 3d monitors. I have one 680 and another waiting on order......

Thinking about upgrading to 670 tri-sli, but that would mean mobo upgrade:(

May 28, 2012 | 10:04 AM - Posted by Anonymous (not verified)

Well, the fact that it is overpriced is still a fact. But, if you can handle money and you can save properly you would be able to actually afford the card. Don't spend money on crap, and learn to save !

June 23, 2012 | 09:53 PM - Posted by Anonymous (not verified)

if you use one card and have a 3 monitor 3d display would it have a very low fps?

July 14, 2012 | 08:59 AM - Posted by Knowsmorethanyou (not verified)

Look, nobody at all. NOBODY needs to buy the top of the line cards. You know why? i've got a 6790 AMD Radeon. and i will have that card for 4 years. After those 4 years. Yes, it might down in performance a little bit. But it's the exact reason Cross-fire was invented. then you just get ANOTHER 6790 which in 4 years will be half of what you purchased it for.

I know by personal studies that ALL cards top of the line over 900 frigging dollars is 150% pointless to buy. there are NO games out there TODAY in the 21st century that NEED that type of card. therefore YES, it (MIGHT) outperform cards IF there were games that required that card. BUT seeing as the only highly intense game at the moment that requires Rendering at a maximum for it's physics and a glowing picture for best performance and gameplay experience.... Is battlefield 3.

Now many can whine and moan. Offf, it's crysis 2. It's not. I played crysis on high setting with my 6 year old 5450Radeon. and i still have it to this day. Don't tell me its the most stressful game. It hasn't been proven. Since battlefield3 came out it has gone under countless awards and rewards. Many awards have been given due to their Intense realistic gameplay. Destruction 2.0 as many like to call it.. Or, it's "Physics 2.0" truthfully named. I'll tell you now, try playing on the office maps in Close quarters and shoot a .50cal down the cubicles and tell me those physics doesn't require rendering up the ass.

Spending 900 dollars on a graphics card that will be useless in 10 years and useful in 50 when battlefield 4000 comes out or whatever might. is Absolutely stupid. Until gamers get the knowledge in their head correctly. They'll never learn, and big companies will continue to make cheap products for big bucks that people can brag about and become smug.

Sorry to all the people buying 6990's and 690GTX cards. But my AMDRadeon6790 can outperform any of those cards

Remember... It matter what MODEL, it matter what YEAR, and it matter if you give a shit to clean it. Take care of something and you won't need to buy new crap for your computer every 3 years because "it got screwed up with dust residue" or the motor died out. Or the circuit board fried. Or i didn't wear an anti-static to fix my computer. or my computer keeps shutting down from overheating. No. your computer shuts down because you don't have enough power distribution cycling through your computer. Multiple rails is bad people.... Do the calculations of how many Watts your computer needs. as well as amps. you'll find it's not too hard. and you really don't need to buy super power house power supplies 1000watts.

August 17, 2012 | 04:20 AM - Posted by Anonymous (not verified)

i get black screen with this card on deus exhr and it forces me to reboot PC! same with saints 3.

August 11, 2013 | 09:54 PM - Posted by cybervang (not verified)

I bought a GEFORCE GTX 650 and it was the worse video card I've ever owned. It crashed left and right with frame errors and NOT DETECTED boots. My last was a RADEON with 0 issues and I always had to resort back to it when my GTX wanted to take a break.

My next purchase will be another Radeon UPgrade. I'm done with Nvidia.

Fast but BUGGY.

March 16, 2014 | 01:21 PM - Posted by Anonymous (not verified)

another thing is games, not all the games out there will let you play with 2 graphics cards, the 690 is one graphic card with 2 gpu's thus faking out the games and allowing you to play them, which is the main reason i bought one with 3 monitors set to 1920 x 1080

June 7, 2014 | 02:13 AM - Posted by Spinelli (not verified)





Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.