Review Index:
Feedback

NVIDIA GeForce GTX 690 Review - Dual GK104 Kepler Greatness

Author: Ryan Shrout
Manufacturer: NVIDIA

Kepler Features: GPU Boost

The new GTX 690 is going to get the same feature set as the GTX 680 that launched last month, so I thought it was pertinent to post about some of them here.  This information is taken from our original GTX 680 review.

One of the most interesting, and likely to be the most controversial, new features on the GeForce GTX 680 and the Kepler GPU is called GPU Boost.  You should think of GPU Boost as the GPU equivalent of Intel's Turbo Boost technology as it allows the clock speeds of the GTX 680 to increase when there is thermal headroom for it.

View Full Size

Before going over what GPU Boost does, it is crucial to have a background on the clock selection process on GPUs today.  In general, clocks of GPUs have been set to the worst case scenario so that a vendor or manufacturer could guarantee stability of the part in all games.  If one game or application (like 3DMark 11 pictured here) uses 160 watts of power at 1000 MHz (and that was generally considered the worst case by the GPU designer) then the clock would be set there.

Last year both AMD and NVIDIA started down the process of changing things by eliminating applications like Furmark and OCCT from the mix by limiting the amount of power that the GPU could draw.  AMD did this in hardware and NVIDIA in a driver hack, but both vendors saw the benefit of removing the outside case of Furmark from their decision process when setting clock speeds.  

Games use less power than Furmark, OCCT and even 3DMark 11, and some games use much less power than others.  The above image shows a theoretical clock speed 1.0 GHz, a voltage of 1.0v and power consumption numbers for three applications - 3DM11, Crysis 2 and Battlefield 3.  Using 160 watts as the target power consumption for a specific card, these would be the clock speeds set by NVIDIA without GPU Boost technology.

View Full Size

With a clock speed of 1.05 GHz and a voltage of 1.1v, you can see that power consumption changes (as would performance).  3DMark 11 is now outside the 160 watt limit they wanted to set though Crysis 2 and BF3 are still within safe ranges.

View Full Size

At 1.1 GHz and 1.2v, this imaginary card is now outside the 160 watt limit in both Crysis 2 and 3DMark 11 but is still within spec on BF3. 

The goal of GPU Boost is to allow the clock speed of the GPU to scale between clock speeds based on the application being used, stay within the power consumption tolerance specified by the manufacturer, and run stable without user intervention.  

View Full Size

To accomplish this, the Kepler GPU and graphics cards need to be able to monitor power consumption, temperatures, and performance utilizations.  Then, to use that data to alter clock speeds and voltages on the fly.  NVIDIA is keeping very quiet about the specific algorithm and work here but it should be pretty simple to understand for users; if you have thermal headroom and the GPU is nearly completely utilized, then increase clock and voltage until you are nearly at the limits specified.

This all occurs fairly quickly with an instantaneous polling time and a 100ms response time on the clocks and voltages.

View Full Size

Here is a different view of the GPU Boost feature.  With the Y axis being power and the X axis a series of unnamed games, you can see that a single game that utilizes the GPU nearly completely and consumes the most power typically sets the base clock speed for GPUs.  However, all the way on the left you'll see another game that falls well under the power limits and could be running at higher speeds.

View Full Size

Each of these titles will have varying amounts of power windows to expand into but the goal of GPU Boost is to get as close to the thermal limits of the card as possible to enable the best GPU performance.

View Full Size

With the Y axis now shifted to clock speed rather than power, you can see that the GPU clock will increase some amount over the base clock used on that game on the far right; a capability possible because of GPU Boost's ability to increase clocks on the fly.

View Full Size

There are lots of questions that go along with this new technology NVIDIA is introducing, including whether users have the ability to disable the feature.  That is something that NVIDIA has decided to not allow simply stating that "this is way the card works now."  The obvious fear is that reviewers and gamers would turn the technology off to get an "apples to apples" comparison of the GTX 680 to the HD 7970.  NVIDIA's stance is that GPU Boost is a part of the GPU and should be included anyway.  We are going to keep looking for a way to measure the actual benefits of GPU Boost on our own; although, it does take some time.

Another question you surely have is "how much with the clock boost?"  The GTX 680 will be marketed with two clock speeds: the "base clock" and the "boost clock" with the base clock being exactly what you are used to.  The base clock is the speed at which the GPU will run in the worst case scenario, a game or application which utilizes the GPU to its fullest.  The boost clock is the GPU clock while running at "typical game" - though what a typical game is isn't really described.  Because there is such a wide array of PC games on the market, and that clock can change even from scene to scene in the same game, it will be hard to say with any kind of certainty what speed you will be running at without looking for yourself.

This also means that performance will vary in several ways from user to user.  If your GPU is in an open test bed versus a poorly cooled chassis, for example, you could have your GPU clock swing one way or the other.  Also, due to the tolerances with GPUs from the factory, two different GTX 680s could perform differently in the exact same environment.  While we have only been able to test a single GPU ourselves, and thus can't really judge the accuracy of the statement, NVIDIA is claiming there will only be a 1-2% variation from user to user.  Will consumers complain if their specific card runs slower than their friends?  We aren't sure yet but I know we will be finding out very soon.

More examples of NVIDIA's GPU Boost at work are available on our overclocking page.

May 3, 2012 | 09:45 AM - Posted by Asmodyus (not verified)

I am I the only one out there that thinks this card is to expensive? This is what not making since to me the GTX580 599 when it came out. GTX590 699-799 when it came out. GTX680 499. GTX690 1000.

I am sorry but that cost is way to high for a single video card regardless if it's 2 chips. I hope nvidia falls flat on there face. But has they say there is a lot of stupid people who will buy this card at that price.

May 3, 2012 | 10:44 AM - Posted by Bro (not verified)

Blame AMD for it. They are late with HD 7990.

May 3, 2012 | 08:18 PM - Posted by Pellervo (not verified)

The reason the price is so high is because IT IS what they say it is: fastest card on the planet. Let me put it simply: 680= $500K Ferrari. 690= million dollar F1. 690 is for those enthusiasts with the wallet.

May 8, 2012 | 12:42 AM - Posted by Mr King

I think with the 580 and 590, the 590's performance didn't match that of two 580s in SLI while the 690 does match two 680s in SLI. I think from that perspective the price makes some sense. I'm not happy to see the return of the thousand dollar video card, but there are other options too.

Although the Mars 2 was considerably more card than two 580s yet it wasn't twice the price. Then there is the whole issue of it being impossible to buy a Mars card.

May 9, 2012 | 01:35 AM - Posted by Branthog

I don't see what's wrong with a dual GPU card being $1,000 if it's okay for the single GPU card to be $500. Does that mean you think two single GPU cards for $500 each is okay, but stuffing a single $1,000 card is just "too much"?

Yeah, it's over-priced, but exactly what kind of price point would you plan on hitting with the different offerings they have right now, without severely undercutting the competition, unnecessarily?

Besides, chances are they'll be lower in a few months.

June 25, 2012 | 12:37 AM - Posted by Anonymous (not verified)

No, your the only one out their that is publicly complaining about the price tag of this card and how broke you are! Lol...calling people idiots that can afford to burn their $$ on this card? Really? I have this card and it smashes all games I load on it with all the setting set to ultra and well my eyes can only pickup 30 fps but believe you me the game play is excellent. Crysis, all settings are set to maximum and I'm not getting any tears or stutter...very smooth game play and in my book well worth the buy. I'll be looking at buying a second card next month why? Because I run x3 monitors and I will be playing 3D surround but the other reason is because I can not because I'm an idiot.

December 21, 2012 | 11:23 AM - Posted by Anonymous (not verified)

Very awesome.

But every time messages like these pop up my first thought is: 'you wasted a bucketload of cash'.

Ofcourse its entirely up to you, but the funny thing is that in the price segment of 500 and up, you are wasting tons of money on percentiles of performance gain.

Don't kid yourself.

May 3, 2012 | 10:49 AM - Posted by D1RTYD1Z619

After see this review I'd rather wait for the GTX680 prices to come down to SLI them. I have one already and am not having issues running any of my games at their highest setting on my current monitor's(1600*1200)resolution.

May 3, 2012 | 07:54 PM - Posted by AParsh335i (not verified)

You should honestly get a new monitor- you arent taking advantage of the 680.

May 5, 2012 | 02:37 PM - Posted by D1RTYD1Z619

But keeping this resolution will extend the life of my card because it will have to push less pixels.

May 6, 2012 | 02:41 PM - Posted by Ewe (not verified)

If you wait any longer, you will likely never get a 690 GTX. I highly doubt they will make many of these cards. You can't get a 590 anymore, that's for sure.

May 20, 2013 | 09:42 PM - Posted by battlewuss (not verified)

I got my GTX 590 two weeks ago :p

August 22, 2013 | 03:41 PM - Posted by Hugh (not verified)

Hello, every time i used to check blog posts here early in the dawn, for the reason that i enjoy to
gain knowledge of more and more.

My page: Jade Ludlow

May 17, 2012 | 07:33 PM - Posted by pat (not verified)

You all seem to be forgetting that the 690 also has twice the vram of a single 680. This will matter if you plan on playing with 3 screens. I have 3 gtx 570 2.5 gig and battlefield 3 is using just under 2 gig vram, and the game is not even maxed out. Fyi 2 680s with 2 gig vram does not add up to 4 gig vram in sli. The system can only utilize the vram on one card. So there you have it, the 690 in my opinion is actually worth more than 2 680, do to the 4 gigs of vram.

June 12, 2012 | 01:30 AM - Posted by Anonymous (not verified)

It's the same deal on the 690. In the case of normal sli the textures need to be loaded into the memory for both cards and thus the memory is not added together. The same is true for the 690. It has 2gb per core and those same textures need to be loaded into both. And so the memory per core is exactly the same. In the case of these dual gpu cards you always half the memory that it says on the box to give you the accurate reading for the amount of memory that is going to be used.

May 3, 2012 | 11:36 AM - Posted by Anonymous (not verified)

Would there be any difference between SLI 680's and a single 690? I would think that although the 680's will take more space, that the dedicated slot will improve bandwidth capability (or maybe not because of the sli adapter?).

May 3, 2012 | 11:47 AM - Posted by Tim K (not verified)

I am also very curious between the actual pro's and cons performance wise when comparing to this card to 2 680s in SLI. I am no savvy on mulit card setups but on the surface I see the benefits on only using a 2 slot form factor and less power with the 690, but worry a single point of failure with a very complex and hot running component. Would love to know if there are any performance advantages gained by having it on once PCB vs 2 cards in SLI.

May 3, 2012 | 01:38 PM - Posted by Jeremy Hellstrom

I thought someone might ask that - http://www.pcper.com/news/Graphics-Cards/Why-would-you-want-buy-GTX690

May 4, 2012 | 01:21 AM - Posted by Bill (not verified)

The GTX 690 does make better sense than a 680 SLI setup if you are buying right now. If I were in the market though, I'd simply buy a 680, wait a year for the prices to come down, and then buy another. Yeah, I know, you're not getting the insane frame rates right now, but seriously, if you need something faster than a 680 at this time...let's just say I'd love to see the monitor setup!

May 9, 2012 | 01:36 AM - Posted by Branthog

Unfortunately, I have seen benchmarking that suggestions quad SLI (dual 690s) is actually worse performance than triple SLI or possibly even dual SLI.

The benching was not entirely conclusive as to whether this was a driver limitation, application (game) limitation, or a combination of the two.

December 21, 2012 | 11:28 AM - Posted by Anonymous (not verified)

Tri and quad SLI are well-known to scale badly, so this is nothing new.

The best scaling is achieved with Dual SLI. Above that, consider upgrading your card a notch instead of adding a third or fourth.

Let alone the issues concerning heat dissipation and power supply. For these last two, the 690 offers a (partial) solution but for the primary argument, the 690 changes nothing about the scalability of quad SLI.

May 3, 2012 | 02:48 PM - Posted by flatland (not verified)

Bjorn3D did a comparison between two GTX 680s and the GTX 690.

May 5, 2012 | 12:52 PM - Posted by Ryan Shrout

Well...so did we...?

May 3, 2012 | 06:46 PM - Posted by SirPauly (not verified)

Ryan,

In your 3d vision performance investigations with the GTX 690, well, how did you receive those high averages when V-sync is enabled with 3d vision?

May 4, 2012 | 01:48 AM - Posted by Anonymous (not verified)

Always go with the single gpu option for anyone thinking of going SLI. I have sli 570's right now and it's beautiful when it works but a LOT of games don't properly support it and never will. Until the technology gets better, go for a better single gpu IMO. Maybe not one that costs 1k but still.

December 21, 2012 | 11:31 AM - Posted by Anonymous (not verified)

I have personally yet to discover the first game that would actually benefit from SLI, that doesn't support SLI setups.

Sure, your retro game collection isn't supported, but who needs an SLI setup for that?

Far Cry 3 was playable in SLI from day one, and I had zero issues with the 310.70/64 drivers. 80-110 fps steady with all possible candy on the screen. And this is with dual GTX 660 even.

In the case of my dual GTX 660 setup, I have a GTX 680 equivalent for 120 euro's less than the cost of a 680. Done deal.

May 4, 2012 | 11:39 AM - Posted by Anonymous (not verified)

Resolution is the key for what you purchase for graphics cards. If you do not play at 2650 or higher then you are wasting your money on this card. As I play at 5760 x 1080 I am very interested as my current 2 x 580 classifieds actually draw enough power to kick my breaker if my wife turns on the bathroom lights. I DO wish they had more vram on the cards, but for now it works just fine.

May 4, 2012 | 11:46 AM - Posted by I'm Hit! (not verified)

compared to sli 680 2way. the two way beats the 690 by about 2-4%. But the 690 runs quieter, cooler and takes about 30-50watts less power.

You will not notice that minor percentage loss in games. Get one from a reputable end card partner and just RMA your 690 if one or both gpu's fail on the card.

I am going from a 580 Classified to the 690 for my 30inch monitor = 2560 x 1600 resolution

May 5, 2012 | 12:50 PM - Posted by Ryan Shrout

You are a graphics monster sir!

December 21, 2012 | 11:34 AM - Posted by Anonymous (not verified)

I would also reckon that the 690 has less noticeable framedrops or micro stutter than dual 680, no?

Even though currently that is hardly an issue it still doesn't look nice in bench :)

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.