Review Index:
Feedback

NVIDIA GeForce GTX 690 Review - Dual GK104 Kepler Greatness

Author: Ryan Shrout
Manufacturer: NVIDIA

Kepler Features: GPU Boost

The new GTX 690 is going to get the same feature set as the GTX 680 that launched last month, so I thought it was pertinent to post about some of them here.  This information is taken from our original GTX 680 review.

One of the most interesting, and likely to be the most controversial, new features on the GeForce GTX 680 and the Kepler GPU is called GPU Boost.  You should think of GPU Boost as the GPU equivalent of Intel's Turbo Boost technology as it allows the clock speeds of the GTX 680 to increase when there is thermal headroom for it.

View Full Size

Before going over what GPU Boost does, it is crucial to have a background on the clock selection process on GPUs today.  In general, clocks of GPUs have been set to the worst case scenario so that a vendor or manufacturer could guarantee stability of the part in all games.  If one game or application (like 3DMark 11 pictured here) uses 160 watts of power at 1000 MHz (and that was generally considered the worst case by the GPU designer) then the clock would be set there.

Last year both AMD and NVIDIA started down the process of changing things by eliminating applications like Furmark and OCCT from the mix by limiting the amount of power that the GPU could draw.  AMD did this in hardware and NVIDIA in a driver hack, but both vendors saw the benefit of removing the outside case of Furmark from their decision process when setting clock speeds.  

Games use less power than Furmark, OCCT and even 3DMark 11, and some games use much less power than others.  The above image shows a theoretical clock speed 1.0 GHz, a voltage of 1.0v and power consumption numbers for three applications - 3DM11, Crysis 2 and Battlefield 3.  Using 160 watts as the target power consumption for a specific card, these would be the clock speeds set by NVIDIA without GPU Boost technology.

View Full Size

With a clock speed of 1.05 GHz and a voltage of 1.1v, you can see that power consumption changes (as would performance).  3DMark 11 is now outside the 160 watt limit they wanted to set though Crysis 2 and BF3 are still within safe ranges.

View Full Size

At 1.1 GHz and 1.2v, this imaginary card is now outside the 160 watt limit in both Crysis 2 and 3DMark 11 but is still within spec on BF3. 

The goal of GPU Boost is to allow the clock speed of the GPU to scale between clock speeds based on the application being used, stay within the power consumption tolerance specified by the manufacturer, and run stable without user intervention.  

View Full Size

To accomplish this, the Kepler GPU and graphics cards need to be able to monitor power consumption, temperatures, and performance utilizations.  Then, to use that data to alter clock speeds and voltages on the fly.  NVIDIA is keeping very quiet about the specific algorithm and work here but it should be pretty simple to understand for users; if you have thermal headroom and the GPU is nearly completely utilized, then increase clock and voltage until you are nearly at the limits specified.

This all occurs fairly quickly with an instantaneous polling time and a 100ms response time on the clocks and voltages.

View Full Size

Here is a different view of the GPU Boost feature.  With the Y axis being power and the X axis a series of unnamed games, you can see that a single game that utilizes the GPU nearly completely and consumes the most power typically sets the base clock speed for GPUs.  However, all the way on the left you'll see another game that falls well under the power limits and could be running at higher speeds.

View Full Size

Each of these titles will have varying amounts of power windows to expand into but the goal of GPU Boost is to get as close to the thermal limits of the card as possible to enable the best GPU performance.

View Full Size

With the Y axis now shifted to clock speed rather than power, you can see that the GPU clock will increase some amount over the base clock used on that game on the far right; a capability possible because of GPU Boost's ability to increase clocks on the fly.

View Full Size

There are lots of questions that go along with this new technology NVIDIA is introducing, including whether users have the ability to disable the feature.  That is something that NVIDIA has decided to not allow simply stating that "this is way the card works now."  The obvious fear is that reviewers and gamers would turn the technology off to get an "apples to apples" comparison of the GTX 680 to the HD 7970.  NVIDIA's stance is that GPU Boost is a part of the GPU and should be included anyway.  We are going to keep looking for a way to measure the actual benefits of GPU Boost on our own; although, it does take some time.

Another question you surely have is "how much with the clock boost?"  The GTX 680 will be marketed with two clock speeds: the "base clock" and the "boost clock" with the base clock being exactly what you are used to.  The base clock is the speed at which the GPU will run in the worst case scenario, a game or application which utilizes the GPU to its fullest.  The boost clock is the GPU clock while running at "typical game" - though what a typical game is isn't really described.  Because there is such a wide array of PC games on the market, and that clock can change even from scene to scene in the same game, it will be hard to say with any kind of certainty what speed you will be running at without looking for yourself.

This also means that performance will vary in several ways from user to user.  If your GPU is in an open test bed versus a poorly cooled chassis, for example, you could have your GPU clock swing one way or the other.  Also, due to the tolerances with GPUs from the factory, two different GTX 680s could perform differently in the exact same environment.  While we have only been able to test a single GPU ourselves, and thus can't really judge the accuracy of the statement, NVIDIA is claiming there will only be a 1-2% variation from user to user.  Will consumers complain if their specific card runs slower than their friends?  We aren't sure yet but I know we will be finding out very soon.

More examples of NVIDIA's GPU Boost at work are available on our overclocking page.

May 4, 2012 | 12:17 PM - Posted by AParsh335i (not verified)

Wow. I'm surprised no one mentioned this yet - Newegg is already sold out of GTX 690 and they marked it up $200! I've posted a few times in response to people complaining about $999.99 price tag on this GTX 690 and that I think it's a great price considering that GTX 680s are currently selling for $550-650 retail. Of course, now newegg goes at $1199.99 with the 690. Thus is supply & demand.

May 5, 2012 | 12:40 PM - Posted by Ryan Shrout

Yeah, the price hikes are a pain in the ass and kind of hit NVIDIA in the face, but there is really nothing they can do about it without getting into some other legal issues.

May 5, 2012 | 01:36 AM - Posted by nabokovfan87

Why on earth is there a 6990 results, but not single 7970 results? Why is there 7970 CF results, but not a single 7970?

Very strange Ryan.

May 5, 2012 | 12:49 PM - Posted by Ryan Shrout

For our single card comparison graphs, there are only four spots. The GTX 680 is the faster of the two current generation single-GPU solutions, so it seemed more relevant to include IT rather than the HD 7970. If you are still curious about how the HD 7970 compares to the GTX 680, I recommend checking out this article: http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-GeForce-GTX-680-2GB-G...

Or even this: http://www.pcper.com/reviews/Graphics-Cards/Galaxy-GeForce-GTX-680-2GB-G...

May 6, 2012 | 07:49 PM - Posted by jewie27 (not verified)

Finally, thanks Ryan for posting 3D Vision performance.

May 6, 2012 | 07:53 PM - Posted by jewie27 (not verified)

The only problem is, how does it show the 3D FPS over 60 fps? 3D Vision has a FPS cap of 60.

May 6, 2012 | 11:25 PM - Posted by Thedarklord

Great review PCPer!

One thing id like to see though, with NVIDIA's Adaptive V-Sync, and/or other Kepler/driver tricks, does the GTX 690 exibit micro-stutter?

BTW I am SO happy that NVIDIA is really taking on the V-Sync problems, (Yay Kepler!), cause thats one of the most annoying artifacts, that and micro-stutter for SLI setups. ^.^

May 7, 2012 | 05:46 AM - Posted by ThorAxe

Whether I get two GTX 680s or a GTX 690 will solely depend on the price. If the price is the same I will most likely go for the 690 as it is beautiful and quiet.

May 12, 2012 | 08:41 AM - Posted by Pete (not verified)

Dont forget about your mobo. I've read you need to be running pic-e 3.0 x16 to get full benefit of the 690.

Thanks for the awesome benchmarks PCPer! Been looking for these everywhere. I just bought x3 Asus 27in 3d monitors. I have one 680 and another waiting on order......

Thinking about upgrading to 670 tri-sli, but that would mean mobo upgrade:(

May 28, 2012 | 10:04 AM - Posted by Anonymous (not verified)

Well, the fact that it is overpriced is still a fact. But, if you can handle money and you can save properly you would be able to actually afford the card. Don't spend money on crap, and learn to save !

June 23, 2012 | 09:53 PM - Posted by Anonymous (not verified)

if you use one card and have a 3 monitor 3d display would it have a very low fps?

July 14, 2012 | 08:59 AM - Posted by Knowsmorethanyou (not verified)

Look, nobody at all. NOBODY needs to buy the top of the line cards. You know why? i've got a 6790 AMD Radeon. and i will have that card for 4 years. After those 4 years. Yes, it might down in performance a little bit. But it's the exact reason Cross-fire was invented. then you just get ANOTHER 6790 which in 4 years will be half of what you purchased it for.

I know by personal studies that ALL cards top of the line over 900 frigging dollars is 150% pointless to buy. there are NO games out there TODAY in the 21st century that NEED that type of card. therefore YES, it (MIGHT) outperform cards IF there were games that required that card. BUT seeing as the only highly intense game at the moment that requires Rendering at a maximum for it's physics and a glowing picture for best performance and gameplay experience.... Is battlefield 3.

Now many can whine and moan. Offf, it's crysis 2. It's not. I played crysis on high setting with my 6 year old 5450Radeon. and i still have it to this day. Don't tell me its the most stressful game. It hasn't been proven. Since battlefield3 came out it has gone under countless awards and rewards. Many awards have been given due to their Intense realistic gameplay. Destruction 2.0 as many like to call it.. Or, it's "Physics 2.0" truthfully named. I'll tell you now, try playing on the office maps in Close quarters and shoot a .50cal down the cubicles and tell me those physics doesn't require rendering up the ass.

Spending 900 dollars on a graphics card that will be useless in 10 years and useful in 50 when battlefield 4000 comes out or whatever might. is Absolutely stupid. Until gamers get the knowledge in their head correctly. They'll never learn, and big companies will continue to make cheap products for big bucks that people can brag about and become smug.

Sorry to all the people buying 6990's and 690GTX cards. But my AMDRadeon6790 can outperform any of those cards

Remember... It matter what MODEL, it matter what YEAR, and it matter if you give a shit to clean it. Take care of something and you won't need to buy new crap for your computer every 3 years because "it got screwed up with dust residue" or the motor died out. Or the circuit board fried. Or i didn't wear an anti-static to fix my computer. or my computer keeps shutting down from overheating. No. your computer shuts down because you don't have enough power distribution cycling through your computer. Multiple rails is bad people.... Do the calculations of how many Watts your computer needs. as well as amps. you'll find it's not too hard. and you really don't need to buy super power house power supplies 1000watts.

August 17, 2012 | 04:20 AM - Posted by Anonymous (not verified)

i get black screen with this card on deus exhr and it forces me to reboot PC! same with saints 3.

August 11, 2013 | 09:54 PM - Posted by cybervang (not verified)

I bought a GEFORCE GTX 650 and it was the worse video card I've ever owned. It crashed left and right with frame errors and NOT DETECTED boots. My last was a RADEON with 0 issues and I always had to resort back to it when my GTX wanted to take a break.

My next purchase will be another Radeon UPgrade. I'm done with Nvidia.

Fast but BUGGY.

March 16, 2014 | 01:21 PM - Posted by Anonymous (not verified)

another thing is games, not all the games out there will let you play with 2 graphics cards, the 690 is one graphic card with 2 gpu's thus faking out the games and allowing you to play them, which is the main reason i bought one with 3 monitors set to 1920 x 1080

June 7, 2014 | 02:13 AM - Posted by Spinelli (not verified)

THERE IS NO 50% PERFORMANCE HIT HERE. THERE IS HARDLY ANY PERFORMANCE HIT. FRAMES-PER-SECOND ARE REPORTED AS PER EYE WHILE IN 3D MODE.

YOU NEED TO DOUBLE THE REPORTED FRAMERATE WHEN IN 3D MODE, AND ONLY THEN COMPARE IT TO THE 2D FRAMERATE. 3D VISION IS NOT MUCH OF A FRAMERATE KILLER, IF AT ALL.

FOR EXAMPLE, IF YOUR GAME RAN AT 120 FPS IN 2D MODE, BUT THEN REPORTS 60 FPS IN 3D MODE, THEN THERE IS NO PERFORMANCE HIT AT ALL BECAUSE IT IS 60 FPS PER EYE WHICH EQUALS TO 120 FPS. THE GAME IS STILL ACTUALLY RUNNING AT 120 FPS, AND THAT IS WHY THE FLUIDITY OF 60 FPS IN 3D MODE IS IDENTICAL TO THE FLUIDITY OF THE GRAPHICS AT 120 FPS IN 2D MODE.

YEARS LATER AND SO MANY PEOPLE, INCLUDING "PROFESSIONAL REVIEW SITES" CONTINUE NOT UNDERSTANDING THIS, AND THEY THEREFORE KEEP GIVING NVIDIA 3D VISION SUCH A BAD NAME AS THEY FALSLEY/INCORRECTLY CLAIM THAT NVIDIA 3D VISION KILLS/HALVES/SERIOUSLY DROPS FRAMERATES WHICH IS SIMPLY NOT TRUE AND COMPLETELY SHOWS THEIR MISUNDERSTANDING OF THE TECHNOLOGY.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.