Review Index:
Feedback

NVIDIA GeForce GTX TITAN Preview - GK110, GPU Boost 2.0, Overclocking and GPGPU

Author: Ryan Shrout
Manufacturer: NVIDIA

GPU Boost 2.0

When the GTX 680 launched with the first version of GPU Boost, most people saw it as a great feature that enabled clock speeds of a GPU to go up when thermal headroom allowed.  It was elegant in action but very difficult to explain without seeing it demonstrated.  For that reason, we had NVIDIA's Tom Petersen come out to our offices to do a live stream and explain it.  We repeated this for the GTX 690 and a few other launches as well.  (Note: we are doing this same live event on the launch day of the GTX TITAN - Thursday, February 21st - so be sure to join us!)

View Full Size

With TITAN, NVIDIA will be releasing an updated version of GPU Boost they claim will allow for even higher clocks on the GPU than the original technology would have allowed.  This time the key measurement isn't power but temperature.

View Full Size

GPU Boost 1.0

In the first version of GPU Boost there was (and still is) a maximum clock set in the firmware of the card that no amount of tweaking would allow you to breach.  While this number isn't advertised or easily discovered on a spec sheet, it was the maximum level NVIDIA would let the chip run at to avoid possible danger to the silicon due to high voltage and high temperature.

View Full Size

GPU Boost 2.0

This updated version of GPU Boost can increase the maximum clock rate because the voltage level is controlled by a known, easily measured data point: temperature.  By preventing a combination of high voltages and high temperatures that might break a chip, NVIDIA can increase the voltage on a chip-to-chip basis to increase the overall performance of the card in most cases. 

View Full Size

Even better: NVIDIA is going to once again allow for the ability to exceed the base voltage to another unknown maximum.  This marks a return of overvolting cards (at least with the TITAN) but when asked NVIDIA said this feature would not be re-enabled on the GTX 600-series.

For those that are curious, NVIDIA gave us some more information on how overvolting will work.  It must be disabled by default and the user must acknowledge a warning message before enabling it in software.  It can be persistent across reboots of the PC and then the max voltage of the GPU can be adjusted by the end user.  This is completely optional though and it will be up to the card vendors to enable it.  I would think that finding a GTX TITAN without support for overvolting would be impossible but as we see future card releases with GPU Boost 2.0 technology it is something we'll keep an eye on.

View Full Size

View Full Size

With these changes NVIDIA has moved the bell curve of frequency distribution toward higher clocks.  By utilizing overvolting on the chip you can actually expand the max clock to the right even a bit further.

View Full Size

The shift in focus to the temperature of the GPU with version 2.0 of this feature means that the likelihood of the GPU being at the target temperature is much more likely than it would have been before.  The above graph shows GPU Boost 2.0 temperatures in yellow and GPU Boost 1.0 temperatures in grey.  You can clearly see that the curve has been constricted around the 80C mark (the default setting) and that the GPU is both BELOW and ABOVE 80C much less often.

View Full Size

Just like you could adjust the power target to overclock your GTX 600-series card, you can now adjust the temperature target with GTX TITAN to allow it to increase clocks and voltages. 

View Full Size

What does a 10C bump in your target temperature mean to your clock speeds?  That. 

View Full Size

NVIDIA is very conscious about the noise levels of TITAN and want to make sure they don't get out of hand like some HD 7970s can do.  TITAN runs nearly silently in our testing until it gets near the 80C mark and then the fan speed begins to ramp up. 

View Full Size

Adjusting the temperature offset doesn't automatically change the fan schedules but...

View Full Size

...you can offset the fan curves as well manually to match any other setting changes.

View Full Size

This new version of GPU Boost definitely seems more in line with the original goals of the technology but there are some interesting caveats.  First, you'll quickly find that the clock speeds of TITAN will start out higher on a "cold" GPU and then ramp down as the temperature of the die increases.  This means that doing quick performance checks of the GPU using 3DMark or even quick game launches will result in performance measurements that are higher than they would be after 5-10 minutes of gaming.  As a result. our testing of TITAN required us to "warm up" the GPU for a few minutes before every benchmark run. 

Also, defaulting the GPU to target 80C seems more than a bit low for our tastes and buyers of a $999 graphics cards are likely more interested in performance than slightly less noise from the coolers fan.  It kind of seems like NVIDIA is straddling the line between wanting a higher performing GPU and being able to say it is the quietest GPU by X percent.  If NVIDIA was willing to have a slightly louder configuration out of the box with GK110 how much higher would the average clocks be? 

As I spend more time with this hardware an the new GPU Boost 2.0 I am sure we'll find out more.

February 19, 2013 | 06:57 AM - Posted by Kyle (not verified)

Yeah, but will it run Crysis?

February 19, 2013 | 07:01 AM - Posted by Ryan Shrout

We'll find out on Thursday.  :)

February 20, 2013 | 12:49 PM - Posted by Anonymous (not verified)

no

February 22, 2013 | 04:07 AM - Posted by Anonymous (not verified)

Actually, the better question is: will it blend?

February 19, 2013 | 07:03 AM - Posted by I don't have a name. (not verified)

Crysis 3, more like. :) Can't wait to see the performance results!

February 19, 2013 | 07:05 AM - Posted by I don't have a name. (not verified)

Might the GTX Titan review be the first to use your new frame-rating performance metric? Can't wait! :)

February 19, 2013 | 07:06 AM - Posted by Ryan Shrout

I can tell you right now that it won't be the full version of our Frame Rating system, but we'll have some new inclusions in it for sure.

February 19, 2013 | 07:21 AM - Posted by stephenC (not verified)

I am really looking forward to seeing how this card will perform with crysis 3

February 19, 2013 | 07:40 AM - Posted by Anonymous (not verified)

I think I'll get one. I have no need for it, but it looks cool. So, from someone who knows very little about computers or gaming, I take it this would be "overkill" to have my buddy build me a new system around this?

February 19, 2013 | 07:43 AM - Posted by Prodeous (not verified)

Hope to see some GPGPU testing and comparsion to 680. I can assist with Blender3D as it support GPGPU rendering on CUDA cards if you guys want.. Need to find out if I need to start putting coins away for another puchase ;)

February 19, 2013 | 08:31 AM - Posted by Ryan Shrout

Point me in the right direction!

February 19, 2013 | 07:16 PM - Posted by Riley (not verified)

+1 for blender render times! Go to blender.org to download blender (looks like it is down as I post this, probably for v2.66 release).

blendswap.com could be used to find stuff to render. That way the files would be publicly available to anyone who wanted to compare your times to their own. Just make sure the files use the cycles rendering engine (not blender internal).

Instructions at http://www.blenderguru.com/4-easy-ways-to-speed-up-cycles/ can be followed to swap between CPU and GPU rendering.

February 20, 2013 | 02:22 PM - Posted by Gadgety

I'm looking to replacing three GTX580 3GB mainly for rendering. With 6GB memory and the massive amount of CUDA cores I'm really looking forward to Blender Cycles speeds, but also Octane, VRay-RT. This must be one of the usage scenarios for the Titan, rather than pure gaming.

February 21, 2013 | 02:49 PM - Posted by Prodeous (not verified)

Download blender from:
http://www.blender.org/download/get-blender/

Windows 64bit install is the simplest.

The most common test for GPGPU rendering is to use the link below. "The unofficial Blender Cycles benchmark"
http://blenderartists.org/forum/showthread.php?239480-2-61-Cycles-render...

And then use the information provided by Riley:

http://www.blenderguru.com/4-easy-ways-to-speed-up-cycles/

February 19, 2013 | 08:12 AM - Posted by Jumangi (not verified)

*shrug* I guess its fun for the reviewer to get something new in the office but this might as well still be a myth as 99.99% of users will never even think about getting something this expensive.

Give us something new that regular people can actually afford...

February 19, 2013 | 08:18 AM - Posted by Rick (not verified)

i agree with above. one of these or two 670? i hope we see that comparison in your review.

February 19, 2013 | 08:32 AM - Posted by Ryan Shrout

670s are too cheap - you'll see two GTX 680s though.

February 19, 2013 | 10:12 AM - Posted by mAxius

ah the nvidiots will gladly put out $1000.00 for a graphics card that is nothing more than a 680 on roid rage... i know im not touching this card unless i win the lotto and get hit in the head with a brick making me mentally incapacitated.

February 19, 2013 | 10:38 AM - Posted by Oskars Apša (not verified)

If PcPer'll have the time and blessing of Nvidias highers and Geekbox Gods, make a review with a system comprised of these cards in sli and dual Intel Xeons and/or AMD Opterons.

February 19, 2013 | 10:52 AM - Posted by Anonymous (not verified)

Sweet mother of GOD!

I want! I can't wait to see the benches. They outta make Titan version the norm for future generations to bring down the x80 series lol.

February 19, 2013 | 11:23 AM - Posted by Anonymous (not verified)

It would be nice is someone would test this card with blender 3d to see how it rates for rendering against the Quadro GPUs that cost way more! Does anyone know of websites that benchmark open source software using GPUs, as most of the benchmarking sites are using Maya and other high priced software, and little opensource free software for their tests.

February 19, 2013 | 11:30 AM - Posted by PapaDragon

ALL that POWER from one Card, shame about the price. This is is my Adobe Premiere Pro and After Effects dream card, plus gaming! Too bad for the price.

Anyhow, a lot of great features, cant wait for the review, great informative video Ryan!

February 19, 2013 | 12:23 PM - Posted by David (not verified)

I wish NVIDIA had been a little more aggressive on the price but I still couldn't have afforded it. Just upgraded both my and my wife's system to 7970 GHzs because the bundle deal was too good to pass up.

That said, for all the people who say cards like this are overkill, my current Skyrim install would give this card a run for its money even at 1080p60.

February 19, 2013 | 12:32 PM - Posted by Anonymous (not verified)

Can I mount a 4-way Titan SLI?

February 19, 2013 | 03:44 PM - Posted by Josh Walrath

I don't know, that sounds uncomfortable.

February 19, 2013 | 02:53 PM - Posted by ezjohny

AMD here is your competition, What do you have in mind.
"NEXT GENERATION" gameplay AMD!
This graphic card is going to pack a punch people!!!

February 19, 2013 | 02:59 PM - Posted by Evan (not verified)

What is this? Lucky i got a ps3 last week! it looks better and pre ordered crysis 3 and it has blu ray so its better graphics!

February 19, 2013 | 04:32 PM - Posted by Scott Michaud

Lol... the sad part is there are some people who actually believe that.

February 19, 2013 | 05:20 PM - Posted by Anonymous (not verified)

The sad part is people who go from SD to HD and see no difference.

February 19, 2013 | 03:18 PM - Posted by papajohn

i will take out a loan to buy this card! :)

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.