GeForce GTX Titan Z Overclocking Testing

Subject: Graphics Cards | June 12, 2014 - 06:17 PM |
Tagged: overclocking, nvidia, gtx titan z, geforce

Earlier this week I posted a review of the NVIDIA GeForce GTX Titan Z graphics card, a dual-GPU Kepler GK110 part that currently sells for $3000. If you missed that article you should read it first and catch up but the basic summary was that, for PC gamers, it's slower and twice the price of AMD's Radeon R9 295X2.

During that article though I mentioned that the Titan Z had more variable clock speeds than any other GeForce card I had tested. At the time I didn't go any further than that since the performance of the card already pointed out the deficit it had going up against the R9 295X2. However, several readers asked me to dive into overclocking with the Titan Z and with that came the need to show clock speed changes. 

My overclocking was done through EVGA's PrecisionX software and we measured clock speeds with GPU-Z. The first step in overclocking an NVIDIA GPU is to simply move up the Power Target sliders and see what happens. This tells the card that it is allowed to consume more power than it would normally be allowed to, and then thanks to GPU Boost technology, the clock speed should scale up naturally. 

View Full Size

Click to Enlarge

And that is exactly what happened. I ran through 30 minutes of looped testing with Metro: Last Light at stock settings, with the Power Target at 112%, with the Power Target at 120% (the maximum setting) and then again with the Power Target at 120% and the GPU clock offset set to +75 MHz. 

That 75 MHz offset was the highest setting we could get to run stable on the Titan Z, which brings the Base clock up to 781 MHz and the Boost clock to 951 MHz. Though, as you'll see in our frequency graphs below the card was still reaching well above that.

View Full Size

Click to Enlarge

This graph shows clock rates of the GK110 GPUs on the Titan Z over the course of 25 minutes of looped Metro: Last Light gaming. The green line is the stock performance of the card without any changes to the power settings or clock speeds. While it starts out well enough, hitting clock rates of around 1000 MHz, it quickly dives and by 300 seconds of gaming we are often going at or under the 800 MHz mark. That pattern is consistent throughout the entire tested time and we have an average clock speed of 894 MHz.

Next up is the blue line, generated by simply moving the power target from 100% to 112%, giving the GPUs a little more thermal headroom to play with. The results are impressive, with a much more consistent clock speed. The yellow line, for the power target at 120%, is even better with a tighter band of clock rates and with a higher average clock. 

Finally, the red line represents the 120% power target with a +75 MHz offset in PrecisionX. There we see a clock speed consistency matching the yellow line but offset up a bit, as we have been taught to expect with NVIDIA's recent GPUs. 

View Full Size

Click to Enlarge

The result of all this data comes together in the bar graph here that lists the average clock rates over the entire 25 minute test runs. At stock settings, the Titan Z was able to hit 894 MHz, just over the "typical" boost clock advertised by NVIDIA of 876 MHz. That's good news for NVIDIA! Even though there is a lot more clock speed variance than I would like to see with the Titan Z, the clock speeds are within the expectations set by NVIDIA out the gate.

Bumping up that power target though will help out gamers that do invest in the Titan Z quite a bit. Just going to 112% results in an average clock speed of 993 MHz, a 100 MHz jump worth about 11% overall. When we push that power target up even further, and overclock the frequency offset a bit, we actually get an average clock rate of 1074 MHz, 20% faster than the stock settings. This does mean that our Titan Z is pulling more power and generating more noise (quite a bit more actually) with fan speeds going from around 2000 to 2700 RPM.

View Full Size

View Full Size

View Full Size

View Full Size

At both 2560x1440 and 3840x2160, in the Metro: Last Light benchmark we ran, the added performance of the Titan Z does put it at the same level of the Radeon R9 295X2. Of course, it goes without saying that we could also overclock the 295X2 a bit further to improve ITS performance, but this is an exercise in education.

View Full Size

Does it change my stance or recommendation for the Titan Z? Not really; I still think it is overpriced compared to the performance you get from AMD's offerings and from NVIDIA's own lower priced GTX cards. However, it does lead me to believe that the Titan Z could have been fixed and could have offered at least performance on par with the R9 295X2 had NVIDIA been willing to break PCIe power specs and increase noise.

UPDATE (6/13/14): Some of our readers seem to be pretty confused about things so I felt the need to post an update to the main story here. One commenter below mentioned that I was one of "many reviewers that pounded the R290X for the 'throttling issue' on reference coolers" and thinks I am going easy on NVIDIA with this story. However, there is one major difference that he seems to overlook: the NVIDIA results here are well within the rated specs. 

When I published one of our stories looking at clock speed variance of the Hawaii GPU in the form of the R9 290X and R9 290, our results showed that clock speed of these cards were dropping well below the rated clock speed of 1000 MHz. Instead I saw clock speeds that reached as low as 747 MHz and stayed near the 800 MHz mark. The problem with that was in how AMD advertised and sold the cards, using only the phrase "up to 1.0 GHz" in its marketing. I recommended that AMD begin selling the cards with a rated base clock and a typical boost clock instead only labeling with the, at the time, totally incomplete "up to" rating. In fact, here is the exact quote from this story: "AMD needs to define a "base" clock and a "typical" clock that users can expect." Ta da.

The GeForce GTX Titan Z though, as we look at the results above, is rated and advertised with a base clock of 705 MHz and a boost clock of 876 MHz. The clock speed comparison graph at the top of the story shows the green line (the card at stock) never hitting that 705 MHz base clock while averaging 894 MHz. That average is ABOVE the rated boost clock of the card. So even though the GPU is changing between frequencies more often than I would like, the clock speeds are within the bounds set by NVIDIA. That was clearly NOT THE CASE when AMD launched the R9 290X and R9 290. If NVIDIA had sold the Titan Z with only the specification of "up to 1006 MHz" or something like then the same complaint would be made. But it is not.

The card isn't "throttling" at all, in fact, as someone specifies below. That term insinuates that it is going below a rated performance rating. It is acting in accordance with the GPU Boost technology that NVIDIA designed.

Some users seem concerned about temperature: the Titan Z will hit 80-83C in my testing, both stock and overclocked, and simply scales the fan speed to compensate accordingly. Yes, overclocked, the Titan Z gets quite a bit louder but I don't have sound level tests to show that. It's louder than the R9 295X2 for sure but definitely not as loud as the R9 290 in its original, reference state.

Finally, some of you seem concerned that I was restrticted by NVIDIA on what we could test and talk about on the Titan Z. Surprise, surprise, NVIDIA didn't send us this card to test at all! In fact, they were kind of miffed when I did the whole review and didn't get into showing CUDA benchmarks. So, there's that.

Video News

June 12, 2014 | 06:31 PM - Posted by Anonymous (not verified)

What up with your reviews as of late. They seam very incomplete with just side mentions of temperatures and noise levels.

You said in your video review the 295x2 flaw was its temperature but yet haven't provided any number to compare it.

Seams at stock the Titan Z is throttling like heck. It would be nice if you provided similar results from 295x2 and 780 Ti SLI.

June 12, 2014 | 06:38 PM - Posted by Ryan Shrout

Temps on the Titan Z never get above 83C. Fans just keep getting faster to compensate.

June 12, 2014 | 06:49 PM - Posted by Ryan Shrout

Also please tell me where I said that temperature was the problem for the 295X2 because that seems unlikely

June 12, 2014 | 07:44 PM - Posted by Anonymous (not verified)


In the video overview

There is no mention on the initial review of the 295x2 on temps or any comparisons of temps in the Titan Z review yet you allude to it as a flaw in comparison but didn't provide any data to back it up in the video or in the written review.

"AMD released this and it is the highest performing single graphics card in the world. Its not perfect its go its flaws. Power consumption noise, heat and all that."

June 12, 2014 | 08:09 PM - Posted by Ryan Shrout

I say that in the video but not in the article? If so that was a mistake. Power and noise are definitely correct. AMD moved the target temp of the GPUs down to 80C on the 295X2, down from 95C on the 290X.

June 13, 2014 | 02:31 AM - Posted by LtMatt

Ryan i thought the temp limit of the 295x2 was 75c? That's what i read in another review. Maybe the throttle limit is actually 80c though. What do you think, is it definitely 80c?

June 13, 2014 | 02:47 AM - Posted by Anonymous (not verified)

Whether AMD moved it from 95C to 80C is relevant if the operating temperate reaches such. Most websites are reporting 295x2 average temp between 55C-65C.

Even the co-review PCPer did with HardOcp on the 295x2s in crossfire. HardOcp included temperatures with screenshots and PCPer didn't.

Comparative data between the three would be more helpful like most reputable review sites. These reviews of late just seam like previews with limited games tested and specifics just glanced over, ignored or just left out completely.

June 12, 2014 | 08:09 PM - Posted by Airbrushkid (not verified)

I think that he was refering to this

June 12, 2014 | 07:16 PM - Posted by arbiter

About only unanswered bit would be what does the targets do for power usage compared to stock settings.

June 12, 2014 | 07:17 PM - Posted by pdjblum

Wow, all us gamers need is a graphics card costing between $1500 and $3000 to get more than 40fps 75% of the time on a 4K monitor. As hard as you are pushing 4k gaming, I must be the only mofo that is a regular on this site that can't imagine spending that much on a card, much less one that is going to be obsolete in a year or two. If I am the only one, then continue. If I am not, then why don't you focus more on resolutions up to 2560 x 1440. Even the latter res is hard to push with the best of these cards. In that you get them for free and Allyn seems to have all the money in the world, you seem to be oblivious to your readers, unless of course, as I said, I am the only one who can't afford this shit.

June 12, 2014 | 07:41 PM - Posted by eddie (not verified)

Your not the only one who would never consider buying this card. I think the general consecutive on the internet is that this card is stupid. With all that being said it is still topical and as a reviewer it is still their job to at least do something with the card.

June 13, 2014 | 09:40 AM - Posted by Lord Binky (not verified)

If he doesn't do 4K, then manufacturers will just point to the review saying that there is no need to push for 4k gaming performance.

I for one would like a $300 graphics card that does 4k @ 60fps next year, so keep at it!

June 12, 2014 | 09:02 PM - Posted by Rick (not verified)

What was the noise level of the card at overclocked speed?

June 12, 2014 | 11:18 PM - Posted by Anonymous (not verified)

"overclock the 295X2 a bit further to improve IT'S performance"

When using possessive, it's spelled "its", not "it's".

June 12, 2014 | 11:35 PM - Posted by snook

its fine asshole.

June 13, 2014 | 01:03 AM - Posted by ImmenseBrick (not verified)

What is up with all the hate on Pcper and Ryan? I for one was curious of the Titan Z, and while I will never own one it is nice that Ryan managed to cover this beast. For the record, Ryan and Pcper are not pushing 4k, they are catering to what the growing enthusiast crowd will eventually move toward. Trolls be damned, go home and hide in your cave. Thank you Ryan and crew for the excellent coverage as always. Keep up the excellent coverage.

June 13, 2014 | 01:34 AM - Posted by Anonymous (not verified)

Come on Ryan, get consistent. Weren't you one of the many reviewers that pounded the R290X for the 'throttling issue' on reference coolers? This TitanicZ is one of the worst throttling cards I've seen. How about that noise level when the fan 'spins up to compensate for the added temp'? Or what about the power consumption when this thing is running at it's maximum? Etc. etc.? Where's the brouhaha?

So let me guess, power consumption sky rockets, noise goes through the roof and it's still slower than the R9 295X2?

What's the deal? Did nv place restrictions on testing and what you can and can't show? Or did you buy it and need to show as good of results as possible in hopes that you can resell it? Come on, call it like it is. It even gets trashed in GPGPU as shown by hardware info tests.

At the end of the video you claimed that if it were priced the same as the R9 295X2, there would be all kinds of reasons to recommend it over AMD's card. What reasons are those exactly?

June 13, 2014 | 07:51 AM - Posted by Relayer (not verified)

Agreed +1

June 13, 2014 | 11:41 PM - Posted by Anonymous (not verified)


June 13, 2014 | 07:18 AM - Posted by Mountainlifter


" When we push that power target up even further, and overclock the frequency offset a bit, we actually get an average frame rate of 1074 MHz, 20% faster than the stock settings."

Didn't you mean average clock of 1074MHz?

June 13, 2014 | 09:18 AM - Posted by Ryan Shrout

Yup, thanks!

June 13, 2014 | 08:09 AM - Posted by Anonymous (not verified)

So, very unstable frequencies, drops under advertised speed, bad frame latency, noise ...

I remember more articles pointing each of those one by one when it was a 500$ 290X outperforming a 450$ GTX780 by 15%.

Are those coming for the nVidia card now ?
Or, strangely, well you know .... not coming ?

June 13, 2014 | 09:19 AM - Posted by Ryan Shrout

To all the commenters above, please read the update I added to the story. It should help address your issues.

June 13, 2014 | 02:51 PM - Posted by Anonymous (not verified)

Some of our readers seem to be pretty confused about things so I felt the need to post an update to the main story here. One commenter below mentioned that I was one of "many reviewers that pounded the R290X for the 'throttling issue' on reference coolers" and thinks I am going easy on NVIDIA with this story. However, there is one major difference that he seems to overlook: the NVIDIA results here are well within the rated specs.

The card isn't "throttling" at all, in fact, as someone specifies below. That term insinuates that it is going below a rated performance rating. It is acting in accordance with the GPU Boost technology that NVIDIA designed.

Are you joking? Really?

This is why so many think your full of it and bias. You seam to impose your own definition of throttling. A card could throttle no matter if its with-in spec or not. Both chips have the tech to impose throttling when the criteria is met.

to regulate and especially to reduce the speed of (as an engine) by such means.

Try to retain some of your dignity and for god sake be consistant when doing reviews.

June 13, 2014 | 03:25 PM - Posted by Anony Mouse (not verified)


If you look at it from Ryans point of view. It's okay to impose a artificial throttle aslong as its above advertise clocks.

The referance 290X had a reason for throttling. Reaching its thermal spec.

What is Titan Z reason for throttling, fan speed regulation? In an effort to win on the noise front they imposed a artificial throttle.

June 13, 2014 | 08:54 PM - Posted by Ryan Shrout

Well, they set a thermal limit on the GPU of 83C in this case with the Titan Z. The clock rates adjust accordingly. It's the same idea that AMD does essentially, but NVIDIA simply advertises it in a more honest fashion in my view.

June 13, 2014 | 09:18 PM - Posted by Anonymous (not verified)

So your admitting to being bias toward Nvidia due to disliking AMD marking of PowerTune technology ?

Truth sha'll set you free.

No wonder we will never see the same treatment you gave AMDs PowerTune be given to Nvidia GPU Boost 2.0.

June 13, 2014 | 08:53 PM - Posted by Ryan Shrout

You guys slay me. :) But we love our readers!

Let me ask you this, do you consider the clock rate changes that the Intel Core i7-4770K goes through between the 3.5 GHz base clock and 3.9 GHz turbo clock to be "throttling"? I would be not.

June 14, 2014 | 04:04 PM - Posted by Anonymous (not verified)

A CPU can throttle too if its hitting its thermal limit. You already establish the Titan Z was doing just that?

Someone should fire this guy. Seriously who ever runs this site should think twice of having this guy on the pay roll.

These readers deserve better and frankly probably would get better from a amature blog site at this point.

June 15, 2014 | 08:23 AM - Posted by snook

if you even made a point im not sure. but, he would have to fire himself.....genius.

June 13, 2014 | 11:01 AM - Posted by Anonymous (not verified)

Power consumption results would be nice :)

June 13, 2014 | 12:21 PM - Posted by Anonymous (not verified)

Damn the butt hurt fanboys, Wah! Wah! Wah!, hay the World Cup match, that's where the fanboyism needs to go! all this over a GPU card that was marketed to gamers as an afterthought, a card meant for non graphics professional, scientific/finance computation, and priced accordingly(still less than a Quadro, scientific and finance users don't want or need the Quadro level graphics drivers$$$)! So Nvidia decides to market it to some Trust Fund enabled gamers with silver spoons in their mouths, so what, if you can afford it, spend the chump change, if you can not, just wait for the next Nvidia generation to be released and the Titan Z will come down in price.

Did you not notice the double precision capability on the Titan Z, and not see that it was meant for a niche market of scientific/financial uses, without the need for the uber expensive professional graphics drivers! Bleeding edge Nvidia GPUs have always bled the wallet dry, and Nvidia is more so in the GPGPU/HPC/supercomputer accelerator business, than just in gaming alone, that HPC/GPGPU/Supercomputer business makes the money that pays for the R&D, and with IBM using Nvidia next generation GPUs on a mezzanine modules as GPGPU accelerators, for the Power8 server CPUs, expect Nvidia to not be just all about low priced gaming even more than before.

June 13, 2014 | 12:35 PM - Posted by Anonymous (not verified)

This above post was not meant as a reply to a particular post is was posted as a reply by mistake.

June 13, 2014 | 08:54 PM - Posted by Ryan Shrout


June 14, 2014 | 03:08 PM - Posted by Anonymous (not verified)

Still waiting.

June 14, 2014 | 03:09 PM - Posted by Anonymous (not verified)

Still waiting.

June 14, 2014 | 06:55 PM - Posted by agentkhiem (not verified)

I understand that you need to defend the integrity of your article, but attacking a reader in that style is not how I remember this AMD site became popular.

June 15, 2014 | 03:42 AM - Posted by Anonymous (not verified)


June 15, 2014 | 10:55 AM - Posted by Anonymous (not verified)

The Titan Z doesn't throttle. To say it's throttling would imply it's held back and under-performing. It's running faster than the quoted Nvidia 'Base Clock' and as per the graphs it doesn't take much to get a decent overclock out of it.

The fact is the (reference) 290X's quoted clock speed was 'Up to 1000Mhz'. How often did it hit those clocks? Only for the first ~5mins under load, after which temps became too hot, so the GPU reduced voltages hence slowing clocks (by ~200Mhz or more (throttling itself)).

The issue is more that AMD exaggerated Reference 290X's clocks (I guess technically they were 'Up To 1000Mhz'). If AMD defined a 'Base Clock' instead they would have avoided a lot of the drama this caused around launch.

All it doesn't matter too much now the AiB solutions are available.

June 15, 2014 | 02:01 PM - Posted by Anonymous (not verified)

That's Ryans issue with the marketing and he seams butt hurt over it. He cant seam to get passed it because he doesnt like it.

Just visit the AMD website You will see all R9 series cards are marketed as "Up to ? MHz"

Very childish to be against a brand because you can't comprehend one word. That's what dictionaries are for.

UP TO preposition

Definition of UP TO

—used as a function word to indicate extension as far as a specified place

—used as a function word to indicate a limit or boundary

June 15, 2014 | 07:32 PM - Posted by Anonymous (not verified)

This is what a real review should look like. Nvidia GeForce GTX Titan-Z SLI review incl. Tones TIZAir system

June 16, 2014 | 01:10 AM - Posted by ThorAxe

So many AMD trolls in here trying to justify their flawed 290X marketing it's almost laughable.

June 16, 2014 | 04:41 PM - Posted by 290x owner (not verified)

While Ryan Does seem to favor Nvidia I do not think his points are without basis. Up to 1Ghz does imply that you actually can hit that. I really think AMD should specify base clocks. Raising power limits allows the Titan-Z to hit higher clocks just like adjusting power and thermal limits lets a 290x hit higher clocks. No big surprise there.
The Titan-Z is within specs the whole time... It just takes overclocking it to get it on par with a stock 295x2.

There is a reason why no one recommends the Titan-Z.

It is ok to have a preference so long as it does not slant your analysis of the facts. I think Ryan has done a decent job of this.

Throttling vs Turbo. While throttling has a more negative inference it is probably still the best term to talk about dropping to a lower turbo state.

June 16, 2014 | 07:00 PM - Posted by snook

a better launch blower and none of this silliness is possible. AMD messed that up. the "up to" hasn't been a problem since aftermarket coolers. Nvidia always sets the lowest possible clock ratings for boost and base. then it's WOW they say 850Mhz boost mine gets 950Mhz...equally stupid BS from them.

now, GPUboost 2.0 is freaking awesome. those graphs don't show throttling, they show GPUboost doing what it does, keeping performance (fps) as stable as possible. AMD drivers are up to par now basically, but gpu software control is Nvidia's territory, for the time being.

June 16, 2014 | 07:27 PM - Posted by Anonymous (not verified)

Why do people think you have to "like AMD and Nvidias products equally"?

Thats such a stupid mentality. One of the companies makes better products. The same is true of cars and pretty much everything else.

The Titan Z costs twice as much, doesnt require a wiring diagram or exceeding the specifications of the power connectors, a water loop and a radiator, and it has about 2x the double precision performance(which is the ONLY reason to buy a Titan over a 780ti).

Two Titan Blacks are a much better value though.

All of the AMD fanboys need to stop.

June 16, 2014 | 08:18 PM - Posted by Anonymous (not verified)

The link a few post above that was posted has double precision performance test. Titan Z beats the 295x2 only 2 out of 6 test.

One of the companies does make a better product just not for $3000 dollars.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

This question is for testing whether you are a human visitor and to prevent automated spam submissions.