GeForce GTX Titan Z Overclocking Testing

Subject: Graphics Cards | June 12, 2014 - 03:17 PM |
Tagged: overclocking, nvidia, gtx titan z, geforce

Earlier this week I posted a review of the NVIDIA GeForce GTX Titan Z graphics card, a dual-GPU Kepler GK110 part that currently sells for $3000. If you missed that article you should read it first and catch up but the basic summary was that, for PC gamers, it's slower and twice the price of AMD's Radeon R9 295X2.

During that article though I mentioned that the Titan Z had more variable clock speeds than any other GeForce card I had tested. At the time I didn't go any further than that since the performance of the card already pointed out the deficit it had going up against the R9 295X2. However, several readers asked me to dive into overclocking with the Titan Z and with that came the need to show clock speed changes. 

My overclocking was done through EVGA's PrecisionX software and we measured clock speeds with GPU-Z. The first step in overclocking an NVIDIA GPU is to simply move up the Power Target sliders and see what happens. This tells the card that it is allowed to consume more power than it would normally be allowed to, and then thanks to GPU Boost technology, the clock speed should scale up naturally. 

View Full Size

Click to Enlarge

And that is exactly what happened. I ran through 30 minutes of looped testing with Metro: Last Light at stock settings, with the Power Target at 112%, with the Power Target at 120% (the maximum setting) and then again with the Power Target at 120% and the GPU clock offset set to +75 MHz. 

That 75 MHz offset was the highest setting we could get to run stable on the Titan Z, which brings the Base clock up to 781 MHz and the Boost clock to 951 MHz. Though, as you'll see in our frequency graphs below the card was still reaching well above that.

View Full Size

Click to Enlarge

This graph shows clock rates of the GK110 GPUs on the Titan Z over the course of 25 minutes of looped Metro: Last Light gaming. The green line is the stock performance of the card without any changes to the power settings or clock speeds. While it starts out well enough, hitting clock rates of around 1000 MHz, it quickly dives and by 300 seconds of gaming we are often going at or under the 800 MHz mark. That pattern is consistent throughout the entire tested time and we have an average clock speed of 894 MHz.

Next up is the blue line, generated by simply moving the power target from 100% to 112%, giving the GPUs a little more thermal headroom to play with. The results are impressive, with a much more consistent clock speed. The yellow line, for the power target at 120%, is even better with a tighter band of clock rates and with a higher average clock. 

Finally, the red line represents the 120% power target with a +75 MHz offset in PrecisionX. There we see a clock speed consistency matching the yellow line but offset up a bit, as we have been taught to expect with NVIDIA's recent GPUs. 

View Full Size

Click to Enlarge

The result of all this data comes together in the bar graph here that lists the average clock rates over the entire 25 minute test runs. At stock settings, the Titan Z was able to hit 894 MHz, just over the "typical" boost clock advertised by NVIDIA of 876 MHz. That's good news for NVIDIA! Even though there is a lot more clock speed variance than I would like to see with the Titan Z, the clock speeds are within the expectations set by NVIDIA out the gate.

Bumping up that power target though will help out gamers that do invest in the Titan Z quite a bit. Just going to 112% results in an average clock speed of 993 MHz, a 100 MHz jump worth about 11% overall. When we push that power target up even further, and overclock the frequency offset a bit, we actually get an average clock rate of 1074 MHz, 20% faster than the stock settings. This does mean that our Titan Z is pulling more power and generating more noise (quite a bit more actually) with fan speeds going from around 2000 to 2700 RPM.

View Full Size

View Full Size

View Full Size

View Full Size

At both 2560x1440 and 3840x2160, in the Metro: Last Light benchmark we ran, the added performance of the Titan Z does put it at the same level of the Radeon R9 295X2. Of course, it goes without saying that we could also overclock the 295X2 a bit further to improve ITS performance, but this is an exercise in education.

View Full Size

Does it change my stance or recommendation for the Titan Z? Not really; I still think it is overpriced compared to the performance you get from AMD's offerings and from NVIDIA's own lower priced GTX cards. However, it does lead me to believe that the Titan Z could have been fixed and could have offered at least performance on par with the R9 295X2 had NVIDIA been willing to break PCIe power specs and increase noise.

UPDATE (6/13/14): Some of our readers seem to be pretty confused about things so I felt the need to post an update to the main story here. One commenter below mentioned that I was one of "many reviewers that pounded the R290X for the 'throttling issue' on reference coolers" and thinks I am going easy on NVIDIA with this story. However, there is one major difference that he seems to overlook: the NVIDIA results here are well within the rated specs. 

When I published one of our stories looking at clock speed variance of the Hawaii GPU in the form of the R9 290X and R9 290, our results showed that clock speed of these cards were dropping well below the rated clock speed of 1000 MHz. Instead I saw clock speeds that reached as low as 747 MHz and stayed near the 800 MHz mark. The problem with that was in how AMD advertised and sold the cards, using only the phrase "up to 1.0 GHz" in its marketing. I recommended that AMD begin selling the cards with a rated base clock and a typical boost clock instead only labeling with the, at the time, totally incomplete "up to" rating. In fact, here is the exact quote from this story: "AMD needs to define a "base" clock and a "typical" clock that users can expect." Ta da.

The GeForce GTX Titan Z though, as we look at the results above, is rated and advertised with a base clock of 705 MHz and a boost clock of 876 MHz. The clock speed comparison graph at the top of the story shows the green line (the card at stock) never hitting that 705 MHz base clock while averaging 894 MHz. That average is ABOVE the rated boost clock of the card. So even though the GPU is changing between frequencies more often than I would like, the clock speeds are within the bounds set by NVIDIA. That was clearly NOT THE CASE when AMD launched the R9 290X and R9 290. If NVIDIA had sold the Titan Z with only the specification of "up to 1006 MHz" or something like then the same complaint would be made. But it is not.

The card isn't "throttling" at all, in fact, as someone specifies below. That term insinuates that it is going below a rated performance rating. It is acting in accordance with the GPU Boost technology that NVIDIA designed.

Some users seem concerned about temperature: the Titan Z will hit 80-83C in my testing, both stock and overclocked, and simply scales the fan speed to compensate accordingly. Yes, overclocked, the Titan Z gets quite a bit louder but I don't have sound level tests to show that. It's louder than the R9 295X2 for sure but definitely not as loud as the R9 290 in its original, reference state.

Finally, some of you seem concerned that I was restrticted by NVIDIA on what we could test and talk about on the Titan Z. Surprise, surprise, NVIDIA didn't send us this card to test at all! In fact, they were kind of miffed when I did the whole review and didn't get into showing CUDA benchmarks. So, there's that.

June 12, 2014 | 03:31 PM - Posted by Anonymous (not verified)

What up with your reviews as of late. They seam very incomplete with just side mentions of temperatures and noise levels.

You said in your video review the 295x2 flaw was its temperature but yet haven't provided any number to compare it.

Seams at stock the Titan Z is throttling like heck. It would be nice if you provided similar results from 295x2 and 780 Ti SLI.

June 12, 2014 | 03:38 PM - Posted by Ryan Shrout

Temps on the Titan Z never get above 83C. Fans just keep getting faster to compensate.

June 12, 2014 | 03:49 PM - Posted by Ryan Shrout

Also please tell me where I said that temperature was the problem for the 295X2 because that seems unlikely

June 12, 2014 | 04:44 PM - Posted by Anonymous (not verified)

Certainly.

http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-GeForce-GTX-TITAN-Z-R...

In the video overview
http://www.youtube.com/watch?v=A4tseWkngtY&feature=player_embedded#t=664

There is no mention on the initial review of the 295x2 on temps or any comparisons of temps in the Titan Z review yet you allude to it as a flaw in comparison but didn't provide any data to back it up in the video or in the written review.

"AMD released this and it is the highest performing single graphics card in the world. Its not perfect its go its flaws. Power consumption noise, heat and all that."

June 12, 2014 | 05:09 PM - Posted by Ryan Shrout

I say that in the video but not in the article? If so that was a mistake. Power and noise are definitely correct. AMD moved the target temp of the GPUs down to 80C on the 295X2, down from 95C on the 290X.

June 12, 2014 | 11:31 PM - Posted by LtMatt

Ryan i thought the temp limit of the 295x2 was 75c? That's what i read in another review. Maybe the throttle limit is actually 80c though. What do you think, is it definitely 80c?

June 12, 2014 | 11:47 PM - Posted by Anonymous (not verified)

Whether AMD moved it from 95C to 80C is relevant if the operating temperate reaches such. Most websites are reporting 295x2 average temp between 55C-65C.

Even the co-review PCPer did with HardOcp on the 295x2s in crossfire. HardOcp included temperatures with screenshots and PCPer didn't.

Comparative data between the three would be more helpful like most reputable review sites. These reviews of late just seam like previews with limited games tested and specifics just glanced over, ignored or just left out completely.

June 12, 2014 | 05:09 PM - Posted by Airbrushkid (not verified)

I think that he was refering to this https://www.youtube.com/watch?v=A4tseWkngtY&feature=player_detailpage#t=676

June 12, 2014 | 04:16 PM - Posted by arbiter

About only unanswered bit would be what does the targets do for power usage compared to stock settings.

June 12, 2014 | 04:17 PM - Posted by pdjblum

Wow, all us gamers need is a graphics card costing between $1500 and $3000 to get more than 40fps 75% of the time on a 4K monitor. As hard as you are pushing 4k gaming, I must be the only mofo that is a regular on this site that can't imagine spending that much on a card, much less one that is going to be obsolete in a year or two. If I am the only one, then continue. If I am not, then why don't you focus more on resolutions up to 2560 x 1440. Even the latter res is hard to push with the best of these cards. In that you get them for free and Allyn seems to have all the money in the world, you seem to be oblivious to your readers, unless of course, as I said, I am the only one who can't afford this shit.

June 12, 2014 | 04:41 PM - Posted by eddie (not verified)

Your not the only one who would never consider buying this card. I think the general consecutive on the internet is that this card is stupid. With all that being said it is still topical and as a reviewer it is still their job to at least do something with the card.

June 13, 2014 | 06:40 AM - Posted by Lord Binky (not verified)

If he doesn't do 4K, then manufacturers will just point to the review saying that there is no need to push for 4k gaming performance.

I for one would like a $300 graphics card that does 4k @ 60fps next year, so keep at it!

June 12, 2014 | 06:02 PM - Posted by Rick (not verified)

What was the noise level of the card at overclocked speed?

June 12, 2014 | 08:18 PM - Posted by Anonymous (not verified)

"overclock the 295X2 a bit further to improve IT'S performance"

When using possessive, it's spelled "its", not "it's".

June 12, 2014 | 08:35 PM - Posted by snook

its fine asshole.

June 12, 2014 | 10:03 PM - Posted by ImmenseBrick (not verified)

What is up with all the hate on Pcper and Ryan? I for one was curious of the Titan Z, and while I will never own one it is nice that Ryan managed to cover this beast. For the record, Ryan and Pcper are not pushing 4k, they are catering to what the growing enthusiast crowd will eventually move toward. Trolls be damned, go home and hide in your cave. Thank you Ryan and crew for the excellent coverage as always. Keep up the excellent coverage.

June 12, 2014 | 10:34 PM - Posted by Anonymous (not verified)

Come on Ryan, get consistent. Weren't you one of the many reviewers that pounded the R290X for the 'throttling issue' on reference coolers? This TitanicZ is one of the worst throttling cards I've seen. How about that noise level when the fan 'spins up to compensate for the added temp'? Or what about the power consumption when this thing is running at it's maximum? Etc. etc.? Where's the brouhaha?

So let me guess, power consumption sky rockets, noise goes through the roof and it's still slower than the R9 295X2?

What's the deal? Did nv place restrictions on testing and what you can and can't show? Or did you buy it and need to show as good of results as possible in hopes that you can resell it? Come on, call it like it is. It even gets trashed in GPGPU as shown by hardware info tests.

At the end of the video you claimed that if it were priced the same as the R9 295X2, there would be all kinds of reasons to recommend it over AMD's card. What reasons are those exactly?

June 13, 2014 | 04:51 AM - Posted by Relayer (not verified)

Agreed +1

June 13, 2014 | 08:41 PM - Posted by Anonymous (not verified)

+1

June 13, 2014 | 04:18 AM - Posted by Mountainlifter

Typo?

" When we push that power target up even further, and overclock the frequency offset a bit, we actually get an average frame rate of 1074 MHz, 20% faster than the stock settings."

Didn't you mean average clock of 1074MHz?

June 13, 2014 | 06:18 AM - Posted by Ryan Shrout

Yup, thanks!

June 13, 2014 | 05:09 AM - Posted by Anonymous (not verified)

So, very unstable frequencies, drops under advertised speed, bad frame latency, noise ...

I remember more articles pointing each of those one by one when it was a 500$ 290X outperforming a 450$ GTX780 by 15%.

Are those coming for the nVidia card now ?
Or, strangely, well you know .... not coming ?

June 13, 2014 | 06:19 AM - Posted by Ryan Shrout

To all the commenters above, please read the update I added to the story. It should help address your issues.

June 13, 2014 | 11:51 AM - Posted by Anonymous (not verified)


Some of our readers seem to be pretty confused about things so I felt the need to post an update to the main story here. One commenter below mentioned that I was one of "many reviewers that pounded the R290X for the 'throttling issue' on reference coolers" and thinks I am going easy on NVIDIA with this story. However, there is one major difference that he seems to overlook: the NVIDIA results here are well within the rated specs.

The card isn't "throttling" at all, in fact, as someone specifies below. That term insinuates that it is going below a rated performance rating. It is acting in accordance with the GPU Boost technology that NVIDIA designed.

Are you joking? Really?

This is why so many think your full of it and bias. You seam to impose your own definition of throttling. A card could throttle no matter if its with-in spec or not. Both chips have the tech to impose throttling when the criteria is met.

throt·tle
to regulate and especially to reduce the speed of (as an engine) by such means.

Try to retain some of your dignity and for god sake be consistant when doing reviews.

June 13, 2014 | 12:25 PM - Posted by Anony Mouse (not verified)

+1

If you look at it from Ryans point of view. It's okay to impose a artificial throttle aslong as its above advertise clocks.

The referance 290X had a reason for throttling. Reaching its thermal spec.

What is Titan Z reason for throttling, fan speed regulation? In an effort to win on the noise front they imposed a artificial throttle.

June 13, 2014 | 05:54 PM - Posted by Ryan Shrout

Well, they set a thermal limit on the GPU of 83C in this case with the Titan Z. The clock rates adjust accordingly. It's the same idea that AMD does essentially, but NVIDIA simply advertises it in a more honest fashion in my view.

June 13, 2014 | 06:18 PM - Posted by Anonymous (not verified)

So your admitting to being bias toward Nvidia due to disliking AMD marking of PowerTune technology ?

Truth sha'll set you free.

No wonder we will never see the same treatment you gave AMDs PowerTune be given to Nvidia GPU Boost 2.0.

June 13, 2014 | 05:53 PM - Posted by Ryan Shrout

You guys slay me. :) But we love our readers!

Let me ask you this, do you consider the clock rate changes that the Intel Core i7-4770K goes through between the 3.5 GHz base clock and 3.9 GHz turbo clock to be "throttling"? I would be not.

June 14, 2014 | 01:04 PM - Posted by Anonymous (not verified)

A CPU can throttle too if its hitting its thermal limit. You already establish the Titan Z was doing just that?

http://www.pcper.com/image/view/42883?return=node%2F60538

Someone should fire this guy. Seriously who ever runs this site should think twice of having this guy on the pay roll.

These readers deserve better and frankly probably would get better from a amature blog site at this point.

June 15, 2014 | 05:23 AM - Posted by snook

if you even made a point im not sure. but, he would have to fire himself.....genius.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.