Review Index:

AMD Radeon R9 290X Hawaii Review - Taking on the TITANs

Author: Ryan Shrout
Manufacturer: AMD

PowerTune and Variable Clock Rates

Probably the biggest change to the Hawaii GPU, other than the shader count and raw performance, is how AMD PowerTune is integrated and how AMD is managing power, voltage, temperature, and clock speeds. 

View Full Size

Gone are the days of the deterministic algorithm found in Tahiti as Hawaii uses a "fuzzy math" system to actively monitor and set nearly all aspects of the GPU power and performance.  Using a combination of hardware monitors, accumulators and serial interfaces on the hardware with power estimation tools and weighted functions, AMD is able to shift the clock, voltage and fan speeds at a very fast pace.

View Full Size

This new power logic is actually built into the 290X as well as the 260X / 7790 and the FM2 APUs as well showing the flexibility of the design.  This controller is able to switch voltages on the order of 10 microseconds delay with 6.25 mV granularity.  By doing this it can adjust performance of the R9 290X based on temperatures and the environment in which it is placed.

View Full Size

In many ways, the new AMD PowerTune technology is similar to NVIDIA introduced with Kepler GPUs.  The graphics card now determines on the fly, rather than set from the firmware or vendor, what clock speeds and voltages are safe based on the power levels and voltages required to reach them.  Fan speeds are really the heart of this algorithm though and AMD says it started there when looking at setting default parameters. 

View Full Size

The idea is simple enough: vary the clock speed of the 290X based on the temperature it is currently running, any headroom it may have left in terms of voltage / power, and the maximum fan speed set by the user (or firmware).  Managing it all can be difficult, though. It can lead to some interesting side effects.

View Full Size

Notice in this graph that the operating temperature of the GPU is approaching 95C. This is quite high in my book despite AMD's assurances that this temperature is safe for the ASIC to run at for its entire life. 

View Full Size

And while this graph isn't given any scale on either clock speeds or time, we have some of our data on how the 290X actually performs in real-world conditions.


Variable Clock Rates

Remember on the last page when we mentioned the "up to" 1.0 GHz clock rate on the R9 290X?  As it turns out, this measurement isn't very useful for guessing the clock speed at any given time after the first 2-3 minutes of game play spin up. 

View Full Size

What you are looking at here is the clock rate on a per-second basis as reported by a new version of GPU-Z supplied by AMD during a test run of Crysis 3.  You can see that while we are at 1000 MHz for a short period of time, it drops below that and wanders into the 900 MHz and 800 MHz range as well.

In my experience with the R9 290X, the ability to maintain a 1000 MHz clock speed was impossible in any game where you were running more than 5 minutes.  Even "easy" titles like Skyrim dropped well below the "up to" 1.0  GHz rate.

To showcase the effect this can have on clock speeds and performance I took a 60 second snip from GPU-Z recordings after 5-8 minutes of gameplay.  I then averaged the clock speed of those 60 data points to make the graph below.

View Full Size

This should represent the clock speeds of long-term gaming sessions in a case and environment with good air flow and decent ambient temperatures.  Notice that not one of them actually averaged over 900 MHz (though Skyrim got very close) and Bioshock Infinite actually was the slowest at 821 MHz.  That is an 18% drop in frequency from the "up to" rating that AMD provided us and provided add-in card vendors. 

But does this really matter?  I think AMD needs to come up with a frequency rating that is more useful than "up to" for the R9 290X as it misleading at best.  NVIDIA adopted a "typical Boost clock" for its GPUs that seems to have worked and also indicates a "base clock" that they guarantee the GPU will never go below during normal games.  R9 290X users should want the same information.

In terms of performance though, it won't change much of how we test.  All this means now is that we needed to "warm up" the GPU each time we were ready to benchmark it.  I tended to sit in the game for at least 5 minutes before running our normal test run and I think that is plenty of time to get the GPU up to its 95C operating temperature and push clocks to realistic levels.  Be wary of benchmarks results that DO NOT take that into account as they could be 10%+ faster than real-world results would indicate. 

View Full Size

View Full Size

In both Battlefield 3 and GRID 2, these differences produced results that you should be wary of!  While not earth-shattering differences in performance, in battles that are often won by single-digit percentages, reviewers and buyers can't be too careful.

October 24, 2013 | 12:19 AM - Posted by Mnemonicman

Can't wait to see how this card performs under watercooling.

October 24, 2013 | 01:10 AM - Posted by Indeed (not verified)

Watercooling is probably called for...the higher power density makes it harder to cool.

October 24, 2013 | 12:27 AM - Posted by arbiter

Givin' all the hype this card got over last few months, how it would be a titan killer, etc. Looking at most games its pretty much on even group with titan and almost on par with 780. Still question on 95c temp if that was where it would get to in games? If givin' the same temp limit on which side would win as 780/titan cards do have a some bit of overclocking headroom. 95c is what it got to in gaming tests, being open air bench surely that would effect what real world would see when its in an enclosed case. If those prices are correct then it does look to be a good deal.

October 24, 2013 | 01:14 AM - Posted by Fergdog (not verified)

I don't see why there'd be a lot of hype for this card. It's on the same 28nm and it's the same architecture as the 7790 which was just very minor compute changes. It gives good competition to Nvidia's high end card that are ridiculously priced which is very good though.

October 24, 2013 | 12:51 AM - Posted by PapaDragon

The first page I went to was the Battlefield 3 Great Performance all around!!! Im surprised that it Cost $550 (good) but that heat output/noise and temp..its ok. Cant wait to see the Lightning/Direct Cu versions. Also, cant wait for the price drops the GTX 780 too.

@ Ryan, in the BF3 section , the screen shot of the BF3 settings puts it at 1920/1200, but the graphs show it at 1080p??

2. The Sound levels of the 280x are missing!

Thanks 4 the review!!

October 24, 2013 | 10:42 AM - Posted by Ryan Shrout

Trust the settings, not the resolution on that screen.  :)

October 24, 2013 | 01:18 AM - Posted by Scyy (not verified)

I see what you meant about watch for other reviews not allowing the cards to throttle. Everywhere else is showing huge leads that likely diminish somewhat after playing a short while and the card throttles.

October 24, 2013 | 01:38 AM - Posted by Anonymous (not verified)

The card looks like a toy compared to the titan visually, also probably contributes to its lack of thermals. The quality of the plastics etc. But wow on the price.!! I dunno I guess for the budget limited would go for this, but the higher quality card is the titan. If you got a high end pc your still gonna get the titan, whats a few hundred bucks when you got 5 or 7k in it.

October 24, 2013 | 02:19 AM - Posted by MArc-Andre B (not verified)

flip uber mode + fan slider to 100% (will run up to 100% but not always) and watch the magic! (maybe save yourself warming up the card 5 min too!)

October 24, 2013 | 10:42 AM - Posted by Ryan Shrout

Watch the magic and listen to the wind!! :D

October 24, 2013 | 02:40 AM - Posted by 63jax

i'm very curious about the Toxic version of this, can't wait!

October 24, 2013 | 03:44 AM - Posted by Klimax (not verified)

Simply only brute force. Nothing smart about it.

Also what is highest tessellation setting in Metro LL and if it is not normal, why wasn't it used? (I guess, because AMD still sucks in tessellation)

October 24, 2013 | 03:47 AM - Posted by Matt (not verified)

"The AMD Radeon R9 290X delivers here as well, coming in not only $350 less expensive than the GeForce GTX Titan, but $100 less than the GeForce GTX 780 as well."

Titan: $999
R290x: $549

Price difference: $450

October 24, 2013 | 10:43 AM - Posted by Ryan Shrout

Doh, math is hard.

Fixed, thanks!

October 24, 2013 | 05:00 AM - Posted by nasha thedog (not verified)

The conclusion and overall results are very different to several other tests I've seen so far, Personally I don't think 4k results really matter, 99% of gamers won't have it for a year or two and this gen will of been replaced by then, AMD are silly for demanding reference only on release, It looks cheaply made for a top end release. I want to see it compared at relevant res's ie: from 1920x1080 to 2560x1600 and I want to see for example the MSI 290x gaming against the MSI 780 gaming, both out of the box and overclocked before I make any final purchases. As it stands reference v's reference even with your results leaning to AMD I'd still go with a 780 reference over a 290x reference as it seems I'd be faster, cooler and quieter when both are overclocked, But like I said that's neither here nor there, bring on the non reference models.

October 24, 2013 | 06:49 AM - Posted by symmetrical (not verified)

Ryan, shouldn't it be $450 less than a Titan...?

October 24, 2013 | 10:43 AM - Posted by Ryan Shrout

Yup, sorry!

October 24, 2013 | 07:12 AM - Posted by HeavyG (not verified)

As a dual Titan owner, I am very impressed with these scores. I am not sure if I would trade off the gain on performance for the heat and noise penalty, but either way, if my cash balance was the determining factor of which card to buy, it would clearly be the R9 290X.

Luckily my cash balance doesn't make all of the final decisions. Anyways, I can't wait to see how well this card does when some non-reference cooled versions are tested. It might make a big difference compared to that crappy AMD cooler.

October 24, 2013 | 08:34 AM - Posted by Anonymous (not verified)

"coming in not only $350 less expensive than the GeForce GTX Titan" 999-549=450

October 24, 2013 | 10:44 AM - Posted by Ryan Shrout


October 24, 2013 | 09:30 AM - Posted by Mac (not verified)

I'm not seeing the issue with the "up to 1GHz" clocks thing
This thing has DPM states anywhere from 300 to 1000Mhz
We've seen as low as 727Mhz in furmark

October 24, 2013 | 09:48 AM - Posted by Anonymous (not verified)

Has the been any testing on multi display setups, i would like to see if the 4k preformance also help on multi display gaming .. like Battlefield on 3 screens.

October 24, 2013 | 10:44 AM - Posted by Ryan Shrout

4K on tiled displays and Eyefinity should be very similar - but I'll be doing testing this week.

October 24, 2013 | 10:05 AM - Posted by Remon (not verified)

Did you warm up the Nvidia cards too?

October 24, 2013 | 10:45 AM - Posted by Ryan Shrout


Though to be honest the performance deltas we've seen on NVIDIA's cards are very minimal.

October 24, 2013 | 10:59 AM - Posted by nabokovfan87

I know it sounds trivial Ryan, but I'd love to see the breakdown of average clock vs. boosted/advertised like you did for the AMDs going into ALL reviews if not just a follow-up article with titan and the 290x.

As a consumer I am extremely hesitant to trust boost speeds on CPU and GPU. If it can run it at one point on all cores, then that's the max (in my opinion). There is also the matter of this all being temperature dependant. As someone suggested above, for the sake of proving their internal temp/boost logic out, perhaps do the same above average clock speed tests after warmup with and without fan at 100%.

I live down in so cal, and my room is ALWAYS 85-90 F when running my PC, and during the summer it's possibly worse. So if I spend this amount of money on a card and it says it will perform at those temps then I'm likely going to hit them, and with boost being the new fad I get the vibe that it will always downclock and run horribly.

When you were discussing the clocks with Josh and you showed average clocks, i think it's half the story. Is it because of temp, is it because the gpu doesn't NEED to be at 1 GHz for that sequence of the game, or is it time limited boost? Hard to say, but when you have it at ~850 MHz vs. 1 GHz and it still kills the 1000 card, I think it's pretty hard to be that upset by. With updates it will get even faster also.

Lastly, I started watching your thoughts and conversation with josh via the youtube video. I know you don't like some of the things AMD is saying vs. doing, and I respect you pointing those out, but when you go to review their parts and every sentence is finished with "but the titan does X" or "but nvidia has X". It seems like you are an advertiser more then a bipartisan observer of facts and information.

October 24, 2013 | 01:54 PM - Posted by Ryan Shrout

Thanks for the feedback.  I'll address one by one.

1. I can do that in the coming weeks, yes.  Note that we did do some of this when Kepler launched and found the "typical" boost clocks that NVIDIA advertised to be pretty damn accurate.  Also, no fan runs at 100%, thank goodness.  :)

2. An ambient temp that high will definitely have adverse affects on your graphics card, NVIDIA or AMD.  I hope you have good cooling in there and use some noise canceling headphones!

3. There is never a time when the GPU decides it doesn't "need to be" at 1.0 GHz for AMD.  The lower clocks are simply a result of what the GPU can run at while not crossing the 95C temperature level.

4. Keep in mind that my video on the R9 290X is not an ad for the AMD R9 290X.  It is meant to be an editorial commentary on the product which I believe requires direct comparison to the previous generation and the current competition.  That means I will talk about what the TITAN and 780 do better, and what they do worse.  The fact that I comment on that throughout should be indicative of that and nothing more.

October 24, 2013 | 10:42 PM - Posted by nabokovfan87

thanks for the reply back Ryan. Can you elaborate on number 3? If temp doesn't cause it, then why not just have a base clock and avoid boost alltogether?

I agree with you that you have to compare/contrast, but perhaps a better format is simply stating the results and then doing the "but competition does x" spew afterwards. Yes you have to compare by the very nature of using charts and graphs, but as someone who ultimately wants to make their own judgment based on valid and reliable data the constant interruption of "but loud vs competitor" compared to simply saying "but its load compared to last gen/our usability standards" was very offputting, as an example.

October 24, 2013 | 10:35 AM - Posted by Truth Seeker (not verified)

As the previous poster, if you are not warming up Nvidia cards, then you have unfairly biased the review against AMD.

Vast majority of other reviews conclude AMD R9-290x wins on performance (cruises over Titan at 4K) and clear winner in price per performance.

Coupled with water blocks (as EK indicates at Overclock link), the R9-290x hits 1200/1600 easy and becomes an overclocked rocket.

October 24, 2013 | 10:45 AM - Posted by Ryan Shrout

That is basically the same conclusion that I made...?

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.