Review Index:
Feedback

AMD Radeon R9 290X Hawaii Review - Taking on the TITANs

Author: Ryan Shrout
Manufacturer: AMD

PowerTune and Variable Clock Rates

Probably the biggest change to the Hawaii GPU, other than the shader count and raw performance, is how AMD PowerTune is integrated and how AMD is managing power, voltage, temperature, and clock speeds. 

View Full Size

Gone are the days of the deterministic algorithm found in Tahiti as Hawaii uses a "fuzzy math" system to actively monitor and set nearly all aspects of the GPU power and performance.  Using a combination of hardware monitors, accumulators and serial interfaces on the hardware with power estimation tools and weighted functions, AMD is able to shift the clock, voltage and fan speeds at a very fast pace.

View Full Size

This new power logic is actually built into the 290X as well as the 260X / 7790 and the FM2 APUs as well showing the flexibility of the design.  This controller is able to switch voltages on the order of 10 microseconds delay with 6.25 mV granularity.  By doing this it can adjust performance of the R9 290X based on temperatures and the environment in which it is placed.

View Full Size

In many ways, the new AMD PowerTune technology is similar to NVIDIA introduced with Kepler GPUs.  The graphics card now determines on the fly, rather than set from the firmware or vendor, what clock speeds and voltages are safe based on the power levels and voltages required to reach them.  Fan speeds are really the heart of this algorithm though and AMD says it started there when looking at setting default parameters. 

View Full Size

The idea is simple enough: vary the clock speed of the 290X based on the temperature it is currently running, any headroom it may have left in terms of voltage / power, and the maximum fan speed set by the user (or firmware).  Managing it all can be difficult, though. It can lead to some interesting side effects.

View Full Size

Notice in this graph that the operating temperature of the GPU is approaching 95C. This is quite high in my book despite AMD's assurances that this temperature is safe for the ASIC to run at for its entire life. 

View Full Size

And while this graph isn't given any scale on either clock speeds or time, we have some of our data on how the 290X actually performs in real-world conditions.

 

Variable Clock Rates

Remember on the last page when we mentioned the "up to" 1.0 GHz clock rate on the R9 290X?  As it turns out, this measurement isn't very useful for guessing the clock speed at any given time after the first 2-3 minutes of game play spin up. 

View Full Size

What you are looking at here is the clock rate on a per-second basis as reported by a new version of GPU-Z supplied by AMD during a test run of Crysis 3.  You can see that while we are at 1000 MHz for a short period of time, it drops below that and wanders into the 900 MHz and 800 MHz range as well.

In my experience with the R9 290X, the ability to maintain a 1000 MHz clock speed was impossible in any game where you were running more than 5 minutes.  Even "easy" titles like Skyrim dropped well below the "up to" 1.0  GHz rate.

To showcase the effect this can have on clock speeds and performance I took a 60 second snip from GPU-Z recordings after 5-8 minutes of gameplay.  I then averaged the clock speed of those 60 data points to make the graph below.

View Full Size

This should represent the clock speeds of long-term gaming sessions in a case and environment with good air flow and decent ambient temperatures.  Notice that not one of them actually averaged over 900 MHz (though Skyrim got very close) and Bioshock Infinite actually was the slowest at 821 MHz.  That is an 18% drop in frequency from the "up to" rating that AMD provided us and provided add-in card vendors. 

But does this really matter?  I think AMD needs to come up with a frequency rating that is more useful than "up to" for the R9 290X as it misleading at best.  NVIDIA adopted a "typical Boost clock" for its GPUs that seems to have worked and also indicates a "base clock" that they guarantee the GPU will never go below during normal games.  R9 290X users should want the same information.

In terms of performance though, it won't change much of how we test.  All this means now is that we needed to "warm up" the GPU each time we were ready to benchmark it.  I tended to sit in the game for at least 5 minutes before running our normal test run and I think that is plenty of time to get the GPU up to its 95C operating temperature and push clocks to realistic levels.  Be wary of benchmarks results that DO NOT take that into account as they could be 10%+ faster than real-world results would indicate. 

View Full Size

View Full Size

In both Battlefield 3 and GRID 2, these differences produced results that you should be wary of!  While not earth-shattering differences in performance, in battles that are often won by single-digit percentages, reviewers and buyers can't be too careful.


October 24, 2013 | 12:19 AM - Posted by Mnemonicman

Can't wait to see how this card performs under watercooling.

October 24, 2013 | 01:10 AM - Posted by Indeed (not verified)

Watercooling is probably called for...the higher power density makes it harder to cool.

October 24, 2013 | 12:27 AM - Posted by arbiter

Givin' all the hype this card got over last few months, how it would be a titan killer, etc. Looking at most games its pretty much on even group with titan and almost on par with 780. Still question on 95c temp if that was where it would get to in games? If givin' the same temp limit on which side would win as 780/titan cards do have a some bit of overclocking headroom. 95c is what it got to in gaming tests, being open air bench surely that would effect what real world would see when its in an enclosed case. If those prices are correct then it does look to be a good deal.

October 24, 2013 | 01:14 AM - Posted by Fergdog (not verified)

I don't see why there'd be a lot of hype for this card. It's on the same 28nm and it's the same architecture as the 7790 which was just very minor compute changes. It gives good competition to Nvidia's high end card that are ridiculously priced which is very good though.

October 24, 2013 | 12:51 AM - Posted by PapaDragon

The first page I went to was the Battlefield 3 Page..lol. Great Performance all around!!! Im surprised that it Cost $550 (good) but that heat output/noise and temp..its ok. Cant wait to see the Lightning/Direct Cu versions. Also, cant wait for the price drops the GTX 780 too.

@ Ryan, in the BF3 section , the screen shot of the BF3 settings puts it at 1920/1200, but the graphs show it at 1080p??

2. The Sound levels of the 280x are missing!

Thanks 4 the review!!

October 24, 2013 | 10:42 AM - Posted by Ryan Shrout

Trust the settings, not the resolution on that screen.  :)

October 24, 2013 | 01:18 AM - Posted by Scyy (not verified)

I see what you meant about watch for other reviews not allowing the cards to throttle. Everywhere else is showing huge leads that likely diminish somewhat after playing a short while and the card throttles.

October 24, 2013 | 01:38 AM - Posted by Anonymous (not verified)

The card looks like a toy compared to the titan visually, also probably contributes to its lack of thermals. The quality of the plastics etc. But wow on the price.!! I dunno I guess for the budget limited would go for this, but the higher quality card is the titan. If you got a high end pc your still gonna get the titan, whats a few hundred bucks when you got 5 or 7k in it.

October 24, 2013 | 02:19 AM - Posted by MArc-Andre B (not verified)

flip uber mode + fan slider to 100% (will run up to 100% but not always) and watch the magic! (maybe save yourself warming up the card 5 min too!)

October 24, 2013 | 10:42 AM - Posted by Ryan Shrout

Watch the magic and listen to the wind!! :D

October 24, 2013 | 02:40 AM - Posted by 63jax

i'm very curious about the Toxic version of this, can't wait!

October 24, 2013 | 03:44 AM - Posted by Klimax (not verified)

Simply only brute force. Nothing smart about it.

Also what is highest tessellation setting in Metro LL and if it is not normal, why wasn't it used? (I guess, because AMD still sucks in tessellation)

October 24, 2013 | 03:47 AM - Posted by Matt (not verified)

"The AMD Radeon R9 290X delivers here as well, coming in not only $350 less expensive than the GeForce GTX Titan, but $100 less than the GeForce GTX 780 as well."

Titan: $999
R290x: $549

Price difference: $450

October 24, 2013 | 10:43 AM - Posted by Ryan Shrout

Doh, math is hard.

Fixed, thanks!

October 24, 2013 | 05:00 AM - Posted by nasha thedog (not verified)

The conclusion and overall results are very different to several other tests I've seen so far, Personally I don't think 4k results really matter, 99% of gamers won't have it for a year or two and this gen will of been replaced by then, AMD are silly for demanding reference only on release, It looks cheaply made for a top end release. I want to see it compared at relevant res's ie: from 1920x1080 to 2560x1600 and I want to see for example the MSI 290x gaming against the MSI 780 gaming, both out of the box and overclocked before I make any final purchases. As it stands reference v's reference even with your results leaning to AMD I'd still go with a 780 reference over a 290x reference as it seems I'd be faster, cooler and quieter when both are overclocked, But like I said that's neither here nor there, bring on the non reference models.

October 24, 2013 | 06:49 AM - Posted by symmetrical (not verified)

Ryan, shouldn't it be $450 less than a Titan...?

October 24, 2013 | 10:43 AM - Posted by Ryan Shrout

Yup, sorry!

October 24, 2013 | 07:12 AM - Posted by HeavyG (not verified)

As a dual Titan owner, I am very impressed with these scores. I am not sure if I would trade off the gain on performance for the heat and noise penalty, but either way, if my cash balance was the determining factor of which card to buy, it would clearly be the R9 290X.

Luckily my cash balance doesn't make all of the final decisions. Anyways, I can't wait to see how well this card does when some non-reference cooled versions are tested. It might make a big difference compared to that crappy AMD cooler.

October 24, 2013 | 08:34 AM - Posted by Anonymous (not verified)

"coming in not only $350 less expensive than the GeForce GTX Titan" 999-549=450

October 24, 2013 | 10:44 AM - Posted by Ryan Shrout

Fixed!

October 24, 2013 | 09:30 AM - Posted by Mac (not verified)

I'm not seeing the issue with the "up to 1GHz" clocks thing
This thing has DPM states anywhere from 300 to 1000Mhz
We've seen as low as 727Mhz in furmark

October 24, 2013 | 09:48 AM - Posted by Anonymous (not verified)

Has the been any testing on multi display setups, i would like to see if the 4k preformance also help on multi display gaming .. like Battlefield on 3 screens.

October 24, 2013 | 10:44 AM - Posted by Ryan Shrout

4K on tiled displays and Eyefinity should be very similar - but I'll be doing testing this week.

October 24, 2013 | 10:05 AM - Posted by Remon (not verified)

Did you warm up the Nvidia cards too?

October 24, 2013 | 10:45 AM - Posted by Ryan Shrout

Yup!

Though to be honest the performance deltas we've seen on NVIDIA's cards are very minimal.

October 24, 2013 | 10:59 AM - Posted by nabokovfan87

I know it sounds trivial Ryan, but I'd love to see the breakdown of average clock vs. boosted/advertised like you did for the AMDs going into ALL reviews if not just a follow-up article with titan and the 290x.

As a consumer I am extremely hesitant to trust boost speeds on CPU and GPU. If it can run it at one point on all cores, then that's the max (in my opinion). There is also the matter of this all being temperature dependant. As someone suggested above, for the sake of proving their internal temp/boost logic out, perhaps do the same above average clock speed tests after warmup with and without fan at 100%.

I live down in so cal, and my room is ALWAYS 85-90 F when running my PC, and during the summer it's possibly worse. So if I spend this amount of money on a card and it says it will perform at those temps then I'm likely going to hit them, and with boost being the new fad I get the vibe that it will always downclock and run horribly.

When you were discussing the clocks with Josh and you showed average clocks, i think it's half the story. Is it because of temp, is it because the gpu doesn't NEED to be at 1 GHz for that sequence of the game, or is it time limited boost? Hard to say, but when you have it at ~850 MHz vs. 1 GHz and it still kills the 1000 card, I think it's pretty hard to be that upset by. With updates it will get even faster also.

Lastly, I started watching your thoughts and conversation with josh via the youtube video. I know you don't like some of the things AMD is saying vs. doing, and I respect you pointing those out, but when you go to review their parts and every sentence is finished with "but the titan does X" or "but nvidia has X". It seems like you are an advertiser more then a bipartisan observer of facts and information.

October 24, 2013 | 01:54 PM - Posted by Ryan Shrout

Thanks for the feedback.  I'll address one by one.

1. I can do that in the coming weeks, yes.  Note that we did do some of this when Kepler launched and found the "typical" boost clocks that NVIDIA advertised to be pretty damn accurate.  Also, no fan runs at 100%, thank goodness.  :)

2. An ambient temp that high will definitely have adverse affects on your graphics card, NVIDIA or AMD.  I hope you have good cooling in there and use some noise canceling headphones!

3. There is never a time when the GPU decides it doesn't "need to be" at 1.0 GHz for AMD.  The lower clocks are simply a result of what the GPU can run at while not crossing the 95C temperature level.

4. Keep in mind that my video on the R9 290X is not an ad for the AMD R9 290X.  It is meant to be an editorial commentary on the product which I believe requires direct comparison to the previous generation and the current competition.  That means I will talk about what the TITAN and 780 do better, and what they do worse.  The fact that I comment on that throughout should be indicative of that and nothing more.

October 24, 2013 | 10:42 PM - Posted by nabokovfan87

thanks for the reply back Ryan. Can you elaborate on number 3? If temp doesn't cause it, then why not just have a base clock and avoid boost alltogether?

I agree with you that you have to compare/contrast, but perhaps a better format is simply stating the results and then doing the "but competition does x" spew afterwards. Yes you have to compare by the very nature of using charts and graphs, but as someone who ultimately wants to make their own judgment based on valid and reliable data the constant interruption of "but loud vs competitor" compared to simply saying "but its load compared to last gen/our usability standards" was very offputting, as an example.

October 24, 2013 | 10:35 AM - Posted by Truth Seeker (not verified)

As the previous poster, if you are not warming up Nvidia cards, then you have unfairly biased the review against AMD.

Vast majority of other reviews conclude AMD R9-290x wins on performance (cruises over Titan at 4K) and clear winner in price per performance.

Coupled with water blocks (as EK indicates at Overclock link), the R9-290x hits 1200/1600 easy and becomes an overclocked rocket.

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/...

October 24, 2013 | 10:45 AM - Posted by Ryan Shrout

That is basically the same conclusion that I made...?

October 28, 2013 | 08:29 AM - Posted by Panta

R9-290x hits 1200/1600 easy, i doubt it.
i assume it's a cheery picked card..

October 24, 2013 | 11:58 AM - Posted by Anonymous (not verified)

Almost considered getting one of these until I saw the 95C temps.

You go SLI and end up overheating all your other expensive components and drive down your PSUs efficiency.

C'mon AMD! I want to buy from you and support you guys but get your sh!t together.

That's too damn hot. I wonder how many RMA's due to overheated parts are coming down the line.

October 24, 2013 | 12:51 PM - Posted by Chloiber (not verified)

It's a great card, but the reference cooling is, as always unusable. I had hoped that AMD would improved this time. It's really a sad story: they should invest way more in their coolers (and up production cost by a bit, they can go up a bit in prices easily).
I know that custom designs will appear which remedy this problem quite a bit, but it still leaves a very big disadvantage. And everyone who reads the reviews gets a bad impression of the card. I think AMD will never learn.

October 25, 2013 | 12:11 PM - Posted by TwinShadowx (not verified)

Cant Please everyone. I Rather have a cheap looking card the runs Hot and loud that Cost 550 then a card that's over price at the end of the day the both get the Job down. Also AMD could have spent time and money getting a real cooler and also charge 1000$

October 24, 2013 | 01:00 PM - Posted by Anonymous (not verified)

This price is excellent. We won't have to pay a huge premium for the mods that Asus, MSI and others are going to make on the stock 290X. Imagine an Asus Ares 290X version with a silent cooler. Energy consumption cannot change much but noise is something I won't tolerate.

Happy happy happy.

October 24, 2013 | 01:16 PM - Posted by Anonymous (not verified)

Looks like Titan kicked it`s arse.
NVIDIA FTW !

October 24, 2013 | 10:43 PM - Posted by nabokovfan87

are you mad?

October 24, 2013 | 04:04 PM - Posted by ltguy005 (not verified)

I would like to see the R9 290x and the GTX Titan benchmarked together with the fans at the same rpm and the temperature throttle point set to the same value (like 70C) so that each card can boost up as much as possible while staying in that noise/temp envelope.

October 24, 2013 | 06:31 PM - Posted by jon (not verified)

great review ryan,
I really like your frame time graphs especially @1440p
It was also shocking to see how low the 290X avg. clocks really are. And still it competes!

It would be great if you could add in the real average clock speeds for a particular bench instead of stating the boost clock/base clock, behind the name of each card.(in future reviews).

That would make for an easier comparison since clock speeds seem to be all over the place on both the 290X and nvidia cards.

October 25, 2013 | 07:06 AM - Posted by Johnny Rook (not verified)

The gaming experience I can get from a reference R9 290X would be better than the reference GTX 780 gaming experience if:

a) I use noise-canceling headphones;

b) I use an water-block;

c) I am willing to pay a bit extra in my power bills.

But, wouldn't that make the R9 290X "experience" cost me ~$750, or more?

No, R9 290X is not worthy a Gold Award.
Sorry Ryan but, I think you are not being coherent with yourself; the GTX 780, that already performs better than a TITAN, not even a Brass Award got in your review!

And I don't even want to continue writing about the "gaming experience" a card delivers being more important to me than its "raw performance" numbers, because is a personal preference.

Anyways, for those that only care about "raw performance", real owners, in real-world gaming rigs, already start to post results from the R9 290X in forums, and those results are not being better than results posted previously by GTX 780 owners.

October 27, 2013 | 08:13 AM - Posted by Daniel Nielsen (not verified)

Its a review, its his opinion, and it's just as valid as your.

November 9, 2014 | 07:28 AM - Posted by Vince (not verified)

Ahaa, itѕ рleasant ɗialogue on the tߋрic of this piece of writing here
att this wweb sіte, I have read all that, so at
this time me also ϲommеnting at this place.

Feel free to suіrf to my webllg Neuro3X (Limitlessdrug.Pw)

December 11, 2014 | 12:38 AM - Posted by Anonymous (not verified)

im a long time gamer tester and far from being bias i have loved nvidia and ATI for years i like both but the rfernce is a hotter running card but with a little common know how the r9 290x refernce can be controlled for heat for free and noise limited if set right just use msi afterburner for this issue my gtx 780 underload using furmark hits 87 temps at 1280x720 open air room temp 75f the r9 290x hit 87 to 90 at same specs the nvidia card actually adjust the fan speed like it should the r9 does not so i basically set my r9 fan speed to same certain temp setting as the 780 and magic the fan works perfect and my temps are now 78 under full load and the fan is slightly louder nothing to complain about if you have never used furmark go get it from geek3d this program loads a card like no other this card at the price point is a great deal for the performance its not that amd has put a cheap cooler/ heats sink its actually a great one but the reference bios is not set right for the fan. and honestly the fan in a amd is alot higher rpm fan these things can put off some serious noise but you can pick up a reference card fairly reasonable for these issues but trust me get msi afterburner set it to boot with windows and adjust these settings. if you need my setting reply ill get back with you and help heres u a easy great fix and very fast card for alot less cash. like i said i like both but with all new games being developed and are now being more stream for things amd support for the next few years i see amd card being a bit on top. unlike the past nvidia had the mainstream connection with game developers.

October 26, 2013 | 08:21 PM - Posted by wujj123456

International Stutter Units... LMAO

November 5, 2013 | 01:59 PM - Posted by Anonymous (not verified)

Amazing review , thanks :)
http://www.battlefield4-cdkey.com/

November 9, 2013 | 04:21 PM - Posted by Anonymous (not verified)

Hi, the page 1 of this article states that the GTX Titan/780/280X/7970GHz hace 320-BitBUS when in reality they are 384-Bits.

January 30, 2015 | 06:02 PM - Posted by MAXXHEW

The FIX3R saves the day! Use #IWant4GB to enter to win an AMD Radeon™ R9 290X graphics card...

http://www.youtube.com/watch?v=tQAtwFFa2QY

@AMD is giving away free R9-290X's here: https://twitter.com/AMD

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.