Review Index:

AMD Radeon R9 290X Hawaii - The Configurable GPU?

Author: Ryan Shrout
Manufacturer: AMD

Power Consumption, Sound Levels and NVIDIA Variances

Performance and clocks are not the only things to change when you adjust the maximum fan speed on the R9 290X.  Power consumption and noise levels also increase as a result of it.

View Full Size

Power consumption was measured during Run 2 of this scenario, when the GPU was at its hottest.  Clearly as the GPU uses higher clocks, the power draw is higher as well.  At the 20% fan speed setting, which kept the GPU clock at 727 MHz most of the time, the R9 290X system was drawing 317 watts.  That is 37 watts lower than the power draw in the "reference" 40% fan speed setting and 119 watts lower (!!) than the power draw of the card at the 60% fan speed setting. 

It is also worth nothing that the maximum performance setting of 60% fan speed results in a 23% increase in power consumption (whole system) compared to reference; a difference of 82 watts is not insignificant.

View Full Size

As fan speed increases, so does the noise level from the R9 290X blower style cooler.  At the reference setting we were getting 42.6 dbA which jumps to 53.4 dbA with the maximum performance mode of our Hawaii GPU.  Make no mistake, this card can get loud. Compared to the current GeForce GTX options on the market, even at the default settings, the R9 290X is louder.

NVIDIA GeForce Comparison

Although this article really focuses on the claims that AMD is making about Hawaii, I know that readers would like to see NVIDIA Kepler included in some way.  NVIDIA introduced GPU Boost with Kepler in early 2012 and was actually criticized by many at the outset for offering too much variability in clock speed.  They did not specifically give a clock rate that the GPU would run at for all applications.  This was a very new concept for us at the time: GPUs that did not run at a single clock rate were confusing and scary.  Obviously both parties involved now see the benefits this option can offer.

View Full Size

View Full Size

These graphs compare a high-end GeForce GTX graphics card to the R9 290X at 40% fan setting in terms of clock rates.  Performance isn't measured here, directly, although these are comparable parts. 

During the Run 1 results, both GPUs are able to maintain near-maximum clock speeds, as you would expected based on previous data.  The Run 2 graph shows the same 40% fan speed we results for the 290X on the previous pages but it also shows how the GeForce GTX GPU reacts.  The NVIDIA card also drops in clock speed (from about 1006 MHz to 966 MHz, 4%) but it maintains that level throughout.  That frequency is also above the advertised base clock. 

This is only a single game, and single level, being tested but my experiences with both AMD's and NVIDIA's variable clock implementations show this is a standard scenario.


Closing Thoughts

There are two key issues worth looking at from this story.  The first issue is the notion that the R9 290X is an enthusiast configurable graphics card and the second is the high variability in the clocks of the 290X (even at default settings).

Let me first comment on the clock variability.  I said in my initial 290X review I wrote:

Notice that not one of them (average clock speeds we reported) actually averaged over 900 MHz (though Skyrim got very close) and Bioshock Infinite actually was the slowest at 821 MHz.  That is an 18% drop in frequency from the "up to" rating that AMD provided us and provided add-in card vendors.


I think AMD needs to come up with a frequency rating that is more useful than "up to" for the R9 290X as it misleading at best.  NVIDIA adopted a "typical Boost clock" for its GPUs that seems to have worked and also indicates a "base clock" that they guarantee the GPU will never go below during normal games.  R9 290X users should want the same information.

After this further research, I firmly believe that AMD needs to take charge of the situation and help present relevant performance data to consumers.  Imagine the variance we'll see once retail cards start showing up in the market with better (or worse) coolers?  Will AMD allow its partners to just claim some wildly higher "up to" clock rates in order to catch the enthusiast who did not come across the results we have presented here?  The R9 290X, empirically, has a base clock of 727 MHz with a "typical" clock rate in the 850-880 MHz range.  "Up to 1000 MHz" has very little relevance.

View Full Size

AMD's claim the that the R9 290X is an "enthusiast configurable GPU" does have some merit though.  I would agree that most users purchasing a $500+ graphics card will have a basic knowledge of the control panel and its settings.  Gamers will be able to get some extra performance out of the R9 290X by simply increasing the maximum fan speed as long as they are comfortable with the increased sound levels. 

What you cannot do though is lower the maximum fan speed any lower than 40% to get a quieter solution than stock.  With settings of 20% or 30%, the fan ignored my preference to maintain 95C at the 727 base clock.  Hawaii appears to be pegged at that base level until AMD's partners can come up with quieter and more efficient cooling solutions.

Have questions?  Thoughts?  Leave them in the comments below!!

November 4, 2013 | 11:13 PM - Posted by Ryan Shrout

While the sound of rain draps or low urban settings (another quote I have seen for 40 db) doesn't sound like much, just consider that this is setting on your desk or under it, the entire time you are gaming.  I think noise levels is something that more people care about than they realize.

November 4, 2013 | 02:02 PM - Posted by Easycure

Perhaps a comparison at 80 degrees would be a good one.
Nvidia Titan and 780 at 80 degrees vs 290X at 80 degrees.

Have not seen a performance comparison at 80C anywhere.

November 4, 2013 | 11:14 PM - Posted by Ryan Shrout

While interesting from a scientific view, it wouldn't really be fair to AMD.  They made the decision to allow 95C temps, so we'll go with that for most testing.

November 6, 2013 | 10:53 PM - Posted by Principle (not verified)

How about we do the 95C comparison. Oh wait, Nvidia's GPU wont live at that temp. Thats a good thing from AMD, its a robust design that will benefit overclockers that use good cooling solutions.

November 7, 2013 | 03:25 AM - Posted by arbiter

you are assuming that ref cooler is just that bad and amd gpu's don't just make that much heat. i bet nvidia gpu could handle 90c+ just they know people don't want to turn their computer case in to an oven.

November 4, 2013 | 02:14 PM - Posted by rezes

I would like to congratulate you.
Again, you hit the target from 12.
Prior to the FCAT problem. Now the problem of the temperature.
Thanks for the facts put out.

November 4, 2013 | 03:01 PM - Posted by bchiker

Thanks for the overall article and I believe that the overall point is spot on. An operating clock speed range would be more honest than the "Up To" advertising.
Luckily most reviews that I have read have recognized this and run their tests while the card was already hot.

One thing I would like to see is examples of what sound levels are like. Reading "40.8" dB or "53.4" dB does not let me know what it sounds like.
Looking at a handy chart from Industrial Noise Control ( helps put this in perspective. Forty dB is like a "Library, bird calls (44 dB)); lowest limit of urban ambient sound" and 50 dB is "Quiet suburb, conversation at home. Large electrical transformers at 100 ft". They describe 60 dB as a restaurant conversation. This puts the noise levels into perspective and gives a better idea of what they compare to.

It appears that this is one card to wait on the custom cooler designs and see who comes up with the best design.

November 4, 2013 | 11:15 PM - Posted by Ryan Shrout

We might be able to do some recordings.  Thanks for the idea!

November 4, 2013 | 03:19 PM - Posted by Lithium (not verified)

Custom coolers, right
And all that heat into the case, right

November 4, 2013 | 04:19 PM - Posted by MasterT (not verified)

Are they any custom coolers on any of the high end cards that don't dump the heat into the case?

November 4, 2013 | 05:30 PM - Posted by Anonymous (not verified)

Which "Geforce GTX card" is used? Where is it's performance variances data?

November 4, 2013 | 06:53 PM - Posted by Anonymous (not verified)

"During the Run 1 results, both GPUs are able to maintain near-maximum clock speeds, as you would expected based on previous data. The Run 2 graph shows the same 40% fan speed we results for the 290X on the previous pages but it also shows how the GeForce GTX GPU reacts. The NVIDIA card also drops in clock speed (from about 1006 MHz to 966 MHz, 4%) but it maintains that level throughout. That frequency is also above the advertised base clock."

Double standard.

He seams to have a gripe with AMD. Its perfectly fine for Nvidia to drop MHZ from base to boost but not okay if AMD drops from "up to 1ghz".
Not AMDs fault if you fail to comprehend it.

Hes also comparing the 40% Silent Mode to Nvidia. Hes glancing over the fact that in Uber Mode the drop is almost non-existent since the fan will be at 55%.

Probably since Nvidia doesn't have a Uber Mode switch yet he felt the need to not compare it.

What it really says is that his benchmarking method has been faulty since its only a benchmark run and not a game environment test. Nothing common sense wouldn't have solved.

Thanks followers for helping us make you dumber.

November 4, 2013 | 11:17 PM - Posted by Ryan Shrout

40% is the default, out of box setting, that's why it was test.  We clearly show the drop in clocks for NVIDIA as well, but its only showing a 5% drop, compared to the 20-35% drop for the AMD card here.  Again, at the default setting.  

Also, NVIDIA clearly advertises the clock variance while AMD does not.

November 4, 2013 | 11:32 PM - Posted by Anonymous (not verified)

not the same Anonymous.

I asked which nvidia card was used and if you have any nvidia's performance data related to that run.

November 5, 2013 | 12:20 AM - Posted by Anonymous (not verified)

Pathetic explanation.

"default setting"

You tested none default settings at 20, 30, 40, 50 & 60.

At 50% the graph shows it returns to 1ghz unlike the Nvidia card which dip 20-35% and stay there.

Uber Mode is a simple flip of the switch which will give you 55% fan speed with almost no clock drops. After all these graphic cards don't install themselves.

Unless these buyers are utterly clueless like you seam to indicate your readers are. I might be incline to agree with you.

November 5, 2013 | 01:31 AM - Posted by serpico (not verified)

Im not sure what statements you disagree with. Does amd advertise base clocks and boost clocks and give the buyer an idea of what to expect?

Does nvidia not advertise base clocks and boost clocks as Ryan stated? What are you disagreeing with?

"You tested none default settings at 20, 30, 40, 50 & 60."
yeah, he tested these out of scientific curiosity. They show some interesting results. They are not, strictly, necessary for the keypoints of this article.

"Uber Mode is a simple flip of the switch which will give you 55% fan speed with almost no clock drops. After all these graphic cards don't install themselves."
Sure, but if everyone should switch to 55% why does the card even ship at 40% default? Maybe you should contact amd about this problem.

Are you asking why Ryan didn't have the same scientific curiousity when he compared to Nvidia? I don't know, why don't you simply nicely ask him to test at 55% against nvidia?

It's not like he's posting anti amd articles all day long. He already gave the pcper gold award to the r9 290x and showed that the crossfire results were very good.

November 4, 2013 | 06:32 PM - Posted by Brad (not verified)

So we can assume that a good custom cooler will keep the card at or close 1Ghz all the time? Plus overclocking headroom.

November 4, 2013 | 07:37 PM - Posted by Anonymous (not verified)

I can smell a little fanboy reviewer. How much did Nvidia pay you for this?

November 4, 2013 | 11:18 PM - Posted by Ryan Shrout

Like, 100 million dollars man.

Wanna see my gold Lambo?

November 5, 2013 | 12:25 AM - Posted by Anonymous (not verified)

It not a gold Lambo.

Its has to have room for him and Tom Peterson to continue there Bromance affair.

I'm thinking a spacious minivan.

November 4, 2013 | 08:58 PM - Posted by Anonymous (not verified)

Ryan, honestly now who is making you write these articles Nvidia or AMD, or both. It might be just me but I have a feeling you are getting ready some Nvidia reply to AMD reviews as we talk you are probably working on it. I get it, AMD use to sponsor your site then they didn't then Tomy showed up in your office then Nvidia sponsored you (or maybe the other way around). Not only do you probably own your self a Nvidia card at home (because the wife probably doesn't mind the noise that it doesn't make & she would probably divorce or kill you or even worse if you bought an 4K ASUS PQ321Q), so you are limited to some 2560 x 1600 experience on your Apple 30" witch for Nvidia maybe SLI is great at so far. But once you get into multi monitor stuff & 4K territory witch Nvidia lacks so far, I think I could have a better experience with AMD Radeon B9 290Z maybe in XDMA even better. I think it's something that SLI was not designed for or is capping somewhere vs. XDMA + the 3Gb cap, you know the more Gb's the better.
Let's say I'm divorced at the time & that I can hypothetically afford some crazy things like: 3X R9 290 + 1X R9 290X [(XDMA) I think it might work since AMD is not so ass uptight about having everything the same down to the manufacturer & model of card]
Will they get hot? Will I still make heat in my house if I water cool it, for the sake of silence. The water cooling loop from the money I saved from each card vs 4x 780Ti SLI $2,800 vs 3X <$550 + $550= less than $2,200 witch is what I payed for my 4 Yamakasi (3 portrait, 1 landscape center top for browsing & other crap). I would be able to afford a good water loop for the value of $600, sure the extra work & everything else with it. Would you think I would have a better experience @ it for an extreme enthusiast based on your 4k review & recently also these guys Crossfire review. Maybe throw in some 4way raid 0 ssd then raid 1 by 4 other ssd for Allan's sake.
If I do any of this crazy talk at a point & time in my busy life would you consider posting on your site about it?

November 4, 2013 | 11:19 PM - Posted by Ryan Shrout

I don't...  Who said...  What...?


November 5, 2013 | 01:45 PM - Posted by Jeremy Hellstrom

Hope that guy wasn't one of the Indiegogo backers that gets to come on the show.

November 6, 2013 | 09:26 AM - Posted by Anonymous (not verified)

Keep hoping you might be surprised, I was ironically joking more or less I guess you guys don't like those kind of jokes,
I like Ryan's response though he's playing along & he acts like nothing happened. After all that writing I still like your honesty & pod casts you people do. I am the one pushing you to the edge so you can be a better you or not.

November 6, 2013 | 08:47 PM - Posted by TY City (not verified)

Thanks for the review, what do you think of Tom's review and the fact that they bought 2 retail r9-290xs and got drastically different results from the review samples?

Would love for you guys to buy one (a 290x) anonymously and test it out.

November 7, 2013 | 03:29 AM - Posted by arbiter

ryan did say they bought a few retail cards to test that. What was on toms hardware, they asked AMD about it and amd said it was a bad card, which is possible since 15-20% drop is pretty bad. It could be very possible it is, what most reviews are open air bench which does help all cards get lower temps but when card like this has no problem getting to 95c, makes you wonder how much card will clock back cause of an in-closed case?

November 5, 2013 | 06:51 PM - Posted by Anonymous (not verified)

Why don't you crawl back into your AMD fanboy cave retard & GTFO!

Good review Ryan I wanted to know this as well before buying a R9 290 card. But I will wait for some after-market coolers and reviews of those if the throttling is better. If not I think I might go Nvidia.

November 6, 2013 | 09:31 AM - Posted by Anonymous (not verified)

I was joking but I am still waiting for the 780Ti reviews also & then I'll wait some more & so forth but I gotta say I like your Nvy Dia fanboy Reaction. It's a discussion not a battle but I'll take you up on both name the time & place & Yes Jeremy I am one of Ryan's backers and a pretty big one too.

November 4, 2013 | 09:04 PM - Posted by Anonymous (not verified)

November 5, 2013 | 09:14 AM - Posted by jon (not verified)

nice info there ryan

What nvidia card is that?
If its not the gtx 780 it would be nice if you could add it in?
I would love to know the difference in perf/clock of the hawaii vs gk110 based cards

Maybe you could do a clock for clock review?
Sadly, no review site offers such info :(

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.