Review Index:
Feedback

AMD Radeon R9 290X Hawaii - The Configurable GPU?

Author: Ryan Shrout
Manufacturer: AMD

Power Consumption, Sound Levels and NVIDIA Variances

Performance and clocks are not the only things to change when you adjust the maximum fan speed on the R9 290X.  Power consumption and noise levels also increase as a result of it.

View Full Size

Power consumption was measured during Run 2 of this scenario, when the GPU was at its hottest.  Clearly as the GPU uses higher clocks, the power draw is higher as well.  At the 20% fan speed setting, which kept the GPU clock at 727 MHz most of the time, the R9 290X system was drawing 317 watts.  That is 37 watts lower than the power draw in the "reference" 40% fan speed setting and 119 watts lower (!!) than the power draw of the card at the 60% fan speed setting. 

It is also worth nothing that the maximum performance setting of 60% fan speed results in a 23% increase in power consumption (whole system) compared to reference; a difference of 82 watts is not insignificant.

View Full Size

As fan speed increases, so does the noise level from the R9 290X blower style cooler.  At the reference setting we were getting 42.6 dbA which jumps to 53.4 dbA with the maximum performance mode of our Hawaii GPU.  Make no mistake, this card can get loud. Compared to the current GeForce GTX options on the market, even at the default settings, the R9 290X is louder.

NVIDIA GeForce Comparison

Although this article really focuses on the claims that AMD is making about Hawaii, I know that readers would like to see NVIDIA Kepler included in some way.  NVIDIA introduced GPU Boost with Kepler in early 2012 and was actually criticized by many at the outset for offering too much variability in clock speed.  They did not specifically give a clock rate that the GPU would run at for all applications.  This was a very new concept for us at the time: GPUs that did not run at a single clock rate were confusing and scary.  Obviously both parties involved now see the benefits this option can offer.

View Full Size

View Full Size

These graphs compare a high-end GeForce GTX graphics card to the R9 290X at 40% fan setting in terms of clock rates.  Performance isn't measured here, directly, although these are comparable parts. 

During the Run 1 results, both GPUs are able to maintain near-maximum clock speeds, as you would expected based on previous data.  The Run 2 graph shows the same 40% fan speed we results for the 290X on the previous pages but it also shows how the GeForce GTX GPU reacts.  The NVIDIA card also drops in clock speed (from about 1006 MHz to 966 MHz, 4%) but it maintains that level throughout.  That frequency is also above the advertised base clock. 

This is only a single game, and single level, being tested but my experiences with both AMD's and NVIDIA's variable clock implementations show this is a standard scenario.

 

Closing Thoughts

There are two key issues worth looking at from this story.  The first issue is the notion that the R9 290X is an enthusiast configurable graphics card and the second is the high variability in the clocks of the 290X (even at default settings).

Let me first comment on the clock variability.  I said in my initial 290X review I wrote:

Notice that not one of them (average clock speeds we reported) actually averaged over 900 MHz (though Skyrim got very close) and Bioshock Infinite actually was the slowest at 821 MHz.  That is an 18% drop in frequency from the "up to" rating that AMD provided us and provided add-in card vendors.

...

I think AMD needs to come up with a frequency rating that is more useful than "up to" for the R9 290X as it misleading at best.  NVIDIA adopted a "typical Boost clock" for its GPUs that seems to have worked and also indicates a "base clock" that they guarantee the GPU will never go below during normal games.  R9 290X users should want the same information.

After this further research, I firmly believe that AMD needs to take charge of the situation and help present relevant performance data to consumers.  Imagine the variance we'll see once retail cards start showing up in the market with better (or worse) coolers?  Will AMD allow its partners to just claim some wildly higher "up to" clock rates in order to catch the enthusiast who did not come across the results we have presented here?  The R9 290X, empirically, has a base clock of 727 MHz with a "typical" clock rate in the 850-880 MHz range.  "Up to 1000 MHz" has very little relevance.

View Full Size

AMD's claim the that the R9 290X is an "enthusiast configurable GPU" does have some merit though.  I would agree that most users purchasing a $500+ graphics card will have a basic knowledge of the control panel and its settings.  Gamers will be able to get some extra performance out of the R9 290X by simply increasing the maximum fan speed as long as they are comfortable with the increased sound levels. 

What you cannot do though is lower the maximum fan speed any lower than 40% to get a quieter solution than stock.  With settings of 20% or 30%, the fan ignored my preference to maintain 95C at the 727 base clock.  Hawaii appears to be pegged at that base level until AMD's partners can come up with quieter and more efficient cooling solutions.

Have questions?  Thoughts?  Leave them in the comments below!!

November 4, 2013 | 09:07 AM - Posted by ZoranICS

Slap a waterblock on it and it will be both quiet and and superfast. Problem? :)

November 4, 2013 | 09:21 AM - Posted by D1RTYD1Z619

I agree. AMD should have shipped this thing watercooled in its own closed loop.

November 4, 2013 | 10:12 AM - Posted by Coupe

The price wouldn't remain the same and would ruin the whole value / enthusiast advantage of the card.

November 4, 2013 | 10:21 AM - Posted by ZoranICS

You do realize that a part of the card's price is that huge cooler, right? A semi decent self contained loop would be maybe 50-60$ more than the current price and the performance would be stellar all the time. If done right, it wouldn't move away from the 1GHz core clock for a second and could also be almost silent at that...

November 5, 2013 | 12:58 AM - Posted by Anonymous (not verified)

Actually the coolers are around $25~30 for a standard 290(X)

But for me the benefits of Water cooling out strip the cost, and the cost is less high then most people think, as most of the time, you get a 10~20% extra overclock, and also a more silent system.

As some people and friends call me crazy, for having a +€1000 water loop whit 4x +2x 140mm 80mm thick rads and 2x 180mm rads in my TJ11, but when i went over from quad 5870 2GB Matrix cards on a R4E mobo whit a 3930K@5,1GHz, i just had to replace the VGA waterbloks, for two new Titan blocks for €250.

But the rest of my loop i will properly use for maybe a other 10 years.

For all that money i got back a +/-10% extra overclock, but above all, even when i had 4 cards in my system and was pushing around 1000 watts, i still could not hear my system.

November 4, 2013 | 10:18 AM - Posted by ZoranICS

That would be an option :) But I'd rather have a good quality block on it prepared for my own loop. I don't really like those self contained thingies ;)

November 4, 2013 | 03:35 PM - Posted by Anonymous (not verified)

Or why not ship it with no cooler at all and give the owners that price cut for buying their own after market coolers

November 4, 2013 | 03:28 PM - Posted by Soap Aik (not verified)

Or buy a GTX 780Ti and be done with it.

November 4, 2013 | 03:31 PM - Posted by Anonymous (not verified)

And put a hole on your pocket as well ^_^

November 7, 2013 | 02:36 PM - Posted by Anonymous (not verified)

The same hole you get by buying aftermarket cooler/ waterblock...It comes down to preferences

November 5, 2013 | 06:54 AM - Posted by Anonymous (not verified)

yeah.. not all users have liquid cooling or would plan to get one. Problem?

November 4, 2013 | 09:41 AM - Posted by BSBuster (not verified)

" Will AMD allow its partners to just claim some wildly higher "up to" clock rates in order to catch the enthusiast who did not come across the results we have presented here?"

Good Catch

AMD is again blurring the performance lines with marketing propaganda, you can't trust a debt laden desperate company. Nvidia's Titan class coolers are far better that AMD stock cooler. Their GPUs run cooler and more consistent without all the blower noise. Add in better drivers, SLI, control panel, CUDA, PYSX, and visual quality and Nvidia is all I've been recommending.

You get what you pay for with Nvidia, with AMD you get buyers remorse from the bogus benchmarks you got duped into.

I do have to support both so what's the best way to install a new AMD GPU driver, it's been nothing but problems for me. Nvidia driver updates work like charm, maybe I would'n be so hard on AMD if at least that was made easy. About 2/3 of the GPUs I support are Nvidia but about 2/3 of the driver problems are from AMD.

November 4, 2013 | 10:25 AM - Posted by Coupe

The AMD driver excuse is getting a bit old. That was from many years ago and they have come light years from that experience.

Nvidia has had more driver issues recently that caused catastrophic failure on their cards.

Comparing the $550 R9-290x to the $1000 Titan and talking about quality is redundant. For less than a $1000 Titan you can install a water cooling solution that would run circles around the Titan with noise temperature and performance.

Even if you prefer Nvidia, you want this card because it brought down the price of the 780 a ton.

November 4, 2013 | 10:28 AM - Posted by ZoranICS

I am getting really tired of this AMD drivers suck thing. BOTH companies have good and bad drivers. nVidia drivers kill their cards at times, never heard of a suicidal Radeon...

To the point, both teams suck on driver front - that is a given. But, for the past 2-3 years I personally experienced less driver related problems with the Radeons than with the GeForces.

November 4, 2013 | 09:56 AM - Posted by Mechromancer (not verified)

Lesson: Don't buy a high-end GPU with a reference cooler.

November 4, 2013 | 11:16 AM - Posted by BSBuster (not verified)

"Don't buy a high-end GPU with a reference cooler."

I would say, "Don't buy a high-end GPU with a reference cooler." from AMD.

Nvidia Titan class coolers are great in build quality and performance, you can OC right out of the box. The new GTX780ti sounds like a good OCer with its stock cooler.

Water cooling is not a option, it would open a huge can of worms for me, closed loop cooling for CPUs are great but not for GPUs, warranties would be voided. You would be surprised how many are returned because of leaks.

November 5, 2013 | 06:51 AM - Posted by Anonymous (not verified)

It's not that Titan or GTX780 coolers are good.. Its the architecture that is more efficient in using up electricity without leaking it away.

November 6, 2013 | 07:49 PM - Posted by Principle (not verified)

I beg to differ, those Nvidia GPUs can draw a lot of power too. But their reference coolers ARE superior to AMD's reference coolers. Thats easy to see.

November 7, 2013 | 12:23 AM - Posted by arbiter

they are better but i would bet that hawaii gpu makes more heat them nvidia counter parts.

November 9, 2013 | 11:30 PM - Posted by Anonymous (not verified)

I intentionally bought the reference card, so that i can slap a good ol' waterblock, in hope that the elements would be of higher quality. Anyone willing to use the stock leaf-blowers and have their cards running at 95c should be very aware of heat-related issues. Better after-market cooling at about the same price would mean the use of cheaper electrical parts. Guess anyone willing to get a 290(x) should consider the cooling options available, since a simple hair or dust bunny, stuck in the fan can ultimately destroy the equipment.
For me, personally, any piece of hardware running on more that that 20c over the room temperature is a potential risk, so i invested in a massive water cooling kit. All you need to do then is to get a new waterblock for the new card every 2 years or so. Also the noise is almost non-existent.

November 4, 2013 | 09:57 AM - Posted by JohnGR (not verified)

Enthusiasts know what they are buying from the first day this card was reviewed on the Internet.

People who don't know about specs and only buy the best card no matter the cost don't care about clock speed, only about performance. Is it really a problem if the card drops down to 100MHz but gives performance numbers equal or better than GTX 780? Really?

I understand the articles about frame rating and I supported PCRer 100%, but now you just try to find ways to make the card look bad and faulty.
If performance is there and IT IS, if the default mode is the quiet mode and IT IS, the mode that the card was presented and reviewed, then the buyer can unlock more performance with a better air cooler or watercooling.
The only case where you are right is if someone takes this card and installs it in the smallest, worst ventilated case ever created. Also it could be a problem if he lives in Sahara desert.....

November 4, 2013 | 10:36 AM - Posted by Scott Michaud

It is more like we are trying to make two points:

1) Show people what the actual performance is (because it can dip like 10-15% at default fan settings after just a couple of minutes). What if they based their decision on reading a website who runs a 30-second benchmark without allowing the card to warm up? They're going to get a significantly different experience 10 minutes in to their gaming session unless they manually throttle-up their fans.

2) They should not say "up to" in their marketing material. This could be particularly bad if other companies (beyond AMD, beyond a less-than-reputible partner, beyond GPUs even) take this as an open invitation to advertise nothing except what their card (or whatever) can do for 30 seconds strapped to a leaf blower outside in a snowbank. Personally, I believe the system Ryan developed would be much better numbers "on the box": the card obviously ignores minimum fan speed to maintain 727MHz (so that should be its base) but is comfortable at ~850-880MHz with default settings. I think this metric should be adopted by AMD.

But you do highlight our challenge: if you use headphones or you purchase a custom cooler? We have seen settings where AMD's card can run stably at 1 GHz; but, it is not default... or does the switch give it two defaults? Does that even make sense to say?

Difficult for us to describe? Difficult for our readers to fully understand. That means we needed to try harder and find a better way to explain it.

Thanks for your comments!

November 4, 2013 | 10:20 AM - Posted by Coupe

Great article Ryan and some good points.

I think an advantage that AMD has is how precise Power Tune is at adjusting the 290x, that it is very configurable and the cost.

When I see the clock history in Afterburner being adjusted with extreme frequency and precision, it shows that, right now, the R9 has the best control over its GPU.

When you pass more control over to the user, it alleviates many concerns over temperature and noise. However, people who value power and temperature got a great treat because the 780 price got dropped much lower.

Overall, I think because of these factors the AMD launch was great for the consumer. That is the most important part of the wide acceptance and the difference the Fermi launch some years ago.

November 4, 2013 | 10:42 AM - Posted by mab (not verified)

first bios switch is called quiet.... i wonder why... maybe to run quiet? and second bios switch should be called up to 1000mhz cause seems like it sticks to 1000mhz most of the time.

Mantle ...Star citizen can't wait!

November 4, 2013 | 10:45 AM - Posted by Ryan Shrout

I don't consider the fan at 40% on this cooler to be quiet.  

November 4, 2013 | 11:56 AM - Posted by mab (not verified)

Your load on quiet for the 290x is 42.6 (which in turn is nowhere near the 35.6 db from titan /780 from your own review i know )
40 db is the equivalent of rain drops.
Not quiet compared to nvidia? i'll give u that.

amd still managed to make it 5db lower on uber compared to the older gtx 680, amd 7950, amd 7970 , older model blower type fans. (which was 55db)

we just need 3rd parties to make the noise/heat moot.

November 4, 2013 | 12:43 PM - Posted by Scott Michaud

Raw decibels are good for quantifying sound (and relatively easy to measure and graph) but lousy for qualifying noise. Our ears have drastically different hearing profiles dependent on frequency and, even then, some noises are more annoying than others.

Personally, I hope we can figure out a better method (at least weigh by the frequency response curve of an average young adult?) of conveying noise at some point.

November 4, 2013 | 01:10 PM - Posted by ZoranICS

isn't there a special scale for sound pressure level metering that considers the ear of an average human? I believe there is. However, that will not tell the whole story either... some sounds can be annoying at 35dBA and some aren't distracting at 55dBA...

November 4, 2013 | 06:38 PM - Posted by David Willmore (not verified)

There is, it's called the weighting. You see that "A" at the end of dbA? That's a db value with A weighting.

November 5, 2013 | 05:29 AM - Posted by ZoranICS

I put that "A" on the end on purpose... ;) Like I said "I believe there is" ;)

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.