Review Index:
Feedback

AMD Radeon R9 290X Hawaii - The Configurable GPU?

Author: Ryan Shrout
Manufacturer: AMD

Power Consumption, Sound Levels and NVIDIA Variances

Performance and clocks are not the only things to change when you adjust the maximum fan speed on the R9 290X.  Power consumption and noise levels also increase as a result of it.

View Full Size

Power consumption was measured during Run 2 of this scenario, when the GPU was at its hottest.  Clearly as the GPU uses higher clocks, the power draw is higher as well.  At the 20% fan speed setting, which kept the GPU clock at 727 MHz most of the time, the R9 290X system was drawing 317 watts.  That is 37 watts lower than the power draw in the "reference" 40% fan speed setting and 119 watts lower (!!) than the power draw of the card at the 60% fan speed setting. 

It is also worth nothing that the maximum performance setting of 60% fan speed results in a 23% increase in power consumption (whole system) compared to reference; a difference of 82 watts is not insignificant.

View Full Size

As fan speed increases, so does the noise level from the R9 290X blower style cooler.  At the reference setting we were getting 42.6 dbA which jumps to 53.4 dbA with the maximum performance mode of our Hawaii GPU.  Make no mistake, this card can get loud. Compared to the current GeForce GTX options on the market, even at the default settings, the R9 290X is louder.

NVIDIA GeForce Comparison

Although this article really focuses on the claims that AMD is making about Hawaii, I know that readers would like to see NVIDIA Kepler included in some way.  NVIDIA introduced GPU Boost with Kepler in early 2012 and was actually criticized by many at the outset for offering too much variability in clock speed.  They did not specifically give a clock rate that the GPU would run at for all applications.  This was a very new concept for us at the time: GPUs that did not run at a single clock rate were confusing and scary.  Obviously both parties involved now see the benefits this option can offer.

View Full Size

View Full Size

These graphs compare a high-end GeForce GTX graphics card to the R9 290X at 40% fan setting in terms of clock rates.  Performance isn't measured here, directly, although these are comparable parts. 

During the Run 1 results, both GPUs are able to maintain near-maximum clock speeds, as you would expected based on previous data.  The Run 2 graph shows the same 40% fan speed we results for the 290X on the previous pages but it also shows how the GeForce GTX GPU reacts.  The NVIDIA card also drops in clock speed (from about 1006 MHz to 966 MHz, 4%) but it maintains that level throughout.  That frequency is also above the advertised base clock. 

This is only a single game, and single level, being tested but my experiences with both AMD's and NVIDIA's variable clock implementations show this is a standard scenario.

 

Closing Thoughts

There are two key issues worth looking at from this story.  The first issue is the notion that the R9 290X is an enthusiast configurable graphics card and the second is the high variability in the clocks of the 290X (even at default settings).

Let me first comment on the clock variability.  I said in my initial 290X review I wrote:

Notice that not one of them (average clock speeds we reported) actually averaged over 900 MHz (though Skyrim got very close) and Bioshock Infinite actually was the slowest at 821 MHz.  That is an 18% drop in frequency from the "up to" rating that AMD provided us and provided add-in card vendors.

...

I think AMD needs to come up with a frequency rating that is more useful than "up to" for the R9 290X as it misleading at best.  NVIDIA adopted a "typical Boost clock" for its GPUs that seems to have worked and also indicates a "base clock" that they guarantee the GPU will never go below during normal games.  R9 290X users should want the same information.

After this further research, I firmly believe that AMD needs to take charge of the situation and help present relevant performance data to consumers.  Imagine the variance we'll see once retail cards start showing up in the market with better (or worse) coolers?  Will AMD allow its partners to just claim some wildly higher "up to" clock rates in order to catch the enthusiast who did not come across the results we have presented here?  The R9 290X, empirically, has a base clock of 727 MHz with a "typical" clock rate in the 850-880 MHz range.  "Up to 1000 MHz" has very little relevance.

View Full Size

AMD's claim the that the R9 290X is an "enthusiast configurable GPU" does have some merit though.  I would agree that most users purchasing a $500+ graphics card will have a basic knowledge of the control panel and its settings.  Gamers will be able to get some extra performance out of the R9 290X by simply increasing the maximum fan speed as long as they are comfortable with the increased sound levels. 

What you cannot do though is lower the maximum fan speed any lower than 40% to get a quieter solution than stock.  With settings of 20% or 30%, the fan ignored my preference to maintain 95C at the 727 base clock.  Hawaii appears to be pegged at that base level until AMD's partners can come up with quieter and more efficient cooling solutions.

Have questions?  Thoughts?  Leave them in the comments below!!


November 4, 2013 | 12:07 PM - Posted by ZoranICS

Slap a waterblock on it and it will be both quiet and and superfast. Problem? :)

November 4, 2013 | 12:21 PM - Posted by D1RTYD1Z619

I agree. AMD should have shipped this thing watercooled in its own closed loop.

November 4, 2013 | 01:12 PM - Posted by Coupe

The price wouldn't remain the same and would ruin the whole value / enthusiast advantage of the card.

November 4, 2013 | 01:21 PM - Posted by ZoranICS

You do realize that a part of the card's price is that huge cooler, right? A semi decent self contained loop would be maybe 50-60$ more than the current price and the performance would be stellar all the time. If done right, it wouldn't move away from the 1GHz core clock for a second and could also be almost silent at that...

November 5, 2013 | 03:58 AM - Posted by Anonymous (not verified)

Actually the coolers are around $25~30 for a standard 290(X)

But for me the benefits of Water cooling out strip the cost, and the cost is less high then most people think, as most of the time, you get a 10~20% extra overclock, and also a more silent system.

As some people and friends call me crazy, for having a +€1000 water loop whit 4x +2x 140mm 80mm thick rads and 2x 180mm rads in my TJ11, but when i went over from quad 5870 2GB Matrix cards on a R4E mobo whit a 3930K@5,1GHz, i just had to replace the VGA waterbloks, for two new Titan blocks for €250.

But the rest of my loop i will properly use for maybe a other 10 years.

For all that money i got back a +/-10% extra overclock, but above all, even when i had 4 cards in my system and was pushing around 1000 watts, i still could not hear my system.

November 4, 2013 | 01:18 PM - Posted by ZoranICS

That would be an option :) But I'd rather have a good quality block on it prepared for my own loop. I don't really like those self contained thingies ;)

November 4, 2013 | 06:35 PM - Posted by Anonymous (not verified)

Or why not ship it with no cooler at all and give the owners that price cut for buying their own after market coolers

November 4, 2013 | 06:28 PM - Posted by Soap Aik (not verified)

Or buy a GTX 780Ti and be done with it.

November 4, 2013 | 06:31 PM - Posted by Anonymous (not verified)

And put a hole on your pocket as well ^_^

November 7, 2013 | 05:36 PM - Posted by Anonymous (not verified)

The same hole you get by buying aftermarket cooler/ waterblock...It comes down to preferences

November 5, 2013 | 09:54 AM - Posted by Anonymous (not verified)

yeah.. not all users have liquid cooling or would plan to get one. Problem?

November 4, 2013 | 12:41 PM - Posted by BSBuster (not verified)

" Will AMD allow its partners to just claim some wildly higher "up to" clock rates in order to catch the enthusiast who did not come across the results we have presented here?"

Good Catch

AMD is again blurring the performance lines with marketing propaganda, you can't trust a debt laden desperate company. Nvidia's Titan class coolers are far better that AMD stock cooler. Their GPUs run cooler and more consistent without all the blower noise. Add in better drivers, SLI, control panel, CUDA, PYSX, and visual quality and Nvidia is all I've been recommending.

You get what you pay for with Nvidia, with AMD you get buyers remorse from the bogus benchmarks you got duped into.

I do have to support both so what's the best way to install a new AMD GPU driver, it's been nothing but problems for me. Nvidia driver updates work like charm, maybe I would'n be so hard on AMD if at least that was made easy. About 2/3 of the GPUs I support are Nvidia but about 2/3 of the driver problems are from AMD.

November 4, 2013 | 01:25 PM - Posted by Coupe

The AMD driver excuse is getting a bit old. That was from many years ago and they have come light years from that experience.

Nvidia has had more driver issues recently that caused catastrophic failure on their cards.

Comparing the $550 R9-290x to the $1000 Titan and talking about quality is redundant. For less than a $1000 Titan you can install a water cooling solution that would run circles around the Titan with noise temperature and performance.

Even if you prefer Nvidia, you want this card because it brought down the price of the 780 a ton.

November 4, 2013 | 01:28 PM - Posted by ZoranICS

I am getting really tired of this AMD drivers suck thing. BOTH companies have good and bad drivers. nVidia drivers kill their cards at times, never heard of a suicidal Radeon...

To the point, both teams suck on driver front - that is a given. But, for the past 2-3 years I personally experienced less driver related problems with the Radeons than with the GeForces.

November 4, 2013 | 12:56 PM - Posted by Mechromancer (not verified)

Lesson: Don't buy a high-end GPU with a reference cooler.

November 4, 2013 | 02:16 PM - Posted by BSBuster (not verified)

"Don't buy a high-end GPU with a reference cooler."

I would say, "Don't buy a high-end GPU with a reference cooler." from AMD.

Nvidia Titan class coolers are great in build quality and performance, you can OC right out of the box. The new GTX780ti sounds like a good OCer with its stock cooler.

Water cooling is not a option, it would open a huge can of worms for me, closed loop cooling for CPUs are great but not for GPUs, warranties would be voided. You would be surprised how many are returned because of leaks.

November 5, 2013 | 09:51 AM - Posted by Anonymous (not verified)

It's not that Titan or GTX780 coolers are good.. Its the architecture that is more efficient in using up electricity without leaking it away.

November 6, 2013 | 10:49 PM - Posted by Principle (not verified)

I beg to differ, those Nvidia GPUs can draw a lot of power too. But their reference coolers ARE superior to AMD's reference coolers. Thats easy to see.

November 7, 2013 | 03:23 AM - Posted by arbiter

they are better but i would bet that hawaii gpu makes more heat them nvidia counter parts.

November 10, 2013 | 02:30 AM - Posted by Anonymous (not verified)

I intentionally bought the reference card, so that i can slap a good ol' waterblock, in hope that the elements would be of higher quality. Anyone willing to use the stock leaf-blowers and have their cards running at 95c should be very aware of heat-related issues. Better after-market cooling at about the same price would mean the use of cheaper electrical parts. Guess anyone willing to get a 290(x) should consider the cooling options available, since a simple hair or dust bunny, stuck in the fan can ultimately destroy the equipment.
For me, personally, any piece of hardware running on more that that 20c over the room temperature is a potential risk, so i invested in a massive water cooling kit. All you need to do then is to get a new waterblock for the new card every 2 years or so. Also the noise is almost non-existent.

November 4, 2013 | 12:57 PM - Posted by JohnGR (not verified)

Enthusiasts know what they are buying from the first day this card was reviewed on the Internet.

People who don't know about specs and only buy the best card no matter the cost don't care about clock speed, only about performance. Is it really a problem if the card drops down to 100MHz but gives performance numbers equal or better than GTX 780? Really?

I understand the articles about frame rating and I supported PCRer 100%, but now you just try to find ways to make the card look bad and faulty.
If performance is there and IT IS, if the default mode is the quiet mode and IT IS, the mode that the card was presented and reviewed, then the buyer can unlock more performance with a better air cooler or watercooling.
The only case where you are right is if someone takes this card and installs it in the smallest, worst ventilated case ever created. Also it could be a problem if he lives in Sahara desert.....

November 4, 2013 | 01:36 PM - Posted by Scott Michaud

It is more like we are trying to make two points:

1) Show people what the actual performance is (because it can dip like 10-15% at default fan settings after just a couple of minutes). What if they based their decision on reading a website who runs a 30-second benchmark without allowing the card to warm up? They're going to get a significantly different experience 10 minutes in to their gaming session unless they manually throttle-up their fans.

2) They should not say "up to" in their marketing material. This could be particularly bad if other companies (beyond AMD, beyond a less-than-reputible partner, beyond GPUs even) take this as an open invitation to advertise nothing except what their card (or whatever) can do for 30 seconds strapped to a leaf blower outside in a snowbank. Personally, I believe the system Ryan developed would be much better numbers "on the box": the card obviously ignores minimum fan speed to maintain 727MHz (so that should be its base) but is comfortable at ~850-880MHz with default settings. I think this metric should be adopted by AMD.

But you do highlight our challenge: if you use headphones or you purchase a custom cooler? We have seen settings where AMD's card can run stably at 1 GHz; but, it is not default... or does the switch give it two defaults? Does that even make sense to say?

Difficult for us to describe? Difficult for our readers to fully understand. That means we needed to try harder and find a better way to explain it.

Thanks for your comments!

November 4, 2013 | 01:20 PM - Posted by Coupe

Great article Ryan and some good points.

I think an advantage that AMD has is how precise Power Tune is at adjusting the 290x, that it is very configurable and the cost.

When I see the clock history in Afterburner being adjusted with extreme frequency and precision, it shows that, right now, the R9 has the best control over its GPU.

When you pass more control over to the user, it alleviates many concerns over temperature and noise. However, people who value power and temperature got a great treat because the 780 price got dropped much lower.

Overall, I think because of these factors the AMD launch was great for the consumer. That is the most important part of the wide acceptance and the difference the Fermi launch some years ago.

November 4, 2013 | 01:42 PM - Posted by mab (not verified)

first bios switch is called quiet.... i wonder why... maybe to run quiet? and second bios switch should be called up to 1000mhz cause seems like it sticks to 1000mhz most of the time.

Mantle ...Star citizen can't wait!

November 4, 2013 | 01:45 PM - Posted by Ryan Shrout

I don't consider the fan at 40% on this cooler to be quiet.  

November 4, 2013 | 02:56 PM - Posted by mab (not verified)

Your load on quiet for the 290x is 42.6 (which in turn is nowhere near the 35.6 db from titan /780 from your own review i know )
40 db is the equivalent of rain drops.
Not quiet compared to nvidia? i'll give u that.

amd still managed to make it 5db lower on uber compared to the older gtx 680, amd 7950, amd 7970 , older model blower type fans. (which was 55db)

we just need 3rd parties to make the noise/heat moot.

November 4, 2013 | 03:43 PM - Posted by Scott Michaud

Raw decibels are good for quantifying sound (and relatively easy to measure and graph) but lousy for qualifying noise. Our ears have drastically different hearing profiles dependent on frequency and, even then, some noises are more annoying than others.

Personally, I hope we can figure out a better method (at least weigh by the frequency response curve of an average young adult?) of conveying noise at some point.

November 4, 2013 | 04:10 PM - Posted by ZoranICS

isn't there a special scale for sound pressure level metering that considers the ear of an average human? I believe there is. However, that will not tell the whole story either... some sounds can be annoying at 35dBA and some aren't distracting at 55dBA...

November 4, 2013 | 09:38 PM - Posted by David Willmore (not verified)

There is, it's called the weighting. You see that "A" at the end of dbA? That's a db value with A weighting.

November 5, 2013 | 08:29 AM - Posted by ZoranICS

I put that "A" on the end on purpose... ;) Like I said "I believe there is" ;)

November 4, 2013 | 11:13 PM - Posted by Ryan Shrout

While the sound of rain draps or low urban settings (another quote I have seen for 40 db) doesn't sound like much, just consider that this is setting on your desk or under it, the entire time you are gaming.  I think noise levels is something that more people care about than they realize.

November 4, 2013 | 02:02 PM - Posted by Easycure

Perhaps a comparison at 80 degrees would be a good one.
Nvidia Titan and 780 at 80 degrees vs 290X at 80 degrees.

Have not seen a performance comparison at 80C anywhere.

November 4, 2013 | 11:14 PM - Posted by Ryan Shrout

While interesting from a scientific view, it wouldn't really be fair to AMD.  They made the decision to allow 95C temps, so we'll go with that for most testing.

November 6, 2013 | 10:53 PM - Posted by Principle (not verified)

How about we do the 95C comparison. Oh wait, Nvidia's GPU wont live at that temp. Thats a good thing from AMD, its a robust design that will benefit overclockers that use good cooling solutions.

November 7, 2013 | 03:25 AM - Posted by arbiter

you are assuming that ref cooler is just that bad and amd gpu's don't just make that much heat. i bet nvidia gpu could handle 90c+ just they know people don't want to turn their computer case in to an oven.

November 4, 2013 | 02:14 PM - Posted by rezes

I would like to congratulate you.
Again, you hit the target from 12.
Prior to the FCAT problem. Now the problem of the temperature.
Thanks for the facts put out.

November 4, 2013 | 03:01 PM - Posted by bchiker

Thanks for the overall article and I believe that the overall point is spot on. An operating clock speed range would be more honest than the "Up To" advertising.
Luckily most reviews that I have read have recognized this and run their tests while the card was already hot.

One thing I would like to see is examples of what sound levels are like. Reading "40.8" dB or "53.4" dB does not let me know what it sounds like.
Looking at a handy chart from Industrial Noise Control (http://www.industrialnoisecontrol.com/comparative-noise-examples.htm) helps put this in perspective. Forty dB is like a "Library, bird calls (44 dB)); lowest limit of urban ambient sound" and 50 dB is "Quiet suburb, conversation at home. Large electrical transformers at 100 ft". They describe 60 dB as a restaurant conversation. This puts the noise levels into perspective and gives a better idea of what they compare to.

It appears that this is one card to wait on the custom cooler designs and see who comes up with the best design.

November 4, 2013 | 11:15 PM - Posted by Ryan Shrout

We might be able to do some recordings.  Thanks for the idea!

November 4, 2013 | 03:19 PM - Posted by Lithium (not verified)

Custom coolers, right
And all that heat into the case, right

November 4, 2013 | 04:19 PM - Posted by MasterT (not verified)

Are they any custom coolers on any of the high end cards that don't dump the heat into the case?

November 4, 2013 | 05:30 PM - Posted by Anonymous (not verified)

Which "Geforce GTX card" is used? Where is it's performance variances data?

November 4, 2013 | 06:53 PM - Posted by Anonymous (not verified)

"During the Run 1 results, both GPUs are able to maintain near-maximum clock speeds, as you would expected based on previous data. The Run 2 graph shows the same 40% fan speed we results for the 290X on the previous pages but it also shows how the GeForce GTX GPU reacts. The NVIDIA card also drops in clock speed (from about 1006 MHz to 966 MHz, 4%) but it maintains that level throughout. That frequency is also above the advertised base clock."

Double standard.

He seams to have a gripe with AMD. Its perfectly fine for Nvidia to drop MHZ from base to boost but not okay if AMD drops from "up to 1ghz".
Not AMDs fault if you fail to comprehend it.

Hes also comparing the 40% Silent Mode to Nvidia. Hes glancing over the fact that in Uber Mode the drop is almost non-existent since the fan will be at 55%.

Probably since Nvidia doesn't have a Uber Mode switch yet he felt the need to not compare it.

What it really says is that his benchmarking method has been faulty since its only a benchmark run and not a game environment test. Nothing common sense wouldn't have solved.

P.S.
Thanks followers for helping us make you dumber.

November 4, 2013 | 11:17 PM - Posted by Ryan Shrout

40% is the default, out of box setting, that's why it was test.  We clearly show the drop in clocks for NVIDIA as well, but its only showing a 5% drop, compared to the 20-35% drop for the AMD card here.  Again, at the default setting.  

Also, NVIDIA clearly advertises the clock variance while AMD does not.

November 4, 2013 | 11:32 PM - Posted by Anonymous (not verified)

not the same Anonymous.

I asked which nvidia card was used and if you have any nvidia's performance data related to that run.

November 5, 2013 | 12:20 AM - Posted by Anonymous (not verified)

Pathetic explanation.

"default setting"

You tested none default settings at 20, 30, 40, 50 & 60.

At 50% the graph shows it returns to 1ghz unlike the Nvidia card which dip 20-35% and stay there.

Uber Mode is a simple flip of the switch which will give you 55% fan speed with almost no clock drops. After all these graphic cards don't install themselves.

Unless these buyers are utterly clueless like you seam to indicate your readers are. I might be incline to agree with you.

November 5, 2013 | 01:31 AM - Posted by serpico (not verified)

Im not sure what statements you disagree with. Does amd advertise base clocks and boost clocks and give the buyer an idea of what to expect?

Does nvidia not advertise base clocks and boost clocks as Ryan stated? What are you disagreeing with?

"You tested none default settings at 20, 30, 40, 50 & 60."
yeah, he tested these out of scientific curiosity. They show some interesting results. They are not, strictly, necessary for the keypoints of this article.

"Uber Mode is a simple flip of the switch which will give you 55% fan speed with almost no clock drops. After all these graphic cards don't install themselves."
Sure, but if everyone should switch to 55% why does the card even ship at 40% default? Maybe you should contact amd about this problem.

Are you asking why Ryan didn't have the same scientific curiousity when he compared to Nvidia? I don't know, why don't you simply nicely ask him to test at 55% against nvidia?

It's not like he's posting anti amd articles all day long. He already gave the pcper gold award to the r9 290x and showed that the crossfire results were very good.

November 4, 2013 | 06:32 PM - Posted by Brad (not verified)

So we can assume that a good custom cooler will keep the card at or close 1Ghz all the time? Plus overclocking headroom.

November 4, 2013 | 07:37 PM - Posted by Anonymous (not verified)

I can smell a little fanboy reviewer. How much did Nvidia pay you for this?

November 4, 2013 | 11:18 PM - Posted by Ryan Shrout

Like, 100 million dollars man.

Wanna see my gold Lambo?

November 5, 2013 | 12:25 AM - Posted by Anonymous (not verified)

It not a gold Lambo.

Its has to have room for him and Tom Peterson to continue there Bromance affair.

I'm thinking a spacious minivan.

November 4, 2013 | 08:58 PM - Posted by Anonymous (not verified)

Ryan, honestly now who is making you write these articles Nvidia or AMD, or both. It might be just me but I have a feeling you are getting ready some Nvidia reply to AMD reviews as we talk you are probably working on it. I get it, AMD use to sponsor your site then they didn't then Tomy showed up in your office then Nvidia sponsored you (or maybe the other way around). Not only do you probably own your self a Nvidia card at home (because the wife probably doesn't mind the noise that it doesn't make & she would probably divorce or kill you or even worse if you bought an 4K ASUS PQ321Q), so you are limited to some 2560 x 1600 experience on your Apple 30" witch for Nvidia maybe SLI is great at so far. But once you get into multi monitor stuff & 4K territory witch Nvidia lacks so far, I think I could have a better experience with AMD Radeon B9 290Z maybe in XDMA even better. I think it's something that SLI was not designed for or is capping somewhere vs. XDMA + the 3Gb cap, you know the more Gb's the better.
Let's say I'm divorced at the time & that I can hypothetically afford some crazy things like: 3X R9 290 + 1X R9 290X [(XDMA) I think it might work since AMD is not so ass uptight about having everything the same down to the manufacturer & model of card] http://www.youtube.com/watch?v=3TEYkRwkrg8
Will they get hot? Will I still make heat in my house if I water cool it, for the sake of silence. The water cooling loop from the money I saved from each card vs 4x 780Ti SLI $2,800 vs 3X <$550 + $550= less than $2,200 witch is what I payed for my 4 Yamakasi (3 portrait, 1 landscape center top for browsing & other crap). I would be able to afford a good water loop for the value of $600, sure the extra work & everything else with it. Would you think I would have a better experience @ it for an extreme enthusiast based on your http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-AMD-Radeon-R9-2... 4k review & recently also http://hardocp.com/article/2013/11/01/amd_radeon_r9_290x_crossfire_video... these guys Crossfire review. Maybe throw in some 4way raid 0 ssd then raid 1 by 4 other ssd for Allan's sake.
If I do any of this crazy talk at a point & time in my busy life would you consider posting on your site about it?

November 4, 2013 | 11:19 PM - Posted by Ryan Shrout

I don't...  Who said...  What...?

:/

November 5, 2013 | 01:45 PM - Posted by Jeremy Hellstrom

Hope that guy wasn't one of the Indiegogo backers that gets to come on the show.

November 6, 2013 | 09:26 AM - Posted by Anonymous (not verified)

Keep hoping you might be surprised, I was ironically joking more or less I guess you guys don't like those kind of jokes,
I like Ryan's response though he's playing along & he acts like nothing happened. After all that writing I still like your honesty & pod casts you people do. I am the one pushing you to the edge so you can be a better you or not.

November 6, 2013 | 08:47 PM - Posted by TY City (not verified)

Thanks for the review, what do you think of Tom's review and the fact that they bought 2 retail r9-290xs and got drastically different results from the review samples?

Would love for you guys to buy one (a 290x) anonymously and test it out.

November 7, 2013 | 03:29 AM - Posted by arbiter

ryan did say they bought a few retail cards to test that. What was on toms hardware, they asked AMD about it and amd said it was a bad card, which is possible since 15-20% drop is pretty bad. It could be very possible it is, what most reviews are open air bench which does help all cards get lower temps but when card like this has no problem getting to 95c, makes you wonder how much card will clock back cause of an in-closed case?

November 5, 2013 | 06:51 PM - Posted by Anonymous (not verified)

Why don't you crawl back into your AMD fanboy cave retard & GTFO!

Good review Ryan I wanted to know this as well before buying a R9 290 card. But I will wait for some after-market coolers and reviews of those if the throttling is better. If not I think I might go Nvidia.

November 6, 2013 | 09:31 AM - Posted by Anonymous (not verified)

I was joking but I am still waiting for the 780Ti reviews also & then I'll wait some more & so forth but I gotta say I like your Nvy Dia fanboy Reaction. It's a discussion not a battle but I'll take you up on both name the time & place & Yes Jeremy I am one of Ryan's backers and a pretty big one too.

November 4, 2013 | 09:04 PM - Posted by Anonymous (not verified)

http://www.pugetsystems.com/labs/articles/Gaming-PC-vs-Space-Heater-Effi...

November 5, 2013 | 09:14 AM - Posted by jon (not verified)

nice info there ryan

What nvidia card is that?
If its not the gtx 780 it would be nice if you could add it in?
I would love to know the difference in perf/clock of the hawaii vs gk110 based cards

Maybe you could do a clock for clock review?
Sadly, no review site offers such info :(

November 5, 2013 | 10:30 AM - Posted by Anonymous (not verified)

... Which "Geforce GTX card" is used?

November 6, 2013 | 09:10 AM - Posted by Steve (not verified)

Since it wasn't specifically mentioned it probably is still under NDA. So my guess is the GTX 780 Ti.

November 6, 2013 | 09:14 AM - Posted by Jabbadap (not verified)

Which boost?

boost1 starts throttling on 70°C, throttles more on 85°C, 90°C and 95°C(step size of 13.5 MHz). And safety clocks at 98°C.

Boost2 throttles only at 80°C and safety clocks at 95°C.

November 6, 2013 | 12:28 PM - Posted by Mac (not verified)

Is there like a 60 sec limitation to your benchmark runs that cant be overcome?
Also, what's the point of those comparisons with the nvidia card without any performance #s? All that shows is how much more dynamic powertune is compared to boost 2.0 :~

November 6, 2013 | 04:49 PM - Posted by Jimbo (not verified)

There are no performance numbers because seeing the 40% fan speed 290X still beating the 780 was contrary to the desired effect of the article.

November 7, 2013 | 03:36 AM - Posted by arbiter

Um i doubt its still beating the 780. since card is under clocked from its "up to" max. AMD starting to sound like an ISP now. AMD card is crippled by 15-20% almost with that gpu loss, so at that point its "lead" over the 780 is gone.

November 6, 2013 | 11:02 PM - Posted by Principle (not verified)

Alright, call me crazy, but is there anything at all scientific about setting a fan speed at 40%?? Not one, just simplicity. How about setting the fan RPMs or the fan decibels for something actually performance related. 40% is arbitray between two different fans. The size and blade pitch could dictate wildly differing CFM values.

Regardless, its a pretty crazy comparison, when its clear AMD put almost zero effort into the reference cooler. They planned to give consumers a good value, not the best of everything. Its loud, we all get it, but how about an actual useful review of someone strapping a water block on it, or another cooler? How many first adopters of $500 hardware leave well enough alone? If you buy 290X cards you know what you are buying and likely have water cooling, so why pay more money for a great cooler you will take off?

In this way it is also configurable performance in FPS, temps and sound levels.

December 2, 2013 | 02:47 PM - Posted by Greg B (not verified)

I am not sure what to make of my R9 290 to be honest. My waterblock arrives tomorrow so my opinions are likely to change, however from the testing that I have done with this card over the last two weeks I can say that overclocking is totally pointless with stock cooling. My average clocks in BF4 with fan on auto is around 820mhz with a low of 720mhz... I quickly hit 94C. The latest 13.11 beta 9.4 catalyst drivers have no performance tab on my test machine (known bug), so I'm not bothering to test with power limit increased until this is fixed and my waterblock is installed.
I am sure that once adequately cooled this problem will disappear.
Still I determine that AMD should be blamed for shoddy marketing and a lousy stock HSF.
They should have been honest enough to state that the clocks on this card are 800mhz boosted to 950mhz (depending on heat and power).... Just my opinion.

I sold my GTX 780 for a stock R9 290 and believe that this was a good deal. I got $500 for my GTX 780 and this allowed me to buy the R9 290 and a new EK Waterblock. Fully cooled I will be able to get better performance from the R9 290 and as I only play BF4 I am excited to see what Mantle brings to the table later in December.
Nice review as always.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.