Review Index:

AMD Radeon R9 290X Hawaii Review - Taking on the TITANs

Author: Ryan Shrout
Manufacturer: AMD

The Radeon R9 290X 4GB Graphics Card and Specs

If you were paying attention to the noise that AMD was creating during its GPU14 event then you likely already know what the Hawaii-based Radeon R9 290X looks like.  If you are a fan of the black and red color scheme, then prepare to be ecstatic. 

View Full Size

At just about 290mm in length, the R9 290X is just a little bit longer than NVIDIA's GeForce GTX TITAN and GTX 780 cards.  The plastic housing is well designed and maintains the red/black scheme that has been with AMD for quite a while.  The reference card also keeps the pretty standard heatsink and single fan combination we've had since the introduction of the Radeon HD 5000 cards. 

View Full Size

Though we touched on most of this on the previous page, it is worth recapping here the specifications of the new R9 290X.  It has 2816 stream processors, 37% more than the R9 280X / HD 7970 that came before it.  This helps bump up the top theoretical compute power to 5.6 TFLOPS giving it an edge of 36% over the R9 280X.  By far the most important architectural change was the increase in primitives / clock; from 2 to 4

The engine clock of "up to" 1.0 GHz will need some more discussion on another page as we found it to be quite a bit more variable than expected. 

With 4GB of GDDR5 memory, the R9 290X should have more than enough space for 4K gaming and even though it runs at a slower clock speed (5.0 Gbps) it increases the bus width up to 512-bit for a massive amount of memory bandwidth; 320 GB/s and 11% faster than the R9 280X.

The R9 290X includes 176 texture units, 64 ROPs and double the Z/Stencil rate of its predecessors.  Clearly the R9 290X does not lack in the hardware department.

View Full Size

The fan on the R9 290X is very similar to the fans we have seen on previous Radeon cards which should tell your right away that this product isn't going to be much quieter than previous generations. 

View Full Size

Despite the huge 6.2 billion transistor die, the R9 290X only requires an 8+6 power connector config. 

View Full Size

We don't need no stinking CrossFire connectors!  But clearly, AMD had left that option open as a contingency plan as the solder points are still located on the reference PCB.  We'll go over the updated CrossFire technology on the next page and you'll find we have some early CrossFire and 4K benchmark results in a follow up article as well!

You may notice the switch on the right hand side of the image as well; that moves the BIOS from "Quiet" mode to "Uber" mode.  All that really means is a shift in the maximum fan speed from 40% to 55% allowing for slightly higher clocks for a slightly longer period of time.  See my PowerTune page for details on that.

View Full Size

The back side of the R9 290X is left bare for all to see though there is much to gawk at.  There are a lot of pads for that 4GB of memory though...

View Full Size

Just like we saw with the R9 280X launch, AMD has gone with a more intelligent display output configuration that includes a pair of dual link DVI outputs, a full size HDMI port and a full size DisplayPort.  Without a doubt this has been the most amicable configuration I have seen in graphics cards in the last several years. 

October 24, 2013 | 12:19 AM - Posted by Mnemonicman

Can't wait to see how this card performs under watercooling.

October 24, 2013 | 01:10 AM - Posted by Indeed (not verified)

Watercooling is probably called for...the higher power density makes it harder to cool.

October 24, 2013 | 12:27 AM - Posted by arbiter

Givin' all the hype this card got over last few months, how it would be a titan killer, etc. Looking at most games its pretty much on even group with titan and almost on par with 780. Still question on 95c temp if that was where it would get to in games? If givin' the same temp limit on which side would win as 780/titan cards do have a some bit of overclocking headroom. 95c is what it got to in gaming tests, being open air bench surely that would effect what real world would see when its in an enclosed case. If those prices are correct then it does look to be a good deal.

October 24, 2013 | 01:14 AM - Posted by Fergdog (not verified)

I don't see why there'd be a lot of hype for this card. It's on the same 28nm and it's the same architecture as the 7790 which was just very minor compute changes. It gives good competition to Nvidia's high end card that are ridiculously priced which is very good though.

October 24, 2013 | 12:51 AM - Posted by PapaDragon

The first page I went to was the Battlefield 3 Great Performance all around!!! Im surprised that it Cost $550 (good) but that heat output/noise and temp..its ok. Cant wait to see the Lightning/Direct Cu versions. Also, cant wait for the price drops the GTX 780 too.

@ Ryan, in the BF3 section , the screen shot of the BF3 settings puts it at 1920/1200, but the graphs show it at 1080p??

2. The Sound levels of the 280x are missing!

Thanks 4 the review!!

October 24, 2013 | 10:42 AM - Posted by Ryan Shrout

Trust the settings, not the resolution on that screen.  :)

October 24, 2013 | 01:18 AM - Posted by Scyy (not verified)

I see what you meant about watch for other reviews not allowing the cards to throttle. Everywhere else is showing huge leads that likely diminish somewhat after playing a short while and the card throttles.

October 24, 2013 | 01:38 AM - Posted by Anonymous (not verified)

The card looks like a toy compared to the titan visually, also probably contributes to its lack of thermals. The quality of the plastics etc. But wow on the price.!! I dunno I guess for the budget limited would go for this, but the higher quality card is the titan. If you got a high end pc your still gonna get the titan, whats a few hundred bucks when you got 5 or 7k in it.

October 24, 2013 | 02:19 AM - Posted by MArc-Andre B (not verified)

flip uber mode + fan slider to 100% (will run up to 100% but not always) and watch the magic! (maybe save yourself warming up the card 5 min too!)

October 24, 2013 | 10:42 AM - Posted by Ryan Shrout

Watch the magic and listen to the wind!! :D

October 24, 2013 | 02:40 AM - Posted by 63jax

i'm very curious about the Toxic version of this, can't wait!

October 24, 2013 | 03:44 AM - Posted by Klimax (not verified)

Simply only brute force. Nothing smart about it.

Also what is highest tessellation setting in Metro LL and if it is not normal, why wasn't it used? (I guess, because AMD still sucks in tessellation)

October 24, 2013 | 03:47 AM - Posted by Matt (not verified)

"The AMD Radeon R9 290X delivers here as well, coming in not only $350 less expensive than the GeForce GTX Titan, but $100 less than the GeForce GTX 780 as well."

Titan: $999
R290x: $549

Price difference: $450

October 24, 2013 | 10:43 AM - Posted by Ryan Shrout

Doh, math is hard.

Fixed, thanks!

October 24, 2013 | 05:00 AM - Posted by nasha thedog (not verified)

The conclusion and overall results are very different to several other tests I've seen so far, Personally I don't think 4k results really matter, 99% of gamers won't have it for a year or two and this gen will of been replaced by then, AMD are silly for demanding reference only on release, It looks cheaply made for a top end release. I want to see it compared at relevant res's ie: from 1920x1080 to 2560x1600 and I want to see for example the MSI 290x gaming against the MSI 780 gaming, both out of the box and overclocked before I make any final purchases. As it stands reference v's reference even with your results leaning to AMD I'd still go with a 780 reference over a 290x reference as it seems I'd be faster, cooler and quieter when both are overclocked, But like I said that's neither here nor there, bring on the non reference models.

October 24, 2013 | 06:49 AM - Posted by symmetrical (not verified)

Ryan, shouldn't it be $450 less than a Titan...?

October 24, 2013 | 10:43 AM - Posted by Ryan Shrout

Yup, sorry!

October 24, 2013 | 07:12 AM - Posted by HeavyG (not verified)

As a dual Titan owner, I am very impressed with these scores. I am not sure if I would trade off the gain on performance for the heat and noise penalty, but either way, if my cash balance was the determining factor of which card to buy, it would clearly be the R9 290X.

Luckily my cash balance doesn't make all of the final decisions. Anyways, I can't wait to see how well this card does when some non-reference cooled versions are tested. It might make a big difference compared to that crappy AMD cooler.

October 24, 2013 | 08:34 AM - Posted by Anonymous (not verified)

"coming in not only $350 less expensive than the GeForce GTX Titan" 999-549=450

October 24, 2013 | 10:44 AM - Posted by Ryan Shrout


October 24, 2013 | 09:30 AM - Posted by Mac (not verified)

I'm not seeing the issue with the "up to 1GHz" clocks thing
This thing has DPM states anywhere from 300 to 1000Mhz
We've seen as low as 727Mhz in furmark

October 24, 2013 | 09:48 AM - Posted by Anonymous (not verified)

Has the been any testing on multi display setups, i would like to see if the 4k preformance also help on multi display gaming .. like Battlefield on 3 screens.

October 24, 2013 | 10:44 AM - Posted by Ryan Shrout

4K on tiled displays and Eyefinity should be very similar - but I'll be doing testing this week.

October 24, 2013 | 10:05 AM - Posted by Remon (not verified)

Did you warm up the Nvidia cards too?

October 24, 2013 | 10:45 AM - Posted by Ryan Shrout


Though to be honest the performance deltas we've seen on NVIDIA's cards are very minimal.

October 24, 2013 | 10:59 AM - Posted by nabokovfan87

I know it sounds trivial Ryan, but I'd love to see the breakdown of average clock vs. boosted/advertised like you did for the AMDs going into ALL reviews if not just a follow-up article with titan and the 290x.

As a consumer I am extremely hesitant to trust boost speeds on CPU and GPU. If it can run it at one point on all cores, then that's the max (in my opinion). There is also the matter of this all being temperature dependant. As someone suggested above, for the sake of proving their internal temp/boost logic out, perhaps do the same above average clock speed tests after warmup with and without fan at 100%.

I live down in so cal, and my room is ALWAYS 85-90 F when running my PC, and during the summer it's possibly worse. So if I spend this amount of money on a card and it says it will perform at those temps then I'm likely going to hit them, and with boost being the new fad I get the vibe that it will always downclock and run horribly.

When you were discussing the clocks with Josh and you showed average clocks, i think it's half the story. Is it because of temp, is it because the gpu doesn't NEED to be at 1 GHz for that sequence of the game, or is it time limited boost? Hard to say, but when you have it at ~850 MHz vs. 1 GHz and it still kills the 1000 card, I think it's pretty hard to be that upset by. With updates it will get even faster also.

Lastly, I started watching your thoughts and conversation with josh via the youtube video. I know you don't like some of the things AMD is saying vs. doing, and I respect you pointing those out, but when you go to review their parts and every sentence is finished with "but the titan does X" or "but nvidia has X". It seems like you are an advertiser more then a bipartisan observer of facts and information.

October 24, 2013 | 01:54 PM - Posted by Ryan Shrout

Thanks for the feedback.  I'll address one by one.

1. I can do that in the coming weeks, yes.  Note that we did do some of this when Kepler launched and found the "typical" boost clocks that NVIDIA advertised to be pretty damn accurate.  Also, no fan runs at 100%, thank goodness.  :)

2. An ambient temp that high will definitely have adverse affects on your graphics card, NVIDIA or AMD.  I hope you have good cooling in there and use some noise canceling headphones!

3. There is never a time when the GPU decides it doesn't "need to be" at 1.0 GHz for AMD.  The lower clocks are simply a result of what the GPU can run at while not crossing the 95C temperature level.

4. Keep in mind that my video on the R9 290X is not an ad for the AMD R9 290X.  It is meant to be an editorial commentary on the product which I believe requires direct comparison to the previous generation and the current competition.  That means I will talk about what the TITAN and 780 do better, and what they do worse.  The fact that I comment on that throughout should be indicative of that and nothing more.

October 24, 2013 | 10:42 PM - Posted by nabokovfan87

thanks for the reply back Ryan. Can you elaborate on number 3? If temp doesn't cause it, then why not just have a base clock and avoid boost alltogether?

I agree with you that you have to compare/contrast, but perhaps a better format is simply stating the results and then doing the "but competition does x" spew afterwards. Yes you have to compare by the very nature of using charts and graphs, but as someone who ultimately wants to make their own judgment based on valid and reliable data the constant interruption of "but loud vs competitor" compared to simply saying "but its load compared to last gen/our usability standards" was very offputting, as an example.

October 24, 2013 | 10:35 AM - Posted by Truth Seeker (not verified)

As the previous poster, if you are not warming up Nvidia cards, then you have unfairly biased the review against AMD.

Vast majority of other reviews conclude AMD R9-290x wins on performance (cruises over Titan at 4K) and clear winner in price per performance.

Coupled with water blocks (as EK indicates at Overclock link), the R9-290x hits 1200/1600 easy and becomes an overclocked rocket.

October 24, 2013 | 10:45 AM - Posted by Ryan Shrout

That is basically the same conclusion that I made...?

October 28, 2013 | 08:29 AM - Posted by Panta

R9-290x hits 1200/1600 easy, i doubt it.
i assume it's a cheery picked card..

October 24, 2013 | 11:58 AM - Posted by Anonymous (not verified)

Almost considered getting one of these until I saw the 95C temps.

You go SLI and end up overheating all your other expensive components and drive down your PSUs efficiency.

C'mon AMD! I want to buy from you and support you guys but get your sh!t together.

That's too damn hot. I wonder how many RMA's due to overheated parts are coming down the line.

October 24, 2013 | 12:51 PM - Posted by Chloiber (not verified)

It's a great card, but the reference cooling is, as always unusable. I had hoped that AMD would improved this time. It's really a sad story: they should invest way more in their coolers (and up production cost by a bit, they can go up a bit in prices easily).
I know that custom designs will appear which remedy this problem quite a bit, but it still leaves a very big disadvantage. And everyone who reads the reviews gets a bad impression of the card. I think AMD will never learn.

October 25, 2013 | 12:11 PM - Posted by TwinShadowx (not verified)

Cant Please everyone. I Rather have a cheap looking card the runs Hot and loud that Cost 550 then a card that's over price at the end of the day the both get the Job down. Also AMD could have spent time and money getting a real cooler and also charge 1000$

October 24, 2013 | 01:00 PM - Posted by Anonymous (not verified)

This price is excellent. We won't have to pay a huge premium for the mods that Asus, MSI and others are going to make on the stock 290X. Imagine an Asus Ares 290X version with a silent cooler. Energy consumption cannot change much but noise is something I won't tolerate.

Happy happy happy.

October 24, 2013 | 01:16 PM - Posted by Anonymous (not verified)

Looks like Titan kicked it`s arse.

October 24, 2013 | 10:43 PM - Posted by nabokovfan87

are you mad?

October 24, 2013 | 04:04 PM - Posted by ltguy005 (not verified)

I would like to see the R9 290x and the GTX Titan benchmarked together with the fans at the same rpm and the temperature throttle point set to the same value (like 70C) so that each card can boost up as much as possible while staying in that noise/temp envelope.

October 24, 2013 | 06:31 PM - Posted by jon (not verified)

great review ryan,
I really like your frame time graphs especially @1440p
It was also shocking to see how low the 290X avg. clocks really are. And still it competes!

It would be great if you could add in the real average clock speeds for a particular bench instead of stating the boost clock/base clock, behind the name of each card.(in future reviews).

That would make for an easier comparison since clock speeds seem to be all over the place on both the 290X and nvidia cards.

October 25, 2013 | 07:06 AM - Posted by Johnny Rook (not verified)

The gaming experience I can get from a reference R9 290X would be better than the reference GTX 780 gaming experience if:

a) I use noise-canceling headphones;

b) I use an water-block;

c) I am willing to pay a bit extra in my power bills.

But, wouldn't that make the R9 290X "experience" cost me ~$750, or more?

No, R9 290X is not worthy a Gold Award.
Sorry Ryan but, I think you are not being coherent with yourself; the GTX 780, that already performs better than a TITAN, not even a Brass Award got in your review!

And I don't even want to continue writing about the "gaming experience" a card delivers being more important to me than its "raw performance" numbers, because is a personal preference.

Anyways, for those that only care about "raw performance", real owners, in real-world gaming rigs, already start to post results from the R9 290X in forums, and those results are not being better than results posted previously by GTX 780 owners.

October 27, 2013 | 08:13 AM - Posted by Daniel Nielsen (not verified)

Its a review, its his opinion, and it's just as valid as your.

November 9, 2014 | 07:28 AM - Posted by Vince (not verified)

Ahaa, itѕ рleasant ɗialogue on the tߋрic of this piece of writing here
att this wweb sіte, I have read all that, so at
this time me also ϲommеnting at this place.

Feel free to suіrf to my webllg Neuro3X (Limitlessdrug.Pw)

December 11, 2014 | 12:38 AM - Posted by Anonymous (not verified)

im a long time gamer tester and far from being bias i have loved nvidia and ATI for years i like both but the rfernce is a hotter running card but with a little common know how the r9 290x refernce can be controlled for heat for free and noise limited if set right just use msi afterburner for this issue my gtx 780 underload using furmark hits 87 temps at 1280x720 open air room temp 75f the r9 290x hit 87 to 90 at same specs the nvidia card actually adjust the fan speed like it should the r9 does not so i basically set my r9 fan speed to same certain temp setting as the 780 and magic the fan works perfect and my temps are now 78 under full load and the fan is slightly louder nothing to complain about if you have never used furmark go get it from geek3d this program loads a card like no other this card at the price point is a great deal for the performance its not that amd has put a cheap cooler/ heats sink its actually a great one but the reference bios is not set right for the fan. and honestly the fan in a amd is alot higher rpm fan these things can put off some serious noise but you can pick up a reference card fairly reasonable for these issues but trust me get msi afterburner set it to boot with windows and adjust these settings. if you need my setting reply ill get back with you and help heres u a easy great fix and very fast card for alot less cash. like i said i like both but with all new games being developed and are now being more stream for things amd support for the next few years i see amd card being a bit on top. unlike the past nvidia had the mainstream connection with game developers.

October 26, 2013 | 08:21 PM - Posted by wujj123456

International Stutter Units... LMAO

November 5, 2013 | 01:59 PM - Posted by Anonymous (not verified)

Amazing review , thanks :)

November 9, 2013 | 04:21 PM - Posted by Anonymous (not verified)

Hi, the page 1 of this article states that the GTX Titan/780/280X/7970GHz hace 320-BitBUS when in reality they are 384-Bits.

January 30, 2015 | 06:02 PM - Posted by MAXXHEW

The FIX3R saves the day! Use #IWant4GB to enter to win an AMD Radeon™ R9 290X graphics card...

@AMD is giving away free R9-290X's here:

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.