Review Index:

AMD Radeon R9 290X Hawaii Review - Taking on the TITANs

Author: Ryan Shrout
Manufacturer: AMD

The Radeon R9 290X 4GB Graphics Card and Specs

If you were paying attention to the noise that AMD was creating during its GPU14 event then you likely already know what the Hawaii-based Radeon R9 290X looks like.  If you are a fan of the black and red color scheme, then prepare to be ecstatic. 

View Full Size

At just about 290mm in length, the R9 290X is just a little bit longer than NVIDIA's GeForce GTX TITAN and GTX 780 cards.  The plastic housing is well designed and maintains the red/black scheme that has been with AMD for quite a while.  The reference card also keeps the pretty standard heatsink and single fan combination we've had since the introduction of the Radeon HD 5000 cards. 

View Full Size

Though we touched on most of this on the previous page, it is worth recapping here the specifications of the new R9 290X.  It has 2816 stream processors, 37% more than the R9 280X / HD 7970 that came before it.  This helps bump up the top theoretical compute power to 5.6 TFLOPS giving it an edge of 36% over the R9 280X.  By far the most important architectural change was the increase in primitives / clock; from 2 to 4

The engine clock of "up to" 1.0 GHz will need some more discussion on another page as we found it to be quite a bit more variable than expected. 

With 4GB of GDDR5 memory, the R9 290X should have more than enough space for 4K gaming and even though it runs at a slower clock speed (5.0 Gbps) it increases the bus width up to 512-bit for a massive amount of memory bandwidth; 320 GB/s and 11% faster than the R9 280X.

The R9 290X includes 176 texture units, 64 ROPs and double the Z/Stencil rate of its predecessors.  Clearly the R9 290X does not lack in the hardware department.

View Full Size

The fan on the R9 290X is very similar to the fans we have seen on previous Radeon cards which should tell your right away that this product isn't going to be much quieter than previous generations. 

View Full Size

Despite the huge 6.2 billion transistor die, the R9 290X only requires an 8+6 power connector config. 

View Full Size

We don't need no stinking CrossFire connectors!  But clearly, AMD had left that option open as a contingency plan as the solder points are still located on the reference PCB.  We'll go over the updated CrossFire technology on the next page and you'll find we have some early CrossFire and 4K benchmark results in a follow up article as well!

You may notice the switch on the right hand side of the image as well; that moves the BIOS from "Quiet" mode to "Uber" mode.  All that really means is a shift in the maximum fan speed from 40% to 55% allowing for slightly higher clocks for a slightly longer period of time.  See my PowerTune page for details on that.

View Full Size

The back side of the R9 290X is left bare for all to see though there is much to gawk at.  There are a lot of pads for that 4GB of memory though...

View Full Size

Just like we saw with the R9 280X launch, AMD has gone with a more intelligent display output configuration that includes a pair of dual link DVI outputs, a full size HDMI port and a full size DisplayPort.  Without a doubt this has been the most amicable configuration I have seen in graphics cards in the last several years. 

October 24, 2013 | 12:19 AM - Posted by Mnemonicman

Can't wait to see how this card performs under watercooling.

October 24, 2013 | 01:10 AM - Posted by Indeed (not verified)

Watercooling is probably called for...the higher power density makes it harder to cool.

October 24, 2013 | 12:27 AM - Posted by arbiter

Givin' all the hype this card got over last few months, how it would be a titan killer, etc. Looking at most games its pretty much on even group with titan and almost on par with 780. Still question on 95c temp if that was where it would get to in games? If givin' the same temp limit on which side would win as 780/titan cards do have a some bit of overclocking headroom. 95c is what it got to in gaming tests, being open air bench surely that would effect what real world would see when its in an enclosed case. If those prices are correct then it does look to be a good deal.

October 24, 2013 | 01:14 AM - Posted by Fergdog (not verified)

I don't see why there'd be a lot of hype for this card. It's on the same 28nm and it's the same architecture as the 7790 which was just very minor compute changes. It gives good competition to Nvidia's high end card that are ridiculously priced which is very good though.

October 24, 2013 | 12:51 AM - Posted by PapaDragon

The first page I went to was the Battlefield 3 Great Performance all around!!! Im surprised that it Cost $550 (good) but that heat output/noise and temp..its ok. Cant wait to see the Lightning/Direct Cu versions. Also, cant wait for the price drops the GTX 780 too.

@ Ryan, in the BF3 section , the screen shot of the BF3 settings puts it at 1920/1200, but the graphs show it at 1080p??

2. The Sound levels of the 280x are missing!

Thanks 4 the review!!

October 24, 2013 | 10:42 AM - Posted by Ryan Shrout

Trust the settings, not the resolution on that screen.  :)

October 24, 2013 | 01:18 AM - Posted by Scyy (not verified)

I see what you meant about watch for other reviews not allowing the cards to throttle. Everywhere else is showing huge leads that likely diminish somewhat after playing a short while and the card throttles.

October 24, 2013 | 01:38 AM - Posted by Anonymous (not verified)

The card looks like a toy compared to the titan visually, also probably contributes to its lack of thermals. The quality of the plastics etc. But wow on the price.!! I dunno I guess for the budget limited would go for this, but the higher quality card is the titan. If you got a high end pc your still gonna get the titan, whats a few hundred bucks when you got 5 or 7k in it.

October 24, 2013 | 02:19 AM - Posted by MArc-Andre B (not verified)

flip uber mode + fan slider to 100% (will run up to 100% but not always) and watch the magic! (maybe save yourself warming up the card 5 min too!)

October 24, 2013 | 10:42 AM - Posted by Ryan Shrout

Watch the magic and listen to the wind!! :D

October 24, 2013 | 02:40 AM - Posted by 63jax

i'm very curious about the Toxic version of this, can't wait!

October 24, 2013 | 03:44 AM - Posted by Klimax (not verified)

Simply only brute force. Nothing smart about it.

Also what is highest tessellation setting in Metro LL and if it is not normal, why wasn't it used? (I guess, because AMD still sucks in tessellation)

October 24, 2013 | 03:47 AM - Posted by Matt (not verified)

"The AMD Radeon R9 290X delivers here as well, coming in not only $350 less expensive than the GeForce GTX Titan, but $100 less than the GeForce GTX 780 as well."

Titan: $999
R290x: $549

Price difference: $450

October 24, 2013 | 10:43 AM - Posted by Ryan Shrout

Doh, math is hard.

Fixed, thanks!

October 24, 2013 | 05:00 AM - Posted by nasha thedog (not verified)

The conclusion and overall results are very different to several other tests I've seen so far, Personally I don't think 4k results really matter, 99% of gamers won't have it for a year or two and this gen will of been replaced by then, AMD are silly for demanding reference only on release, It looks cheaply made for a top end release. I want to see it compared at relevant res's ie: from 1920x1080 to 2560x1600 and I want to see for example the MSI 290x gaming against the MSI 780 gaming, both out of the box and overclocked before I make any final purchases. As it stands reference v's reference even with your results leaning to AMD I'd still go with a 780 reference over a 290x reference as it seems I'd be faster, cooler and quieter when both are overclocked, But like I said that's neither here nor there, bring on the non reference models.

October 24, 2013 | 06:49 AM - Posted by symmetrical (not verified)

Ryan, shouldn't it be $450 less than a Titan...?

October 24, 2013 | 10:43 AM - Posted by Ryan Shrout

Yup, sorry!

October 24, 2013 | 07:12 AM - Posted by HeavyG (not verified)

As a dual Titan owner, I am very impressed with these scores. I am not sure if I would trade off the gain on performance for the heat and noise penalty, but either way, if my cash balance was the determining factor of which card to buy, it would clearly be the R9 290X.

Luckily my cash balance doesn't make all of the final decisions. Anyways, I can't wait to see how well this card does when some non-reference cooled versions are tested. It might make a big difference compared to that crappy AMD cooler.

October 24, 2013 | 08:34 AM - Posted by Anonymous (not verified)

"coming in not only $350 less expensive than the GeForce GTX Titan" 999-549=450

October 24, 2013 | 10:44 AM - Posted by Ryan Shrout


October 24, 2013 | 09:30 AM - Posted by Mac (not verified)

I'm not seeing the issue with the "up to 1GHz" clocks thing
This thing has DPM states anywhere from 300 to 1000Mhz
We've seen as low as 727Mhz in furmark

October 24, 2013 | 09:48 AM - Posted by Anonymous (not verified)

Has the been any testing on multi display setups, i would like to see if the 4k preformance also help on multi display gaming .. like Battlefield on 3 screens.

October 24, 2013 | 10:44 AM - Posted by Ryan Shrout

4K on tiled displays and Eyefinity should be very similar - but I'll be doing testing this week.

October 24, 2013 | 10:05 AM - Posted by Remon (not verified)

Did you warm up the Nvidia cards too?

October 24, 2013 | 10:45 AM - Posted by Ryan Shrout


Though to be honest the performance deltas we've seen on NVIDIA's cards are very minimal.

October 24, 2013 | 10:59 AM - Posted by nabokovfan87

I know it sounds trivial Ryan, but I'd love to see the breakdown of average clock vs. boosted/advertised like you did for the AMDs going into ALL reviews if not just a follow-up article with titan and the 290x.

As a consumer I am extremely hesitant to trust boost speeds on CPU and GPU. If it can run it at one point on all cores, then that's the max (in my opinion). There is also the matter of this all being temperature dependant. As someone suggested above, for the sake of proving their internal temp/boost logic out, perhaps do the same above average clock speed tests after warmup with and without fan at 100%.

I live down in so cal, and my room is ALWAYS 85-90 F when running my PC, and during the summer it's possibly worse. So if I spend this amount of money on a card and it says it will perform at those temps then I'm likely going to hit them, and with boost being the new fad I get the vibe that it will always downclock and run horribly.

When you were discussing the clocks with Josh and you showed average clocks, i think it's half the story. Is it because of temp, is it because the gpu doesn't NEED to be at 1 GHz for that sequence of the game, or is it time limited boost? Hard to say, but when you have it at ~850 MHz vs. 1 GHz and it still kills the 1000 card, I think it's pretty hard to be that upset by. With updates it will get even faster also.

Lastly, I started watching your thoughts and conversation with josh via the youtube video. I know you don't like some of the things AMD is saying vs. doing, and I respect you pointing those out, but when you go to review their parts and every sentence is finished with "but the titan does X" or "but nvidia has X". It seems like you are an advertiser more then a bipartisan observer of facts and information.

October 24, 2013 | 01:54 PM - Posted by Ryan Shrout

Thanks for the feedback.  I'll address one by one.

1. I can do that in the coming weeks, yes.  Note that we did do some of this when Kepler launched and found the "typical" boost clocks that NVIDIA advertised to be pretty damn accurate.  Also, no fan runs at 100%, thank goodness.  :)

2. An ambient temp that high will definitely have adverse affects on your graphics card, NVIDIA or AMD.  I hope you have good cooling in there and use some noise canceling headphones!

3. There is never a time when the GPU decides it doesn't "need to be" at 1.0 GHz for AMD.  The lower clocks are simply a result of what the GPU can run at while not crossing the 95C temperature level.

4. Keep in mind that my video on the R9 290X is not an ad for the AMD R9 290X.  It is meant to be an editorial commentary on the product which I believe requires direct comparison to the previous generation and the current competition.  That means I will talk about what the TITAN and 780 do better, and what they do worse.  The fact that I comment on that throughout should be indicative of that and nothing more.

October 24, 2013 | 10:42 PM - Posted by nabokovfan87

thanks for the reply back Ryan. Can you elaborate on number 3? If temp doesn't cause it, then why not just have a base clock and avoid boost alltogether?

I agree with you that you have to compare/contrast, but perhaps a better format is simply stating the results and then doing the "but competition does x" spew afterwards. Yes you have to compare by the very nature of using charts and graphs, but as someone who ultimately wants to make their own judgment based on valid and reliable data the constant interruption of "but loud vs competitor" compared to simply saying "but its load compared to last gen/our usability standards" was very offputting, as an example.

October 24, 2013 | 10:35 AM - Posted by Truth Seeker (not verified)

As the previous poster, if you are not warming up Nvidia cards, then you have unfairly biased the review against AMD.

Vast majority of other reviews conclude AMD R9-290x wins on performance (cruises over Titan at 4K) and clear winner in price per performance.

Coupled with water blocks (as EK indicates at Overclock link), the R9-290x hits 1200/1600 easy and becomes an overclocked rocket.

October 24, 2013 | 10:45 AM - Posted by Ryan Shrout

That is basically the same conclusion that I made...?

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.