Review Index:
Feedback

AMD Radeon R9 290X Hawaii Review - Taking on the TITANs

Author:
Manufacturer: AMD

A slightly new architecture

Note: We also tested the new AMD Radeon R9 290X in CrossFire and at 4K resolutions; check out that full Frame Rating story right here!!

Last month AMD brought media, analysts, and customers out to Hawaii to talk about a new graphics chip coming out this year.  As you might have guessed based on the location: the code name for this GPU was in fact, Hawaii. It was targeted at the high end of the discrete graphics market to take on the likes of the GTX 780 and GTX TITAN from NVIDIA. 

Earlier this month we reviewed the AMD Radeon R9 280X, R9 270X, and the R7 260X. None of these were based on that new GPU.  Instead, these cards were all rebrands and repositionings of existing hardware in the market (albeit at reduced prices).  Those lower prices made the R9 280X one of our favorite GPUs of the moment as it offers performance per price points currently unmatched by NVIDIA.

But today is a little different, today we are talking about a much more expensive product that has to live up to some pretty lofty goals and ambitions set forward by the AMD PR and marketing machine.  At $549 MSRP, the new AMD Radeon R9 290X will become the flagship of the Radeon brand.  The question is: to where does that ship sail?

 

The AMD Hawaii Architecture

To be quite upfront about it, the Hawaii design is very similar to that of the Tahiti GPU from the Radeon HD 7970 and R9 280X cards.  Based on the same GCN (Graphics Core Next) architecture AMD assured us would be its long term vision, Hawaii ups the ante in a few key areas while maintaining the same core.

View Full Size

Hawaii is built around Shader Engines, of which the R9 290X has four.  Each of these includes 11 CU (compute units) which hold 4 SIMD arrays each.  Doing the quick math brings us to a total stream processor count of 2,816 on the R9 290X. 

Continue reading our review of the AMD Radeon R9 290X 4GB Graphics Card!!

View Full Size

The four shader engines allow the R9 290X to process almost double the amount of primitives as the earlier R9 280X (HD 7970).  There are four tessellation engines in total, so tessellation performance should get a modest boost when compared to previous products, as well as being more in line with what NVIDIA offers in their latest cards.

View Full Size

AMD’s GCN architecture has proven to be very robust and flexible.  Each CU is comprised of 4 x 16 wide vector units, a single scalar unit, 4 texture filter units, the associated texture fetch load/store units, and plenty of registers, data shares, and cache.  While the basic unit remains the same for the R9 290X as it has with the parts released last year, they are still very high performance compute units that are a massive step forward from the previous VLIW 4/5 architectures from past AMD cards.

View Full Size

Double the primitives, double the geometry throughput, and double the tessellation.  AMD has beefed up the front end of Hawaii so it will be more apt to compete with the high end GTX 780 and GTX Titan from NVIDIA.  Tessellation and geometry performance has always been lower than what NVIDIA offers, but this should cover most of the bases when it comes to real-world gameplay involving tessellation.

View Full Size

This chip offers up 64 pixels per clock.  It also offers up to 128 Z/Stencil operations as well.  Each render back end is attached to a memory controller and it offers a tremendous amount of fillrate per clock.  Running between 800 and 1000 MHz, a single card should be able to color a whole lot of pixels on multiple screens without issue.

View Full Size

Speaking of memory controllers, we see that the R9 290X has eight of them.  This allows for a memory bus width of 512 bits.  AMD is shipping this card with 4GB of memory running at 5 GHz effective giving around 320 GB/sec of bandwidth.  This outstrips the 320 bit memory controllers of the HD 7970/R9 280X, the GTX 780, and the GTX Titan.

View Full Size

Hawaii is a bigger chip than what AMD is used to.  It is quite a bit smaller than the 530 mm squared GK110, but it seems to actually be a small step above that other part in terms of performance across different workloads.  In this table we see that we have a doubling of primitives, 1.3x compute, higher tex and pixel fillrate, higher memory bandwidth (that could potentially go higher with faster memory), and a die area well below those ratios between the Tahiti chip and Hawaii.  AMD has rearranged the GCN architecture to maximize all of those characteristics without exploding the die size and transistor count.  This is a very powerful chip and it is not all that much bigger than the previous high end from AMD.

October 23, 2013 | 09:19 PM - Posted by Mnemonicman

Can't wait to see how this card performs under watercooling.

October 23, 2013 | 10:10 PM - Posted by Indeed (not verified)

Watercooling is probably called for...the higher power density makes it harder to cool.

October 23, 2013 | 09:27 PM - Posted by arbiter

Givin' all the hype this card got over last few months, how it would be a titan killer, etc. Looking at most games its pretty much on even group with titan and almost on par with 780. Still question on 95c temp if that was where it would get to in games? If givin' the same temp limit on which side would win as 780/titan cards do have a some bit of overclocking headroom. 95c is what it got to in gaming tests, being open air bench surely that would effect what real world would see when its in an enclosed case. If those prices are correct then it does look to be a good deal.

October 23, 2013 | 10:14 PM - Posted by Fergdog (not verified)

I don't see why there'd be a lot of hype for this card. It's on the same 28nm and it's the same architecture as the 7790 which was just very minor compute changes. It gives good competition to Nvidia's high end card that are ridiculously priced which is very good though.

October 23, 2013 | 09:51 PM - Posted by PapaDragon

The first page I went to was the Battlefield 3 Page..lol. Great Performance all around!!! Im surprised that it Cost $550 (good) but that heat output/noise and temp..its ok. Cant wait to see the Lightning/Direct Cu versions. Also, cant wait for the price drops the GTX 780 too.

@ Ryan, in the BF3 section , the screen shot of the BF3 settings puts it at 1920/1200, but the graphs show it at 1080p??

2. The Sound levels of the 280x are missing!

Thanks 4 the review!!

October 24, 2013 | 07:42 AM - Posted by Ryan Shrout

Trust the settings, not the resolution on that screen.  :)

October 23, 2013 | 10:18 PM - Posted by Scyy (not verified)

I see what you meant about watch for other reviews not allowing the cards to throttle. Everywhere else is showing huge leads that likely diminish somewhat after playing a short while and the card throttles.

October 23, 2013 | 10:38 PM - Posted by Anonymous (not verified)

The card looks like a toy compared to the titan visually, also probably contributes to its lack of thermals. The quality of the plastics etc. But wow on the price.!! I dunno I guess for the budget limited would go for this, but the higher quality card is the titan. If you got a high end pc your still gonna get the titan, whats a few hundred bucks when you got 5 or 7k in it.

October 23, 2013 | 11:19 PM - Posted by MArc-Andre B (not verified)

flip uber mode + fan slider to 100% (will run up to 100% but not always) and watch the magic! (maybe save yourself warming up the card 5 min too!)

October 24, 2013 | 07:42 AM - Posted by Ryan Shrout

Watch the magic and listen to the wind!! :D

October 23, 2013 | 11:40 PM - Posted by 63jax

i'm very curious about the Toxic version of this, can't wait!

October 24, 2013 | 12:44 AM - Posted by Klimax (not verified)

Simply only brute force. Nothing smart about it.

Also what is highest tessellation setting in Metro LL and if it is not normal, why wasn't it used? (I guess, because AMD still sucks in tessellation)

October 24, 2013 | 12:47 AM - Posted by Matt (not verified)

"The AMD Radeon R9 290X delivers here as well, coming in not only $350 less expensive than the GeForce GTX Titan, but $100 less than the GeForce GTX 780 as well."

Titan: $999
R290x: $549

Price difference: $450

October 24, 2013 | 07:43 AM - Posted by Ryan Shrout

Doh, math is hard.

Fixed, thanks!

October 24, 2013 | 02:00 AM - Posted by nasha thedog (not verified)

The conclusion and overall results are very different to several other tests I've seen so far, Personally I don't think 4k results really matter, 99% of gamers won't have it for a year or two and this gen will of been replaced by then, AMD are silly for demanding reference only on release, It looks cheaply made for a top end release. I want to see it compared at relevant res's ie: from 1920x1080 to 2560x1600 and I want to see for example the MSI 290x gaming against the MSI 780 gaming, both out of the box and overclocked before I make any final purchases. As it stands reference v's reference even with your results leaning to AMD I'd still go with a 780 reference over a 290x reference as it seems I'd be faster, cooler and quieter when both are overclocked, But like I said that's neither here nor there, bring on the non reference models.

October 24, 2013 | 03:49 AM - Posted by symmetrical (not verified)

Ryan, shouldn't it be $450 less than a Titan...?

October 24, 2013 | 07:43 AM - Posted by Ryan Shrout

Yup, sorry!

October 24, 2013 | 04:12 AM - Posted by HeavyG (not verified)

As a dual Titan owner, I am very impressed with these scores. I am not sure if I would trade off the gain on performance for the heat and noise penalty, but either way, if my cash balance was the determining factor of which card to buy, it would clearly be the R9 290X.

Luckily my cash balance doesn't make all of the final decisions. Anyways, I can't wait to see how well this card does when some non-reference cooled versions are tested. It might make a big difference compared to that crappy AMD cooler.

October 24, 2013 | 05:34 AM - Posted by Anonymous (not verified)

"coming in not only $350 less expensive than the GeForce GTX Titan" 999-549=450

October 24, 2013 | 07:44 AM - Posted by Ryan Shrout

Fixed!

October 24, 2013 | 06:30 AM - Posted by Mac (not verified)

I'm not seeing the issue with the "up to 1GHz" clocks thing
This thing has DPM states anywhere from 300 to 1000Mhz
We've seen as low as 727Mhz in furmark

October 24, 2013 | 06:48 AM - Posted by Anonymous (not verified)

Has the been any testing on multi display setups, i would like to see if the 4k preformance also help on multi display gaming .. like Battlefield on 3 screens.

October 24, 2013 | 07:44 AM - Posted by Ryan Shrout

4K on tiled displays and Eyefinity should be very similar - but I'll be doing testing this week.

October 24, 2013 | 07:05 AM - Posted by Remon (not verified)

Did you warm up the Nvidia cards too?

October 24, 2013 | 07:45 AM - Posted by Ryan Shrout

Yup!

Though to be honest the performance deltas we've seen on NVIDIA's cards are very minimal.

October 24, 2013 | 07:59 AM - Posted by nabokovfan87

I know it sounds trivial Ryan, but I'd love to see the breakdown of average clock vs. boosted/advertised like you did for the AMDs going into ALL reviews if not just a follow-up article with titan and the 290x.

As a consumer I am extremely hesitant to trust boost speeds on CPU and GPU. If it can run it at one point on all cores, then that's the max (in my opinion). There is also the matter of this all being temperature dependant. As someone suggested above, for the sake of proving their internal temp/boost logic out, perhaps do the same above average clock speed tests after warmup with and without fan at 100%.

I live down in so cal, and my room is ALWAYS 85-90 F when running my PC, and during the summer it's possibly worse. So if I spend this amount of money on a card and it says it will perform at those temps then I'm likely going to hit them, and with boost being the new fad I get the vibe that it will always downclock and run horribly.

When you were discussing the clocks with Josh and you showed average clocks, i think it's half the story. Is it because of temp, is it because the gpu doesn't NEED to be at 1 GHz for that sequence of the game, or is it time limited boost? Hard to say, but when you have it at ~850 MHz vs. 1 GHz and it still kills the 1000 card, I think it's pretty hard to be that upset by. With updates it will get even faster also.

Lastly, I started watching your thoughts and conversation with josh via the youtube video. I know you don't like some of the things AMD is saying vs. doing, and I respect you pointing those out, but when you go to review their parts and every sentence is finished with "but the titan does X" or "but nvidia has X". It seems like you are an advertiser more then a bipartisan observer of facts and information.

October 24, 2013 | 10:54 AM - Posted by Ryan Shrout

Thanks for the feedback.  I'll address one by one.

1. I can do that in the coming weeks, yes.  Note that we did do some of this when Kepler launched and found the "typical" boost clocks that NVIDIA advertised to be pretty damn accurate.  Also, no fan runs at 100%, thank goodness.  :)

2. An ambient temp that high will definitely have adverse affects on your graphics card, NVIDIA or AMD.  I hope you have good cooling in there and use some noise canceling headphones!

3. There is never a time when the GPU decides it doesn't "need to be" at 1.0 GHz for AMD.  The lower clocks are simply a result of what the GPU can run at while not crossing the 95C temperature level.

4. Keep in mind that my video on the R9 290X is not an ad for the AMD R9 290X.  It is meant to be an editorial commentary on the product which I believe requires direct comparison to the previous generation and the current competition.  That means I will talk about what the TITAN and 780 do better, and what they do worse.  The fact that I comment on that throughout should be indicative of that and nothing more.

October 24, 2013 | 07:42 PM - Posted by nabokovfan87

thanks for the reply back Ryan. Can you elaborate on number 3? If temp doesn't cause it, then why not just have a base clock and avoid boost alltogether?

I agree with you that you have to compare/contrast, but perhaps a better format is simply stating the results and then doing the "but competition does x" spew afterwards. Yes you have to compare by the very nature of using charts and graphs, but as someone who ultimately wants to make their own judgment based on valid and reliable data the constant interruption of "but loud vs competitor" compared to simply saying "but its load compared to last gen/our usability standards" was very offputting, as an example.

October 24, 2013 | 07:35 AM - Posted by Truth Seeker (not verified)

As the previous poster, if you are not warming up Nvidia cards, then you have unfairly biased the review against AMD.

Vast majority of other reviews conclude AMD R9-290x wins on performance (cruises over Titan at 4K) and clear winner in price per performance.

Coupled with water blocks (as EK indicates at Overclock link), the R9-290x hits 1200/1600 easy and becomes an overclocked rocket.

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/...

October 24, 2013 | 07:45 AM - Posted by Ryan Shrout

That is basically the same conclusion that I made...?

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.