Review Index:

AMD Radeon R9 290X Hawaii Review - Taking on the TITANs

Manufacturer: AMD

A slightly new architecture

Note: We also tested the new AMD Radeon R9 290X in CrossFire and at 4K resolutions; check out that full Frame Rating story right here!!

Last month AMD brought media, analysts, and customers out to Hawaii to talk about a new graphics chip coming out this year.  As you might have guessed based on the location: the code name for this GPU was in fact, Hawaii. It was targeted at the high end of the discrete graphics market to take on the likes of the GTX 780 and GTX TITAN from NVIDIA. 

Earlier this month we reviewed the AMD Radeon R9 280X, R9 270X, and the R7 260X. None of these were based on that new GPU.  Instead, these cards were all rebrands and repositionings of existing hardware in the market (albeit at reduced prices).  Those lower prices made the R9 280X one of our favorite GPUs of the moment as it offers performance per price points currently unmatched by NVIDIA.

But today is a little different, today we are talking about a much more expensive product that has to live up to some pretty lofty goals and ambitions set forward by the AMD PR and marketing machine.  At $549 MSRP, the new AMD Radeon R9 290X will become the flagship of the Radeon brand.  The question is: to where does that ship sail?


The AMD Hawaii Architecture

To be quite upfront about it, the Hawaii design is very similar to that of the Tahiti GPU from the Radeon HD 7970 and R9 280X cards.  Based on the same GCN (Graphics Core Next) architecture AMD assured us would be its long term vision, Hawaii ups the ante in a few key areas while maintaining the same core.

View Full Size

Hawaii is built around Shader Engines, of which the R9 290X has four.  Each of these includes 11 CU (compute units) which hold 4 SIMD arrays each.  Doing the quick math brings us to a total stream processor count of 2,816 on the R9 290X. 

Continue reading our review of the AMD Radeon R9 290X 4GB Graphics Card!!

View Full Size

The four shader engines allow the R9 290X to process almost double the amount of primitives as the earlier R9 280X (HD 7970).  There are four tessellation engines in total, so tessellation performance should get a modest boost when compared to previous products, as well as being more in line with what NVIDIA offers in their latest cards.

View Full Size

AMD’s GCN architecture has proven to be very robust and flexible.  Each CU is comprised of 4 x 16 wide vector units, a single scalar unit, 4 texture filter units, the associated texture fetch load/store units, and plenty of registers, data shares, and cache.  While the basic unit remains the same for the R9 290X as it has with the parts released last year, they are still very high performance compute units that are a massive step forward from the previous VLIW 4/5 architectures from past AMD cards.

View Full Size

Double the primitives, double the geometry throughput, and double the tessellation.  AMD has beefed up the front end of Hawaii so it will be more apt to compete with the high end GTX 780 and GTX Titan from NVIDIA.  Tessellation and geometry performance has always been lower than what NVIDIA offers, but this should cover most of the bases when it comes to real-world gameplay involving tessellation.

View Full Size

This chip offers up 64 pixels per clock.  It also offers up to 128 Z/Stencil operations as well.  Each render back end is attached to a memory controller and it offers a tremendous amount of fillrate per clock.  Running between 800 and 1000 MHz, a single card should be able to color a whole lot of pixels on multiple screens without issue.

View Full Size

Speaking of memory controllers, we see that the R9 290X has eight of them.  This allows for a memory bus width of 512 bits.  AMD is shipping this card with 4GB of memory running at 5 GHz effective giving around 320 GB/sec of bandwidth.  This outstrips the 320 bit memory controllers of the HD 7970/R9 280X, the GTX 780, and the GTX Titan.

View Full Size

Hawaii is a bigger chip than what AMD is used to.  It is quite a bit smaller than the 530 mm squared GK110, but it seems to actually be a small step above that other part in terms of performance across different workloads.  In this table we see that we have a doubling of primitives, 1.3x compute, higher tex and pixel fillrate, higher memory bandwidth (that could potentially go higher with faster memory), and a die area well below those ratios between the Tahiti chip and Hawaii.  AMD has rearranged the GCN architecture to maximize all of those characteristics without exploding the die size and transistor count.  This is a very powerful chip and it is not all that much bigger than the previous high end from AMD.

October 28, 2013 | 08:29 AM - Posted by Panta

R9-290x hits 1200/1600 easy, i doubt it.
i assume it's a cheery picked card..

October 24, 2013 | 11:58 AM - Posted by Anonymous (not verified)

Almost considered getting one of these until I saw the 95C temps.

You go SLI and end up overheating all your other expensive components and drive down your PSUs efficiency.

C'mon AMD! I want to buy from you and support you guys but get your sh!t together.

That's too damn hot. I wonder how many RMA's due to overheated parts are coming down the line.

October 24, 2013 | 12:51 PM - Posted by Chloiber (not verified)

It's a great card, but the reference cooling is, as always unusable. I had hoped that AMD would improved this time. It's really a sad story: they should invest way more in their coolers (and up production cost by a bit, they can go up a bit in prices easily).
I know that custom designs will appear which remedy this problem quite a bit, but it still leaves a very big disadvantage. And everyone who reads the reviews gets a bad impression of the card. I think AMD will never learn.

October 25, 2013 | 12:11 PM - Posted by TwinShadowx (not verified)

Cant Please everyone. I Rather have a cheap looking card the runs Hot and loud that Cost 550 then a card that's over price at the end of the day the both get the Job down. Also AMD could have spent time and money getting a real cooler and also charge 1000$

October 24, 2013 | 01:00 PM - Posted by Anonymous (not verified)

This price is excellent. We won't have to pay a huge premium for the mods that Asus, MSI and others are going to make on the stock 290X. Imagine an Asus Ares 290X version with a silent cooler. Energy consumption cannot change much but noise is something I won't tolerate.

Happy happy happy.

October 24, 2013 | 01:16 PM - Posted by Anonymous (not verified)

Looks like Titan kicked it`s arse.

October 24, 2013 | 10:43 PM - Posted by nabokovfan87

are you mad?

October 24, 2013 | 04:04 PM - Posted by ltguy005 (not verified)

I would like to see the R9 290x and the GTX Titan benchmarked together with the fans at the same rpm and the temperature throttle point set to the same value (like 70C) so that each card can boost up as much as possible while staying in that noise/temp envelope.

October 24, 2013 | 06:31 PM - Posted by jon (not verified)

great review ryan,
I really like your frame time graphs especially @1440p
It was also shocking to see how low the 290X avg. clocks really are. And still it competes!

It would be great if you could add in the real average clock speeds for a particular bench instead of stating the boost clock/base clock, behind the name of each card.(in future reviews).

That would make for an easier comparison since clock speeds seem to be all over the place on both the 290X and nvidia cards.

October 25, 2013 | 07:06 AM - Posted by Johnny Rook (not verified)

The gaming experience I can get from a reference R9 290X would be better than the reference GTX 780 gaming experience if:

a) I use noise-canceling headphones;

b) I use an water-block;

c) I am willing to pay a bit extra in my power bills.

But, wouldn't that make the R9 290X "experience" cost me ~$750, or more?

No, R9 290X is not worthy a Gold Award.
Sorry Ryan but, I think you are not being coherent with yourself; the GTX 780, that already performs better than a TITAN, not even a Brass Award got in your review!

And I don't even want to continue writing about the "gaming experience" a card delivers being more important to me than its "raw performance" numbers, because is a personal preference.

Anyways, for those that only care about "raw performance", real owners, in real-world gaming rigs, already start to post results from the R9 290X in forums, and those results are not being better than results posted previously by GTX 780 owners.

October 27, 2013 | 08:13 AM - Posted by Daniel Nielsen (not verified)

Its a review, its his opinion, and it's just as valid as your.

November 9, 2014 | 07:28 AM - Posted by Vince (not verified)

Ahaa, itѕ рleasant ɗialogue on the tߋрic of this piece of writing here
att this wweb sіte, I have read all that, so at
this time me also ϲommеnting at this place.

Feel free to suіrf to my webllg Neuro3X (Limitlessdrug.Pw)

December 11, 2014 | 12:38 AM - Posted by Anonymous (not verified)

im a long time gamer tester and far from being bias i have loved nvidia and ATI for years i like both but the rfernce is a hotter running card but with a little common know how the r9 290x refernce can be controlled for heat for free and noise limited if set right just use msi afterburner for this issue my gtx 780 underload using furmark hits 87 temps at 1280x720 open air room temp 75f the r9 290x hit 87 to 90 at same specs the nvidia card actually adjust the fan speed like it should the r9 does not so i basically set my r9 fan speed to same certain temp setting as the 780 and magic the fan works perfect and my temps are now 78 under full load and the fan is slightly louder nothing to complain about if you have never used furmark go get it from geek3d this program loads a card like no other this card at the price point is a great deal for the performance its not that amd has put a cheap cooler/ heats sink its actually a great one but the reference bios is not set right for the fan. and honestly the fan in a amd is alot higher rpm fan these things can put off some serious noise but you can pick up a reference card fairly reasonable for these issues but trust me get msi afterburner set it to boot with windows and adjust these settings. if you need my setting reply ill get back with you and help heres u a easy great fix and very fast card for alot less cash. like i said i like both but with all new games being developed and are now being more stream for things amd support for the next few years i see amd card being a bit on top. unlike the past nvidia had the mainstream connection with game developers.

October 26, 2013 | 08:21 PM - Posted by wujj123456

International Stutter Units... LMAO

November 5, 2013 | 01:59 PM - Posted by Anonymous (not verified)

Amazing review , thanks :)

November 9, 2013 | 04:21 PM - Posted by Anonymous (not verified)

Hi, the page 1 of this article states that the GTX Titan/780/280X/7970GHz hace 320-BitBUS when in reality they are 384-Bits.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.