A slightly new architecture
AMD’s new flagship GPU is finally here and we have all the benchmark results to tell you where it stands!!
Note: We also tested the new AMD Radeon R9 290X in CrossFire and at 4K resolutions; check out that full Frame Rating story right here!!
Last month AMD brought media, analysts, and customers out to Hawaii to talk about a new graphics chip coming out this year. As you might have guessed based on the location: the code name for this GPU was in fact, Hawaii. It was targeted at the high end of the discrete graphics market to take on the likes of the GTX 780 and GTX TITAN from NVIDIA.
Earlier this month we reviewed the AMD Radeon R9 280X, R9 270X, and the R7 260X. None of these were based on that new GPU. Instead, these cards were all rebrands and repositionings of existing hardware in the market (albeit at reduced prices). Those lower prices made the R9 280X one of our favorite GPUs of the moment as it offers performance per price points currently unmatched by NVIDIA.
But today is a little different, today we are talking about a much more expensive product that has to live up to some pretty lofty goals and ambitions set forward by the AMD PR and marketing machine. At $549 MSRP, the new AMD Radeon R9 290X will become the flagship of the Radeon brand. The question is: to where does that ship sail?
The AMD Hawaii Architecture
To be quite upfront about it, the Hawaii design is very similar to that of the Tahiti GPU from the Radeon HD 7970 and R9 280X cards. Based on the same GCN (Graphics Core Next) architecture AMD assured us would be its long term vision, Hawaii ups the ante in a few key areas while maintaining the same core.
Hawaii is built around Shader Engines, of which the R9 290X has four. Each of these includes 11 CU (compute units) which hold 4 SIMD arrays each. Doing the quick math brings us to a total stream processor count of 2,816 on the R9 290X.
The four shader engines allow the R9 290X to process almost double the amount of primitives as the earlier R9 280X (HD 7970). There are four tessellation engines in total, so tessellation performance should get a modest boost when compared to previous products, as well as being more in line with what NVIDIA offers in their latest cards.
AMD’s GCN architecture has proven to be very robust and flexible. Each CU is comprised of 4 x 16 wide vector units, a single scalar unit, 4 texture filter units, the associated texture fetch load/store units, and plenty of registers, data shares, and cache. While the basic unit remains the same for the R9 290X as it has with the parts released last year, they are still very high performance compute units that are a massive step forward from the previous VLIW 4/5 architectures from past AMD cards.
Double the primitives, double the geometry throughput, and double the tessellation. AMD has beefed up the front end of Hawaii so it will be more apt to compete with the high end GTX 780 and GTX Titan from NVIDIA. Tessellation and geometry performance has always been lower than what NVIDIA offers, but this should cover most of the bases when it comes to real-world gameplay involving tessellation.
This chip offers up 64 pixels per clock. It also offers up to 128 Z/Stencil operations as well. Each render back end is attached to a memory controller and it offers a tremendous amount of fillrate per clock. Running between 800 and 1000 MHz, a single card should be able to color a whole lot of pixels on multiple screens without issue.
Speaking of memory controllers, we see that the R9 290X has eight of them. This allows for a memory bus width of 512 bits. AMD is shipping this card with 4GB of memory running at 5 GHz effective giving around 320 GB/sec of bandwidth. This outstrips the 320 bit memory controllers of the HD 7970/R9 280X, the GTX 780, and the GTX Titan.
Hawaii is a bigger chip than what AMD is used to. It is quite a bit smaller than the 530 mm squared GK110, but it seems to actually be a small step above that other part in terms of performance across different workloads. In this table we see that we have a doubling of primitives, 1.3x compute, higher tex and pixel fillrate, higher memory bandwidth (that could potentially go higher with faster memory), and a die area well below those ratios between the Tahiti chip and Hawaii. AMD has rearranged the GCN architecture to maximize all of those characteristics without exploding the die size and transistor count. This is a very powerful chip and it is not all that much bigger than the previous high end from AMD.
Looks like Titan kicked it`s
Looks like Titan kicked it`s arse.
NVIDIA FTW !
are you mad?
are you mad?
I would like to see the R9
I would like to see the R9 290x and the GTX Titan benchmarked together with the fans at the same rpm and the temperature throttle point set to the same value (like 70C) so that each card can boost up as much as possible while staying in that noise/temp envelope.
great review ryan,
I really
great review ryan,
I really like your frame time graphs especially @1440p
It was also shocking to see how low the 290X avg. clocks really are. And still it competes!
It would be great if you could add in the real average clock speeds for a particular bench instead of stating the boost clock/base clock, behind the name of each card.(in future reviews).
That would make for an easier comparison since clock speeds seem to be all over the place on both the 290X and nvidia cards.
The gaming experience I can
The gaming experience I can get from a reference R9 290X would be better than the reference GTX 780 gaming experience if:
a) I use noise-canceling headphones;
b) I use an water-block;
c) I am willing to pay a bit extra in my power bills.
But, wouldn’t that make the R9 290X “experience” cost me ~$750, or more?
No, R9 290X is not worthy a Gold Award.
Sorry Ryan but, I think you are not being coherent with yourself; the GTX 780, that already performs better than a TITAN, not even a Brass Award got in your review!
And I don’t even want to continue writing about the “gaming experience” a card delivers being more important to me than its “raw performance” numbers, because is a personal preference.
Anyways, for those that only care about “raw performance”, real owners, in real-world gaming rigs, already start to post results from the R9 290X in forums, and those results are not being better than results posted previously by GTX 780 owners.
Its a review, its his
Its a review, its his opinion, and it’s just as valid as your.
Ahaa, itѕ рleasant ɗialogue
Ahaa, itѕ рleasant ɗialogue on the tߋрic of this piece of writing here
att this wweb sіte, I have read all that, so at
this time me also ϲommеnting at this place.
Feel free to suіrf to my webllg Neuro3X (Limitlessdrug.Pw)
im a long time gamer tester
im a long time gamer tester and far from being bias i have loved nvidia and ATI for years i like both but the rfernce is a hotter running card but with a little common know how the r9 290x refernce can be controlled for heat for free and noise limited if set right just use msi afterburner for this issue my gtx 780 underload using furmark hits 87 temps at 1280×720 open air room temp 75f the r9 290x hit 87 to 90 at same specs the nvidia card actually adjust the fan speed like it should the r9 does not so i basically set my r9 fan speed to same certain temp setting as the 780 and magic the fan works perfect and my temps are now 78 under full load and the fan is slightly louder nothing to complain about if you have never used furmark go get it from geek3d this program loads a card like no other this card at the price point is a great deal for the performance its not that amd has put a cheap cooler/ heats sink its actually a great one but the reference bios is not set right for the fan. and honestly the fan in a amd is alot higher rpm fan these things can put off some serious noise but you can pick up a reference card fairly reasonable for these issues but trust me get msi afterburner set it to boot with windows and adjust these settings. if you need my setting reply ill get back with you and help heres u a easy great fix and very fast card for alot less cash. like i said i like both but with all new games being developed and are now being more stream for things amd support for the next few years i see amd card being a bit on top. unlike the past nvidia had the mainstream connection with game developers.
International Stutter
International Stutter Units… LMAO
Amazing review , thanks
Amazing review , thanks 🙂
http://www.battlefield4-cdkey.com/
Hi, the page 1 of this
Hi, the page 1 of this article states that the GTX Titan/780/280X/7970GHz hace 320-BitBUS when in reality they are 384-Bits.
The FIX3R saves the day! Use
The FIX3R saves the day! Use #IWant4GB to enter to win an AMD Radeon™ R9 290X graphics card…
http://www.youtube.com/watch?v=tQAtwFFa2QY
@AMD is giving away free R9-290X’s here: https://twitter.com/AMD