Review Index:
Feedback

The AMD Radeon R9 Fury X 4GB Review - Fiji Finally Tested

Author: Ryan Shrout
Manufacturer: AMD
Tagged: 4GB, amd, Fiji, Fury, fury x, hbm, R9, radeon

Power Consumption and Sound Levels

With flagship GPUs not touting power consumption and efficiency as one of the key tenets of design, how much was AMD able to improve its stance compared to the incredibly impressive Maxwell GPU from NVIDIA and the Hawaii GPU from AMD a couple years back?

View Full Size

This is a very impressive result for AMD. The brand new Fury X, built on the same 28nm process technology as the R9 290X using the Hawaii GPU, is able to operate under a gaming load at about 40 watts less power! Keep in mind Fiji is offering improved performance in excess of 35% and is packing in 45% more stream processors than Hawaii. This is great news for AMD as its engineers are clearly headed in the right direction and took a huge leap in power efficiency this generation.

The GTX 980 Ti from NVIDIA though, still used about 20 watts less power than the Fury X while running faster than the AMD flagship in every game but a couple. Fiji is drastically improved over Hawaii, but Maxwell is still king here.

One thing to keep in mind in the power consumption numbers - it is very likely that part of the reason for AMD's utilization of the water cooler design they went with was to keep temperatures as low as possible. This reduces leakage of the GPU and helps maintain lower power consumption for a given performance target. AMD clearly could have used an air cooler for the Fury X but the cost might have been more than just extra noise; it probably would have been lower efficiency as well.

Sound level testing was a bit more complex with the Fury X thanks to its water cooler.

View Full Size

Our demonstrated results as measured by the power meter show the R9 Fury X in a very positive light. Keep in mind that I moved the radiator and fan near the back plate of the Fury X for this testing to be sure we were combining the sound output from the card and the cooler in our measurement. At just 36.4 dbA, the Fury X is much quieter than the GTX 980 Ti reference cooler and only loses to the ASUS DirectCU II cooler on our retail R9 290X.

One thing that is not shown in this graph though is the high pitched whine that is present 100% of the time with our review sample powered up. The sound is clearly part of the pump mechanism and I know from discussions with other reviewers that this is a common problem in the launch samples. AMD addressed this to me in an email, stating that the issue was limited to the initial batch of engineering samples and that the issues had "been resolved and a fix added" for all production parts going on sale to the public. Obviously we'll have to wait for reports from the field to verify that.

I will say that installing the R9 Fury X into a chassis, or even just putting a piece of cardboard between the open test bench and my head resulted in the sound nearly being imperceptible. Still, this is a concern worth keeping an ear on.


June 24, 2015 | 08:22 AM - Posted by Anonymous (not verified)

Why compare with 980ti if it can't keep up? Stupid marketing spoiled it otherwise to be an excellent product with massive improvements. They should've sold this for 600$. And no one could've complained.

June 24, 2015 | 09:05 AM - Posted by Anonymous (not verified)

Tom's hardware review came out a little positive, better power figures too. But expected a little more from them. Hope they can come up with a driver with 5% improvement flat.

June 24, 2015 | 09:07 AM - Posted by Leyaena (not verified)

Why they compare it against the 980TI is pretty clear to me:
It comes in at the exact same price point.

I honestly feel bad for AMD, I had really hoped this would be their big break. But as it stands, the 980TI and especially Titan X still reign supreme.

June 24, 2015 | 10:21 AM - Posted by Eric (not verified)

I think he was saying 'why did AMD line it up with the 980 Ti' - that's why he said they should have priced it cheaper and then people would be saying 'yeah, it's not quite as fast, but it doesn't cost quite as much either'.

June 24, 2015 | 09:14 PM - Posted by Gary Lee (not verified)

If they had put the price point at $500-550, AMD would have sold scads more of them; and would have given legendary performance for a gamer's buck

June 24, 2015 | 09:58 PM - Posted by Anonymous (not verified)

your 980Ti is destroyed by Fury X
http://www.forbes.com/sites/jasonevangelho/2015/06/18/amd-radeon-fury-x-...

June 25, 2015 | 12:37 AM - Posted by Mandrake

Are you seriously expecting people to take AMD's own benchmark slides as proof? Every single major review from reputable sites like PCPer, HardOCP, TechReport etc have all shown the 980 Ti is a little ahead of the Fury X.

Ryan's review was generous to AMD. A silver award. The Fury X isn't a bad card at all, it just isn't as good as the 980 Ti. It makes AMD far more competitive than they have been for a long while now. Given that they are using the HBM technology, and the water cooling though - I did hope for at least parity with the 980 Ti.

June 25, 2015 | 07:43 AM - Posted by Anonymous (not verified)

Idiot, Fury X got beaten left right and center. Even overclock vs. overclock

http://www.purepc.pl/karty_graficzne/premierowy_test_amd_radeon_r9_fury_...

Any enthusiast paying 650 for a high end card will overclock it like mad and this just shows how piss poor the Fury X performs. Heck even a well overclocked 980 is comparable to it lmfao XD

June 25, 2015 | 09:33 AM - Posted by Morphius (not verified)

Exatcly. "Good enough computing" is getting pretty boring, with reviewers and consumers seemingly only careing about making computational components less and less powerfull.

Guess what, take a sledgehammer to a 290X and it will drop the power consumption by 100%! I really don't understand why they seem to not understand this.

So much money gets wasted on making lower performing components these days...
I can only dream of what a 1~2kW liquid cooled graphics subsystem would be able to achieve when pared with a 250~500Hz display.
But I digress, the human eye cannot even see more than 24 (or was it 12) frames per second (or minute?) so no one really cares either way. Sigh.

June 25, 2015 | 10:06 AM - Posted by Squall Loire (not verified)

"But I digress, the human eye cannot even see more than 24 (or was it 12) frames per second (or minute?) so no one really cares either way. Sigh."

Common misconeption. The human eye requires 24 frames per second to blend it into motion - but is capable of seeing much, much more than that. Tests have been performed that showed pilots were able to see and identify images of aircraft displayed for just 1/2000 of a second.
The only limiting factor to framerates with current tech is display refresh rates. No point going over 60fps if your screen only does 60Hz.

June 25, 2015 | 10:48 AM - Posted by Morphius (not verified)

Sadly I am well aware of this piece of actual scientific information. I happen to fall into a miniscule minority who are not able to preceive smooth motion even from today's most advanced 144Hz, G-Sync display (I bought one hoping it would as my CRT died a couple of years back).
At 185Hz the CRT was still extremely flickery, but at least I was able to aim on target...

My comment was merely targeted at explaining why we are not seeing any real push for increased frame rates (or decreased latencies) beyond what was already available at the turn of the century. Untrue or not, what the masses believe automatically translates into an economic (dis)incentive.
The massive investment required in developing a high-power semiconductor process is simply insurmountable; therefore the single-threaded performance of a ~5GHz Sandy-Bridge caps the ultimate performance of most games at somewhere between 100~200fps, irrespective of GPU-power or screen resolution, for at least the next 10-15 years.

June 30, 2015 | 06:29 AM - Posted by Anonymous (not verified)

It's like 72 frames per second.... after that the human I can't really tell the difference.

June 24, 2015 | 01:22 PM - Posted by StephanS

Its compared to a 980 ti because it within 10% performance range.
The stock Fury is 14% faster then the 980 ti in Metro, this tells you how much shader compute this monster got to offer VS the GTX 980 ti.

Even so the Fury X is often faster then the GTX 980 ti, I agree with your $600 price. At $599 AMD would have cleanup the high-end gaming market. At $650, their is a barrier since nvidia carry a brand name value over AMD. Sadly, we known many people are ready to spent 2x the money for something barely faster because its nvidia.

anyways, I hope AMD sells out so they can continue to advance GPU forward. Because if nvidia is the sole GPU provider for gaming, its not going to be good for any of us.

June 24, 2015 | 09:55 PM - Posted by Anonymous (not verified)

HERE IS BENCHMARK: 980Ti and Fury X x2 :D and winner issssss:
http://www.forbes.com/sites/jasonevangelho/2015/06/18/amd-radeon-fury-x-...

FURY X X2 ofcuorse....
and this two card have the same price!!! but,Fury X is become cheaper (and is better) and Nvidia will not change price... beacuse,this AMD series is destroyed it!!!

June 28, 2015 | 11:18 AM - Posted by Rob G (not verified)

These aren't real benchmarks, these were "leaked" official benchmarks released by AMD where they tweaked in game settings to favor the Fury X. I can tweak in game settings and oc my 970 gtx to make it was faster than an r9 390x too.

August 24, 2015 | 06:32 PM - Posted by memester (not verified)

If you are doing non gaming workloads the r9 fury x can be the better gpu by far, with the much higher computer performance.

June 24, 2015 | 08:24 AM - Posted by Anonymous (not verified)

https://www.youtube.com/watch?t=138&v=BWbRSHtHI6c

AMD's Richard Huddy Q&A questions about HDMI2,

Active DisplayPort 1.2a-to-HDMI 2.0 adapters are set to debut this summer. These adapters will enable any graphics cards with DP1.2a outputs to deliver 4K@60Hz gaming on UHD televisions that support HDMI 2.0.

June 24, 2015 | 08:26 AM - Posted by Ryan Shrout

I've used these types of adapters for 5-6 years and I can say that they never quite work 100% like they are supposed to. I don't consider this a good excuse for not having HDMI 2.0.

June 24, 2015 | 08:38 AM - Posted by Anonymous (not verified)

no you do not " used these types of adapters for 5-6 years "
adapters 5-6 years was DVI TO HDMI Passive !

in the winter Active adapters DP1.2A TO HDMI 2.0
Active = DP1.2 300 MHZ TO HDMI 2.0 600 MHZ

completely different product

June 24, 2015 | 09:09 AM - Posted by Ryan Shrout

I understand that the HDMI 2.0 part is going to be new. But I have definitely been using DP to DL-DVI adapters 5-6 years, despite what your petulant comment indicated. Things like this have been around a LONG TIME: http://www.newegg.com/Product/Product.aspx?Item=N82E16812607011

It's the same fundamental technology to convert between two digital signals.

June 24, 2015 | 10:58 AM - Posted by Mobile_Dom

you dont see the word petulant as often as we should anymore

June 25, 2015 | 10:09 AM - Posted by Squall Loire (not verified)

You just said yourself that you've been using DP to DVI adapters, and not the DP to HDMI adapters the OP was referring to...

June 25, 2015 | 07:06 PM - Posted by Wolvenmoon (not verified)

HDMI and DVI use the same signaling over different physical jacks.

June 25, 2015 | 09:36 PM - Posted by Anonymous (not verified)

Agreed, I've been patiently waiting for this adapter for quite some time now. Would like to know which retailer was selling them for the last 5-6 years because I know for sure it wasn't newegg or amazon.

June 24, 2015 | 08:57 AM - Posted by Genova84

I've been hearing about these rumored adapters for over a year now. People have posted their emails with both accell and bizlink and neither company is saying these are definitely launching this year. They say, "maybe by year end." I don't know why AMD wouldn't want to control this basic piece of hardware. I already has an HDMI port FFS! Why not just make that HDMI 2.0? I, I don't get it.

June 24, 2015 | 09:18 AM - Posted by Anonymous (not verified)

because there is no such product in the world !

the problem make DP1.2A Low voltage 300 MHZ TO HI voltage 600 MHZ
and the driver should have all protocols of the HDMI 2.0 standard
is very complicated

MSI is saying their 380, 390 and 390X all have HDMI 2.0

AMD admits at HDMI 1.4 bandwidth
http://oi57.tinypic.com/27wsoxt.jpg

AMD have hdmi 2.0 today NVIDIA Style
the only difference AMD admits it ( HDMI 1.4 bandwidth 10.2 )

ASK NVIDIA IF THAY HAVE :

HDCP 2.2 ?
DCI - P3 coloer ?
SiI9777 Chip ?
HDR ?
bandwidth 18GBPS ?
HDCP 2.2 ?
...

reviews without checking the truth no reviews!

June 24, 2015 | 09:26 AM - Posted by Ryan Shrout

You might be our most persistent commenter to date. ;)

June 24, 2015 | 09:51 AM - Posted by TOMI (not verified)

Ryan ASK NVIDIA

June 24, 2015 | 09:52 AM - Posted by Jabbadap (not verified)

Nah maxwell2.0:s integrated dislpay controller can do 600MHz pixel clock and have full yuv4:4:4/rgp4:4:4 color at uhd over hdmi2.0. And it does have hdcp2.2 link.

More problematic is the chip on the telly side, whose in early days(and even today) really were not capable for hdmi2.0(they just where modified hdmi1.4). Thus in the time gtx980/gtx970 were released, there were tellies marketed as "hmdi2 4k" which in reality were only capable yuv4:2:0 at 2160p@60Hz(which even keplers can do in hdmi1.4 via drivers).

June 24, 2015 | 10:23 AM - Posted by BlackDove (not verified)

What monitor do you have thats DCI-P3? And you do realize that Rec.2020 monitors will be much better, and Rec.2020 will be a consumer and pro color standard right? What content do you even need P3 over that cable for? Shouldnt you have a Quadro with SDI outputs?

June 24, 2015 | 12:34 PM - Posted by Allyn Malventano

Dude when are you going to understand that Maxwell *does* support HDMI 2 with either 4:4:4 or full RGB 8 or 10 bit color at 4K60.

June 24, 2015 | 12:48 PM - Posted by Anonymous (not verified)

Don't think you can support something higher then the standard.

http://www.hdmi.org/manufacturer/hdmi_2_0/hdmi_2_0_faq.aspx#146

4k @ 60hz
10bit 4:2:0

4k @ 30hz
8bit 4:4:4

HDMI 2.0 cant do 4:4:4 @ 60hz or support Rec 2020 @ 60hz. You have to use DisplayPort 1.2 or higher for that.

June 25, 2015 | 07:18 AM - Posted by PCPerversion

Apart from the fact that Ryan Shrout and Allyn Malventano clearly chug Nvidia cock,this isnt such a bad review.

June 24, 2015 | 09:42 PM - Posted by Anonymous (not verified)

exactly!!! Nvidia DONT HAVE HALF features witch have AMD!!! Nvidia fanboys ONLY knows says: Nvidia have lower TDP :((( WOOOW.loer TDP!!!! fanboys,BUY yorself GOOD PSU!!! if you are gamers. every true gamer have great PSU. and most inportant,have Nvidia maybe VSR???? if you fanboys know what is this... no,dont have! beacuse,Nvidia is SHIT!!! can not more steal and now do terrible cards... drivers especially bad. they do not have a good driver for 9xx series! even do not know how much they have VRAM on cards.. so... Nvidia is simple big shit!!!!

June 25, 2015 | 12:41 AM - Posted by Mandrake

VSR is essentially a blanket copy of DSR. Oops. DSR works on everything back to Fermi.

July 2, 2015 | 03:29 PM - Posted by Anonymous (not verified)

Every tru gamer speak english good

June 24, 2015 | 12:37 PM - Posted by Anonymous (not verified)

Somehow I don't think the 20 people in the world with 4k 60Hz HDMI 2.0 televisions who are looking at buying a high-end graphics card for a gaming HTPC are the market AMD is shooting for with this card. By the time 4k 60Hz televisions become so widespread that HDMI 2.0 is a requirement, the original Fury is going to be long obsolete.

June 24, 2015 | 12:49 PM - Posted by Martin Trautvetter

You should shop around for a new TV, HDMI 2.0 / 4k60 is everywhere.

Which makes AMD's omission of HDMI 2.0 so baffling.

June 24, 2015 | 01:19 PM - Posted by trenter (not verified)

Ryan, please consider adding more games to your testing suite. I think amd will come out looking a lot better in some of the newer games like shadow of mordor, far cry 4, and possibly upcoming games not developed for last gen console hardware. Good review though, the power figures were surprising.

June 24, 2015 | 08:30 AM - Posted by Martin Trautvetter

And I'm sure Mr Huddy will hand-deliver one of those to every new owner of a Fury X, right?

June 24, 2015 | 09:14 AM - Posted by Gunbuster

Indeed, the classic AMD Coming Soon™ with a splash of not our problem.

It's hard enough to find a 4k TV that even does HDMI 2.0 4k 60Hz 4:4:4 correctly. You don't want to be throwing a janky active converter into the middle of that.

June 24, 2015 | 10:25 AM - Posted by BlackDove (not verified)

Theyre all a complete waste since NONE of the current 4K TVs support the REAL 4K standard, which is Rec.2020.

June 24, 2015 | 12:48 PM - Posted by Anonymous (not verified)

"It's hard enough to find a 4k TV that even does HDMI 2.0 4k 60Hz 4:4:4 correctly."

This is pretty much why it's no big deal that Fury X doesn't have HDMI 2.0 - nobody needs it. By the time 4k 60Hz televisions are widespread enough that HDMI 2.0 is a requirement, the Fury X will be obsolete anyway.

June 24, 2015 | 08:26 AM - Posted by Martin Trautvetter

Too bad about the pump noise.

June 24, 2015 | 09:26 AM - Posted by obababoy

The other websites said this is a non issue once you change the speed profile a bit.

June 24, 2015 | 10:33 AM - Posted by Ryan Shrout

Maybe someone did, but TR, HWC and Guru3D all make comments on the pump noise.

June 24, 2015 | 11:38 AM - Posted by Martin Trautvetter

AFAIK you can't change the speed of the pump. Ryan?

btw: computerbase.de report a retail sample Fury X to have toned down, but still existing pump noise when compared to their press sample.

June 24, 2015 | 05:17 PM - Posted by djotter

I have the same pump noise on my CoolerMaster Nepton 140XL. I turned down the pump to reduce the whine, but it is still audible outside my case.

June 24, 2015 | 08:35 AM - Posted by Tedders (not verified)

"AMD told me this week that the driver would have to be tuned "for each game". This means that AMD needs to dedicate itself to this cause if it wants Fury X and the Fury family to have a nice, long, successful lifespan."

This scares me.

June 24, 2015 | 08:40 AM - Posted by Cataclysm_ZA

Don't be afraid. That's exactly the reason why a HD7970 GHz is still a decent performer today and still gets game improvements added to it. Same thing with R9 290X, which actually improved drastically over time to where it beats GTX 980 in quite a few games. AMD does need to work on drivers and apply the same tesselation fix to Fiji that they did with Hawaii, but overall I can see this card becoming more competitive over time.

June 24, 2015 | 01:30 PM - Posted by StephanS

I actually cant wait to see Dx12 games on a 290x VS GTX 980

From Mordor & Metro, we see that the 290x got more hoomph then the GTX 980 at high resolution, but get clobbered at 1080p.

If Dx12 driver are on parity on both platform, the numbers might hold across resolution, and we might see that the 290x is always faster then the GTX 980...

But then again, Battelfield4 Mantle seem broken, so maybe AMD wont be able to capitalize on Dx12 ?

June 24, 2015 | 07:51 PM - Posted by Titan_V (not verified)

Mantle in Battlefield 4 most definitely NOT broken. I use it every day. Works great!

June 24, 2015 | 12:39 PM - Posted by Barfly (not verified)

So they make a card that took years to develop and they could not release it with a decent driver on day one?
this is like game devs releasing PC ports .
So sick of this stuff

June 24, 2015 | 08:45 AM - Posted by Cataclysm_ZA

Ryan, any word from AMD on why tesselation performance for the Fury X is low-ish? Its barely better than Tonga (R9 285), let alone R9 290X.

June 24, 2015 | 08:53 AM - Posted by Martin Trautvetter

Can't cheat fast enough? :p

June 24, 2015 | 08:58 AM - Posted by Cataclysm_ZA

I'm not sure what you're referring to here.

June 24, 2015 | 12:44 PM - Posted by Martin Trautvetter

Just making light of AMD 'finding' massive amounts of extra tessellation performance in one- and two-year-old GPUs.

June 24, 2015 | 09:11 AM - Posted by Ryan Shrout

The answer as to why is just because it doesn't integrate any more tessellation hardware than Tonga did. Same with the ROP count.

June 24, 2015 | 09:17 AM - Posted by Cataclysm_ZA

Ah! Well, its not a bad result and still a pretty good deal even at $650. I hope AMD's driver team makes this last just as long and improve just as much as the HD7970 over time.

Techreport's review theorised that the number of ROP units weren't what was holding back Hawaii. Is that your view as well?

June 24, 2015 | 05:21 PM - Posted by trenter (not verified)

Is it not using the tonga improvements? Techreport claims tonga has improved tesselation over hawaii substantially when they reviewed the 285.

June 25, 2015 | 12:04 PM - Posted by Cataclysm_ZA

It did indeed, and Tonga is much, much faster than any Radeon before it when it comes to tesselation. But, the GTX 780 is still faster in tesselation benches than Fiji, which obviously plays out when testing performance in titles that have good Hairworks implementations.

June 24, 2015 | 08:45 AM - Posted by General Lee (not verified)

2% faster than 290X in GTA V at 1440p. How's that even possible...

$600 would've been a more fitting price. Sad to see AMD is still not able to create a faster GPU than Nvidia. They could really use a halo product, but Fury X doesn't really impress. I guess they could have much to gain with drivers, but going with a 980 Ti is probably the safer bet overall.

June 24, 2015 | 09:11 AM - Posted by Ryan Shrout

I agree that $600 would feel a lot better for this, and I asked AMD for a price drop yesterday, but they didn't listen to me. :)

In general though I think they are not going to have issues selling through their stock of the Fury X for several months.

June 24, 2015 | 11:02 AM - Posted by Rustknuckle (not verified)

Think its going to be awhile before you see a price drop since they have already dropped down from the 800$ they were planning to charge for Fury before 980ti came out.

June 24, 2015 | 12:33 PM - Posted by Azix (not verified)

How did you know they were at $800?

June 24, 2015 | 03:37 PM - Posted by arbiter

That was rumored price for fury X months leading up to launch 850$, 980ti was rumored at 650$.

June 24, 2015 | 12:33 PM - Posted by Azix (not verified)

How did you know they were at $800?

June 24, 2015 | 09:00 AM - Posted by Anonymous (not verified)

So much for getting more dGPU market share AMD....

June 24, 2015 | 09:00 AM - Posted by Dark_wizzie (not verified)

Hey Ryan, why do you guys still use Unigine Heaven> Isn't Unigine Valley a newer benchmark?

June 24, 2015 | 09:12 AM - Posted by Ryan Shrout

Meh, not that big of a difference. Unigine Heaven is just a standardized synthetic test with an emphasis on tessellation.

June 24, 2015 | 10:20 AM - Posted by Dark_wizzie

The benchmark is x10 more satisfying to watch though. :')

June 24, 2015 | 09:04 AM - Posted by Anonymous (not verified)

All the AMD fanboys on various tech sites hyping and telling how this will destroy Titan X/ 980 Ti, what now huh?
Fury X got destroyed even with HBM and still using more power.

June 24, 2015 | 09:18 AM - Posted by Gunbuster

I wold not say "destroyed" but coming months late to the party, only being able to hang at +-5% performance in most cases and still using more power is a pretty poor showing.

Thanks for beta testing HBM though AMD, I'm sure I'll enjoy it in an Nvidia card down the road ;)

June 24, 2015 | 03:41 PM - Posted by arbiter

Yea i am one those people that always gets attacked for a lot of things i say despite fact of being true. There is things i am wrong about no ones perfect but i was pretty much spot on with this card not being much faster if any then 980ti.

I was wrong about power draw but it was possible to be that range cause amd's history but meh.

June 24, 2015 | 05:24 PM - Posted by trenter (not verified)

Go read other reviews with more games tested, fury is 5% faster than 980 ti at 4k.

June 24, 2015 | 09:10 AM - Posted by Snake Pliskin (not verified)

This review sucks.

June 24, 2015 | 09:27 AM - Posted by Ryan Shrout

Nuh-uh!

June 24, 2015 | 10:00 AM - Posted by Dark_wizzie (not verified)

<3
I love your reviews.

psst: Tell Allyn I want traced based analysis for gaming on an SSD. :)

June 24, 2015 | 10:34 AM - Posted by Ryan Shrout

I keep telling him too!

June 24, 2015 | 11:06 AM - Posted by Dark_wizzie

:D

June 24, 2015 | 12:37 PM - Posted by Allyn Malventano

For gaming? Hmm, interesting idea, but the trace would be mostly a line riding 0 (for a game at least). I am cooking up some new stuff though - to be rolled out in a big enterprise review first and consumer parts later.

June 25, 2015 | 07:20 AM - Posted by PCPerversion

Apart from the fact that Ryan Shrout and Allyn Malventano clearly chug Nvidia cock,this isnt such a bad review.

June 24, 2015 | 10:44 AM - Posted by John H (not verified)

Clearly this entire article was stolen from JoshTekk

June 24, 2015 | 01:49 PM - Posted by Josh Walrath

Clearly!

June 24, 2015 | 09:12 AM - Posted by Anonymous (not verified)

Here choose your pickings, read it and weep:

http://www.guru3d.com/articles-pages/amd-radeon-r9-fury-x-review,1.html

http://hardocp.com/article/2015/06/24/amd_radeon_r9_fury_x_video_card_re...

I'm sure more will follow with the exact same figures and conclusions LOL

June 24, 2015 | 09:12 AM - Posted by Keven Harvey (not verified)

I wonder if it will pick up some steam on july 29th. Surely less driver overhead can only help.

June 24, 2015 | 09:27 AM - Posted by Ryan Shrout

The release of Windows 10 and DX12 will not suddenly make every game built on a DX12 engine.

June 24, 2015 | 09:32 AM - Posted by Boggins (not verified)

Like in every new DX generation, it always takes several years after release before games truly switch over as a new standard. Let's hope Microsoft's offer to roll Windows 10 to Windows 7 and 8/8.1 users will speed adoption up, to give developers a reason to go to DX 12 earlier.

But by the time DX 12 is standard in games, we'll be on to the next big GPU generation.

June 24, 2015 | 03:43 PM - Posted by arbiter

The graphic improvements will surely take some time but i expect the speed improvements DX12 offers will be adopted pretty quick since it will open up things to make games look and run better.

June 24, 2015 | 05:34 PM - Posted by trenter (not verified)

The difference is xbox one has dx12 and devs have been using it for a while now. Windoes 10 is free allowing a much larger install base. If people buy gpu's with the intention of keeping them for 2-3 years then dx12 games will be shipping in that periof of time.

June 24, 2015 | 09:43 AM - Posted by Anonymous (not verified)

It indeed does not, but testing Win10 right now, together with the Win10 beta drivers already give us better performance on DX11 games.

It seems that either Win10 is more efficient, using DX12 magically improves DX11 performance or that the beta drivers for Win10 are actually better than the current beta drivers for Win8.1

June 24, 2015 | 10:34 AM - Posted by Ryan Shrout

Source on this claim?

June 24, 2015 | 11:03 AM - Posted by General Lee (not verified)

There's been plenty of claims that AMD's drivers get improved DX11 CPU utilization on Win 10. Of course this is just some people on the forums, I haven't heard of any HW site to make proper tests yet.

Here's an example someone has made for Project Cars:
https://www.youtube.com/watch?v=XzFe5OOHZko

June 24, 2015 | 11:13 AM - Posted by Anonymous (not verified)

Don't forget about the Vulkan graphics API, and some Steam OS/Linux testing as the titles become available for the Steam Box/Steam OS based systems. Also some DX12 multi-adaptor testing, and I'm not sure about any multi-adaptor type support in Vulkan, but there should be. There are a lot of newer graphics API technologies that will have to be tested on the latest cards, and back tested on and older generation cards that may be able to take advantage of the newer graphics API technologies to a lesser degree than the newest cards, but there still may be improvements.

June 25, 2015 | 03:42 PM - Posted by Keven Harvey (not verified)

The game, no, but the driver maybe. Your own 3dmark api testing has shown that amd gets no benefit from going from dx11 single threaded to dx11 multi threaded. Maybe it can somehow free up that main thread that the game is using.

June 24, 2015 | 09:23 AM - Posted by onion uk (not verified)

im kinda getting the feeling the drivers are playing a sizable roll in the weaker performance

June 24, 2015 | 09:26 AM - Posted by Gunbuster

Of course, better drivers Coming Soon™ from the software experts at AMD...

June 24, 2015 | 09:25 AM - Posted by Anonymous (not verified)

What a joke, AMD

June 24, 2015 | 11:24 AM - Posted by Anonymous (not verified)

What high prices(higher than Too High in Nvidia case) you Nvidia fans would have to pay, if AMD was not at least within the margin of error with their competing products, and every driver needs tweaking no matter the GPU maker, especially for the latest hardware. AMD is not doing too shabbily with only 4GB of memory, and as soon as HBM2 is available it will be on AMD's updated Fiji cards, with very little extra engineering needed. Nvidia will still be in its testing and certification phases with HBM, while AMD will have almost drop in capability to accept the newer HBM standard.

June 24, 2015 | 09:28 AM - Posted by Boggins (not verified)

I guess 980 TI ended up far faster than AMD expected. They probably wanted to pit Fury X against Titan X, and Fury against 980/980 TI. Unfortunately, at the same price point as 980 TI, there doesn't seem to be much reason for anyone to consider buying the AMD Fury X right now. Especially since it doesn't sound the drivers are fully optimized, availability problems. It's unfortunate, but it also sounds like Nvidia has priced 980 TI aggressively enough that they won't need to make a price drop.

As much as I commend AMD's different approach with this new card, I am also disappointed that they weren't able achieve better performance to shake up Nvidia's share of the market.

June 24, 2015 | 09:48 PM - Posted by Anonymous (not verified)

you stupid noob... Fury X eat GTX 980Ti!!! here is benchmark.
http://www.forbes.com/sites/jasonevangelho/2015/06/18/amd-radeon-fury-x-...

and GTX 980Ti and Fury X have the same PRICE!!! get informed before you start to write nonsense... and most powerfull card/s on the world is Fury X x2 and R9295X2!!! all Nvidia cards is not even close. this is reall power! AMD...Nvidia is always be a SHIT!!! and thief!

June 24, 2015 | 09:28 AM - Posted by obababoy

Ryan,

Did you guys use the newer 15.15 driver? I am assuming so.

June 24, 2015 | 09:41 AM - Posted by Ryan Shrout

Yes, that's listed on our Test Setup page.

June 24, 2015 | 01:21 PM - Posted by Anonymous (not verified)

15.6 released a day before you posted this review.

June 24, 2015 | 01:57 PM - Posted by Josh Walrath

I believe 15.15 is based on a newer rev of Catalyst as compared to 15.6 (which features Batman optimizations).

June 24, 2015 | 09:39 AM - Posted by justin150 (not verified)

Love the look of the card (shame watercooling tubes not on side of card and power connections at back rather than other way around), love the size but...

Just not good enough, except for mini-itx builds

Needs to drop price and I will wait for version with custom water block and no pump.

Knock $100 off price and they have a winner, but head to head with 980ti is a bit disappointing

June 24, 2015 | 09:40 AM - Posted by Anonymous (not verified)

All these cards including Titan X are not 4K PC Gaming period. Veteran PC Gamers don't play games at console framerates FFS. It's either min. 60fps or GTFO

Older games you can but the really old games.....

1440p is where its at now and 980Ti and Titan X can do that best with higher overclocks out of the box and lower power consumption. Not forgetting the multi-monitor & 21:9 guys either.

June 24, 2015 | 09:42 AM - Posted by Ryan Shrout

I actually agree with your here. Every card that comes that says it is "perfect for 4K" is being disingenuous. And I think both NVIDIA and AMD are guilty of it.

June 24, 2015 | 11:20 AM - Posted by Anonymous (not verified)

yep I have a 980ti hybrid. its perfect for maxed out 1440P Vsync on. its stutter free.

June 24, 2015 | 09:51 PM - Posted by Anonymous (not verified)

980Ti and Titan Z can hide in front Fury X x2!!! Fury X x2 and R9 295X2 is most powerfull GPU on the world!!! and is not loud like 980Ti and Titan.. she sounds like trackor...

June 24, 2015 | 09:42 AM - Posted by J Nev (not verified)

The bottom line: should have come out 6 months ago.

June 24, 2015 | 09:46 AM - Posted by TheAnonymousJuan (not verified)

Was really hoping for something better from AMD. Oh well, I just ordered up my 980 ti from Newegg after reading this and a few other reviews.

June 24, 2015 | 09:47 AM - Posted by xfsdfg (not verified)

For god sakes, some games barely play 1080 in max settings....where is 1080 benchmark again. There is absulutely no point to just run after bigger display with every new graphicscard when solid 60 fps cant be quaranteed with every game in max.

June 24, 2015 | 09:49 AM - Posted by obababoy

Tell us about 1080 60fps 13 or 14 more times and I swear I will flip out!

June 24, 2015 | 09:50 AM - Posted by J Nev (not verified)

Excuse me Ryan: I hope you do a 390x review soon?

June 24, 2015 | 10:35 AM - Posted by Ryan Shrout

AMD has not seemed eager to send these out for review. But I'll get my hands on one!

June 24, 2015 | 03:44 PM - Posted by arbiter

390x is an overclocked 290x so performance of those cards is only about 10% higher.

June 24, 2015 | 09:53 AM - Posted by Bri (not verified)

Now that the reviews are out and it is indeed so very close to the 980 Ti, it does make me stop and think. Did Nvidia engage in some corporate espionage here? Why else would the company introduce a part that undercuts their own higher priced Titan unless they were already expecting the Fury and attempting to dampen the reaction...

June 24, 2015 | 09:57 AM - Posted by obababoy

Unlikely :)Nvidia has just been over charging for the Titan X and had enough wiggle room to undercut it with a $650 almost equal version. Besides AMD set the price based on the 980ti.

It compares pretty well. My issue with AMD is that on paper this card should be beating the 980ti...Something fishy with the drivers has to be a factor. Right?

June 24, 2015 | 10:02 AM - Posted by Josh Walrath

They both use the same suppliers, have the same board partners, and many employees know one another.  There aren't a lot of secrets in the industry that stay secret for long.

June 24, 2015 | 10:38 AM - Posted by Ryan Shrout

I am quite sure that NVIDIA was doing some research to figure out what AMD might release. In actuality I think that NVIDIA would have aimed to run at just UNDER the performance of the Fury X at the same price if they'd had their choice, so I am guessing NVIDIA over estimated the perf that Fiji brought to the table.

Interesting discussion for the podcast maybe.

June 24, 2015 | 12:04 PM - Posted by Dark_wizzie

Linus totally called this one. He said that Nvidia knew the performance of the Fury X far in advance. The 980TI and 980 price drops were not accidental releases/price drops that happened to hurt the Fury X in all the right ways. Although, it's very likely that AMD knew what Nvidia was up to as well. Cards don't just magically fall into those price points. It was pre-planned. So people wondered what Nvidia will do with the prices of 980ti now the Fury is out, the answer I believe is nothing: The release of the 980ti is already the response.

June 24, 2015 | 01:03 PM - Posted by Martin Trautvetter

I'm sure AMD had a rather unhappy day when they found out about the 980 Ti's pricing. (much like they were probably quite upbeat after Titan X)

Current Fury X pricing seems to indicate that they're trying to ride it out for the moment, but with custom Tis coming to market, I can't imagine it sticking it 650 unless supply is severely constrained.

June 24, 2015 | 03:47 PM - Posted by arbiter

nvidia has such a large gap in market share they know they could take a lose in 1 card. They have win in just about every other card right now as most their main line up is A New chip not an old one renamed to look new.

June 24, 2015 | 03:26 PM - Posted by Anonymous (not verified)

I suspected this when the 980 Ti so closely matched the Titan X. I suspect the 980 Ti would have been cut down more significantly without the competition from Fury X. If the 980 Ti had been cut down more significantly, then Fury X would have been competitive with a $1000 Titan X, rather than a $650 980 Ti. Nvidia not only did not cut it down much, they also bumped up the clock compared to the Titan X. If Nvidia can supply the demand for the 980 Ti, then it is acceptable; it seems to be in stock. I have to wonder if Nvidia will release another card though. It seems like they would have GPUs which have defects preventing them from being sold in 980 Ti cards, but would still be a good product.

June 24, 2015 | 09:55 AM - Posted by YTech

Ryan,

the new AMD product seems promising and still in it's early stages. There's room for improvements.

Comparing the specs with it's competitor, have you attempted to overclock both cards to see if there's any comparable improvements vs the competitor?

It appears that the memory frequency on the Fury X is pretty low. Not sure if that's a typo. They do have the bandwidth capability, but may be lacking on the frequency which can be noticed in some games.

June 24, 2015 | 04:12 PM - Posted by BillDStrong

The frequency is correct, they get their bandwidth from the 4096 memory bus. Remember the memory and the chip are much closer together, so when you raise the speed of one, it heats up both of the parts. They also probably don't have the third party drivers that would allow them to boost the voltage of the cards, which can limit the boost they can get.

June 24, 2015 | 10:48 PM - Posted by YTech

As I said, still an early product. Maybe they'll be third party drivers that can provide some improvements as you mention. But I agree in regards to the additional challenges due to short distance between units.

Ryan, went in more details during the podcast in regards to the overclocking.

June 24, 2015 | 09:59 AM - Posted by Anonymous (not verified)

I am surprised you didn't use Witcher 3 as a test game. It definitely is a good benchmark for newer cards.

June 24, 2015 | 10:02 AM - Posted by obababoy

Witcher 3 doesnt have an actual benchmark for it though.

June 24, 2015 | 10:13 AM - Posted by Dark_wizzie

What about benchmarking it by playing a section of the game? I thought that was the default way to benchmark.

June 24, 2015 | 10:22 AM - Posted by obababoy

But it isnt exact or repeatable. Especially with wandering AI and time of day etc etc. I get what you are saying but it wouldnt be accurate unless you did the same motion like 50 times for each GPU in roughly the same area and compared averages. In Witcher 3 just looking in a different direction sometimes changes my FPS by 10!

June 24, 2015 | 03:49 PM - Posted by arbiter

Its how they do it in crysis 3, there is a set save their load and run through to a certain spot. There is gonna be difference each time yes, but that is why you run the benchmark a few times and make an avg. Its about only way to do it.

June 24, 2015 | 10:24 AM - Posted by Ciddo (not verified)

It's a little disappointing seeing the Fury X lose marginally to the 980 Ti at the same price point. However, I think the real card we have to wait for is the Fury. If the Fury card that comes out later this year performs 10-15% with $100 off of its price point, that will probably be the card to get.

When Nvidia released the 980 and the 970, the 970 was an amazing buy up until the whole 3.5GB memory issue came to light. If AMD can avoid that with the release of the air cooled Fury card, consumers can probably take better advantage of the HBM with their own water cooling solutions.

June 24, 2015 | 10:26 AM - Posted by Dark_wizzie

There's an interesting difference between the Fraps FPS and Observed FPS for 295x2 on Skyrim @ 4k. o.o

June 24, 2015 | 10:38 AM - Posted by Ryan Shrout

Yeah, AMD never fixed DX9 frame pacing...

June 24, 2015 | 11:17 AM - Posted by Dark_wizzie

Yeah... That ruled out Crossfire for me entirely. Not working with Skyrim is unacceptable. I'm also worried about the 4gb vram for Skyrim as well, and it seems to be chugging a bit on GTA V at 4k. If there is a voltage unlock coming I hope we get it soon because as it stands the 980 ti will overclock better.

Some guys on OCN are pointing out some hot VRM temperatures on the back of the card. I dunno if it's a problem or not. (The backplate remains cool but under it the VRMs are supposed to be really hot. The plate isn't touching the VRM and there's just air in there being an insulation more than anything, or so it is claimed.)

June 24, 2015 | 01:07 PM - Posted by Anonymous (not verified)

they fixed the crossfire frame pacing on DX9 for lower resolution, just not for 4k/eyefinity I think

if you compare your tests from before they fixed anything (when FCAT was new) and now the DX9 1080P CF results were fixed, I think

June 24, 2015 | 10:33 AM - Posted by Searching4Sasquatch (not verified)

Free Canadian bacon with every Furty X!

June 24, 2015 | 11:11 AM - Posted by PCPerFan (not verified)

I'm super impressed with the improvements in power consumption, but that's the only thing I'm impressed with.

Performance trails the 980 ti - I don't know what AMD was thinking pricing this the same as the 980 ti as the underdog. No DVI, no HDMI 2.0, do not want.

If this was priced at $550, this would be a solid release.

June 24, 2015 | 11:16 AM - Posted by Dark_wizzie

Hopefully the Fury without the water cooling will have the same chip and improved driver optimization when it does launch to make it a more compelling option. Personally, I don't care about the HDMI support, but I'm running a Korean monitor and I need my DVI-D. This and some other things is pushing me towards the 980ti TBH.

June 24, 2015 | 03:51 PM - Posted by arbiter

Power draw is improved but a lot of that could be mostly due to water cooler. Can see how when you keep the gpu are a very cool temp lowers power draw cause less leaks in the 295x2. So I would guess some of the draw savings is due to that. We will know for sure when the non-water cooled one comes out.

June 24, 2015 | 11:15 AM - Posted by Anonymous (not verified)

Fury X got REKT!

June 24, 2015 | 11:29 AM - Posted by Anonymous (not verified)

Not quite rekt; it's no Bulldozer. More like the old comparison between the 290x and 780ti (which, interestingly enough, has shifted heavily in the 290x's favor with current drivers and games - I wonder if the same will happen with the Fury and 980ti).

They definitely hyped it too much, but it's really not a bad card by any means. The biggest surprise is how much higher the Fury is in FLOPS than the 980ti, yet delivers similar or slightly less performance. Drivers? Memory limitations? Tesselation?

June 24, 2015 | 11:57 AM - Posted by Anonymous (not verified)

And Nvidia is increasingly segmenting their gaming SKUs from their accelerator SKUs, at least with AMD some number crunching advantages can be had for around the same price point. I see where Nvidia in getting the power savings from by stripping out the Flops/FP capabilities! So AMD's product provides more computational performance, should the newer gaming engines need it for physics and other enhancements, AMDs GPU product with AMDs continued internal improvements with Mantle, and Mantel's improvements quickly provided downstream to Khronos and Vulkan(the public facing version of most of Mantle's and other's API contributions) will allow for the Fiji the same improvements over time.

Really the jury is still out on Fury X, and its derivatives until more complete testing on the newer Graphics APIs. And what about AMD's continued internal Mantle developments that will make their way into the software stacks of the gaming engines, and games, through sharing with M$ for DX12, and Khronos with Vulkan, and any special development sharing of Mantle with specific Games makers for their products. The gaming comparisons alone are not enough at this early stage to totally dismiss AMD's competing products, this is just a first match in a series, and hopefully the prices will get better on both sides, so the consumer wins.

June 24, 2015 | 03:57 PM - Posted by arbiter

If you look at 680/780ti/980 vs the competeing AMD part generally the GFLOPS has been in favor of AMD by a bit most the time. But a lot of games that doesn't matter a bit.

June 24, 2015 | 04:42 PM - Posted by Anonymous (not verified)

A bit, sure, but I don't think the gap was ever this big. Makes me more hopeful about future driver improvements since there's so much raw power there.

June 24, 2015 | 05:16 PM - Posted by Anonymous (not verified)

Games not as much, but other uses, and some non gaming graphics usage, that extra processing power comes in handy. With Blender 3d getting support for cycles rendering on AMD GPUs, things are about to change for low cost 3d graphics projects, especially with the costs of the professional GPU cards that most independent users can not readily afford. I'm looking at the future dual fury based SKUs, as well as what pricing that may happen around a Professional/HPC APU workstation variant that AMD has in the works based on the Zen CPU cores and Greenland graphics that shares HBM on a interposer, for an APU type workstation system on a interposer.

Nvidia's pricing is way beyond what Abdul Jabbar could reach with a rocket assisted skyhook for users that need the all those GFLOPS without the drained bank accounts.
It was Fury that brought on that lower pricing, and Fury does not have bit-coin mining to keep the costs high, in fact I see a relatively quick price drop on the Fury SKU, more so than with the previous generation.

A price war is about the look of where things are heading more so from AMD's side that needs to get a larger market share and the extra revenues that come with more market share at the expense of large profits. Large sales volumes(revenues) and more market share can make up for some lesser profit margins and produce a better economy of scale for AMD with its suppliers, those larger volumes of unit sales can get larger materials bulk savings from AMDs suppliers, and that will eventually bring down the costs of HBM in line with, and even better, than GDDR5.

June 24, 2015 | 11:22 AM - Posted by aurhinius (not verified)

Seems like bulldozer to excavator again. Hbm slapped on a larger but failing architecture based core. Doesnt matter how fast the memory is if you cant get the core right. Might expkain the use of the interposer. I dont think this is the card AMD intended to release but perhaps the 20nm failure forced the issue and placing hbm on an old core was the only choice.

The similarity to the cpu situation is uncanny I feel.

June 24, 2015 | 11:29 AM - Posted by Master Chen (not verified)

So much paid shill liar BS in this "review" article, that it's outright laughable. More than 80% of all most respected and most accurate hardware reviewing sources out there made clear reports that stock Fury X beats the living SHIT out of 980 Ti in 8 cases out of 10 (and even manages to beat Titanic X in 6 gaming tests out of 10 while not even being overclocked), losing noticeably to 980 Ti and Titanic X only in heavily Nvidia-biased and GayWorks-gimped titles.

PcPer is pretty much done for, at least for me personally. I prefer my resources to be accurate, unbiased and truthful to the very end. This here "article" has clearly shown to me that PcPer is a no-good source for hardware testing reviews AT THE VERY LEAST.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.