Rounding out the Fiji reviews, Fury X on tour

Subject: Graphics Cards | June 25, 2015 - 02:42 PM |
Tagged: 4GB, amd, Fiji, Fury, fury x, hbm, R9, radeon

[H]ard|OCP used a slightly different configuration to test the new R9 Fury X, an i7-3770K on an ASUS PB287Q as opposed to an i7-3960X and an ASUS P9X79, the SSD is slightly different but the RAM remains the same at 16GB of DDR3-1600.   [H] also used the same driver as we did and found similar difficulties using it with R9-2xx cards which is why that card was tested with the Catalyst 15.5 Beta.   When testing The Witcher 3 the GTX 980 Ti came out on top overall but it is worth noting the Fury's 70% performance increase over the 290X when HairWorks was enabled.  Their overall conclusions matched what Ryan saw, read them for yourself right here.

View Full Size

"We review AMD's new Fiji GPU comprising the new AMD Radeon R9 Fury X video card with stacked chip technology High Bandwidth Memory. We take this video card through its paces, make comparisons and find out what it can do for us in real world gameplay. Is this $649 video card competitive? Is it truly geared for 4K gaming as AMD says?"

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Video News


June 25, 2015 | 03:07 PM - Posted by Terry Suave (not verified)

Hello Jeremy,

I'd like to know whether or not there might be any truth to this post on reddit: http://www.reddit.com/r/pcmasterrace/comments/3b2ep8/fury_x_possibly_rev...

Although it was Ryan that actually wrote the review, I'd like to know that, if this post is true, does it impact your opinion of the Fury X?

June 25, 2015 | 03:46 PM - Posted by Jeremy Hellstrom

We aren't done testing it, there are several remaining questions with drivers being one area of investigation. 

June 28, 2015 | 12:39 AM - Posted by Anonymous (not verified)

AMD debunked the "wrong driver" rumor.

https://www.reddit.com/r/hardware/comments/3b518c/amd_rep_denies_wrong_d...

June 25, 2015 | 03:12 PM - Posted by Anonymous (not verified)

+ or - 5fps per game is still pretty much an even match and will get even better with driver maturity. Give it a month or two and revisit. I'm positive you will see the card beating the 980TI.

June 25, 2015 | 03:15 PM - Posted by Anonymous (not verified)

FuryX running near silent at 50c full load is worth the negligible fps difference. In comparison the 980ti runs hot and is loud imo.

June 25, 2015 | 03:53 PM - Posted by kenjo

Is it silent ? I have not really seen any data supporting that. I currently have dual(crossfire) 7970 setup that is LOUD

If I could get the same performance out of a r9 fury x as that and get it substantially less noisy I would jump at it.

June 25, 2015 | 04:00 PM - Posted by Anonymous (not verified)

Compared to 2x 7970 its much quieter.

http://techreport.com/r.x/radeon-r9-fury-x/noise-load.gif

June 25, 2015 | 04:18 PM - Posted by kenjo

well not by much according to this.

http://www.guru3d.com/articles_pages/amd_radeon_r9_fury_x_review,13.html

June 26, 2015 | 07:35 AM - Posted by Anonymous (not verified)

980ti Hybrid totally different. AND A FAIRER COMPARISON
it costs about the same as a normal ti now cause of the Nvidia price drops. and it overclock reliably to 50% over stock lol so 1000mhz to 1500mhz! and DECIMATED THE FURYX that no one can buy lol

new egg confirmed it received 100 units for launch. in australia the biggest retailer Umart online, hasnt even made a listing as its not here AT ALL lol

June 25, 2015 | 03:28 PM - Posted by Titan_V (not verified)

Everyone is ignoring the $100-200 you'll save with Freesync vs. GSync when you go Fury. There is value there.

June 25, 2015 | 03:52 PM - Posted by funandjam

yes, but you give up frame multiplication when framerate dips below lower end of the VRR window of a monitor. Until AMD implements this on their cards, gsync will remain the better solution.

June 25, 2015 | 04:07 PM - Posted by Titan_V (not verified)

My point is not to compare them. People will likely looking at total outlay and will be updating monitors. $100-200 matters.

June 25, 2015 | 04:38 PM - Posted by funandjam

You have to compare them, the difference in the quality of the experience is well worth it. Quite a few reviews have shown that when freesync drops below the VRR window, the transition is quite jarring.

June 25, 2015 | 05:48 PM - Posted by Terry Suave (not verified)

I don't really think that is a very valid argument. If you have the horsepower to drive a high-quality display such as an ROG swift, it would seem to me that you shouldn't be dropping below its window very often if you've tuned your settings properly.

I've had a freesync monitor for about a week now (Acer XG270HU) using an R9 290 and I haven't noticed any harsh transitions, though I think I've only been playing in the range of ~60-144 in games like GTA V and Payday 2.

June 25, 2015 | 06:46 PM - Posted by Titan_V (not verified)

Precisely. The point that seems to elude funandjam is that both technologies improve the gaming experience markedly, and thus, if you're going to have to pick a lane, the benefit with Freesync/Fury is the $100-200 savings PER MONITOR. That is significant. funandjam, you can play the snob card here and say that it is worth and that is or opinion, but LOTS of folks will argue that it isn't. If your goal is to run triple monitors, then you just about bought yourself a second Fury with the savings.

It *IS* an important point and I think it is negligent on the part of the reviewers to fail to bring it to the forefront.

June 25, 2015 | 09:35 PM - Posted by Anonymous (not verified)

You can pick up Gsync monitors for under 300. The price isn't as astronomical as you're making it out to be.

June 25, 2015 | 09:44 PM - Posted by Anonymous (not verified)

Quality means a lot to some people, and most all people already spending 600-1200$ on gpu and monitor alone will consider the extra 1-200$ for a noticiable increase in quality. A friend of mine replaced his 290 with a gtx 970 after borrowing my 970 for just a few days. He noticed something I cant relate, I havent had AMD since my hd 4670, which I LOVED. Whenever I hear people defending AMD nowadays I can only think they have never tried the other side.

June 26, 2015 | 03:35 AM - Posted by Anonymous (not verified)

All those people that complain in the GeForce Forums certainly aren't defending AMD.

June 26, 2015 | 10:58 AM - Posted by Anonymous (not verified)

Compare the AMD vs Nvidia forums and Nvidia is overflowing with complaints such as BSOD, artifacts and hard locks.

June 27, 2015 | 07:50 AM - Posted by renz (not verified)

So people that spend this kind of money would they care the save they can have when buying monitors?

July 2, 2015 | 12:54 PM - Posted by Rroc (not verified)

That's a good point. People in this market tend to just go for it and not care for the money. However, a very friendly number such as 100 or 200 dollars saving can sway people toward AMD side because of the fact that they are thinking they are saving 100, 200, 300, or 400 dollars on the number of monitors they buy. These are very easy numbers to grasp and may have a big enough impact.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.