AMD Radeon Fury X Graphics Card Pictured, Uses 2 x 8-pin Power

Subject: Graphics Cards | June 13, 2015 - 02:31 AM |
Tagged: Skylake, leak, hbm, fury x, Fury, Fiji, amd

You just never know what's going to come your way on Facebook on a Friday night. Take this evening for me: there I was sitting on the laptop minding my own business when up pops a notification about new messages to the PC Perspective page of FB. Anonymous user asks very simply "do you want pictures of skylake and r9 fury x".

With a smirk, knowing that I am going to be Rick-rolled in some capacity, I reply, "sure".

View Full Size

Well, that's a lot more than I was expecting! For the first time that I can see we are getting the entire view of the upcoming AMD Fury X graphics card, with the water cooler installed. The self-contained water cooler that will keep the Fiji GPU and its HBM memory at reliable temperatures looks to be quite robust. Morry, one of our experts in the water cooling fields, is guessing the radiator thickness to be around 45mm, but that's just a guess based on the images we have here. I like how the fan is in-set into the cooler design so that the total package looks more svelte than it might actually be.

View Full Size

The tubing for the liquid transfer between the GPU block and the rad is braided pretty heavily which should protect it from cuts and wear as well as help reduce evaporation. The card is definitely shorter compared to other flagship graphics cards and that allows AMD to output the tubing through the back of the card rather than out the top. This should help in smaller cases where users want to integrate multi-GPU configurations.

View Full Size

This shot shows the front of the card and details the display outputs: 3x DisplayPort and 1x HDMI.

View Full Size

Finally, and maybe most importantly, we can see that Fiji / Fury X will indeed require a pair of 8-pin power connections. That allows the card to draw as much as 375 total watts but that doesn't mean that will be the TDP of the card when it ships.

Also, for what it's worth, this source did identify himself to me and I have no reason to believe these are bogus. And the name is confirmed: AMD Radeon Fury X.

Overall, I like the design that AMD has gone with for this new flagship offering. It's unique, will stand out from the normal cards on the market and that alone will help get users attention, which is what AMD needs to make a splash with Fiji. I know that many people will lament the fact that Fury X requires a water cooler to stay competitive, and that it might restrict installation in some chassis (if you already have a CPU water cooler, for example), but I think ultra-high-end enthusiasts looking at $600+ GPUs will be just fine with the configuration.

There you have it - AMD's Fury X graphics card is nearly here!


June 13, 2015 | 02:34 AM - Posted by Rafmaninoff (not verified)

That fan looks very interesting indeed...

June 13, 2015 | 02:49 AM - Posted by Anonymous (not verified)

It looks exactly like Nidec Gentle Typhoons, it even has the little support ring from the high-RPM typhoons. An excellent fan with enterprise build quality and a price to match, but as far as I know, Nidec hasn't produced them in awhile. Looks like AMD isn't messing around.

June 13, 2015 | 03:04 AM - Posted by Anonymous (not verified)

because when I think video card, I think $30 case fan attached to a radiator.

June 13, 2015 | 03:41 AM - Posted by Anonymous (not verified)

I'm not really sure what's the problem with it. It's a great fan whether on a case or on a radiator, and if using a $20 fan instead of a $5 fan makes the card cooler and/or quieter, then why not? Should last awhile as well.

June 13, 2015 | 09:33 AM - Posted by mouse032 (not verified)

Bought 2 OEM GT1850 only $14*2.For amd,maybe $5~$6 each.

June 13, 2015 | 03:04 AM - Posted by Anonymous (not verified)

because when I think video card, I think $30 case fan attached to a radiator.

June 13, 2015 | 03:42 AM - Posted by Anonymous (not verified)

I've read that the fans just aren't sold to consumers directly. If you have a corporate account the are still available to order.

June 13, 2015 | 10:51 AM - Posted by skline00

They are being made. Check with Cooler guys to buy the OEM version.

June 15, 2015 | 02:29 PM - Posted by Anonymous (not verified)

Defiantly isn't a GT.

June 15, 2015 | 02:31 PM - Posted by Anonymous (not verified)

definitely*

Thanks auto correct.

June 15, 2015 | 03:38 PM - Posted by Anonymous (not verified)

Nidec has always produced them ;) they are the Gentle Typhoon High Speeds, its just that they are no longer sold in retail since Scythe pulled the contract. You can still buy all the Gentle Typhoon range from Nidec if you order a few thousand posing as a OEM

June 13, 2015 | 04:52 AM - Posted by hyp36rmax (not verified)

That fan is a Cooler Master Silencio FP 120 that is on the Nepton 120XL, 240M and Siolencio 652s Chassis, which is a high static fan.

June 13, 2015 | 04:54 AM - Posted by hyp36rmax (not verified)

or maybe not... :)

June 13, 2015 | 05:04 AM - Posted by hyp36rmax (not verified)

Here's my Gentle Typhoon AP29 for comparison:

http://www.overclock.net/t/1560118/pcper-amd-radeon-fury-x-graphics-card...

June 13, 2015 | 08:25 AM - Posted by Map (not verified)

No, it is a high speed Scythe Gentle Typhoon (http://www.scythe-eu.com/en/products/fans/gentle-typhoon-120-mm-high-rpm...). Look at the support ring and at least one photo of Fury X with low speed Typhoon exist (http://www.guru3d.com/news-story/amd-radeon-fury-x-photos-appear-online....).
So it will be at least 3000 rpm fan.

June 16, 2015 | 06:20 PM - Posted by hyp36rmax

I'll just leave this link here from the AMD Announcement from E3 2015

http://cdn.videocardz.com/1/2015/06/AMD-Radeon-Fury-radiator.jpg

Sure looks like the final will have a Cooler Master Silencio FP 120 ;)

June 13, 2015 | 02:56 AM - Posted by Anonymous (not verified)

1st pic. Why is his shoe blocked out ? Did he have a hole in it exposing his toes ?

3rd pic. I'm guessing the leakers name, Barry Allen ? Explains hole in shoe.

June 14, 2015 | 04:44 AM - Posted by Anonymous (not verified)

Shoes are used to identify suspects in crimes. All shoes are very unique and can expose a person, especially the soles of the feet and the print they leave...

This guy blocked his shoes so they couldn't ID the leaker. He could lose his job instantly for opening one of the boxes.

June 13, 2015 | 03:08 AM - Posted by Anonymous (not verified)

No VGA port? How will I use this with my CRT? Active adapters don't even support the full res and refresh rate of a good CRT

June 13, 2015 | 05:36 AM - Posted by Anonymous (not verified)

Nobody cares. VGA is dead, CRT is dead, get with the times. Only a very small minority of customers for this card would be using a CRT.

June 13, 2015 | 05:54 AM - Posted by Anonymous (not verified)

And only a very small minority purchase cards like these. Whats your point?

June 13, 2015 | 10:53 AM - Posted by Anonymous (not verified)

My point is that a small minority of a small minority is a very small number of people. There's basically no reason for AMD to put VGA on this card, because most customers are going to use digital interfaces, mainly DP.

June 13, 2015 | 05:03 PM - Posted by Anonymous (not verified)

And very few people truly care about graphics and displays (just look at how many people run rubbish 1366x768 TN panels with integrated graphics). This card is aimed at the few who do.

I would like to get with the times, but first, monitor manufacturers need to give me something worth buying. After 15 years, there are still no monitors that are clearly better than the GDM-FW900. Everything involves significant compromises somewhere.

I could understand no VGA if they made this card single slot, but as it is, AMD has an entire slot utterly wasted. It isn't even being used for exhaust. Nvidia manages to have VGA support even on air-cooled cards that benefit from exhaust ports.

June 14, 2015 | 08:06 AM - Posted by Anonymous (not verified)

And what would need to be exhausted from there? All the cooling is being done by the radiator, and it's not as if those holes in the bracket are gonna make any actual difference anyway....

June 13, 2015 | 08:59 AM - Posted by Anonymous (not verified)

Contemporary GPUs already don't support D-SUB over their DVI ports, they support digital only, so this is not something you can bash the Fury for.

June 13, 2015 | 05:44 PM - Posted by Anonymous (not verified)

That was only really the 290(X). The more recent AMD 285 and all current Nvidia cards still support DVI-I.

June 13, 2015 | 04:24 PM - Posted by Anonymous (not verified)

also no S-VIDEO/Composite out... very disappointing

June 13, 2015 | 03:11 AM - Posted by Anonymous (not verified)

Crazy how small that looks, just glancing at my GTX770. Looks like it might be time for an upgrade.

June 13, 2015 | 03:26 AM - Posted by Anonymous (not verified)

My body is ready.

June 13, 2015 | 03:27 AM - Posted by pdjblum

Shit, does this mean you don't have one in house for testing? I was hoping to see a review the day of the release and figured you were under an NDA. Could it be that sharing these photos is somehow not precluded under the NDA, which is what I am hoping for?

June 13, 2015 | 03:40 AM - Posted by Anonymous (not verified)

Fuck Yeah!!!!!!
Single slot water cooled card for custom loops.
I love how AMD always have single rowed I/O for flagship cards. You can always get a single slot bracket.
Hell Yeah!!!!

June 13, 2015 | 03:48 AM - Posted by BlackDove (not verified)

A tiny graphics card with a giant radiator is a ridiculous product.

June 13, 2015 | 03:56 AM - Posted by whatever

Traditionally card sizes are determined by heat output, more powerful GPU generate more heat, hence bigger cooler to cool down the chip. Bigger cooler lead to a bigger overall card at the end.
Another factor is memory chips. High end cards usually have more memory, hence needing more memory chips to be contained by the PCB. Hence they usually have a bigger PCB.
Those two factors make high end cards HUGE.

But Fury doesn't have this problem, all its memory are mounted directly on top of the GPU, making the PCB smaller. Also having a standalone radiator eliminate the need for a big onboard cooler.

That's why it's so small.

Get your facts straight before complaining.

June 13, 2015 | 09:04 AM - Posted by Anonymous (not verified)

I don't think the cooler size dictates the graphics card size much; the cards are usually crowwded with components. High-end cards usually have a lot of memory chips which take up a lot of space, as you said. The other components that consume a lot of board space is the power regulation circuitry. The extra power connectors supply 12 V. This needs to be converted to the low voltage/high current required by the gpu. The VRMs needed to supply ~300 W will take a lot of space. This is what will be on the Fiji board. Nit will be the interposer package and the Power delivery circuitry.

June 13, 2015 | 11:29 PM - Posted by BlackDove (not verified)

I have the facts on this card:

4GB of RAM on a flagship card while the competetion has 6-12

Small card with a massive water loop, making it take up more space than conventional vapor chambers

2x8 pin power meaning it probably uses plenty of power

This is the new 7970. GP100 is the next GK110. AMD has 3D RAM first. Intel(Knights Landing) and Nvidia arent rushing to release it until its ready.

Having a tiny PCB with a 70x70mm chip that requires a huge radiator is kind of pointless, like i said. Get your engineering and design right.

June 14, 2015 | 08:12 AM - Posted by Anonymous (not verified)

1. Vram is all there is to a video card I take it?
2. And also more silent and better cooled. Almost any case has room for a single 120mm radiator, especially the cases bought by the audience for this product.
3. So what? This isn't a laptop. You're buying a top of the line GPU, I'm not assuming you can't pay your electricity bills
4. The 7970 was a great card, so what's your point?
5. Thanks for sharing your knowledge, oh great and knowledgeable engineer with years of experience in the field and who obviously knows more than the people who've been working on it.
/rant

June 14, 2015 | 10:43 PM - Posted by BlackDove (not verified)

First, VRAM hasnt been used on GPUs for about 15 years. Its GDDR or HBM in this case.

And no its not all there is to it. I never said it was.

My point about the 7970 is that it was a big chip with a 384 bit bus like GK110, but when GK110 came out a few months later it was sigbificantly more powerful. I think something similar will happen here: AMD did 3D RAM first,Nvidia will do it better.

Youre welcome.

June 15, 2015 | 09:33 AM - Posted by Silver Sparrow

Its been stated that Nvidia will be using HBM from SK Hynix, the very memory modules that AMD co-designed. I guess HMC is off the cards for nvidia at least for the time being.

You're welcome.

p.s. There's being a fanboy, being smug and then being both at the same time. All over some logic gates... you go you!

June 15, 2015 | 01:03 PM - Posted by BlackDove (not verified)

I was replying to someone being smug, so i dont care about being polite.

What was i wrong about exactly?

June 13, 2015 | 06:41 AM - Posted by Anonymous (not verified)

Laughable really. Compare that to the beautiful GTX TITAN.

June 13, 2015 | 03:35 PM - Posted by Gerard (not verified)

Would that be the same card that thermally throttles and has a loud cooler?

June 15, 2015 | 04:42 PM - Posted by Anonymous (not verified)

When it was AMD with the card that thermally throttled while using a really loud reference cooler, Nvidia fanboys laughed and mocked.

But now that it's Nvidia's card that throttles under a loud reference cooler, it's perfectly okay.

June 14, 2015 | 03:59 PM - Posted by Goldmember (not verified)

Yea for sure, but not half as ridiculous as your comment.

June 13, 2015 | 04:05 AM - Posted by Anonymous (not verified)

Where are the skylake pics then?

June 13, 2015 | 04:13 AM - Posted by iSkylaker

Where is the Skylake part?

June 13, 2015 | 04:43 AM - Posted by Ryan Shrout

The Skylake image(s) I have didn't turn out to be nearly as exciting... Just a bare processor.

June 13, 2015 | 04:30 AM - Posted by arbiter

Bad thing about 2x8pin, for AMD that could mean cards power draw can between 300watts and 600watts. Case in point r9 295x2 which has 2x8pin as well.

June 13, 2015 | 07:03 AM - Posted by Anonymous (not verified)

Might need to check you figures again boss. (1×75W + 2×150W)

June 13, 2015 | 07:24 AM - Posted by JohnGR

He is an known Nvidia fanboy, he doesn't have to check anything. He just throws mud.

June 13, 2015 | 08:00 AM - Posted by arbiter

Its called Knowing my hardware smartass. You call me an nvidia fanboy, but its sad I know more about your amd hardware then your amd fanboy ass does.

June 13, 2015 | 08:45 AM - Posted by Anonymous (not verified)

What is sad is your a Nvidia fanboy that trolls multiple forums with the exact same post.

June 13, 2015 | 08:53 AM - Posted by arbiter

Logic and Truth are on my side, what you have? AMD Marketing?

June 13, 2015 | 06:46 PM - Posted by Anonymous (not verified)

Logically speaking, 375 is between 300 and 600, so your basic math is correct. Good job.

June 13, 2015 | 09:41 AM - Posted by Anonymous (not verified)

They may be an Nvidia employee. I have worked at tech companies where some of the sales guys admitted to getting on forums and stirring up FUD about the competition in their spare time. It isn't just fanboys with some misguided sense of brand loyalty.

June 13, 2015 | 07:48 AM - Posted by arbiter

as i pointed out if you could read, AMD has used 2x8pin to power a 600watt GPU. SO just cause spec says 150watts, AMD has pulled a lot more from it.

If you can Count, you see 2x8pins on that card. That card draws least 500watts under load, and can draw as much as 600watts under max load.

http://images.bit-tech.net/content_images/2014/04/amd-radeon-r9-295x2-re...

June 13, 2015 | 09:38 AM - Posted by Anonymous (not verified)

That is a dual gpu card which makes your point completely irrelevant.

June 13, 2015 | 10:09 AM - Posted by Martin Trautvetter

Nah, he's actually correct in pointing it out. AMD has ignored PCIe power specs before, so taking the connectors on the board and coming up with max power from that doesn't work for them anymore.

June 13, 2015 | 10:38 AM - Posted by Anonymous (not verified)

BS. He is trying to imply that Fiji could take up to 600 W which is ridiculous FUD. It is pretty obvious to me that there is no way that this single gpu is going to consume anywhere near the amount of power that the 295x2 can draw. I would not be surprised if Fiji gets close to 375 though. The Titan X is a 250 W card with 3072 shaders. Fiji is supposed to have 4096 shaders. With 1024 more shaders, it isn't going to be lower power than the Titan X. It also looks like it has a lot more TMUs, ROPs and other hardware than the Titan X. I would say most gamers will not care about 250 W vs. 375 W if the performance is there.

June 13, 2015 | 07:53 PM - Posted by Anonymous (not verified)

your the BSer here.. Fiji CAN take up to 600w, and there's been plenty of single GPU cards in the past that draw 450-500+ watts when overclocked and running on a "we don't give a shit about power consumption or PCI-E standards" mentality like AMD does. Fiji will do more than get "close" to 375 watts, that will be it's BASE TDP, as in, once overclocked and put under load it'll hit MORE than 375w. Even if it's TDP is only 300, which would be physically impossible for AMD to do as their stream processors are nearly 50% less efficient than a cuda core, and powering 4096 of those inneficient things will easily draw 325-350 watt base tdp at least, with 375 being a more fair number. And don't forget this is the reference board, custom AIB designs with more power phases etc.. like a vapor-x or an XFX Double D etc.. will be using over 400w in some cases. Plus you have to factor in the wattage of an AIO liquid cooler into the TDP of the card as well, so that's another few dozen watts added in under stressful load.

This card could easily hit 450-500w when overclocked under full load. You AMD fanboys seem to forget that TDP != load wattage, TDP is simply the baseline reading for a moderate load on the card. For example, the GTX 970 has a TDP of ~160w, but when you put it under full load, even with no overclock you hig 250-300w; when OC'ed that goes up to 325-350ish. Whereas the TITAN X/980 TI can hit ~400w under OC and heavy load. So even if the Fury X hits 500-600w with max load it's not any different than AMDs past TDP designs.

June 13, 2015 | 07:53 PM - Posted by Anonymous (not verified)

you're*

June 13, 2015 | 10:51 PM - Posted by Anonymous (not verified)

Prove it. I have seen no evidence that nvidia's design is actually inherently lower power. The Titan X is not on the same process tech as the 290x was so they are not directly comparable. Current rumors say around 300 W with the new process for Fiji.. If you are overclocking, then you obviously are not staying in the preset limits by definition.

June 13, 2015 | 11:32 PM - Posted by BlackDove (not verified)

Theyre not both TSMC 28nm?

June 14, 2015 | 01:04 PM - Posted by Anonymous (not verified)

I have read that Nvidia did work with TSMC to optimize the process. I am not sure about the specifics; the information is hard to find. I have seen some indications that the 290x uses 28HPM while the 980 uses the 28HPC process variant. I believe there are 4 or 5 different 28 nm process variants available from TSMC, but not all of them would be suitable to building a gpu.

June 15, 2015 | 09:45 PM - Posted by Anonymous (not verified)

To be honest that works out at about £30 a year for 25 hours a week. If the difference in price was say £200 then you would be looking at 10 years to recoup the difference, hardly seem financially viable thing to do?

June 13, 2015 | 10:01 AM - Posted by JohnGR

2X8pins = 600W
Logic is in your side LOLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLL!

June 13, 2015 | 12:52 PM - Posted by Anonymous (not verified)

Haha yeah that guy really tries to look smart but the stupidity is just stronger

June 13, 2015 | 11:18 AM - Posted by overlord (not verified)

LOL So much for accurrate measurements! Tom had it running for much less http://www.tomshardware.com/reviews/graphics-card-power-supply-balance,3...

500W PSU http://www.bequiet.com/en/powersupply/527

June 13, 2015 | 01:53 PM - Posted by Ryan Shrout

I have seen the R9 295X2 with 2x 8-pin power connectors hit over 425 watts of power draw, measured directly at the GPU: http://www.pcper.com/image/view/54167?return=node%2F62575

But nothing like 600 watts.

June 13, 2015 | 09:37 AM - Posted by Anonymous (not verified)

Well, if you want 4096 shader cores (compared to 3072 on the Titan X), your going to have to be willing to deal with the power requirements and heat output to run them. Comparing this card to the 295x2 is ridiculous though. The 295x2 has two large GPUs (2816 shaders each) plus 4 GB of GDDR5 each. There is no way that a single Fiji board will consume that much power. I doubt a single Fiji board will go over the 375 W limit. I suspect the water cooler is needed because of the small size of the card. The Titan X is gigantic so it has a lot of space to get the heat out into the air. This card looks like it might be half the size. It would be hard to get an air cooler in such a small space that could dissipate even 250 W. If you mounted a blower the size of the one on the Titan X, the blower would take up most of the space and only leave a very small area for heat exchange.

June 13, 2015 | 10:10 AM - Posted by JohnGR

I wouldn't compare cores from different architectures.

The water cooler probably is a decision after seeing what happened with Hawaii. Hawaii was killing Nvidia cards in performance, but the GREEN/D NVIDIA press found the excuse they needed to direct reader's attention to the cooler. In a Hi End card's review this was THE FIRST TIME that performance was something secondary. Completely and utterly ridiculous.

The two standards GREEN/D NVIDIA press is using the last two years is probably the reason why AMD choose to show the card first to the public and then give it to reviewers. Reviewers that are crying foul now all over the internet because we are talking about AMD. When we are talking about Nvidia fake specs are not important and miscommunication FOR 6 MONTHS in a company like Nvidia is completely logical and expected and justified.

June 13, 2015 | 10:57 AM - Posted by Anonymous (not verified)

I think the water cooler is necessary because of the size of the card. It will be interesting to see comparison pictures with this next to a Titan X or 980 Ti. If you used a blower like Nvidia does, it would take up most of the space on the card. There would be very little space for the actual cooling fins in a two slot card. How big is a CPU cooler that can dissipate ~300 W? The 980 Ti is a huge card. If you look at it without the shroud, the blower takes up about half the card with the other half being the heat exchanger cooling fins. An air cooler simply would not have worked on this small of card. They might be able to do it by extending the blower beyond the end of the card and only have cooling fins on top of the card. At this price range though, water cooling makes a lot of sense. It is quieter, more flexible for case size, and it cools better.

June 13, 2015 | 12:12 PM - Posted by JohnGR

Yes. But AMD could easily show an air cooled one as a reference card, and yes by extending the cooler.

AMD has changed the way they look at their reference cards. In the past they where giving a just OK version, so that they leave their partners room to create something that not only will look, but also will be and perform much better. That way their partners could charge a little extra and have better margins. That was typical in graphics cards for ages.

I believe, after seeing how the press used the cooling system's mediocre performance, as a way to make the best performing card look inferior to other cards, they changed their attitude, something that we already saw with 295X2. AMD now makes a top card as a reference card, not just an OK card. That probably limits profit margins for their partners, but it is preferred to have lower profit margins from something that you can sell than having bigger profit margins from something that got a very negative response from the press.

PS. Did you noticed how those two 8pin connectors make it easily on the tittle? You have a new small form factor, water cooled hi end card that it is meant to go into cases with PSUs with many hundred of watts if not 1000+, and what is the most importand thing on the world? Those two 8pin connectors. There are GTX980 cards with two 8pin connectors to offer more power and at the same time more room for overclocking, but here we treat those two 8pin connectors as something negative, like we are talking about a mid range card.

June 13, 2015 | 01:56 PM - Posted by Ryan Shrout

It's in the title because it's the only NEW information really provided in these photos.

But you are correct, I hope, that AMD has been rethinking its reference designs and is building something higher quality and with more care than they did on the 290/290X launch.

June 13, 2015 | 03:09 PM - Posted by JohnGR

10 days ago we had this picture that shows 2X8pin connectors
http://cdn.videocardz.com/1/2015/06/AMD-R9-FURY-vs-GTX-980-Ti-PCB-compar...

The next was posted at least 3 days ago and shows the two 8pin connectors clearly.
http://i.imgur.com/OI8HOfA.jpg

Anyway, the fact that you are a hardware site doesn't mean that you spent 24/7 googling about everything, so probably you haven't seen those.

Waiting for your review. I know it will be good written, I hope it will also be fair.

One last thing
http://www.pcper.com/news/Graphics-Cards/CES-2015-EVGA-Shows-Two-New-GTX...
EVGA Classified Kingpin

2X8pin connectors + one 6pin connector for a card that doesn't need more than 2x6pin connectors. IF(big if) Fury beats 980 Ti, I am expecting to see many reviews with OCed 980 TIs in their charts.

June 14, 2015 | 07:40 PM - Posted by Snow_Eater (not verified)

Its not even just the manually overclocked 980ti's I would worry about. Ever since Nvidia introduced their GPU boost tech, there have been cases where I rolled my eyes at benchmark comparisons. A really good benchmark comparison site is anandtech bench, but if you compare a 290 or 290x to a 970, the 290/x cards are reference cooled/clocked while the 970 is an EVGA FTW card with an out of the box boost clock of 1410Mhz with stock bios (at least for anandtech's review sample). Granted, the reference cooled hawaii GPU's don't have a lot of overclocking headroom unless you get a golden chip, and testing an nvidia card with boost left on is completely fair because its how the card works out of the box. I find AMD's powertune technology to be really good and probably more advanced than nvidia gpu boost, but they really need to adopt the overclocking nature of gpu boost into powertune. Even though it ruins some of the fun of overclocking, I think AMD really need to go that direction.

June 14, 2015 | 04:33 PM - Posted by Goldmember (not verified)

Hi Ryan!

Out of curiosity, what proof is there that AMD's reference cooler for Hawaii isn't as good as Nvidia's? Yes it revs more, but it is only natural as it is quite a bit smaller than for example the GK110, and at the same time draws more power.

That being said, I haven't seen a test that confirms that Nvidia's cooler would do a better job at that (cooling Hawaii that is). One might argue that the third party coolers succeeds in keeping it cooler, but they are larger and the trend is still there, it's harder to keep Hawaii's temperatures down.

Other than that, keep up the good work! :)

June 13, 2015 | 06:38 PM - Posted by Anonymous (not verified)

I don't see why you would want an air cooler. How big are CPU coolers that can dissipate ~300 W? Water cooling just makes good design sense for GPUs. Due to the current form factor, GPUs are limited to a tiny area while GPUs have a huge area with case exit fan right behind it. Nvidia has already shown their future plans for a mezzanine type connector rather than a slot connector. We will get new firm factors, but the current form factor is still skewed towards CPUs being the most important component which is no longer the case. For gaming, I would not even bother overclocking the CPU unless I was trying to use a really low end CPU.

There is almost certainly some marketing at play here. AMD wants this to be seen as a revolutionary product, which it is. Extending the cooler to do air cooling would make it look just like every other card on the market. This design, with its unique look and small size gets the point across that this is something different.

June 13, 2015 | 01:09 PM - Posted by Martin Trautvetter

"the GREEN/D NVIDIA press found the excuse they needed to direct reader's attention to the cooler. In a Hi End card's review this was THE FIRST TIME that performance was something secondary. Completely and utterly ridiculous."

I fear you might be misremembering, maybe you should check out a certain Mr Shrout's articles on the matter?

What happened was that AMD shipped a reference card with insufficient cooling to retain the initial clock speeds for any length of time. Hence, performance suffered if the card was used for more than a few minutes at a time, and any testing that enabled the card to cool down between benchmark runs produced results not indicative of real game play.

As such, performance was very much at the centre of the issue.

June 14, 2015 | 03:50 AM - Posted by Anonymous (not verified)

You mean like Titan X throttling?

June 14, 2015 | 09:13 AM - Posted by Martin Trautvetter

I can't seem to remember the Titan X being a thing when Hawaii hit the market in late 2013.

June 15, 2015 | 04:53 PM - Posted by Anonymous (not verified)

I think the point was that the Titan X, today, is experiencing the same issue that Hawaii experienced in late 2013 - thrashing the competition but running really hot and thermally-throttling under a very loud reference cooler - and yet the tech press is either glossing over that fact or ignoring it completely, instead of making it the entire focus the way they did with Hawaii reviews.

IOW: for Hawaii it was, "OMG look at how hot and loud this thing is (regardless of performance)!"

For Titan X it's, "OMG look at how powerful this thing is (regardless of the heat and noise)!"

June 15, 2015 | 06:54 PM - Posted by JohnGR

And this is not the only example of double standards from the press.

Nvidia recently fad many problems with their drivers, beta AND WHQL, going from new version to new version, even posting a fix for one of their latest drivers. Not to mention 700 series performance issues with the latest drivers, after version 347. Did anyone read about that anywhere?

On the other hand the fact that AMD haven't produced a WHQL driver but only Betas has been promoted as a major issue by many websites, like for example TechPowerUp. Of course the fact that there where no problems with stability on AMD Beta drivers was ignored. The only issues reported where with Nvidia GameWorks and whose where about performance in specific, Maxwell friendly titles.

For one more time the press used double standards. They ignored the main thing about drivers, which is stability, and tried to drive the reader's attention on the title, if that title was having the word beta or the WHQL.

That's why I am SCARED of the press with the Fury card reviews, that's why I believe AMD choose to show the card FIRST to the public and latter to the press.

June 16, 2015 | 03:12 PM - Posted by Martin Trautvetter

AMD's "new" stance on drivers is ridiculous. They haven't had an update in 6 months, and nobody can tell me that 6+ months old drivers are properly optimized for games coming out today.

I didn't consider providing regular updates in form of properly tested and released drivers optional when I bought my R9 290X.

June 17, 2015 | 08:34 AM - Posted by JohnGR

Performance is one thing, stability another. If you prefer Nvidia's approach where they offer performance enhancements for 900 series completely with bugs and instability even with the WHQL drivers, and only bugs and instability for 700 series or older cards, fine.

But I prefer losing a couple of fps here and there but having an 100% stable driver in my system that doesn't throws BSODs, doesn't locks the system or turns the screen black.

Also AMD doesn't have six months to give a driver. The last driver is less than a month old. So it is a completely lie what you say about 6 months without drivers.

The Beta on the driver's title means nothing when that driver is totally stable. The same is true for the WHQL. WHQL means nothing when there are instability problems all over the place.

June 17, 2015 | 12:29 PM - Posted by Martin Trautvetter

"WHQL means nothing when there are instability problems all over the place."

Agreed, this is not a performance vs. reliability thing, cause even 14.12 isn't stable and has crashed numerous times for me over the past 6 months just browsing the web.

Beyond that, I prefer my GPU vendor to offer continued driver support for the hardware I bought. When I purchased my 290X AMD was offering just that, only to go back on that promise full-speed last December.

And no, releasing a couple of beta drivers over the past 6+ months doesn't qualify as hardware support. I know first hand how crappy AMD's fully tested and properly released drivers are, so I can only imagine the fun the beta ones are, if AMD can't be bothered to have them tested and released regularly.

June 16, 2015 | 03:07 PM - Posted by Martin Trautvetter

I've seen the Titan X being measured uncomfortably loud under load, does it really throttle below its minimum clock, too?

And as far as coolers are concerned, I learned my lesson with the GTX 8800: If it's got a blower, I won't buy it. Blowers are evil.

June 14, 2015 | 04:29 AM - Posted by donut (not verified)

The nvidia gimp boy strikes again.

June 15, 2015 | 06:21 AM - Posted by Anonymous (not verified)

Nvidia trolls are so sad when they come bashing to AMD forums. I guess they feel threatened and scared that AMD might come up with a better product.

June 13, 2015 | 04:41 AM - Posted by Screwyluie (not verified)

it's a brick... a goddamn brick. I wouldn't buy it just because of how damn ugly it is.

June 13, 2015 | 05:56 PM - Posted by Anonymous (not verified)

Maybe there will be aftermarket pimp kits for mono-brows like yourself to dress it up to meet you irrational expectations. It's a GPU not a trophy, that brick will fit into much smaller from factor cases. I'll take a few of those bricks, and the game play will be beautiful. It's not how it looks, it's how it plays, for what is paid.

June 13, 2015 | 06:08 PM - Posted by Anonymous (not verified)

just dont buy it and VOILA!! done.. dont have to scream it to the world kiddo

June 13, 2015 | 06:22 PM - Posted by Anonymous (not verified)

Yeah, so many things made by humans are just square. You would think we could be a bit more imaginitave.

June 13, 2015 | 10:52 PM - Posted by Anonymous (not verified)

I guess there is the trash can Mac pro from Apple...

June 13, 2015 | 05:09 AM - Posted by Hakuren

Nice. Finally it arrived. Potential for CF with true 1 slot config. With DVI gone its finally a possibility. Hooray for that! No more wasted slots. To be honest I'm bloody sick and tired of gargantuan 280mm+ VGA cards which just are disaster waiting to happen. Its ironic but I own MSI TF3 GTX580 probably longest single GPU-card ever produced, but it's more than enough for my gaming. With these AMD beauties I will most likely switch - with a proviso that drivers are working which is real pain of biblical proportions with AMD (I have few "Red" VGAs).

Josh do something about it! :P

June 13, 2015 | 05:10 AM - Posted by Anonymous (not verified)

Considering the watercooling at 2x8pin power, this is going to be one hungry beast.

I wouldn't expect much overclocking headroom either. Knowing AMD, they're already pushing it with "up to" performance with the watercooling being required to prevent the card from burning up.

Replace it with a custom loop, and there may be overclocking headroom.

June 13, 2015 | 05:21 AM - Posted by arbiter

Its 1 thing to go a water cooler on a card cause you can. Its kinda of another doing water cooler on a card cause you have no choice to prevent the card from frying itself or throttling. Likely for AMD the ladder is reason for the water loop. Kinda does say power draw is gonna be pretty high.

June 13, 2015 | 10:22 AM - Posted by Anonymous (not verified)

So Nvidia can only compete on power consumption? That would make sense considering that Fiji has a lot more resources than a Titan X.

June 13, 2015 | 06:04 PM - Posted by Anonymous (not verified)

What's the reason for the "ladder"? For you, it's the ladder you use to reach Mom's cookie jar, and swipe the cash to get the overpriced Nvidia Kit. You are a paid troll for you green overlords, and only for a few dog biscuits, Good doggie, Good boy! pat, pat here go fetch!

June 15, 2015 | 01:12 AM - Posted by arbiter

No one pays me a dime for anything.

June 13, 2015 | 09:47 AM - Posted by Anonymous (not verified)

I don't think air cooling was much of an option. If you look at the size of this card, a blower the size of the one used on the Titan X would take up almost all of the card. Same thing with a fan. You could only put one fan on there, and it would take most of the space. You wouldn't have enough space for the actual heat exchanger.

June 13, 2015 | 05:12 AM - Posted by Anonymous (not verified)

Who cares how it looks? 3rd party sellers will likely spruce it up, hopefully

June 13, 2015 | 01:58 PM - Posted by Ryan Shrout

I do hope you are right, but I fear that the cost of this cooler and the rumors about limited availability will mean that its reference only for quite some time.

June 13, 2015 | 06:43 AM - Posted by Anonymous (not verified)

I'll be interested to see how they price this thing competitively and STILL make money.....They really need to make money....

June 13, 2015 | 08:37 AM - Posted by Master Chen (not verified)

What's that fan? Looks nothing like the one Chip Hell shown in their unboxed sample.

June 13, 2015 | 09:21 AM - Posted by ZoA (not verified)

What happened shrout, did you not insinuate Fiji does not exist only few mounts back when AMD refused to show it to you? Or was it your oversized ego overpowering your limited intellect making you come to stupid conclusions as usual?

June 13, 2015 | 09:33 AM - Posted by MingLord (not verified)

Are you jockeying for the head troll position? Is that why you mention the horses?

June 13, 2015 | 10:29 AM - Posted by Anonymous (not verified)

I am wondering if i could integrate that card in an existing water cooling system :-/

June 13, 2015 | 11:12 AM - Posted by Anonymous (not verified)

There will probably be water cooled versions for custom loops from other manufactures. I don't know if they will make an air cooled version though. The card is so small that making a sufficient air cooler in that size would be difficult unless they extend the cooler beyond the end of the card. Perhaps they could just cover the card with cooling fins and put a blower that extends out beyond the end of the card. They could have the blower portion be removable if you are using water cooling for a combo card (water or air).

June 13, 2015 | 10:58 AM - Posted by skline00

Looks like a Beast of a card. 2 8 pins is reasonable in light of running both the gpu and the entire liquid cooling system.

As for the fans, if you check Cooler Guys you can still buy the OEM version of the Gentle Typhoon in various speeds. Nidec makes them.

I have 12 AP15 GT fans in my custom water cooling system and they are superb, robust fans.

June 13, 2015 | 11:02 AM - Posted by skline00

Integrating the card in an existing water cooling system would require either an aftermarket block made by EK, etc, or Gerry rigging the existing block into your system. I'm not sure that would be sufficient cooling, especially if the present cooling is somekind of pressurized system using a particular type of coolant.

June 13, 2015 | 11:15 AM - Posted by Anonymous (not verified)

Why would the system be pressurized?

June 13, 2015 | 01:33 PM - Posted by Lystfiskern (not verified)

Hmm....this might very well end up being my next gpu.

June 13, 2015 | 02:50 PM - Posted by btdog

I will state that I've been trying to avoid news about the upcoming Fiji for fear that they hype would be overblown, and the criticism equally excessive.

But I have serious concerns if the card is watercooled only. That implies heat issues, and then there's the difficulty of finding space on the case for potentially two radiators (1 CPU, 1 GPU). Compound that with 2x 8pin connectors...

D@MM!T. I was hoping this would be the first in a series of successive "wins." Instead - by nature of its design - AMD has already limited the potential buyers to the top 2% of the gaming community.

June 13, 2015 | 06:11 PM - Posted by Anonymous (not verified)

Would you say that you are feeling fear, uncertainty, and/or doubt?

June 14, 2015 | 12:42 AM - Posted by btdog

Yes, but that's completely unrelated to this topic...and how did you know?

June 15, 2015 | 03:03 PM - Posted by BBMan (not verified)

Exactly my thought. The energy that you draw for these things has little option but to be converted into heat. This is just basic physics. This is a performance move, not a power efficiency one- unless they are after charging you more for coolers. So you will likely have to deal with yet another AMD record for power draw.

June 13, 2015 | 03:30 PM - Posted by AMD is a sinking ship (not verified)

390x series and below are rehashes from 200 series

If the Fury X has a 300w TDP or more, the card's size prohibits the use of large air coolers. Now if AMD allows custom pcb designs with card makers then they will enlarge it too allow bigger air coolers.
Also The Fury X will only come with 4gb when it releases and all the leaks/rumors suggest that the gpu's performance isn't any better 980ti or Titan x, means that with a $600 price tag and only 4gb of vram, and a possible TDP of upto 375w, makes the Fury X garbage option when you have a GPU(980ti) that has 2gb more vram and is upto 40% more efficient per watt.

June 13, 2015 | 06:47 PM - Posted by remon (not verified)

Nvidia's efficiency is overblown. In reality it is much closer to AMD's cards.

June 13, 2015 | 07:13 PM - Posted by AMD is a sinking ship (not verified)

Nvidia's efficiency per watt Its not overblown, do you actually know how took at benchs?

Look at 290x uses 295w at peak vs say GTX 970 using 180w, performance of these cards are on par with each other overall. There is a 40% difference in power usage between them.

June 13, 2015 | 07:59 PM - Posted by Anonymous (not verified)

They are not on the exact same process even though they are both still 28 nm. I have seen things indicating that some, if not most, of Nvidia's Lower power consumption is due to the difference in process tech, not anything inherent to the design. AMD will be using a different process for upcumming parts.

June 14, 2015 | 10:20 AM - Posted by arbiter

I don't think its cause diff process tech. Nvidia was showing maxwells power improvements back with 750ti.

June 14, 2015 | 01:40 PM - Posted by Anonymous (not verified)

It is hard to find specific information, but I have seen some claims that Nvidia is using a different variant, or at least a slightly customized process. TSMC offers several different 28 nm variants.

Anyway, I suspect a lot of te power savings from Nvidia actually came from cutting out FP64 hardware. The 290x still has FP64 at 1/8th FP32. The Titan X only has FP64 at 1/32 FP32. This is probably fine for gaming, but it means they probably will not be selling it as a compute card. It is foolish to think that AMD would not do the same thing to save power and die area with their completely new design. FP64 hardware takes up a huge amount of die area, and I am not sure it's reasonable to try to power gate it off when not in use. That would cause significant leakage for the 290x and variants. Fiji should be significantly more power efficient for FP32, but FP64 will probably be reduced to similar levels as Nvidia 980/980Ti/Titan X.

June 14, 2015 | 03:10 AM - Posted by Anonymous (not verified)

Typical garbage from nvidiot shills. GM204 != 165W. 185W for stock is min power draw if you run a board on min voltage. Then it won't run boost. Most AIB cards are >200W. AIB 290X draws 250W. People wonder why there's turn-off on bought-paid for "HW review sites". Nvidia ditched compute flexibility in Maxwell, so they perform well in current game workloads. How well will they do under Mantle (er DX12...) with simplified command processor & 2 threads will be apparent soon, I guess.

June 14, 2015 | 10:28 AM - Posted by arbiter

Mantle you mean that proprietary API AMD was working on that was closed source so no one else could use it? Same BS everyone attacks nvidia over? No it never was open source as AMD claimed it would be.

Stop calling people shills. AMD has taken heat from every site rightfully so. AMD has said and claimed so many things over the years to end up NOT living up to their claim's. Instead of being an "Another Moron Deciple" (if you want to go down name calling route).

June 14, 2015 | 04:03 PM - Posted by JohnGR

Yes, that close thing that it is so closed it is now called Vulkan and thanks to it you have now DX12 less that 2 months away.

PS Deciple? At least you know how to spell Moron, moron.

June 14, 2015 | 08:42 PM - Posted by arbiter

welcome to auto correct smartass maybe you heard of it? tends to be wrong at times.

June 15, 2015 | 06:27 AM - Posted by JohnGR

LOL. Don't cry.

June 15, 2015 | 05:00 PM - Posted by Anonymous (not verified)

I've never seen an autocorrect replace a misspelled word with another very badly misspelled word, except when the replacement misspelled word was intentionally added to the autocorrect dictionary.

By the way, the word you're looking for is "disciple".

June 15, 2015 | 12:28 PM - Posted by Anonymous (not verified)

That's the most foolish thing I've read.
So how is working PR at nvidia?

June 13, 2015 | 04:02 PM - Posted by Anonymous (not verified)

The thing I hate about these types of designs, especially with a Radiator that thick, is that you have to find special cases if you even want think about Crossfiring it. Its not like you can find any old mATX Case and throw two in (Titan X). You just can't do that when your radiators look pretty thick or you have two radiators to mount, plus trying to mount one for your CPU.

IMHO, watercooled graphics shouldn't be released from AMD or Nvidia, but by select AIBs.

June 13, 2015 | 06:17 PM - Posted by Anonymous (not verified)

If you are building a custom system, then it is your responsibility to get components that work together. If you can't do that, then you should buy something from a company that does it for you.

June 14, 2015 | 07:12 PM - Posted by Anonymous (not verified)

Yes, yes, yes, but think about it from a business standpoint for a moment. Tell me what sells your card better. The ability right off the bat for any case to CF or only selective CF?

June 13, 2015 | 05:34 PM - Posted by whatever

I just hope the air cooled version is actually cheaper, but with everything else the same as the water cooled version. That way, I can custom water cool it without much extra cost.

June 13, 2015 | 05:36 PM - Posted by Anonymous (not verified)

Off with the plate! I wanna see what´s inside.

June 13, 2015 | 06:18 PM - Posted by H1tman_Actua1

POS space heater anyone?

June 13, 2015 | 07:10 PM - Posted by Titan_V (not verified)

Brain-dead Nvidia marketing sock-puppet, anyone?

June 13, 2015 | 07:55 PM - Posted by Anonymous (not verified)

I am kind of wondering if the black top cover is replaceable for different looks. If you are water cooling the whole thing, then you can put any cover on it you want.

June 13, 2015 | 10:05 PM - Posted by vailr

They're re-using the "Fury" moniker from a 1999 AGP video card:
http://img19.imageshack.us/img19/2807/uvnn.jpg

June 13, 2015 | 10:59 PM - Posted by Anonymous (not verified)

That picture was posted on pcper several days ago. I don't really like "Fury" or "Fury X". Perhaps it should have been "Radeon Serious Edition" and "Radeon Super Serious Edition". It is what it is regardless of what they call it.

June 13, 2015 | 10:59 PM - Posted by Anonymous (not verified)

That picture was posted on pcper several days ago. I don't really like "Fury" or "Fury X". Perhaps it should have been "Radeon Serious Edition" and "Radeon Super Serious Edition". It is what it is regardless of what they call it.

June 13, 2015 | 11:31 PM - Posted by Anonymous (not verified)

I am wondering what the dual gpu card will look like. Wild speculation, but is it possible that this is the dual gpu card? The radiator design is different. If the pump is in the radiator, then it may be plausible to stack 2 PCBs in there. Is would expect more than 2 8-pin connectors for a dual gpu though.

June 14, 2015 | 03:40 AM - Posted by pdjblum

Here's to hoping for the best and fucking the rest. Please don't let us down again.

June 14, 2015 | 03:53 AM - Posted by reaper (not verified)

Hey Ryan, FCAT says hello... Funny how things drop out of favour (Unigne Heaven anyone?) when it doesn't suit Team Green anymore. You calling for "balanced coverage" is a joke. (https://twitter.com/ryanshrout/status/609437983134089216)

Reap what you sow...

June 14, 2015 | 10:34 AM - Posted by arbiter

FCAT only has its use in CF/SLI setup. Shown if a frame was skipped/dropped cause taking to long. I am sure Ryan could still use it if they wanted to but probably won't yield any results worth the time do it.

If review sites don't point out bs then it would end up being the buyers that suffer from seeing it after they buy the product. Reason nvidia doesn't get flakk for much is well they don't do anything hardware wise to get any flakk for. When they say their card or hardware works in such and such way. It pretty much works in that way. They don't have a bunch of idiot PR marketing guys putting their foot in their mouth making stupid claims every time they open their mouth. (if you try to point out gtx970 thing don't, that is 1 thing vs like 20 things AMD has claimed and been wrong)

June 14, 2015 | 01:26 PM - Posted by Ryan Shrout

I literally use FCAT / Frame Rating in every single GPU review I do. And I think I'm the only outlet to do so. Lol

June 14, 2015 | 09:20 PM - Posted by Anonymous (not verified)

Hello Ryan,

on the topic of FCAT / Frame Rating:
did you ever planned for a review of "lucid virtu MVP" with FCAT testing ?

a couple of years ago, before the FCAT methods, this lucid virtu tech was included as a selling point in some motherboards ... if i remember correctly, it was supposed to use the IGP to 'help' the GPU output more frames or better in sync with the monitor

at that time, this "virtual vsync" could not really be tested because FPS-reporting was showing the virtual frames instead of the real ones ... so no objectif conclusions have been made about this tech (or at least, i haven't seen any)

with all this new testing methods, i wonder what results might show up when putting lucid virtu tech on an FCAT bench

the tech is probably no longer relevant, if it has ever been, but it is still very actual (with some imagination):
- help with overhead (similar to DX12/vulcan ?)
- some FPS/sync magic (similar to gsync/freesync ?)

i would love to see this topic revisited with current testing methods ... if only to make me stop wondering about it :)

kind regards

June 14, 2015 | 12:26 PM - Posted by Anonymous (not verified)

The cover on the front of the card looks weird still. I would expect it say "Radeon Fury" at a minimum. Perhaps this is still some pre-release card and the actual shipping product will look different. Is also have to wonder if all of these leaks are just planned marketing.

June 14, 2015 | 01:29 PM - Posted by Martin Trautvetter

Now that would be funny marketing, handing Ryan a set of spy shots, but not sampling him a card.

Chances are they simply hadn't settled on the final branding when locking down the physical design.

June 15, 2015 | 08:49 PM - Posted by Anonymous (not verified)

If Ryan has some of these cards for testing, then he is certainly under NDA and cannot post any of his own pictures until the embargo expires. If someone else leaks them, then he can post them without violating his NDA. If AMD leaked the photos themselves, then it is just part of some marketing plans.

The weird peg-board like cover on the front of the card still looks strange. I am still wondering if there is a surprise underneath. Rumors have indicated that we may get water cooled and air cooled versions of this card with different clock speeds. Also, I would expect a dual card rather quickly since the interposer is so small and PCB is very simple. The PCB only has power delivery, PCI-e, and video outputs. AMD may want to show off the small size of the water cooled version to emphasize how different their interposer design is. I expect an air cooled version will need to be significantly larger due to the need of a large air cooler. We haven't seen any leaks of these possible other versions.

June 15, 2015 | 08:53 PM - Posted by Anonymous (not verified)

It looks like there could be room for FURY along the side also. There is some black covering over it, but there is some space between the "RADEON" logo and the two 8-pin connectors. This makes me think of how we often see shots of new car designs being tested with covers or other things obscuring the actual look of the car.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.