AMD Fury X2 Dual Fiji GPU Card Shown at PC Gaming Show

Subject: Graphics Cards | June 16, 2015 - 10:36 PM |
Tagged: hbm, fury x2, fury x, Fury, Fiji, dual gpu, amd

During the PC Gamer PC Gaming Show, much of the industry was on hand to talk about its take on the state of PC gaming. While there, AMD took the opportunity to show off the dual-GPU Fiji-based AMD Radeon R9 Fury X2 card. (Editor's note: we don't have official confirmation on that name for the card, but it would make sense, right? We'll go with that for the time being.) We don't know much about the specifics on clocks, shader counts or performance, but we do know that AMD is able to cram a HUGE amount of GPU compute capability into an incredibly small space thanks to the high bandwidth memory innovation.

View Full Size

Shown at the PC Gaming Show tonight...

Interesting, just a couple of days ago we were sent this image anonymously:

View Full Size

What's interesting here is that I was told "this is how they test" the GPUs before installing the water block on it. Those are high-end CPU coolers that have been modified slightly to be installed on the lay-flat Fury X2 PCB. This gives you an idea of the development process of building a graphics card like this...

View Full Size

A little blurry, but still informative.

This image, posted by Anshel Sag, Staff Technologist and Technical Writer at Moor Insights & Strategy, shows the bare PCB with the two Fiji GPUs and their HBM memory stacks. (Also, note those "Moor" logos are not really printed on the GPU dies...) There are two 8-pin power connectors on the PCB as well, odd considering that the single GPU Fury X uses the same configuration.

View Full Size

This card has been promised to us in the fall, though pricing and power and performance are to be discussed later. 2015 just keeps getting better for PC gamers!!

Video News

June 16, 2015 | 10:39 PM - Posted by Joe (not verified)

Not sure why AMD is investing in those Fiji boxes. I doubt any real PC gamer will buy one.

June 16, 2015 | 10:44 PM - Posted by Ryan Shrout

I don't think they plan on selling them. It is more showing OEMs what is possible.

June 16, 2015 | 11:27 PM - Posted by Titan_V (not verified)

Define "real" PC gamer. I'd love one of those and I've been building PCs for two decades.

June 17, 2015 | 06:37 AM - Posted by Irishgamer01

I like it 2 and I wouldn't say no to one.

June 16, 2015 | 10:40 PM - Posted by Heavy (not verified)

i wasnt realy excited when the gtx700 series and r9 200 series came out i wasnt excited when the 900 series came out but im squeling like a school girl at this new amd gpus cant wait to see some benchmarks i also want to see how high this bad boys can overclock

June 17, 2015 | 11:37 AM - Posted by Anonymous (not verified)

I think that´s because this is new technology and the fact that a high end GPU can be very small is great for a lot of people.

June 16, 2015 | 11:14 PM - Posted by doeboy (not verified)

So when can we expect benchmarks or will AMD send out samples for review.

June 16, 2015 | 11:39 PM - Posted by funandjam

I'd say the fury x2 card will go for about $1500 at first, which would be about 200 more than 2 of the fury x single gpu cards.

Not that I could afford one, but i'd be very curious to see if Asus puts out another one of those limited ARES cards, those were very cool.

June 17, 2015 | 01:09 AM - Posted by Simms (not verified)

I want to see $999, eat the titan x for lunch

June 17, 2015 | 01:15 PM - Posted by Disturbed Jim (not verified)

The leaked Benchmarks I have seen show a Single fury X Stomping all over Titan X from 1080P up to 5K so 2 will obliterate it

June 18, 2015 | 07:17 AM - Posted by Snake Pliskin (not verified)

Someone please love me.

June 23, 2015 | 10:36 AM - Posted by TinkerToyTech

we love you snake

June 17, 2015 | 11:56 AM - Posted by Anonymous (not verified)

Prices were already released and the Fury is gonna be priced at $649 - so a 1000 dollars price tag for this one isn't so surprising, I'd bet it'll go cheaper.

June 17, 2015 | 04:13 PM - Posted by obababoy

I doubt it will go cheaper then 1k. It will be at least 1200 imo.

June 16, 2015 | 11:45 PM - Posted by Anonymous (not verified)

So I guess nothing launched today...LOL. No reviews=PAPER LAUNCH, right?

Anandtech's says "NEW ERA OF COMPUTING". Should be titled "NEW ERA OF PAPER LAUNCHES".

The future is always exciting, but it would be nice if we dealt with the present at some point...

June 17, 2015 | 03:02 AM - Posted by JohnGR

Thank you for informing us that you are in total denial.

PS We don't care.

June 17, 2015 | 09:19 AM - Posted by Anonymous (not verified)

There's a caption that says the card is promised in the Fall. That's anywhere from 3 to 6 months out. A long ways away in the tech world.

June 17, 2015 | 11:17 AM - Posted by JohnGR

So? Fury X is here, buy two and do a crossfire. The X2 is a real card and it is running into Project Quantum. That's what you keep from the news.

June 17, 2015 | 11:35 AM - Posted by Martin Trautvetter

"Fury X is here"

No, it was shown. We don't know how it performs, which might change in another week, or when it will become available for purchase.

June 17, 2015 | 01:04 PM - Posted by JohnGR

Well, you can always hope to be a total failure. All this to be a bad dream. Don't worry. Pascal will fix everything. IN A YEAR. :P

June 17, 2015 | 06:50 PM - Posted by Martin Trautvetter

How will Pascal fix you posting straight-up BS?

June 17, 2015 | 03:35 PM - Posted by Barfly (not verified)

Answer me this sir, why is Project Quantum using a Intel CPU?
You seem to have the answers, what Intel CPU is Project Quantum using?

June 17, 2015 | 05:27 PM - Posted by Allyn Malventano

I get that the card is eventually coming, but claiming that something was running in a tech demo that is not for sale is a stretch.

June 16, 2015 | 11:51 PM - Posted by djotter

I think this is more likely to be a Nano X2 or they will have to pull the same thing they did with the 295X2 and over draw from the 8 pin connectors.

. . . or it's just really really power efficient :p

June 17, 2015 | 12:43 AM - Posted by biohazard918

I'm sure josh will agree with me that this card needs more dp.

June 17, 2015 | 09:53 AM - Posted by Anonymous (not verified)

Just go to Redtube for that, man. This is a tech site.


June 17, 2015 | 12:47 AM - Posted by Chaitanya Shukla

Looks like another card that is going to break ATX power standard.

June 17, 2015 | 02:24 AM - Posted by Anonymous (not verified)

It may actually draw less than the 295x2 at stock speeds. The single gpu card is only supposed to take 275W. To support overclocking though, the amperage requirement may go up. For the single-gpu, water-cooled card, they said up to 400A power delivery (according to Ryan's post) to support overclocking. Supporting that level on a dual gpu card would require around 34A from each connector, rather than 28A (I think) for the 295x2. Anyway, pairing a high-end PSU with such a card is expected, so I don't think this is really an issue.

June 17, 2015 | 01:01 AM - Posted by Hakuren

Looks so cute and tiny for what it is. Like my venerable and indestructible GTS8800. Finally somebody is going back to side mounted PEG connectors like many VGAs of old. Top mounted annoys me so much I cannot even describe it.

But only one mDP? And 3 HDMI. WTH?

And I hope they don't come with standard equipped tower coolers. We would have to redefine phrase: computer case. :D

June 17, 2015 | 05:20 AM - Posted by Pixy Misa (not verified)

Pretty sure it's 3 DP and one HDMI just like the other Fury cards. But I haven't seen a picture that shows the port layout clearly end-on.

June 17, 2015 | 01:26 AM - Posted by Anonymous (not verified)

I am surprised that they don't seem to be putting any lid on top of the interposer. It is better for cooling I guess. This board looks like it is almost 2x of the nano board; perhaps it is the same PCB layout.

October 28, 2015 | 05:06 PM - Posted by NobodyNotable (not verified)

It is possible the lid might be part of the final construction; this is a display unit. I know if I built this thing, I would wanna show off the little memory stacks on the interposer.
It might be that's what they're doing?

In any event, to mount a cooling block, they're going to want to put something to distribute the stress and fill the airgaps on the GPU.

June 17, 2015 | 03:00 AM - Posted by JohnGR

"There are two 8-pin power connectors on the PCB as well, odd considering that the single GPU Fury X uses the same configuration."

I wrote it yesterday. Like with GTX 980 that have two 8pin power connectors, when in fact they only need two 6pin, so they can offer better overclocking, Fury X doesn't really need 2x8pin.

AMD created the best card from the beginning. So, no mediocre reference AMD card model vs super overclocked EVGA GTX 980Ti card model for the comparison charts. That changed.

June 17, 2015 | 04:00 AM - Posted by DerekR (not verified)

I feel like either nobody is reading what you've posted or they strongly disagree. I also think the two 8pin power connectors are just to give some overhead. We'll have to wait till AMD comes out with its official presentation to know the real power requirements.

As you've posted before, we can't determine the card's power draw just from looking at how many X-pin connectors it has till AMD tells us.

June 17, 2015 | 10:10 AM - Posted by DiceAir (not verified)

I think the 2x 8pins is for better load distribution across the wires. So better stability = better cooling and it looks better when you don't have the 2 extra pins hanging off the side. So I don't see that the dual 8 pins means it uses all the power

June 17, 2015 | 05:47 AM - Posted by Anonymous (not verified)

This image, posted by Antal Sag, Staff Technologist and Technical Writer at Moor Insights & Strategy, shows the bare PCB with the two Fiji GPUs and their HBM memory stacks. (Also, note those "Moor" logos are not really printed on the GPU dies...) There are two 8-pin power connectors on the PCB as well, odd considering that the single GPU Fury X uses the same configuration
Samsung NP350V5C battery
Dell XPS 15 9530 AC Power Adapter

June 17, 2015 | 08:23 AM - Posted by razor512

They may give each GPU,a lower power target since each 8 pin connector is rated for 225 watts. This may limit overclocking potential.

June 17, 2015 | 09:17 AM - Posted by Pixy Misa (not verified)

At some point, maybe the video card is fast enough as it is.

June 17, 2015 | 09:33 AM - Posted by razor512

While it is sure to be a fast card, it is difficult to feel completely satisfied with any processing component unless you are 100% sure that it is overclocked as far as possible without the need of exotic cooling.

Imagine how horrible it would feed to get a high end videocard and only run it at the reference speed.

June 18, 2015 | 12:02 AM - Posted by Pixy Misa (not verified)

I can understand the feeling. As a programmer, inefficient code drives me crazy.

But on the hardware side of things, I'd rather lose 10 or 20% in performance and stick to a config I'm sure is stable. (Partly because I just don't have the time to fiddle with such things any more.)

June 20, 2015 | 09:42 AM - Posted by BlackDove (not verified)

Thats not true for everyone. I have my 780 at 80% power target most of the time for longevity. It is a GHz Edition though lol.

Not everyone overclocks because stability and longevity are more important to some people.

June 17, 2015 | 11:25 AM - Posted by Martin Trautvetter

Actually, 8-pin PEG power connections are rated for 150 W.

Unless of course you're AMD and just don't give a fµck about industry group specifications.

June 17, 2015 | 01:01 PM - Posted by razor512

yep, AMD pretty much decided to skip the official spec since based on wires can physically handle more current, thus you can push the 8 pin connectors to above 200 watts without much trouble, especially since more and more power supplies are starting to push well above 18 amps on the 12V rail.

The physical limit of a 6 pin connector will be 576 watts (for 3 positive, and 3 ground connections, based on the normal 18 gauge wires used in these connectors)Typically with electrical work, people tend to keep around 40-50% of headroom, which puts you at 288 watts, thus to be on the safe side, 225 watts will take into account some cheaper cables)

For your dirt cheap $20 power supplies, you will typically find 20 gauge wire, and even in that case, a 6 pin will have a max physical limit of 396 watts.

Most of the ATX limits were also from a time when they limited the amount of amps on each 12V rail, Once the industry largely abandoned that, next is to abandon the current limits for the connectors, and instead, use stay within the physical limits of the wires themselves.

June 17, 2015 | 01:01 PM - Posted by JohnGR

Have mercy in the next podcast and don't ask Allyn too many questions about Fury cards. :D

June 17, 2015 | 05:19 PM - Posted by Allyn Malventano

Why would they? I'm not the GPU tester...

June 17, 2015 | 06:11 PM - Posted by JohnGR

Yeah, right :p

June 18, 2015 | 05:21 AM - Posted by Anonymous (not verified)

AMD Radeon Fury X doesn't have HDMI 2.0 support

June 18, 2015 | 05:01 PM - Posted by Anonymous (not verified)

probably because HDMI costs a lot to licence while Display port is both better and cheaper :)

June 18, 2015 | 05:36 AM - Posted by pobear (not verified)

Finally, with this i can build a mac pro size computer that will obliterate the mac pro. Soo excited, couldnt careless of the cost i live in a country where daily needs is cheap.

June 19, 2015 | 05:53 AM - Posted by Anonymous (not verified)

Do not understand why there is not HDMI 2.0
Why is there no DP 1.3?

June 19, 2015 | 04:25 PM - Posted by Gradius (not verified)

X2 should be $999 or $1199 as max. I'm getting one for sure!

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

This question is for testing whether you are a human visitor and to prevent automated spam submissions.