During the PC Gamer PC Gaming Show, much of the industry was on hand to talk about its take on the state of PC gaming. While there, AMD took the opportunity to show off the dual-GPU Fiji-based AMD Radeon R9 Fury X2 card. (Editor's note: we don't have official confirmation on that name for the card, but it would make sense, right? We'll go with that for the time being.) We don't know much about the specifics on clocks, shader counts or performance, but we do know that AMD is able to cram a HUGE amount of GPU compute capability into an incredibly small space thanks to the high bandwidth memory innovation.
Shown at the PC Gaming Show tonight…
Interesting, just a couple of days ago we were sent this image anonymously:
What's interesting here is that I was told "this is how they test" the GPUs before installing the water block on it. Those are high-end CPU coolers that have been modified slightly to be installed on the lay-flat Fury X2 PCB. This gives you an idea of the development process of building a graphics card like this…
A little blurry, but still informative.
This image, posted by Anshel Sag, Staff Technologist and Technical Writer at Moor Insights & Strategy, shows the bare PCB with the two Fiji GPUs and their HBM memory stacks. (Also, note those "Moor" logos are not really printed on the GPU dies…) There are two 8-pin power connectors on the PCB as well, odd considering that the single GPU Fury X uses the same configuration.
This card has been promised to us in the fall, though pricing and power and performance are to be discussed later. 2015 just keeps getting better for PC gamers!!
Not sure why AMD is investing
Not sure why AMD is investing in those Fiji boxes. I doubt any real PC gamer will buy one.
I don’t think they plan on
I don't think they plan on selling them. It is more showing OEMs what is possible.
Define “real” PC gamer. I’d
Define “real” PC gamer. I’d love one of those and I’ve been building PCs for two decades.
I like it 2 and I wouldn’t
I like it 2 and I wouldn’t say no to one.
i wasnt realy excited when
i wasnt realy excited when the gtx700 series and r9 200 series came out i wasnt excited when the 900 series came out but im squeling like a school girl at this new amd gpus cant wait to see some benchmarks i also want to see how high this bad boys can overclock
I think that´s because this
I think that´s because this is new technology and the fact that a high end GPU can be very small is great for a lot of people.
So when can we expect
So when can we expect benchmarks or will AMD send out samples for review.
I’d say the fury x2 card will
I’d say the fury x2 card will go for about $1500 at first, which would be about 200 more than 2 of the fury x single gpu cards.
Not that I could afford one, but i’d be very curious to see if Asus puts out another one of those limited ARES cards, those were very cool.
I want to see $999, eat the
I want to see $999, eat the titan x for lunch
The leaked Benchmarks I have
The leaked Benchmarks I have seen show a Single fury X Stomping all over Titan X from 1080P up to 5K so 2 will obliterate it
Someone please love me.
Someone please love me.
we love you snake
we love you snake
Prices were already released
Prices were already released and the Fury is gonna be priced at $649 – so a 1000 dollars price tag for this one isn’t so surprising, I’d bet it’ll go cheaper.
I doubt it will go cheaper
I doubt it will go cheaper then 1k. It will be at least 1200 imo.
So I guess nothing launched
So I guess nothing launched today…LOL. No reviews=PAPER LAUNCH, right?
Anandtech’s says “NEW ERA OF COMPUTING”. Should be titled “NEW ERA OF PAPER LAUNCHES”.
The future is always exciting, but it would be nice if we dealt with the present at some point…
Thank you for informing us
Thank you for informing us that you are in total denial.
PS We don’t care.
There’s a caption that says
There’s a caption that says the card is promised in the Fall. That’s anywhere from 3 to 6 months out. A long ways away in the tech world.
So? Fury X is here, buy two
So? Fury X is here, buy two and do a crossfire. The X2 is a real card and it is running into Project Quantum. That’s what you keep from the news.
“Fury X is here”
No, it was
“Fury X is here”
No, it was shown. We don’t know how it performs, which might change in another week, or when it will become available for purchase.
Well, you can always hope to
Well, you can always hope to be a total failure. All this to be a bad dream. Don’t worry. Pascal will fix everything. IN A YEAR. 😛
How will Pascal fix you
How will Pascal fix you posting straight-up BS?
Answer me this sir, why is
Answer me this sir, why is Project Quantum using a Intel CPU?
You seem to have the answers, what Intel CPU is Project Quantum using?
I get that the card is
I get that the card is eventually coming, but claiming that something was running in a tech demo that is not for sale is a stretch.
I think this is more likely
I think this is more likely to be a Nano X2 or they will have to pull the same thing they did with the 295X2 and over draw from the 8 pin connectors.
. . . or it’s just really really power efficient :p
I’m sure josh will agree with
I’m sure josh will agree with me that this card needs more dp.
Just go to Redtube for that,
Just go to Redtube for that, man. This is a tech site.
lol
Looks like another card that
Looks like another card that is going to break ATX power standard.
It may actually draw less
It may actually draw less than the 295×2 at stock speeds. The single gpu card is only supposed to take 275W. To support overclocking though, the amperage requirement may go up. For the single-gpu, water-cooled card, they said up to 400A power delivery (according to Ryan’s post) to support overclocking. Supporting that level on a dual gpu card would require around 34A from each connector, rather than 28A (I think) for the 295×2. Anyway, pairing a high-end PSU with such a card is expected, so I don’t think this is really an issue.
Looks so cute and tiny for
Looks so cute and tiny for what it is. Like my venerable and indestructible GTS8800. Finally somebody is going back to side mounted PEG connectors like many VGAs of old. Top mounted annoys me so much I cannot even describe it.
But only one mDP? And 3 HDMI. WTH?
And I hope they don’t come with standard equipped tower coolers. We would have to redefine phrase: computer case. 😀
Pretty sure it’s 3 DP and one
Pretty sure it’s 3 DP and one HDMI just like the other Fury cards. But I haven’t seen a picture that shows the port layout clearly end-on.
I am surprised that they
I am surprised that they don’t seem to be putting any lid on top of the interposer. It is better for cooling I guess. This board looks like it is almost 2x of the nano board; perhaps it is the same PCB layout.
It is possible the lid might
It is possible the lid might be part of the final construction; this is a display unit. I know if I built this thing, I would wanna show off the little memory stacks on the interposer.
It might be that’s what they’re doing?
In any event, to mount a cooling block, they’re going to want to put something to distribute the stress and fill the airgaps on the GPU.
“There are two 8-pin power
“There are two 8-pin power connectors on the PCB as well, odd considering that the single GPU Fury X uses the same configuration.”
I wrote it yesterday. Like with GTX 980 that have two 8pin power connectors, when in fact they only need two 6pin, so they can offer better overclocking, Fury X doesn’t really need 2x8pin.
AMD created the best card from the beginning. So, no mediocre reference AMD card model vs super overclocked EVGA GTX 980Ti card model for the comparison charts. That changed.
I feel like either nobody is
I feel like either nobody is reading what you’ve posted or they strongly disagree. I also think the two 8pin power connectors are just to give some overhead. We’ll have to wait till AMD comes out with its official presentation to know the real power requirements.
As you’ve posted before, we can’t determine the card’s power draw just from looking at how many X-pin connectors it has till AMD tells us.
I think the 2x 8pins is for
I think the 2x 8pins is for better load distribution across the wires. So better stability = better cooling and it looks better when you don’t have the 2 extra pins hanging off the side. So I don’t see that the dual 8 pins means it uses all the power
This image, posted by Antal
This image, posted by Antal Sag, Staff Technologist and Technical Writer at Moor Insights & Strategy, shows the bare PCB with the two Fiji GPUs and their HBM memory stacks. (Also, note those “Moor” logos are not really printed on the GPU dies…) There are two 8-pin power connectors on the PCB as well, odd considering that the single GPU Fury X uses the same configuration
Samsung NP350V5C battery
Dell XPS 15 9530 AC Power Adapter
They may give each GPU,a
They may give each GPU,a lower power target since each 8 pin connector is rated for 225 watts. This may limit overclocking potential.
At some point, maybe the
At some point, maybe the video card is fast enough as it is.
While it is sure to be a fast
While it is sure to be a fast card, it is difficult to feel completely satisfied with any processing component unless you are 100% sure that it is overclocked as far as possible without the need of exotic cooling.
Imagine how horrible it would feed to get a high end videocard and only run it at the reference speed.
I can understand the feeling.
I can understand the feeling. As a programmer, inefficient code drives me crazy.
But on the hardware side of things, I’d rather lose 10 or 20% in performance and stick to a config I’m sure is stable. (Partly because I just don’t have the time to fiddle with such things any more.)
Thats not true for everyone.
Thats not true for everyone. I have my 780 at 80% power target most of the time for longevity. It is a GHz Edition though lol.
Not everyone overclocks because stability and longevity are more important to some people.
Actually, 8-pin PEG power
Actually, 8-pin PEG power connections are rated for 150 W.
Unless of course you’re AMD and just don’t give a fµck about industry group specifications.
yep, AMD pretty much decided
yep, AMD pretty much decided to skip the official spec since based on wires can physically handle more current, thus you can push the 8 pin connectors to above 200 watts without much trouble, especially since more and more power supplies are starting to push well above 18 amps on the 12V rail.
The physical limit of a 6 pin connector will be 576 watts (for 3 positive, and 3 ground connections, based on the normal 18 gauge wires used in these connectors)Typically with electrical work, people tend to keep around 40-50% of headroom, which puts you at 288 watts, thus to be on the safe side, 225 watts will take into account some cheaper cables)
For your dirt cheap $20 power supplies, you will typically find 20 gauge wire, and even in that case, a 6 pin will have a max physical limit of 396 watts.
Most of the ATX limits were also from a time when they limited the amount of amps on each 12V rail, Once the industry largely abandoned that, next is to abandon the current limits for the connectors, and instead, use stay within the physical limits of the wires themselves.
Have mercy in the next
Have mercy in the next podcast and don’t ask Allyn too many questions about Fury cards. 😀
Why would they? I’m not the
Why would they? I'm not the GPU tester…
Yeah, right :p
Yeah, right :p
http://www.guru3d.com/news-st
http://www.guru3d.com/news-story/amd-radeon-fury-x-doesnt-have-hdmi-2-support.html
AMD Radeon Fury X doesn’t have HDMI 2.0 support
probably because HDMI costs a
probably because HDMI costs a lot to licence while Display port is both better and cheaper 🙂
Finally, with this i can
Finally, with this i can build a mac pro size computer that will obliterate the mac pro. Soo excited, couldnt careless of the cost i live in a country where daily needs is cheap.
Do not understand why there
Do not understand why there is not HDMI 2.0
Why is there no DP 1.3?
X2 should be $999 or $1199 as
X2 should be $999 or $1199 as max. I’m getting one for sure!