AMD Shows Dual-Fiji Graphics Card in a Falcon Northwest PC at VRLA

Subject: Graphics Cards | January 25, 2016 - 11:51 AM |
Tagged: fury x2, Fiji, dual fiji, amd

Lo and behold! The dual-Fiji card that we have previous dubbed the AMD Radeon Fury X2 still lives! Based on a tweet from AMD PR dude Antal Tungler, a PC from Falcon Northwest at the VRLA convention was utilizing a dual-GPU Fiji graphics card to power some demos.

This prototype Falcon Northwest Tiki system was housing the GPU beast but no images were shown of the interior of the system. Still, it's good to see AMD at least recognize that this piece of hardware still exists at all, since it was initially promised to the enthusiast market by "fall of 2015."  Even in October we had hints that the card might be coming soon after seeing some shipping manifests leak out to the web. 

View Full Size

Better late than never, right? One theory floating around inside the offices here is that AMD is going to release the Fury X2 along with the VR headsets coming out this spring, with hopes of making it THE VR graphics card of choice. The value of using multi-GPU for VR is interesting, with one GPU dedicated to each eye, though the pitfalls that could haunt both AMD and NVIDIA in this regard (latency, frame time consistency) make the technological capability a debate. 

Source: Twitter

January 25, 2016 | 12:50 PM - Posted by RoyalK (not verified)

Like others have stated in one of the articles' comments: How the hell do they plan on cooling the damn thing?

January 25, 2016 | 12:52 PM - Posted by sheady (not verified)

closed loop, just like the 295 X2 and Fury X most likely. Why would they do something different?

January 26, 2016 | 03:01 AM - Posted by Rob T (not verified)

The dual Fiji card is rumoured to have a TDP 75W less than than the R9 295 X2, so AMDs existing cooling solutions should be more than sufficient.

January 26, 2016 | 09:55 AM - Posted by Mobile_Dom

correct me if im wrong here, but i was lead to believe the Fury X2 is going to have a TDP around the 375w area, which is 125w lower than the 295X2,not 75w

January 26, 2016 | 01:29 PM - Posted by Rob T

OK you are corrected :)

From the following article and several others it appears the TDP of the R9 295X2 stated by AMD was 450W and it actually measured slightly lower than that when measured during testing. I think the 500W value came from the card's cooling solution, this is rated at being able to cope with a 500W TDP.

http://www.tomshardware.co.uk/graphics-card-power-supply-balance,review-...

AMD tells us that the Radeon R9 295X2’s TDP is 450W. We’ve known since the second page of this story, given its PowerTune technology and the underlying theory, that 450W isn't just a rough estimate either, but a real limit. We measure just under 430W in while gaming, which is in line with the company's specifications and a lot less than the >500W figure we've seen thrown around.

January 25, 2016 | 12:51 PM - Posted by sheady (not verified)

In an AnandTech article back in December AMD gave a quote saying they had delayed dual Fiji to align with the launch of VR headsets.

January 25, 2016 | 12:56 PM - Posted by Anonymous (not verified)

With the Crossfire performance the FURY cards have shown and Async performance for reduced lag, I have to agree the Dual Fury X2 looks to be the PERFECT card for VR.

Maybe not for Desktop use, but certainly for VR where 2 GPUs is actually better than 1!

January 25, 2016 | 02:05 PM - Posted by jimecherry

Cause nothing says pc master race like a 1200 usd graphics card in a system hooked up to a 600 usd peripheral.

January 25, 2016 | 05:56 PM - Posted by djotter

Bwahahaha

January 25, 2016 | 09:46 PM - Posted by Anonymous (not verified)

Nonono... $1,200 ppppfffffftttt! Cheapskate! 2x Titans for $2,000!

And people think Apple stuff is to expensive ROFL!

January 25, 2016 | 07:14 PM - Posted by Anonymous (not verified)

If Arctic Islands weren't coming out this summer I'd have definitely got a Fury X.
I don't know about the Fury X2 as some of the main games that I play don't optimize for multi-GPU configs, but I'd love to see what kind of power this beast has when it's launched.

January 25, 2016 | 10:45 PM - Posted by Anonymous (not verified)

Will the Fury still be the top of AMD's line-up after release of the 14 nm parts? I kind of expected that they would release a low and mid-range part this year, but I was expecting the mid-range size die to be close to the current top end due to the jump in process tech. I guess even if the Fury is outperformed by upcoming parts, the dual Fury may still be on top. Even after the release of the Fury, it was often outperformed for a similar price by the Radeon 295x2.

January 26, 2016 | 08:36 AM - Posted by Anonymous (not verified)

I wouldn't pick up a dual-GPU card or setup for the initial crop of VR titles: simply nobody outside of Nvidia (and now AMD) have demonstrated an actual functioning VR dual-GPU demo. And unlike with current SLI/Crossfire, dual-GPU cannot be 'retrofitted' for VR, it needs to be designed in and optimised by the engine and game developers to ensure job dispatch is performance din a way that actually reduces latency, rather than increases. That's going to take time.

January 26, 2016 | 08:50 AM - Posted by Justin150 (not verified)

I will pass until GPUs go down to 14nm parts and HBM2. I like what AMD are trying to do with GPUs

Maybe in fall 2016 when I can custom water loop one

January 26, 2016 | 11:27 AM - Posted by remc86007

Shouldn't dual gpu for VR be relatively easy? One gpu, one screen. Admittedly I know nothing about how the software side works.

January 28, 2016 | 03:01 PM - Posted by Anonymous (not verified)

Come to me.)))

Nice Card, i´m in love.)

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.