Subject: Graphics Cards | May 16, 2016 - 07:52 PM | Jeremy Hellstrom
Tagged: amd, r9 380x, crossfire
A pair of R9 380X's will cost you around $500, a bit more $100 less than a single GTX 980Ti and on par or a little less expensive than a straight GTX 980. You have likely seen these cards compared but how often have you seen these cards pitted against a pair of GTX 960's which costs a little bit less than two 380X cards? [H]ard|OCP decided it was worth investigating, perhaps for those who currently have a single one of these cards that are considering a second if the price is right. The results are very tight, overall the two setups performed very similarly with some games favouring AMD and others NVIDIA, check out the full review here.
"We are evaluating two Radeon R9 380X video cards in CrossFire against two GeForce GTX 960 video cards in a SLI arrangement. We will overclock each setup to its highest, to experience the full gaming benefit each configuration has to offer. Additionally we will compare a Radeon R9 380 CrossFire setup to help determine the best value."
Here are some more Graphics Card articles from around the web:
The Dual-Fiji Card Finally Arrives
This weekend, leaks of information on both WCCFTech and VideoCardz.com have revealed all the information about the pending release of AMD’s dual-GPU giant, the Radeon Pro Duo. While no one at PC Perspective has been briefed on the product officially, all of the interesting data surrounding the product is clearly outlined in the slides on those websites, minus some independent benchmark testing that we are hoping to get to next week. Based on the report from both sites, the Radeon Pro Duo will be released on April 26th.
AMD actually revealed the product and branding for the Radeon Pro Duo back in March, during its live streamed Capsaicin event surrounding GDC. At that point we were given the following information:
- Dual Fiji XT GPUs
- 8GB of total HBM memory
- 4x DisplayPort (this has since been modified)
- 16 TFLOPS of compute
- $1499 price tag
The design of the card follows the same industrial design as the reference designs of the Radeon Fury X, and integrates a dual-pump cooler and external fan/radiator to keep both GPUs running cool.
Based on the slides leaked out today, AMD has revised the Radeon Pro Duo design to include a set of three DisplayPort connections and one HDMI port. This was a necessary change as the Oculus Rift requires an HDMI port to work; only the HTC Vive has built in support for a DisplayPort connection and even in that case you would need a full-size to mini-DisplayPort cable.
The 8GB of HBM (high bandwidth memory) on the card is split between the two Fiji XT GPUs on the card, just like other multi-GPU options on the market. The 350 watts power draw mark is exceptionally high, exceeded only by AMD’s previous dual-GPU beast, the Radeon 295X2 that used 500+ watts and the NVIDIA GeForce GTX Titan Z that draws 375 watts!
Here is the specification breakdown of the Radeon Pro Duo. The card has 8192 total stream processors and 128 Compute Units, split evenly between the two GPUs. You are getting two full Fiji XT GPUs in this card, an impressive feat made possible in part by the use of High Bandwidth Memory and its smaller physical footprint.
|Radeon Pro Duo||R9 Nano||R9 Fury||R9 Fury X||GTX 980 Ti||TITAN X||GTX 980||R9 290X|
|GPU||Fiji XT x 2||Fiji XT||Fiji Pro||Fiji XT||GM200||GM200||GM204||Hawaii XT|
|Rated Clock||up to 1000 MHz||up to 1000 MHz||1000 MHz||1050 MHz||1000 MHz||1000 MHz||1126 MHz||1000 MHz|
|Memory||8GB (4GB x 2)||4GB||4GB||4GB||6GB||12GB||4GB||4GB|
|Memory Clock||500 MHz||500 MHz||500 MHz||500 MHz||7000 MHz||7000 MHz||7000 MHz||5000 MHz|
|Memory Interface||4096-bit (HMB) x 2||4096-bit (HBM)||4096-bit (HBM)||4096-bit (HBM)||384-bit||384-bit||256-bit||512-bit|
|Memory Bandwidth||1024 GB/s||512 GB/s||512 GB/s||512 GB/s||336 GB/s||336 GB/s||224 GB/s||320 GB/s|
|TDP||350 watts||175 watts||275 watts||275 watts||250 watts||250 watts||165 watts||290 watts|
|Peak Compute||16.38 TFLOPS||8.19 TFLOPS||7.20 TFLOPS||8.60 TFLOPS||5.63 TFLOPS||6.14 TFLOPS||4.61 TFLOPS||5.63 TFLOPS|
|Transistor Count||8.9B x 2||8.9B||8.9B||8.9B||8.0B||8.0B||5.2B||6.2B|
The Radeon Pro Duo has a rated clock speed of up to 1000 MHz. That’s the same clock speed as the R9 Fury and the rated “up to” frequency on the R9 Nano. It’s worth noting that we did see a handful of instances where the R9 Nano’s power limiting capability resulted in some extremely variable clock speeds in practice. AMD recently added a feature to its Crimson driver to disable power metering on the Nano, at the expense of more power draw, and I would assume the same option would work for the Pro Duo.
Subject: Graphics Cards | March 15, 2016 - 06:02 AM | Ryan Shrout
Tagged: vulkan, raja koduri, Polaris, HBM2, hbm, dx12, crossfire, amd
After hosting the AMD Capsaicin event at GDC tonight, the SVP and Chief Architect of the Radeon Technologies Group Raja Koduri sat down with me to talk about the event and offered up some additional details on the Radeon Pro Duo, upcoming Polaris GPUs and more. The video below has the full interview but there are several highlights that stand out as noteworthy.
- Raja claimed that one of the reasons to launch the dual-Fiji card as the Radeon Pro Duo for developers rather than pure Radeon, aimed at gamers, was to “get past CrossFire.” He believes we are at an inflection point with APIs. Where previously you would abstract two GPUs to appear as a single to the game engine, with DX12 and Vulkan the problem is more complex than that as we have seen in testing with early titles like Ashes of the Singularity.
But with the dual-Fiji product mostly developed and prepared, AMD was able to find a market between the enthusiast and the creator to target, and thus the Radeon Pro branding was born.
Raja further expands on it, telling me that in order to make multi-GPU useful and productive for the next generation of APIs, getting multi-GPU hardware solutions in the hands of developers is crucial. He admitted that CrossFire in the past has had performance scaling concerns and compatibility issues, and that getting multi-GPU correct from the ground floor here is crucial.
- With changes in Moore’s Law and the realities of process technology and processor construction, multi-GPU is going to be more important for the entire product stack, not just the extreme enthusiast crowd. Why? Because realities are dictating that GPU vendors build smaller, more power efficient GPUs, and to scale performance overall, multi-GPU solutions need to be efficient and plentiful. The “economics of the smaller die” are much better for AMD (and we assume NVIDIA) and by 2017-2019, this is the reality and will be how graphics performance will scale.
Getting the software ecosystem going now is going to be crucial to ease into that standard.
- The naming scheme of Polaris (10, 11…) has no equation, it’s just “a sequence of numbers” and we should only expect it to increase going forward. The next Polaris chip will be bigger than 11, that’s the secret he gave us.
There have been concerns that AMD was only going to go for the mainstream gaming market with Polaris but Raja promised me and our readers that we “would be really really pleased.” We expect to see Polaris-based GPUs across the entire performance stack.
- AMD’s primary goal here is to get many millions of gamers VR-ready, though getting the enthusiasts “that last millisecond” is still a goal and it will happen from Radeon.
- No solid date on Polaris parts at all – I tried! (Other than the launches start in June.) Though Raja did promise that after tonight, he will only have his next alcoholic beverage until the launch of Polaris. Serious commitment!
- Curious about the HBM2 inclusion in Vega on the roadmap and what that means for Polaris? Though he didn’t say it outright, it appears that Polaris will be using HBM1, leaving me to wonder about the memory capacity limitations inherent in that. Has AMD found a way to get past the 4GB barrier? We are trying to figure that out for sure.
Why is Polaris going to use HBM1? Raja pointed towards the extreme cost and expense of building the HBM ecosystem prepping the pipeline for the new memory technology as the culprit and AMD obviously wants to recoup some of that cost with another generation of GPU usage.
Speaking with Raja is always interesting and the confidence and knowledge he showcases is still what gives me assurance that the Radeon Technologies Group is headed in the correct direction. This is going to be a very interesting year for graphics, PC gaming and for GPU technologies, as showcased throughout the Capsaicin event, and I think everyone should be looking forward do it.
Subject: Motherboards | November 3, 2015 - 09:00 AM | Sebastian Peak
Tagged: Z170A SLI PLUS, X99 SLI PLUS, sli, msi, motherboard, crossfire, black PCB
MSI has announced their third PRO Series motherboard, the all-black Z170A SLI PLUS.
"MSI, leading in motherboard design, completes the PRO Series motherboard line-up with the launch of the all black Z170A SLI PLUS motherboard. Inheriting DNA from the critically acclaimed X99A SLI PLUS motherboard, the Z170A SLI PLUS is powerful, packed with features and styled with class."
While not lacking in stealthy looks from the black PCB to the matching RAM slots and heatsinks, the specifications of this new SLI PLUS board place it firmly into the premium category.
Features include Intel Gigabit LAN, MSI’s DDR4 Boost technology (which isolates memory traces on the board), Turbo M.2 (and U.2) with speeds up to 32 Gb/s with NVMe SSDs, and the Steel Armor PCI Express slots. And while SLI is right there in the title, there is also support for AMD Crossfire with those super-strong PCIe slots.
The board is fabricated with MSI’s Military Class 4 components including super ferrite chokes and all solid capacitors, and also offer a “EZ Debug” LED and overvoltage protection. In addition to support for the fastest internal storage standards (M.2, U.2, and SATA Express), the Z170A SLI PLUS offers full USB 3.1 Gen2 support for transfers up to 10 Gb/s for compatible devices.
- Supports 6th Gen Intel Core / Pentium / Celeron processors for LGA 1151 socket
- Supports DDR4-3600 Memory (OC)
- DDR4 Boost
- USB 3.1 Gen2
- Turbo M.2 32Gb/s + Turbo U.2 ready + USB 3.1 Gen2 Type-C + SATA 6Gb/s
- Steel Armor PCI-E slots; Supports NVIDIA SLI and AMD Crossfire
- OC Genie 4
- Click BIOS 5
- Audio Boost
- Military Class 4
- EZ Debug LED
- Overvoltage Protection
- 4K UHD Support
- Windows 10 Ready
Pricing and availability were not specified with this morning’s announcement.
Subject: Graphics Cards | October 6, 2015 - 06:40 PM | Jeremy Hellstrom
Tagged: 4k, gtx titan x, fury x, GTX 980 Ti, crossfire, sli
[H]ard|OCP shows off just what you can achieve when you spend over $1000 on graphics cards and have a 4K monitor in their latest review. In Project Cars you can expect never to see less than 40fps with everything cranked to maximum and if you invested in Titan X's you can even enable DS2X AntiAliasing for double the resolution, before down sampling. Witcher 3 is a bit more challenging and no card is up for HairWorks without a noticeable hit to performance. FarCry 4 still refuses to believe in Crossfire and as far as NVIDIA performance goes, if you want to see soft shadows you are going to have to invest in a pair of Titan X's. Check out the full review to see what the best of the current market is capable of.
"The ultimate 4K battle is about to begin, AMD Radeon R9 Fury X CrossFire, NVIDIA GeForce GTX 980 Ti SLI, and NVIDIA GeForce GTX TITAN X SLI will compete for the best gameplay experience at 4K resolution. Find out what $1300 to $2000 worth of GPU backbone will buy you. And find out if Fiji really can 4K."
Here are some more Graphics Card articles from around the web:
Subject: Graphics Cards | September 28, 2015 - 08:45 PM | Jeremy Hellstrom
Tagged: R9 Fury, asus strix r9 fury, r9 390x, GTX 980, crossfire, sli, 4k
Bring your wallets to this review from [H]ard|OCP which pits multiple AMD and NVIDIA GPUs against each other at 4K resolutions and no matter the outcome it won't be cheap! They used the Catalyst 15.8 Beta and the GeForce 355.82 WHQL which were the latest drivers available at the time of writing as well as trying out Windows 10 Pro x64. There were some interesting results, for instance you want an AMD card when driving in the rain playing Project Cars as the GTX 980's immediately slowed down in inclement weather. With Witcher 3, AMD again provided frames faster but unfortunately the old spectre of stuttering appeared, which those of you familiar with our Frame Rating tests will understand the source of. Dying Light proved to be a game that liked VRAM with the 390X taking top spot though sadly neither AMD card could handle Crossfire in Far Cry 4. There is a lot of interesting information in the review and AMD's cards certainly show their mettle but the overall winner is not perfectly clear, [H] chose Fury the R9 Fury with a caveat about Crossfire support.
"We gear up for multi-GPU gaming with AMD Radeon R9 Fury CrossFire, NVIDIA GeForce GTX 980 SLI, and AMD Radeon R9 390X CrossFire and share our head-to-head results at 4K resolution and find out which solution offers the best gameplay experience. How well does Fiji game when utilized in a CrossFire configuration?"
Here are some more Graphics Card articles from around the web:
Subject: General Tech | July 16, 2015 - 06:04 PM | Ken Addison
Tagged: podcast, video, amd, Fury, fury x, sli, crossfire, windows 10, 10240, corsair, RM850i, IBM, 7nm, kaby lake, Skylake, Intel, 14nm, 10nm
PC Perspective Podcast #358 - 07/16/2015
Join us this week as we discuss the AMD R9 Fury, Fury X Multi-GPU, Windows 10 and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Jeremy Hellstrom, Josh Walrath, and Allyn Malventano
Program length: 0:54:27
Week in Review:
News item of interest:
Hardware/Software Picks of the Week:
Josh: Saved me this week
SLI and CrossFire
Last week I sat down with a set of three AMD Radeon R9 Fury X cards, our sampled review card as well as two retail cards purchased from Newegg, to see how the reports of the pump whine noise from the cards was shaping up. I'm not going to dive into that debate again here in this story as I think we have covered it pretty well thus far in that story as well as on our various podcasts, but rest assured we are continuing to look into the revisions of the Fury X to see if AMD and Cooler Master were actually able to fix the issue.
What we have to cover today is something very different, and likely much more interesting for a wider range of users. When you have three AMD Fury X cards in your hands, you of course have to do some multi-GPU testing with them. With our set I was able to run both 2-Way and 3-Way CrossFire with the new AMD flagship card and compare them directly to the comparable NVIDIA offering, the GeForce GTX 980 Ti.
There isn't much else I need to do to build up this story, is there? If you are curious how well the new AMD Fury X scales in CrossFire with two and even three GPUs, this is where you'll find your answers.
Subject: Motherboards | May 22, 2015 - 03:34 AM | Josh Walrath
Tagged: msi, amd, 990fx, FX-8370, FX-9590, sli, crossfire, SoundBlaster, killer nic, usb 3.1
Several weeks ago MSI officially announced the 990FXA-Gaming motherboard for the AM3+ market. The board is based on the tried and true 990FX and SB950 combo, but it adds a new wrinkle to the game: USB 3.1 support. MSI has released the other AMD based USB 3.1 board on the market, the 970 Krait.
Quite a few people were excited about this part, as the AM3+ market has been pretty stagnant as of late. This is not necessarily surprising considering that AMD has not launched a new AM3+ chip since Fall of 2014 with a couple of "efficiency" chips as well as the slightly faster FX-8370.
There was some speculation based on early photographs that the board could have a more robust power delivery system than previous AM3+ boards, but alas, that is not the case. Upon closer inspection it appears as though MSI has gone the 6+2 phase route. If there are good quality components in there, you can potentially run the 220 watt TDP FX-9000 series parts, but these puppies are not officially supported. In fact, I received an email saying that I might want to be really careful in my choice of CPUs as well as being extremely careful when overclocking.
The board still has some real potential at being a really nice home for the 125 watt TDP and below parts. The audio portion looks very well designed and features the SoundBlaster Cinema 2. It supports both SLI and CrossFire in native 2 x 16x (highly doubtful with 3 cards with the way the slots are configured). It has the Killer NIC ethernet suite which may or may not be a selling point, depending on who you ask.
Overall the board is an interesting addition to the club, but I really wouldn't trust it with the FX-9000 series chips. I have a 970 Gaming that came with the FX-9590 that had a similar power delivery system, and it ran like a champ; there is a possibility that the board will run this combination. This is going to be installed this weekend and I will start the benchmarking! Keep tuned!
Subject: General Tech | May 6, 2015 - 07:46 PM | Jeremy Hellstrom
Tagged: gaming, GTAV, 1440p, 4k, crossfire, sli
Last week [H]ard|OCP investigated the performance of GTAV on single GPU systems and this week comes the promised follow up featuring Crossfire and SLI, including the expensive Titan X. With these high end setups, they tested 1440p and 4k performance as running these GPUs at 1080p is a crime against silicon. At 1440p, the GTX 980 in SLI could handle more than a single Titan X though nowhere near what that card managed in SLI while on the AMD the 295X2 could keep up frame wise, but at the cost of some graphical extras. At 4k resolutions, not even the mighty Titans could manage to run with all graphics options turned up, though it certainly did provide the best performance. AMD's GPUs lagged behind in performance however in scaling they were significantly better than NVIDIA's offerings, though there is still some room for improvement. The real battle is at the $650 mark, you can choose between a pair of GTX 970's or a single R9 295X2 as they offer relatively similar performance but if you want the most out of GTAV you are going to have to pay much more than that.
You should also make sure to free up your calendar at the end of the month for the Fragging Frogs Virtual LAN Party #10 is coming up on the 30th!
"This is Part 2 of our full evaluation of Grand Theft Auto V's video card gaming performance. In this part we dive into NVIDIA SLI and AMD CrossFire highest playable gameplay settings and apples-to-apples at 1440p and 4K resolutions. We find out just what it takes to get the most out of GTA V at its highest settings."
Here is some more Tech News from around the web: