Subject: General Tech | April 29, 2014 - 06:29 PM | Jeremy Hellstrom
Tagged: 4k, amd, crossfire, quad crossfire, r9 295x2, radeon, video
Ryan isn't the only crazy one out there stringing 2 PSUs together to power a pair of AMD's massively powerful 295X2s in CrossFire; the gang at [H]ard|OCP did as well after taking the Mickey with a certain Brian. As with Ryan's experiment they required a second PSU, in this case a 1350W plus an 850W in order to stop the rig from crashing. Their test components also differed somewhat, a Maximus V Extreme instead of a P9X79 Deluxe and slightly different RAM and Win 8.1 installed on their SSD. The other reason to check them out is the Eyefinity 5760 x 1200 tests in addition to the 4K tests.
"Got extra PCIe slots and have no idea what in the world you can do with those? Well if you have $3000 burning a hole in your pocket, wiring in your house that is up to code, a good air conditioning system, and a Type C fire extinguisher that you are not using, AMD's Radeon R9 295X2 QuadFire may be just what the fire marshal ordered."
Here are some more Graphics Card articles from around the web:
- Custom-cooled Radeon R9 290X cards from Asus and XFX @ The Tech Report
- Sapphire Vapor-X R9 290 Tri-X OC Video Card Review @ Legit Reviews
- MSI Radeon R9 290X Lightning 4 GB @ techPowerUp
- Sapphire R9 280X Vapor-X (Tri-X) OC 3GB @ eTeknix
- XFX Radeon R7 250 Core Edition Video Card Review @ Hardware Secrets
- GeForce 700 vs. Radeon Rx 200 Series With The Latest Linux Drivers @ Phoronix
- 13-Way Low-End GPU Comparison With AMD's AM1 Athlon @ Phoronix
- EVGA Backplate Install for the GTX 780 Ti Classified @ Hardware Asylum
- Gigabyte GTX 750 Ti WindForce 2X OC 2GB @ eTeknix
- ASUS GTX 750 Ti OC 2GB @ eTeknix
- TKFA2 GTX 750 Ti OC 2GB @ eTeknix
You need a bit of power for this
PC gamers. We do some dumb shit sometimes. Those on the outside looking in, forced to play on static hardware with fixed image quality and low expandability, turn up their noses and question why we do the things we do. It’s not an unfair reaction, they just don’t know what they are missing out on.
For example, what if you decided to upgrade your graphics hardware to improve performance and allow you to up the image quality on your games to unheard of levels? Rather than using a graphics configuration with performance found in a modern APU you could decide to run not one but FOUR discrete GPUs in a single machine. You could water cool them for optimal temperature and sound levels. This allows you to power not 1920x1080 (or 900p), not 2560x1400 but 4K gaming – 3840x2160.
All for the low, low price of $3000. Well, crap, I guess those console gamers have a right to question the sanity of SOME enthusiasts.
After the release of AMD’s latest flagship graphics card, the Radeon R9 295X2 8GB dual-GPU beast, our mind immediately started to wander to what magic could happen (and what might go wrong) if you combined a pair of them in a single system. Sure, two Hawaii GPUs running in tandem produced the “fastest gaming graphics card you can buy” but surely four GPUs would be even better.
The truth is though, that isn’t always the case. Multi-GPU is hard, just ask AMD or NVIDIA. The software and hardware demands placed on the driver team to coordinate data sharing, timing control, etc. are extremely high even when you are working with just two GPUs in series. Moving to three or four GPUs complicates the story even further and as a result it has been typical for us to note low performance scaling, increased frame time jitter and stutter and sometimes even complete incompatibility.
During our initial briefing covering the Radeon R9 295X2 with AMD there was a system photo that showed a pair of the cards inside a MAINGEAR box. As one of AMD’s biggest system builder partners, MAINGEAR and AMD were clearly insinuating that these configurations would be made available for those with the financial resources to pay for it. Even though we are talking about a very small subset of the PC gaming enthusiast base, these kinds of halo products are what bring PC gamers together to look and drool.
As it happens I was able to get a second R9 295X2 sample in our offices for a couple of quick days of testing.
Working with Kyle and Brent over at HardOCP, we decided to do some hardware sharing in order to give both outlets the ability to judge and measure Quad CrossFire independently. The results are impressive and awe inspiring.
A Powerful Architecture
AMD likes to toot its own horn. Just a take a look at the not-so-subtle marketing buildup to the Radeon R9 295X2 dual-Hawaii graphics card, released today. I had photos of me shipped to…me…overnight. My hotel room at GDC was also given a package which included a pair of small Pringles cans (chips) and a bottle of volcanic water. You may have also seen some photos posted of a mysterious briefcase with its side stickered by with the silhouette of a Radeon add-in board.
This tooting is not without some validity though. The Radeon R9 295X2 is easily the fastest graphics card we have ever tested and that says a lot based on the last 24 months of hardware releases. It’s big, it comes with an integrated water cooler, and it requires some pretty damn specific power supply specifications. But AMD did not compromise on the R9 295X2 and, for that, I am sure that many enthusiasts will be elated. Get your wallets ready, though, this puppy will run you $1499.
Both AMD and NVIDIA have a history of producing high quality dual-GPU graphics cards late in the product life cycle. The most recent entry from AMD was the Radeon HD 7990, a pair of Tahiti GPUs on a single PCB with a triple fan cooler. While a solid performing card, the product was released in a time when AMD CrossFire technology was well behind the curve and, as a result, real-world performance suffered considerably. By the time the drivers and ecosystem were fixed, the HD 7990 was more or less on the way out. It was also notorious for some intermittent, but severe, overheating issues, documented by Tom’s Hardware in one of the most harshly titled articles I’ve ever read. (Hey, Game of Thrones started again this week!)
The Hawaii GPU, first revealed back in September and selling today under the guise of the R9 290X and R9 290 products, is even more power hungry than Tahiti. Many in the industry doubted that AMD would ever release a dual-GPU product based on Hawaii as the power and thermal requirements would be just too high. AMD has worked around many of these issues with a custom water cooler and placing specific power supply requirements on buyers. Still, all without compromising on performance. This is the real McCoy.
BF4 Integrates FCAT Overlay Support
Back in September AMD publicly announced Mantle, a new lower level API meant to offer more performance for gamers and more control for developers fed up with the restrictions of DirectX. Without diving too much into the politics of the release, the fact that Battlefield 4 developer DICE was integrating Mantle into the Frostbite engine for Battlefield was a huge proof point for the technology. Even though the release was a bit later than AMD had promised us, coming at the end of January 2014, one of the biggest PC games on the market today had integrated a proprietary AMD API.
When I did my first performance preview of BF4 with Mantle on February 1st, the results were mixed but we had other issues to deal with. First and foremost, our primary graphics testing methodology, called Frame Rating, wasn't able to be integrated due to the change of API. Instead we were forced to use an in-game frame rate counter built by DICE which worked fine, but didn't give us the fine grain data we really wanted to put the platform to the test. It worked, but we wanted more. Today we are happy to announce we have full support for our Frame Rating and FCAT testing with BF4 running under Mantle.
A History of Frame Rating
In late 2012 and throughout 2013, testing graphics cards became a much more complicated beast. Terms like frame pacing, stutter, jitter and runts were not in the vocabulary of most enthusiasts but became an important part of the story just about one year ago. Though complicated to fully explain, the basics are pretty simple.
Rather than using software on the machine being tested to measure performance, our Frame Rating system uses a combination of local software and external capture hardware. On the local system with the hardware being evaluated we run a small piece of software called an overlay that draws small colored bars on the left hand side of the game screen that change successively with each frame rendered by the game. Using a secondary system, we capture the output from the graphics card directly, intercepting it from the display output, in real-time in an uncompressed form. With that video file captured, we then analyze it frame by frame, measuring the length of each of those colored bars, how long they are on the screen, how consistently they are displayed. This allows us to find the average frame rate but also to find how smoothly the frames are presented, if there are dropped frames and if there are jitter or stutter issues.
Subject: General Tech, Graphics Cards | February 7, 2014 - 03:54 AM | Scott Michaud
Tagged: sli, crossfire
I will not even call this a thinly-veiled rant. Linus admits it. To make a point, he assembled a $5000 PC running a pair of NVIDIA GeForce 780 Ti GPUs and another pair of AMD Radeon R9 290X graphics cards. While Bitcoin mining would likely utilize all four video cards well enough, games will not. Of course, he did not even mention the former application (thankfully).
Honestly, he's right. One of the reasons why I am excited about OpenCL (and its WebCL companion) is that it simply does not care about devices. Your host code manages the application but, when the jobs get dirty, it enlists help from an available accelerator by telling it to perform a kernel (think of it like function) and share the resulting chunk of memory.
This can be an AMD GPU. This can be an NVIDIA GPU. This can be an x86 CPU. This can be an FPGA. If the host has multiple, independent tasks, it can be several of the above (and in any combination). OpenCL really does not care.
Obviously, to be fair, AMD is very receptive to open platforms. NVIDIA is less-so, and they are honest about that, but they conform to standards when it benefits their users more than their proprietary ones. I know that point can be taken multiple ways, and several will be hotly debated, but I really cannot find the words to properly narrow it.
Despite the fragmentation in features, there is one thing to be proud of as a PC gamer. You may have different experiences depending on the components you purchase.
But, at least you will always have an experience.
Subject: General Tech | December 12, 2013 - 01:35 AM | Ken Addison
Tagged: z87, xfire, video, shield, R9 290X, podcast, pcper, nvidia, litecoin, grid, frame rating, eyefinity, crossfire, amd
PC Perspective Podcast #280 - 12/12/2013
Join us this week as we discuss the NVIDIA GRID Beta, R9 290X Custom Coolers, 2TB SSDs and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Josh Walrath, Allyn Malventano and Scott Michaud
Subject: General Tech, Graphics Cards | December 11, 2013 - 05:58 PM | Scott Michaud
Tagged: frame pacing, frame rating, amd, southern islands, 4k, eyefinity, crossfire, microstutter
The frame pacing issue has been covered at our website for almost a year now. It stems from the original "microstutter" problem which dates back over a year before we could quantify it. We like to use the term "Frame Rating" to denote the testing methodology we now use for our GPU tests.
AMD fared worse at these tests than NVIDIA (although even they had some problems in certain configurations). They have dedicated a lot of man-hours to the problem resulting in a driver updates for certain scenarios. Crossfire while utilizing Eyefinity or 4K MST was one area they did not focus on. The issue has been addressed in Hawaii and AMD asserted that previous cards will get a software fix soon.
The good news is that we have just received word from AMD that they plan on releasing a beta driver for Southern Islands and earlier GPUs (AMD believes it should work for anything that's not "legacy"). As usual, until it ships anything could change, but it looks good for now.
The beta "frame pacing" driver addressing Crossfire with 4K and Eyefinity, for supported HD-series and Southern Islands-based Rx cards, is expected to be public sometime in January.
Subject: General Tech | November 5, 2013 - 01:22 PM | Jeremy Hellstrom
Tagged: radeon, r9 290, hawaii, crossfire, amd, 290x, powertune
How does all the power of a GTX 780 for a price tag $100 lower sound to you? Honestly it might sound a little loud as the reference cooler on the R9 290 can be a little loud at 50% which is the speed you need to be able to keep this card running full out. As long as you don't mind the sound or are willing to wait for custom air or water cooling solutions there are no negatives about the 290. Frame pacing makes Crossfire much smoother and it sports the hardware improvements for EyeFinity to improve your experience in 4K and multi-monitor usage. [H]ard|OCP actually uses the word epic just before giving this card a Gold Award, check out their full review here.
Ryan's review, including Frame Rating can be found by clicking here.
"It is time now to look at AMD's Radeon R9 290. This lower-cost R9 290 series video card packs a punch, not only in performance, but also in price. Watch it compete with the GeForce GTX 780, and win while being priced lower. This is the value you have been waiting for with gaming performance."
Here are some more Graphics Card articles from around the web:
- AMD's Radeon R9 290 @ The Tech Report
- AMD Radeon R9 290 @ Legion Hardware
- AMD Radeon R9 290 @ Hardware.info
- AMD Radeon R9 290 4GB Video Card Review @ Legit Reviews
- AMD Radeon R9 290 @ Techspot
- AMD Radeon R9 290 4GB Review @ Hardware Canucks
- AMD R9 290 @ Kitguru
- AMD Radeon R9 290 4 GB @ techPowerUp
- AMD Radeon R9 290X CrossFire @ [H]ard|OCP
- Powercolor R9 290X OC Review @ OCC
- PowerColor R9 290X OC 4 GB @ techPowerUp
- HIS R9 270X IceQ X² Turbo Boost Clock 2GB GDDR5 Video Card Review @ Madshrimps
- AMD Radeon R9 290X 4GB @ eTeknix
- 4K Gaming Showdown – AMD R9 290X & R9 280X Vs Nvidia GTX Titan & GTX 780 @ eTeknix
- AMD Radeon R9 290X vs NVIDIA GeForce GTX 780 at 4K @ Legit Reviews
- Sapphire R9 280X Toxic Review @ OCC
- MSI R9 280X GAMING @ [H]ard|OCP
- VisionTek Radeon R9 280X Video Card Review @ Modders-Inc
- HIS R7 260X iPower IceQ X² 2 GB @ techPowerUp
- Sapphire Radeon R9 270X Vapor-X Video Card Review @ TechwareLabs
- AMD's Radeon Gallium3D Starts Posing A Threat To Catalyst @ Phoronix
- Three GeForce GTX 780 Graphics Cards @ X-bit Labs
- GeForce GTX 770 Graphics Cards Roundup @ X-bit Labs
- Gigabyte GTX 780 WindForce OC @ eTeknix
More of the same for a lot less cash
The week before Halloween, AMD unleashed a trick on the GPU world under the guise of the Radeon R9 290X and it was the fastest single GPU graphics card we had tested to date. With a surprising price point of $549, it was able to outperform the GeForce GTX 780 (and GTX TITAN in most cases) while under cutting the competitions price by $100. Not too bad!
Today's release might be more surprising (and somewhat confusing). The AMD Radeon R9 290 4GB card is based on the same Hawaii GPU with a few less compute units enabled (CUs) and an even more aggressive price and performance placement. Seriously, has AMD lost its mind?
Can a card with a $399 price tag cut into the same performance levels as the JUST DROPPED price of $499 for the GeForce GTX 780?? And, if so, what sacrifices are being made by users that adopt it? Why do so many of our introduction sentences end in question marks?
The R9 290 GPU - Hawaii loses a small island
If you are new to the Hawaii GPU and you missed our first review of the Radeon R9 290X from last month, you should probably start back there. The architecture is very similar to that of the HD 7000-series Tahiti GPUs with some modest changes to improve efficiency with the biggest jump in raw primitives per second to 4/clock over 2/clock.
The R9 290 is based on Hawaii though it has four fewer compute units (CUs) than the R9 290X. When I asked AMD if that meant there was one fewer CU per Shader Engine or if they were all removed from a single Engine, they refused to really answer. Instead, several "I'm not allowed to comment on the specific configuration" lines were given. This seems pretty odd as NVIDIA has been upfront about the dual options for its derivative GPU models. Oh well.
A bit of a surprise
Okay, let's cut to the chase here: it's late, we are rushing to get our articles out, and I think you all would rather see our testing results NOW rather than LATER. The first thing you should do is read my review of the AMD Radeon R9 290X 4GB Hawaii graphics card which goes over the new architecture, new feature set, and performance in single card configurations.
Then, you should continue reading below to find out how the new XDMA, bridge-less CrossFire implementation actually works in both single panel and 4K (tiled) configurations.
A New CrossFire For a New Generation
CrossFire has caused a lot of problems for AMD in recent months (and a lot of problems for me as well). But, AMD continues to make strides in correcting the frame pacing issues associated with CrossFire configurations and the new R9 290X moves the bar forward.
Without the CrossFire bridge connector on the 290X, all of the CrossFire communication and data transfer occurs over the PCI Express bus that connects the cards to the entire system. AMD claims that this new XDMA interface was designed for Eyefinity and UltraHD resolutions (which were the subject of our most recent article on the subject). By accessing the memory of the GPU through PCIe AMD claims that it can alleviate the bandwidth and sync issues that were causing problems with Eyefinity and tiled 4K displays.
Even better, this updated version of CrossFire is said to compatible with the frame pacing updates to the Catalyst driver to improve multi-GPU performance experiences for end users.
When an extra R9 290X accidentally fell into my lap, I decided to take it for a spin. And if you have followed my graphics testing methodology in the past year then you'll understand the important of these tests.