Subject: Motherboards | May 21, 2015 - 11:34 PM | Josh Walrath
Tagged: msi, amd, 990fx, FX-8370, FX-9590, sli, crossfire, SoundBlaster, killer nic, usb 3.1
Several weeks ago MSI officially announced the 990FXA-Gaming motherboard for the AM3+ market. The board is based on the tried and true 990FX and SB950 combo, but it adds a new wrinkle to the game: USB 3.1 support. MSI has released the other AMD based USB 3.1 board on the market, the 970 Krait.
Quite a few people were excited about this part, as the AM3+ market has been pretty stagnant as of late. This is not necessarily surprising considering that AMD has not launched a new AM3+ chip since Fall of 2014 with a couple of "efficiency" chips as well as the slightly faster FX-8370.
There was some speculation based on early photographs that the board could have a more robust power delivery system than previous AM3+ boards, but alas, that is not the case. Upon closer inspection it appears as though MSI has gone the 6+2 phase route. If there are good quality components in there, you can potentially run the 220 watt TDP FX-9000 series parts, but these puppies are not officially supported. In fact, I received an email saying that I might want to be really careful in my choice of CPUs as well as being extremely careful when overclocking.
The board still has some real potential at being a really nice home for the 125 watt TDP and below parts. The audio portion looks very well designed and features the SoundBlaster Cinema 2. It supports both SLI and CrossFire in native 2 x 16x (highly doubtful with 3 cards with the way the slots are configured). It has the Killer NIC ethernet suite which may or may not be a selling point, depending on who you ask.
Overall the board is an interesting addition to the club, but I really wouldn't trust it with the FX-9000 series chips. I have a 970 Gaming that came with the FX-9590 that had a similar power delivery system, and it ran like a champ; there is a possibility that the board will run this combination. This is going to be installed this weekend and I will start the benchmarking! Keep tuned!
Subject: General Tech | May 6, 2015 - 03:46 PM | Jeremy Hellstrom
Tagged: gaming, GTAV, 1440p, 4k, crossfire, sli
Last week [H]ard|OCP investigated the performance of GTAV on single GPU systems and this week comes the promised follow up featuring Crossfire and SLI, including the expensive Titan X. With these high end setups, they tested 1440p and 4k performance as running these GPUs at 1080p is a crime against silicon. At 1440p, the GTX 980 in SLI could handle more than a single Titan X though nowhere near what that card managed in SLI while on the AMD the 295X2 could keep up frame wise, but at the cost of some graphical extras. At 4k resolutions, not even the mighty Titans could manage to run with all graphics options turned up, though it certainly did provide the best performance. AMD's GPUs lagged behind in performance however in scaling they were significantly better than NVIDIA's offerings, though there is still some room for improvement. The real battle is at the $650 mark, you can choose between a pair of GTX 970's or a single R9 295X2 as they offer relatively similar performance but if you want the most out of GTAV you are going to have to pay much more than that.
You should also make sure to free up your calendar at the end of the month for the Fragging Frogs Virtual LAN Party #10 is coming up on the 30th!
"This is Part 2 of our full evaluation of Grand Theft Auto V's video card gaming performance. In this part we dive into NVIDIA SLI and AMD CrossFire highest playable gameplay settings and apples-to-apples at 1440p and 4K resolutions. We find out just what it takes to get the most out of GTA V at its highest settings."
Here is some more Tech News from around the web:
Subject: Motherboards | May 6, 2015 - 10:21 AM | Josh Walrath
Tagged: usb 3.1, sli, piledriver, msi, gaming, crossfire, amd, am3+, 990FXA-Gaming, 990fx
Some months ago MSI announced that they are releasing a slew of USB 3.1 compliant parts. What was surprising was the mention of a brand new AM3+ board based on the now nearly geriatric AMD 990FX chipset. The 990FX has had quite a lifespan with PCI-E 2.0 support and the accompanying SB950 southbridge with USB 2.0 and SATA6G features.
It looks as if MSI is doing a clean sheet design for the 990FXA-Gaming. This looks to be a class leading product with plenty of features. Not only does it have the USB 3.1 support, but it also implements the enhanced audio design that we have seen on other top end boards from MSI. It also embraces the Killer ethernet software suite (utilizing Qualcomm's Atheros Gig-E chip).
The power delivery system looks to be a full 8+2 unit, so it can officially handle the 220 watt TDP FX-9000 series of CPUs. It supports both SLI and CrossFire. The cooling on the board looks to be top notch as well, with a heatpipe stretching from the Southbridge, through the Northbridge, and finally to the VRMs.
We expect these boards to be available sometime around the middle of this month. We should also be receiving a sample for testing around then. It is nice to see new support for AMD's FX CPUs, and this should be a cost effective member of the club. Though AM3+ is a dead end in terms of socket infrastructure, there is still a lot of value in AMD's FX CPU line.
There is no word on pricing at this time, but I would not be surprised to see it hit the $149 mark. It does not seem as decked out as the 990FXA-GD80 which is priced around $179. With the robust featureset that they do implement, it does look to be a value if it can hit that aggressive price point.
Quick Performance Comparison
Earlier this week, we posted a brief story that looked at the performance of Middle-earth: Shadow of Mordor on the latest GPUs from both NVIDIA and AMD. Last week also marked the release of the v1.11 patch for Sniper Elite 3 that introduced an integrated benchmark mode as well as support for AMD Mantle.
I decided that this was worth a quick look with the same line up of graphics cards that we used to test Shadow of Mordor. Let's see how the NVIDIA and AMD battle stacks up here.
For those unfamiliar with the Sniper Elite series, the focuses on the impact of an individual sniper on a particular conflict and Sniper Elite 3 doesn't change up that formula much. If you have ever seen video of a bullet slowly going through a body, allowing you to see the bones/muscle of the particular enemy being killed...you've probably been watching the Sniper Elite games.
Gore and such aside, the game is fun and combines sniper action with stealth and puzzles. It's worth a shot if you are the kind of gamer that likes to use the sniper rifles in other FPS titles.
But let's jump straight to performance. You'll notice that in this story we are not using our Frame Rating capture performance metrics. That is a direct result of wanting to compare Mantle to DX11 rendering paths - since we have no way to create an overlay for Mantle, we have resorted to using FRAPs and the integrated benchmark mode in Sniper Elite 3.
Our standard GPU test bed was used with a Core i7-3960X processor, an X79 motherboard, 16GB of DDR3 memory, and the latest drivers for both parties involved. That means we installed Catalyst 14.9 for AMD and 344.16 for NVIDIA. We'll be comparing the GeForce GTX 980 to the Radeon R9 290X, and the GTX 970 to the R9 290. We will also look at SLI/CrossFire scaling at the high end.
Subject: Graphics Cards | July 29, 2014 - 02:27 PM | Jeremy Hellstrom
Tagged: asus, gtx 780, R9 290X DC2 OC, sli, crossfire, STRIX GTX 780 OC 6GB, R9 290X
We have seen [H]ard|OCP test ASUS' STRIX GTX 780 OC 6GB and R9 290X DirectCU II before but this time they have been overclocked and paired up for a 4k showdown. For a chance NewEgg gives the price advantage to AMD, $589 versus $599 at the time of writing (with odd blips in prices on Amazon). The GTX 780 has been set to 1.2GHz and 6.6GHz while the 290X is 1.1GHz and 5.6GHz, keep in mind dual GPU setups may not reach the same frequencies as single cards. Read on for their conclusions and decide if you prefer to brag about a higher overclock or have better overall performance.
"We take the ASUS STRIX GTX 780 OC 6GB video card and run two in SLI and overclock both of these at 4K resolutions to find the ultimate gameplay performance with 6GB of VRAM. We will also compare these to two overclocked ASUS Radeon R9 290X DirectCU II CrossFire video cards for the ultimate VRAM performance showdown."
Here are some more Graphics Card articles from around the web:
- ASUS GTX 780 STRIX OC 6GB Review @ Hardware Canucks
- Gigabyte GeForce GTX 770 (GV-N770OC-2GD) vs. ASUS Matrix Platinum (R9280X-P-3GD5) Video Card Review @ Hardware Secrets
- Ice-cold Temperature Killa: Arctic Accelero Hybrid II-120 GPU Cooler Review @ Techgage
- Examining AMD’s Driver Progress Since Launch Drivers: R9 290X & HD 7970 @ eTeknix
- HIS Radeon R7 250X and 260X iCooler @ Funky Kit
- Sapphire Dual-X R9 280 3GB OC Video Card Review @ Legit Reviews
- AMD Radeon R9 280X Round-up @ Legion Hardware
- HIS R9 280 IceQ X2 OC 3GB GDDR5 Video Card Review @ Madshrimps
- Sapphire Radeon R9 290 Vapor-X 4 GB @ techPowerUp
Subject: Graphics Cards | July 4, 2014 - 01:40 PM | Jeremy Hellstrom
Tagged: STRIX GTX 780 OC 6GB, sli, crossfire, asus, 4k
Multiple monitor and 4k testing of the ASUS STRIX GTX 780 OC cards in SLI is not about the 52MHz out of box overclock but about the 12GB of VRAM that your system will have. Apart from an issue with BF4, [H]ard|OCP tested the STRIX against a pair of reference GTX 780s and HD 290X cards at resolutions of 5760x1200 and 3840x2160. The extra RAM made the STRIX shine in comparison to the reference card as not only was the performance better but [H] could raise many of the graphical settings but was not enough to push its performance past the 290X cards in Crossfire. One other takeaway from this review is that even 6GB of VRAM is not enough to run Watch_Dogs with Ultra textures at these resolutions.
"You’ve seen the new ASUS STRIX GTX 780 OC Edition 6GB DirectCU II video card, now let’s look at two of these in an SLI configuration! We will explore 4K and NV Surround performance with two ASUS STRIX video cards for the ultimate high-resolution experience and see if the extra memory helps this GPU make better strides at high resolutions."
Here are some more Graphics Card articles from around the web:
- ASUS GTX 780 Strix 6 GB @ techPowerUp
- MSI GTX 780 Gaming 6 GB @ techPowerUp
- HIS R7 260X iCooler 2GB GDDR5 Video Card Review @ Madshrimps
- XFX Radeon R9 290X Double Dissipation 4GB @ eTeknix
- PowerColor Devil 13 Dual Core R9 290X 8GB Review @ OCC
- PowerColor Devil 13 R9 290X Dual Core Review @ Hardware Canucks
- XFX R9 280 Black OC Edition @ Kitguru
- HIS Radeon R9 280 IceQ X² OC 3GB @ Benchmark Reviews
- ASUS R7 260X DirectCU II OC @ [H]ard|OCP
Subject: General Tech | April 29, 2014 - 06:29 PM | Jeremy Hellstrom
Tagged: 4k, amd, crossfire, quad crossfire, r9 295x2, radeon, video
Ryan isn't the only crazy one out there stringing 2 PSUs together to power a pair of AMD's massively powerful 295X2s in CrossFire; the gang at [H]ard|OCP did as well after taking the Mickey with a certain Brian. As with Ryan's experiment they required a second PSU, in this case a 1350W plus an 850W in order to stop the rig from crashing. Their test components also differed somewhat, a Maximus V Extreme instead of a P9X79 Deluxe and slightly different RAM and Win 8.1 installed on their SSD. The other reason to check them out is the Eyefinity 5760 x 1200 tests in addition to the 4K tests.
"Got extra PCIe slots and have no idea what in the world you can do with those? Well if you have $3000 burning a hole in your pocket, wiring in your house that is up to code, a good air conditioning system, and a Type C fire extinguisher that you are not using, AMD's Radeon R9 295X2 QuadFire may be just what the fire marshal ordered."
Here are some more Graphics Card articles from around the web:
- Custom-cooled Radeon R9 290X cards from Asus and XFX @ The Tech Report
- Sapphire Vapor-X R9 290 Tri-X OC Video Card Review @ Legit Reviews
- MSI Radeon R9 290X Lightning 4 GB @ techPowerUp
- Sapphire R9 280X Vapor-X (Tri-X) OC 3GB @ eTeknix
- XFX Radeon R7 250 Core Edition Video Card Review @ Hardware Secrets
- GeForce 700 vs. Radeon Rx 200 Series With The Latest Linux Drivers @ Phoronix
- 13-Way Low-End GPU Comparison With AMD's AM1 Athlon @ Phoronix
- EVGA Backplate Install for the GTX 780 Ti Classified @ Hardware Asylum
- Gigabyte GTX 750 Ti WindForce 2X OC 2GB @ eTeknix
- ASUS GTX 750 Ti OC 2GB @ eTeknix
- TKFA2 GTX 750 Ti OC 2GB @ eTeknix
You need a bit of power for this
PC gamers. We do some dumb shit sometimes. Those on the outside looking in, forced to play on static hardware with fixed image quality and low expandability, turn up their noses and question why we do the things we do. It’s not an unfair reaction, they just don’t know what they are missing out on.
For example, what if you decided to upgrade your graphics hardware to improve performance and allow you to up the image quality on your games to unheard of levels? Rather than using a graphics configuration with performance found in a modern APU you could decide to run not one but FOUR discrete GPUs in a single machine. You could water cool them for optimal temperature and sound levels. This allows you to power not 1920x1080 (or 900p), not 2560x1400 but 4K gaming – 3840x2160.
All for the low, low price of $3000. Well, crap, I guess those console gamers have a right to question the sanity of SOME enthusiasts.
After the release of AMD’s latest flagship graphics card, the Radeon R9 295X2 8GB dual-GPU beast, our mind immediately started to wander to what magic could happen (and what might go wrong) if you combined a pair of them in a single system. Sure, two Hawaii GPUs running in tandem produced the “fastest gaming graphics card you can buy” but surely four GPUs would be even better.
The truth is though, that isn’t always the case. Multi-GPU is hard, just ask AMD or NVIDIA. The software and hardware demands placed on the driver team to coordinate data sharing, timing control, etc. are extremely high even when you are working with just two GPUs in series. Moving to three or four GPUs complicates the story even further and as a result it has been typical for us to note low performance scaling, increased frame time jitter and stutter and sometimes even complete incompatibility.
During our initial briefing covering the Radeon R9 295X2 with AMD there was a system photo that showed a pair of the cards inside a MAINGEAR box. As one of AMD’s biggest system builder partners, MAINGEAR and AMD were clearly insinuating that these configurations would be made available for those with the financial resources to pay for it. Even though we are talking about a very small subset of the PC gaming enthusiast base, these kinds of halo products are what bring PC gamers together to look and drool.
As it happens I was able to get a second R9 295X2 sample in our offices for a couple of quick days of testing.
Working with Kyle and Brent over at HardOCP, we decided to do some hardware sharing in order to give both outlets the ability to judge and measure Quad CrossFire independently. The results are impressive and awe inspiring.
A Powerful Architecture
AMD likes to toot its own horn. Just a take a look at the not-so-subtle marketing buildup to the Radeon R9 295X2 dual-Hawaii graphics card, released today. I had photos of me shipped to…me…overnight. My hotel room at GDC was also given a package which included a pair of small Pringles cans (chips) and a bottle of volcanic water. You may have also seen some photos posted of a mysterious briefcase with its side stickered by with the silhouette of a Radeon add-in board.
This tooting is not without some validity though. The Radeon R9 295X2 is easily the fastest graphics card we have ever tested and that says a lot based on the last 24 months of hardware releases. It’s big, it comes with an integrated water cooler, and it requires some pretty damn specific power supply specifications. But AMD did not compromise on the R9 295X2 and, for that, I am sure that many enthusiasts will be elated. Get your wallets ready, though, this puppy will run you $1499.
Both AMD and NVIDIA have a history of producing high quality dual-GPU graphics cards late in the product life cycle. The most recent entry from AMD was the Radeon HD 7990, a pair of Tahiti GPUs on a single PCB with a triple fan cooler. While a solid performing card, the product was released in a time when AMD CrossFire technology was well behind the curve and, as a result, real-world performance suffered considerably. By the time the drivers and ecosystem were fixed, the HD 7990 was more or less on the way out. It was also notorious for some intermittent, but severe, overheating issues, documented by Tom’s Hardware in one of the most harshly titled articles I’ve ever read. (Hey, Game of Thrones started again this week!)
The Hawaii GPU, first revealed back in September and selling today under the guise of the R9 290X and R9 290 products, is even more power hungry than Tahiti. Many in the industry doubted that AMD would ever release a dual-GPU product based on Hawaii as the power and thermal requirements would be just too high. AMD has worked around many of these issues with a custom water cooler and placing specific power supply requirements on buyers. Still, all without compromising on performance. This is the real McCoy.
BF4 Integrates FCAT Overlay Support
Back in September AMD publicly announced Mantle, a new lower level API meant to offer more performance for gamers and more control for developers fed up with the restrictions of DirectX. Without diving too much into the politics of the release, the fact that Battlefield 4 developer DICE was integrating Mantle into the Frostbite engine for Battlefield was a huge proof point for the technology. Even though the release was a bit later than AMD had promised us, coming at the end of January 2014, one of the biggest PC games on the market today had integrated a proprietary AMD API.
When I did my first performance preview of BF4 with Mantle on February 1st, the results were mixed but we had other issues to deal with. First and foremost, our primary graphics testing methodology, called Frame Rating, wasn't able to be integrated due to the change of API. Instead we were forced to use an in-game frame rate counter built by DICE which worked fine, but didn't give us the fine grain data we really wanted to put the platform to the test. It worked, but we wanted more. Today we are happy to announce we have full support for our Frame Rating and FCAT testing with BF4 running under Mantle.
A History of Frame Rating
In late 2012 and throughout 2013, testing graphics cards became a much more complicated beast. Terms like frame pacing, stutter, jitter and runts were not in the vocabulary of most enthusiasts but became an important part of the story just about one year ago. Though complicated to fully explain, the basics are pretty simple.
Rather than using software on the machine being tested to measure performance, our Frame Rating system uses a combination of local software and external capture hardware. On the local system with the hardware being evaluated we run a small piece of software called an overlay that draws small colored bars on the left hand side of the game screen that change successively with each frame rendered by the game. Using a secondary system, we capture the output from the graphics card directly, intercepting it from the display output, in real-time in an uncompressed form. With that video file captured, we then analyze it frame by frame, measuring the length of each of those colored bars, how long they are on the screen, how consistently they are displayed. This allows us to find the average frame rate but also to find how smoothly the frames are presented, if there are dropped frames and if there are jitter or stutter issues.