The Radeon R9 280
Though not really new, the AMD Radeon R9 280 GPU is a part that we really haven't spent time with at PC Perspective. Based on the same Tahiti GPU found in the R9 280X, the HD 7970, the HD 7950 and others, the R9 280 fits at a price point and performance level that I think many gamers will see as enticing. MSI sent along a model that includes some overclocked settings and an updated cooler, allowing the GPU to run at its top speed without much noise.
With a starting price of just $229 or so, the MSI Radeon R9 280 Gaming graphics cards has some interesting competition as well. From the AMD side it butts heads with the R9 280X and the R9 270X. The R9 280X costs $60-70 more though and as you'll see in our benchmarks, the R9 280 will likely cannibalize some of those sales. From NVIDIA, the GeForce GTX 760 is priced right at $229 as well, but does it really have the horsepower to keep with Tahiti?
Subject: Graphics Cards | June 19, 2014 - 10:35 AM | Ryan Shrout
Tagged: video, richard huddy, radeon, openworks, Mantle, freesync, amd
On Tuesday, AMD's newly minted Gaming Scientist, Richard Huddy, stopped by the PC Perspective office to talk about the current state of the company's graphics division. The entire video of the interview is embedded below but several of the points that are made are quite interesting and newsworthy. During the discussion we hear about Mantle on Linux, a timeline for Mantle being opened publicly as well as a surprising new idea for a competitor to NVIDIA's GameWorks program.
Richard is new to the company but not new to the industry, starting with 3DLabs many years ago and taking jobs at NVIDIA, ATI, Intel and now returning to AMD. The role of Gaming Scientist is to directly interface with the software developers for gaming and make sure that the GPU hardware designers are working hand in hand with future, high end graphics technology. In essence, Huddy's job is to make sure AMD continues to innovate on the hardware side to facilitate innovation on the software side.
AMD Planning an "OpenWorks" Program
(33:00) After the volume of discussion surrounding the NVIDIA GameWorks program and its potential to harm the gaming ecosystem by not providing source code in an open manner, Huddy believes that the answer to problem is to simply have NVIDIA release the SDK with source code publicly. Whether or not NVIDIA takes that advice has yet to be seen, but if they don't, it appears that AMD is going down the road of creating its own competing solution that is open and flexible.
The idea of OpenFX or OpenWorks as Huddy refers to it is to create an open source repository for gaming code and effects examples that can be updated, modified and improved upon by anyone in the industry. AMD would be willing to start the initiative by donating its entire SDK to the platform and then invite other software developers, as well as other hardware developers, to add or change to the collection. The idea is to create a competitor to what GameWorks accomplishes but in a license free and open way.
NVIDIA GameWorks has been successful; can AMD OpenWorks derail it?
Essentially the "OpenWorks" repository would work in a similar way to a Linux group where the public has access to the code to submit changes that can be implemented by anyone else. Someone would be able to improve the performance for specific hardware easily but if performance was degraded on any other hardware then it could be easily changed and updated. Huddy believes this is how you move the industry forward and how you ensure that the gamer is getting the best overall experience regardless of the specific platform they are using.
"OpenWorks" is still in the planning stages and AMD is only officially "talking about it" internally. However, bringing Huddy back to AMD wasn't done without some direction already in mind and it would not surprise me at all if this was essentially a done deal. Huddy believes that other hardware companies like Qualcomm and Intel would participate in such an open system but the real question is whether or not NVIDIA, as the discrete GPU market share leader, would be in any way willing to do as well.
Still, this initiative continues to show the differences between the NVIDIA and AMD style of doing things. NVIDIA prefers a more closed system that it has full control over to perfect the experience, to hit aggressive timelines and to improve the ecosystem as they see it. AMD wants to provide an open system that everyone can participate in and benefit from but often is held back by the inconsistent speed of the community and partners.
Mantle to be Opened by end of 2014, Potentially Coming to Linux
(7:40) The AMD Mantle API has been an industry changing product, I don't think anyone can deny that. Even if you don't own AMD hardware or don't play any of the games currently shipping with Mantle support, the re-focusing on a higher efficiency API has impacted NVIDIA's direction with DX11, Microsoft's plans for DX12 and perhaps even Apple's direction with Metal. But for a company that pushes the idea of open standards so heavily, AMD has yet to offer up Mantle source code in a similar fashion to its standard SDK. As it stands right now, Mantle is only given to a group of software developers in the beta program and is specifically tuned for AMD's GCN graphics hardware.
Huddy reiterated that AMD has made a commitment to release a public SDK for Mantle by the end of 2014 which would allow any other hardware vendor to create a driver that could run Mantle game titles. If AMD lives up to its word and releases the full source code for it, then in theory, NVIDIA could offer support for Mantle games on GeForce hardware, Intel could offer support those same games on Intel HD graphics. There will be no license fees, no restrictions at all.
The obvious question is whether or not any other IHV would choose to do so. Both because of competitive reasons and with the proximity of DX12's release in late 2015. Huddy agrees with me that the pride of these other hardware vendors may prevent them from considering Mantle adoption though the argument can be made that the work required to implement it properly might not be worth the effort with DX12 (and its very similar feature set) around the corner.
(51:45) When asked about AMD input on SteamOS and its commitment to the gamers that see that as the future, Huddy mentioned that AMD was considering, but not promising, bringing the Mantle API to Linux. If the opportunity exists, says Huddy, to give the gamer a better experience on that platform with the help of Mantle, and developers ask for the support for AMD, then AMD will at the very least "listen to that." It would incredibly interesting to see a competitor API in the landscape of Linux where OpenGL is essentially the only game in town.
AMD FreeSync / Adaptive Sync Benefits
(59:15) Huddy discussed the differences, as he sees it, between NVIDIA's G-Sync technology and the AMD option called FreeSync but now officially called Adaptive Sync as part of the DisplayPort 1.2a standard. Beside the obvious difference of added hardware and licensing costs, Adaptive Sync is apparently going to be easier to implement as the maximum and minimum frequencies are actually negotiated by the display and the graphics card when the monitor is plugged in. G-Sync requires a white list in the NVIDIA driver to work today and as long as NVIDIA keeps that list updated, the impact on gamers buying panels should be minimal. But with DP 1.2a and properly implemented Adaptive Sync monitors, once a driver supports the negotiation it doesn't require knowledge about the specific model beforehand.
AMD demos FreeSync at Computex 2014
According to Huddy, the new Adaptive Sync specification will go up to as high as 240 Hz and as low as 9 Hz; these are specifics that before today weren't known. Of course, not every panel (and maybe no panel) will support that extreme of a range for variable frame rate technology, but this leaves a lot of potential for improved panel development in the years to come. More likely you'll see Adaptive Sync ready display listing a range closer to 30-60 Hz or 30-80 Hz initially.
Prototypes of FreeSync monitors will be going out to some media in the September or October time frame, while public availability will likely occur in the January or February window.
How does AMD pick game titles for the Never Settle program?
(1:14:00) Huddy describes the fashion in which games are vetted for inclusion in the AMD Never Settle program. The company looks for games that have a good history of course, but also ones that exemplify the use of AMD hardware. Games that benchmark well and have reproducible results that can be reported by AMD and the media are also preferred. Inclusion of an integrated benchmark mode in the game is also a plus as it more likely gets review media interested in including that game in their test suite and also allows the public to run their own tests to compare results.
Another interesting note was the games that are included in bundles often are picked based on restrictions in certain countries. Germany, for example, has very strict guidelines for violence in games and thus add-in card partners would much prefer a well known racing game than an ultra-bloody first person shooter.
First and foremost, a huge thanks to Richard Huddy for making time to stop by the offices and talk with us. And especially for allowing us to live stream it to our fans and readers. I have had the privilege to have access to some of the most interesting minds in the industry, but they are very rarely open to having our talks broadcast to the world without editing and without a precompiled list of questions. For allowing it, both AMD and Mr. Huddy have gained some respect!
There is plenty more discussed in the interview including AMD's push to a non-PC based revenue split, whether DX12 will undermine the use of the Mantle API, and how code like TressFX compares to NVIDIA GameWorks. If you haven't watched it yet I think you'll find the full 90 minutes to be quite informative and worth your time.
UPDATE: I know that some of our readers, and some contacts and NVIDIA, took note of Huddy's comments about TressFX from our interview. Essentially, NVIDIA denied that TressFX was actually made available before the release of Tomb Raider. When I asked AMD for clarification, Richard Huddy provided me with the following statement.
I would like to take the opportunity to correct a false impression that I inadvertently created during the interview.
Contrary to what I said, it turns out that TressFX was first published in AMD's SDK _after_ the release of Tomb Raider.
Nonetheless the full source code to TressFX was available to the developer throughout, and we also know that the game was available to NVIDIA several weeks ahead of the actual release for NVIDIA to address the bugs in their driver and to optimize for TressFX.
Again, I apologize for the mistake.
That definitely paints a little bit of a different picture on around the release of TressFX with the rebooted Tomb Raider title. NVIDIA's complaint that "AMD was doing the same thing" holds a bit more weight. Since Richard Huddy was not with AMD at the time of this arrangement I can see how he would mix up the specifics, even after getting briefed by other staff members.
If you want to be sure you don't miss any more of our live streaming events, be sure to keep an eye on the schedule on the right hand side of our page or sign up for our PC Perspective Live mailing list right here.
Subject: Editorial | June 12, 2014 - 02:28 PM | Ken Addison
Tagged: Z97X-SOC Force, video, titan z, radeon, project tango, podcast, plextor, nvidia, Lightning, gtx titan z, gigabyte, geforce, E3 14, amd, 4790k, 290x
PC Perspective Podcast #304 - 06/12/2014
We have lots of reviews to talk about this week including the GeForce GTX TITAN Z, Core i7-4790K, Gigabyte Z97X-SOC Force, E3 News and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Josh Walrath, Jeremy Hellstrom and Allyn Maleventano
Pimp Next Week Events
0:03:45 Podcast #305 with David Hewlett!
Week in Review:
News items of interest:
Hardware/Software Picks of the Week:
0:57:40 Ryan: Ken's Switching Hardware
1:02:15 Josh: This saved me last weekend.
1:04:00 Allyn: Sony DSC-RX10
With the GPU landscape mostly settled for 2014, we have the ability to really dig in and evaluate the retail models that continue to pop up from NVIDIA and AMD board partners. One of our favorite series of graphics cards over the years comes from MSI in the form of the Lightning brand. These cards tend to take the engineering levels to a point other designers simply won't do - and we love it! Obviously the target of this capability is additional overclocking headroom and stability, but what if the GPU target has issues scaling already?
That is more or less the premise of the Radeon R9 290X Lightning from MSI. AMD's Radeon R9 290X Hawaii GPU is definitely a hot and power hungry part and that caused quite a few issues at the initial release. Since then though, both AMD and its add-in card partners have worked to improve the coolers installed on these cards to improve performance reliability and decrease the LOUD NOISES produced by the stock, reference cooler.
Let's dive into the latest to hit our test bench, the MSI Radeon R9 290X Lightning.
The MSI Radeon R9 290X Lightning
MSI continues to utilize the yellow and black color scheme that many of the company's high end parts integrate and I love the combination. I know that both NVIDIA and AMD disapprove of the distinct lack of "green" and "red" in the cooler and box designs, but good on MSI for sticking to its own thing.
The box for the Lightning card is equal to the prominence of the card itself and you even get a nifty drawer for all of the included accessories.
We originally spotted the MSI R9 290X Lightning at CES in January and the design remains the same. The cooler is quite large (and damn heavy) and is cooled by a set of three fans. The yellow fan in the center is smaller and spins a bit faster, creating more noise than I would prefer. All fan speeds can be adjusted with MSI's included fan control software.
Subject: Graphics Cards | June 2, 2014 - 11:41 PM | Sebastian Peak
Tagged: computex, radeon, r9 295x2, Hawaii XT, dual gpu, computex 2014, ASUS ROG, asus, Ares, amd
The latest installment in the ASUS ARES series of ultra-powerful, limited-edition graphics cards has been announced, and the Ares III is set to be the “world’s fastest” video card.
The dual-GPU powerhouse is driven by two “hand-selected” Radeon Hawaii XT GPUs (R9 290X cores) with 8GB of GDDR5 memory. The card is overclockable according to ASUS, and will likely arrive factory overclocked as they claim it will be faster out of the box than the reference R9-295x2. The ARES III features a custom-designed EK water block, so unlike the R9 295x2 the end user will need to supply the liquid cooling loop.
ASUS claims that the ARES III will “deliver 25% cooler performance than reference R9 295X designs“, but to achieve this ASUS “highly” recommends a high flow rate loop with at least a 120x3 radiator “to extract maximum performance from the card,” and they “will provide a recommended list of water cooling systems at launch”.
Only 500 of the ARES III will be made, and are individually numbered. No pricing has been announced, but ASUS says to expect it to be more than a 295x2 ($1499) - but less than a TITAN Z ($2999). The ASUS ROG ARES III will be available in Q3 2014.
For more Computex 2014 coverage, please check out our feed!
Subject: General Tech, Graphics Cards | May 27, 2014 - 12:00 AM | Scott Michaud
Tagged: radeon, R9, R7, eyefinity, amd
AMD has just launched their Catalyst 14.6 Beta drivers for Windows and Linux. This driver will contain performance improvements for Watch Dogs, launching today in North America, and Murdered: Soul Suspect, which arrives next week. On Linux, the driver now supports Ubuntu 14.04 and its installation process has been upgraded for simplicity and user experience.
Unless performance improvements are more important to you, the biggest feature is the support for Eyefinity with mixed resolutions. With Catalyst 14.6, you no longer need a grid of identical monitors. One example use case, suggested by AMD, is a gamer who purchases an ultra-wide 2560x1080 monitor. They will be able to add a pair of 1080p monitors on either side to create a 6400x1080 viewing surface.
If the monitors are very mismatched, the driver will allow users to letterbox to the largest rectangle contained by every monitor, or "expand" to draw the largest possible rectangle (which will lead to some assets drawing outside of any monitor). A third mode, fill, behaves like Eyefinity currently does. I must give AMD a lot of credit for leaving the choice to the user.
Returning to performance with actual figures, AMD claims "up to" 25% increases in Watch Dogs at 1080p or 28% at 1600p, compared to the previous version. The new CrossFire profile also claims up to 99% scaling in that game, at 2560x1600 with 8x MSAA. Murdered: Soul Suspect will see "up to" 16% improvements on a single card, and "up to" 93% scaling. Each of these results were provided by AMD, which tested on Radeon R9 290X cards. If these CrossFire profiles (well, first, are indicative of actual performance, and) see 99% scaling across two cards, that is pretty remarkable.
A brief mention, AMD has also expanded their JPEG decoder to Kabini. Previously, it was available to Kaveri, as of Catalyst 14.1. This allows using the GPU to display images, with their test showing a series of images being processed in about half of the time. While not claimed by AMD, I expect that the GPU will also be more power-efficient (as the processor can go back to its idle state much quicker, despite activitating another component to do so). Ironically, the three images I used for this news post are encoded in PNG. You might find that amusing.
AMD Catalyst 14.6 Beta Drivers should be now available at their download site.
Subject: Graphics Cards | May 26, 2014 - 05:16 PM | Jeremy Hellstrom
Tagged: amd, radeon, r9 295x2, R9 290X
Through hard work or good luck you find yourself the proud owner of an R9 295X2 and a 4K display but somehow the performance just isn't quite good enough. You can't afford another X2 though there is an R9 290X in your price range but you just aren't sure if it will help your system out at all. That is where [H]ard|OCP steps in with this review where they prove that tri-fire in this configuration does indeed work. Not only does it work, it allows you to vastly increase your performance over a 295X2 or to improve the performance somewhat while raising your graphics settings to new highs. For those using 5760x1200 Eyefinity you probably already have your graphics options cranked; this upgrade will still offer you a linear increase in performance. Not bad if you have the money to invest!
"Will adding a single AMD Radeon R9 290X video card to the AMD Radeon R9 295X2 work? Will you get triple-GPU performance, ala TriFire CrossFire performance? This just might be a more financially feasible configuration for gamers versus QuadFire that provides a great gaming experience in Eyefinity and 4K resolutions."
Here are some more Graphics Card articles from around the web:
- Sapphire R9 290X Vapor-X OC @ Kitguru
- Sapphire Vapor-X R9 290X - Cooling the Savage Beast @HiTech Legion
- MSI Radeon R9 280X Gaming 6 GB @ techPowerUp
- Sapphire R9 290X Vapor-X OC 4GB Video Card Review @ Legit Reviews
- CLUB3D R9 290X RoyalAce Superoverclock @ Kitguru
- AMD Radeon R9 290 On Ubuntu 14.04 With Catalyst Can Beat Windows 8.1 @ Phoronix
- Catalyst On Ubuntu 14.04 Linux Competes Well With Windows 8.1 @ Phoronix
- High-End NVIDIA GeForce vs. AMD Radeon Linux Gaming Comparison @ Phoronix
- Windows 8.1 Still Outperforms Linux With Latest Intel GPU Drivers @ Phoronix
- Raijintek Morpheus @ techPowerUp
- PNY GTX 780 XLR8 OC Edition @ [H]ard|OCP
- Palit GTX780 Jetstream 6GB SLi (Ultra HD 4K) @ Kitguru
- Gigabyte GeForce GTX 750 @ Hardware Secrets
- ASUS GeForce GTX 780 Ti DirectCU II OC @ X-bit Labs
Subject: Graphics Cards | May 15, 2014 - 06:16 PM | Ryan Shrout
Tagged: radeon, R9 290X, r9 290, r9 280x, r9 280, amd
Just the other day AMD sent out an email to the media to discuss the current pricing situation of the Radeon R9 series of graphics cards. This email started with the following statement.
You’ve seen many articles, discussions online about the AMD Radeon™ R9 lineup – especially chatter about pricing and availability. As we’ve talked about it before, the demand for the R9 lineup has been nothing but astonishing, and went well beyond our most optimistic expectations. That created a situation where gamers weren’t able to purchase their desired R9 graphics card.
Clearly AMD would not bring up the subject if the current situation was BAD news so guess what? All seems to be back normal (or expected) in terms of AMD Radeon R9 pricing and card availability. Take a look at the table below to get an idea of where Radeon's currently stand.
|Radeon R9 295X2||$1524||$1499|
|Radeon R9 290X||$549||$529|
|Radeon R9 290||$379||$399|
|Radeon R9 280X||$289||$299|
|Radeon R9 280||$249||$249|
|Radeon R9 270X||$199||$189|
|Radeon R9 270||$169||$179|
There is one price change that differs from the products' launch - the SEP of the Radeon R9 280 has dropped from $279 to $249. Nothing dramatic but a nice change.
Maybe most interesting is this line from the AMD email.
Now that product is available and at suggested pricing, these prices will remain stable. No more madness like you saw in Q1.
That emphasis is AMD's. I'm not quite sure how the company thinks they can keep a tight control on pricing now if it wasn't able to do so before, but more than likely, with the rush for coin mining hardware somewhat dying off, the prediction will hold true. (As a side note, there appears to be some discounts to be found on used Radeon hardware these days...)
Of course the AMD bundling promotion known as Never Settle Forever is still going strong with these new prices as well. Scott wrote up a story detailing this latest incarnation of the promotion and he and I both agree that while free is always
good great, the age of most of the titles in the program is a bit of a problem. But AMD did note in this email that they have "lined up a few brand new games to add to this promotion, and they'll [sic] be sharing more info with you in the next few weeks!"
Subject: General Tech | April 29, 2014 - 06:29 PM | Jeremy Hellstrom
Tagged: 4k, amd, crossfire, quad crossfire, r9 295x2, radeon, video
Ryan isn't the only crazy one out there stringing 2 PSUs together to power a pair of AMD's massively powerful 295X2s in CrossFire; the gang at [H]ard|OCP did as well after taking the Mickey with a certain Brian. As with Ryan's experiment they required a second PSU, in this case a 1350W plus an 850W in order to stop the rig from crashing. Their test components also differed somewhat, a Maximus V Extreme instead of a P9X79 Deluxe and slightly different RAM and Win 8.1 installed on their SSD. The other reason to check them out is the Eyefinity 5760 x 1200 tests in addition to the 4K tests.
"Got extra PCIe slots and have no idea what in the world you can do with those? Well if you have $3000 burning a hole in your pocket, wiring in your house that is up to code, a good air conditioning system, and a Type C fire extinguisher that you are not using, AMD's Radeon R9 295X2 QuadFire may be just what the fire marshal ordered."
Here are some more Graphics Card articles from around the web:
- Custom-cooled Radeon R9 290X cards from Asus and XFX @ The Tech Report
- Sapphire Vapor-X R9 290 Tri-X OC Video Card Review @ Legit Reviews
- MSI Radeon R9 290X Lightning 4 GB @ techPowerUp
- Sapphire R9 280X Vapor-X (Tri-X) OC 3GB @ eTeknix
- XFX Radeon R7 250 Core Edition Video Card Review @ Hardware Secrets
- GeForce 700 vs. Radeon Rx 200 Series With The Latest Linux Drivers @ Phoronix
- 13-Way Low-End GPU Comparison With AMD's AM1 Athlon @ Phoronix
- EVGA Backplate Install for the GTX 780 Ti Classified @ Hardware Asylum
- Gigabyte GTX 750 Ti WindForce 2X OC 2GB @ eTeknix
- ASUS GTX 750 Ti OC 2GB @ eTeknix
- TKFA2 GTX 750 Ti OC 2GB @ eTeknix
You need a bit of power for this
PC gamers. We do some dumb shit sometimes. Those on the outside looking in, forced to play on static hardware with fixed image quality and low expandability, turn up their noses and question why we do the things we do. It’s not an unfair reaction, they just don’t know what they are missing out on.
For example, what if you decided to upgrade your graphics hardware to improve performance and allow you to up the image quality on your games to unheard of levels? Rather than using a graphics configuration with performance found in a modern APU you could decide to run not one but FOUR discrete GPUs in a single machine. You could water cool them for optimal temperature and sound levels. This allows you to power not 1920x1080 (or 900p), not 2560x1400 but 4K gaming – 3840x2160.
All for the low, low price of $3000. Well, crap, I guess those console gamers have a right to question the sanity of SOME enthusiasts.
After the release of AMD’s latest flagship graphics card, the Radeon R9 295X2 8GB dual-GPU beast, our mind immediately started to wander to what magic could happen (and what might go wrong) if you combined a pair of them in a single system. Sure, two Hawaii GPUs running in tandem produced the “fastest gaming graphics card you can buy” but surely four GPUs would be even better.
The truth is though, that isn’t always the case. Multi-GPU is hard, just ask AMD or NVIDIA. The software and hardware demands placed on the driver team to coordinate data sharing, timing control, etc. are extremely high even when you are working with just two GPUs in series. Moving to three or four GPUs complicates the story even further and as a result it has been typical for us to note low performance scaling, increased frame time jitter and stutter and sometimes even complete incompatibility.
During our initial briefing covering the Radeon R9 295X2 with AMD there was a system photo that showed a pair of the cards inside a MAINGEAR box. As one of AMD’s biggest system builder partners, MAINGEAR and AMD were clearly insinuating that these configurations would be made available for those with the financial resources to pay for it. Even though we are talking about a very small subset of the PC gaming enthusiast base, these kinds of halo products are what bring PC gamers together to look and drool.
As it happens I was able to get a second R9 295X2 sample in our offices for a couple of quick days of testing.
Working with Kyle and Brent over at HardOCP, we decided to do some hardware sharing in order to give both outlets the ability to judge and measure Quad CrossFire independently. The results are impressive and awe inspiring.
Get notified when we go live!