Subject: Graphics Cards | January 30, 2014 - 12:15 PM | Jeremy Hellstrom
Tagged: xfx, double d, R9 290X
The only thing more fun that an XFX Double Dissipation R9 290X is two of them in Crossfire, which is exactly what [H]ard|OCP just tested. These cards sport the familiar custom cooler though they are not overclocked nor is [H] testing overclocking in this review though they will revisit this card in the future to do exactly that. This review is about the Crossfire performance of these cards straight out of the box and it is rather impressive. When [H] tested 4K performance they could feel the frame pacing improvements the new driver gives as well as seeing these cards outperform the SLI'd GTX 780 Ti cards in every test; though not always by a huge margin. The current selling price of these cards is about $100 above the MSRP but still come in cheaper than the current NVIDIA card; these particular cards really show off what Hawaii can be capable of.
"Take two custom XFX R9 290X Double Dissipation Edition video cards, enable CrossFire, and let your jaw hit the floor. We will test this combination against the competition in a triple-display Eyefinity setup as well as 4K Ultra HD display gaming. We will find out if custom cards hold any advantage over the reference designed R9 290X."
Here are some more Graphics Card articles from around the web:
- Sapphire Radeon R7 260X @ Phoronix
- Gigabyte R9 290 WindForce OC @ Kitguru
- XFX Radeon R9 280X Black Edition @ Benchmark Reviews
- XFX Radeon R9 290X Double Dissipation Review @ Hardware Canucks
- Gigabyte R9 290X WindForce OC 4GB @ eTeknix
- Sapphire Dual-X R9 270 Graphics Card Review @ Modders-Inc
- EK Waterblocks R280X Matrix Edition Full Cover Block Review @ Madshrimps
- 24-Way AMD Radeon vs. NVIDIA GeForce Linux Graphics Card Comparison @ Phoronix
- 25-Way Open-Source Linux Graphics Card Comparison @ Phoronix
- MSI GTX 760 Mini-ITX Gaming 2 GB @ techPowerUp
A troubled launch to be sure
AMD has released some important new drivers with drastic feature additions over the past year. Remember back in August of 2013 when Frame Pacing was first revealed? Today’s Catalyst 14.1 beta release will actually complete the goals that AMD set forth upon itself in early 2013 in regards to introducing (nearly) complete Frame Pacing technology integration for non-XDMA GPUs while also adding support for Mantle
and HSA capability.
Frame Pacing Phase 2 and HSA Support
When AMD released the first frame pacing capable beta driver in August of 2013, it added support to existing GCN designs (HD 7000-series and a few older generations) at resolutions of 2560x1600 and below. While that definitely addressed a lot of the market, the fact was that CrossFire users were also amongst the most likely to have Eyefinity (3+ monitors spanned for gaming) or even 4K displays (quickly dropping in price). Neither of those advanced display options were supported with any Catalyst frame pacing technology.
That changes today as Phase 2 of the AMD Frame Pacing feature has finally been implemented for products that do not feature the XDMA technology (found in Hawaii GPUs for example). That includes HD 7000-series GPUs, the R9 280X and 270X cards, as well as older generation products and Dual Graphics hardware combinations such as the new Kaveri APU and R7 250. I have already tested Kaveri and the R7 250 in fact, and you can read about its scaling and experience improvements right here. That means that users of the HD 7970, R9 280X, etc., as well as those of you with HD 7990 dual-GPU cards, will finally be able to utilize the power of both GPUs in your system with 4K displays and Eyefinity configurations!
This is finally fixed!!
As of this writing I haven’t had time to do more testing (other than the Dual Graphics article linked above) to demonstrate the potential benefits of this Phase 2 update, but we’ll be targeting it later in the week. For now, it appears that you’ll be able to get essentially the same performance and pacing capabilities on the Tahiti-based GPUs as you can with Hawaii (R9 290X and R9 290).
Catalyst 14.1 beta is also the first public driver to add support for HSA technology, allowing owners of the new Kaveri APU to take advantage of the appropriately enabled applications like LibreOffice and the handful of Adobe apps. AMD has since let us know that this feature DID NOT make it into the public release of Catalyst 14.1.
The First Mantle Ready Driver (sort of)
A technology that has been in development for more than two years according to AMD, the newly released Catalyst 14.1 beta driver is the first to enable support for the revolutionary new Mantle API for PC gaming. Essentially, Mantle is AMD’s attempt at creating a custom API that will replace DirectX and OpenGL in order to more directly target the GPU hardware in your PC, specifically the AMD-based designs of GCN (Graphics Core Next).
Mantle runs at a lower level than DX or OGL does, able to more directly access the hardware resources of the graphics chips, and with that ability is able to better utilize the hardware in your system, both CPU and GPU. In fact, the primary benefit of Mantle is going to be seen in the form of less API overhead and bottlenecks such as real-time shader compiling and code translation.
If you are interested in the meat of what makes Mantle tick and why it was so interesting to us when it was first announced in September of 2013, you should check out our first deep-dive article written by Josh. In it you’ll get our opinion on why Mantle matters and why it has the potential for drastically changing the way the PC is thought of in the gaming ecosystem.
Subject: Graphics Cards, Processors | January 29, 2014 - 12:44 PM | Ryan Shrout
Tagged: video, nvidia, Intel, gt 630, APU, amd, A10-7850K, 7850k
The most interesting aspect of the new Kaveri-based APUs from AMD, in particularly the A10-7850K part, is how it improves mainstream gaming performance. AMD has always stated that these APUs shake up the need for low-cost discrete graphics and when we got the new APU in the office we did a couple of quick tests to see how much validity there to that claim.
In this short video we compare the A10-7850K APU against a combination of the Intel Core i3-4330 and GeForce GT 630 discrete graphics card in five of 2013's top PC releases. I think you'll find the results pretty interesting.
UPDATE: I've had some questions about WHICH of the GT 630 SKUs were used in this testing. Our GT 630 was this EVGA model that is based on 96 CUDA cores and a 128-bit DDR3 memory interface. You can see a comparison of the three current GT 630 options on NVIDIA's website here.
If you are looking for more information on AMD's Kaveri APUs you should check out my review of the A8-7600 part as well our testing of Dual Graphics with the A8-7600 and a Radeon R7 250 card.
Subject: Graphics Cards | January 29, 2014 - 12:00 PM | Ryan Shrout
Tagged: R9 290X, r9 290, r9 270, mining, litecoin, dogecoin, amd
I know we have posted about this a few times on PC Perspective and have discussed it on the PC Perspective Podcast as well, but if you are curious as to why the prices of AMD's latest generation of R9 graphics cards have skyrocketed, look no further than this enterprising consumer and his/her Dogecoin mining rig.
What you are looking at are six MSI Gaming series R9 270 cards running through the aid of PCI Express to USB 3.0 riser cards.
Hybrid CrossFire that actually works
The road to redemption for AMD and its driver team has been a tough one. Since we first started to reveal the significant issues with AMD's CrossFire technology back in January of 2013 the Catalyst driver team has been hard at work on a fix, though I will freely admit it took longer to convince them that the issue was real than I would have liked. We saw the first steps of the fix released in August of 2013 with the release of the Catalyst 13.8 beta driver. It supported DX11 and DX10 games and resolutions of 2560x1600 and under (no Eyefinity support) but was obviously still less than perfect.
In October with the release of AMD's latest Hawaii GPU the company took another step by reorganizing the internal architecture of CrossFire on the chip level with XDMA. The result was frame pacing that worked on the R9 290X and R9 290 in all resolutions, including Eyefinity, though still left out older DX9 titles.
One thing that had not been addressed, at least not until today, was the issues that surrounded AMD's Hybrid CrossFire technology, now known as Dual Graphics. This is the ability for an AMD APU with integrated Radeon graphics to pair with a low cost discrete GPU to improve graphics performance and gaming experiences. Recently over at Tom's Hardware they discovered that Dual Graphics suffered from the exact same scaling issues as standard CrossFire; frame rates in FRAPS looked good but the actually perceived frame rate was much lower.
A little while ago a new driver made its way into my hands under the name of Catalyst 13.35 Beta X, a driver that promised to enable Dual Graphics frame pacing with Kaveri and R7 graphics cards. As you'll see in the coming pages, the fix definitely is working. And, as I learned after doing some more probing, the 13.35 driver is actually a much more important release than it at first seemed. Not only is Kaveri-based Dual Graphics frame pacing enabled, but Richland and Trinity are included as well. And even better, this driver will apparently fix resolutions higher than 2560x1600 in desktop graphics as well - something you can be sure we are checking on this week!
Just as we saw with the first implementation of Frame Pacing in the Catalyst Control Center, with the 13.35 Beta we are using today you'll find a new set of options in the Gaming section to enable or disable Frame Pacing. The default setting is On; which makes me smile inside every time I see it.
The hardware we are using is the same basic setup we used in my initial review of the AMD Kaveri A8-7600 APU review. That includes the A8-7600 APU, an Asrock A88X mini-ITX motherboard, 16GB of DDR3 2133 MHz memory and a Samsung 840 Pro SSD. Of course for our testing this time we needed a discrete card to enable Dual Graphics and we chose the MSI R7 250 OC Edition with 2GB of DDR3 memory. This card will run you an additional $89 or so on Amazon.com. You could use either the DDR3 or GDDR5 versions of the R7 250 as well as the R7 240, but in our talks with AMD they seemed to think the R7 250 DDR3 was the sweet spot for the CrossFire implementation.
Both the R7 250 and the A8-7600 actually share the same number of SIMD units at 384, otherwise known as 384 shader processors or 6 Compute Units based on the new nomenclature that AMD is creating. However, the MSI card is clocked at 1100 MHz while the GPU portions of the A8-7600 APU are running at only 720 MHz.
So the question is, has AMD truly fixed the issues with frame pacing with Dual Graphics configurations, once again making the budget gamer feature something worth recommending? Let's find out!
Subject: General Tech, Graphics Cards | January 28, 2014 - 04:00 PM | Scott Michaud
Tagged: Mantle, BF4, amd
A number of sites have reported on Toshiba's leak of the Catalyst 13.35 BETA driver. Mantle and TrueAudio support highlight its rumored changelog. Apparently Ryan picked it up, checked it out, and found that it does not have the necessary DLLs included. I do not think he has actual Mantle software to test against, and I am not sure how he knew what libraries Mantle requires, but this package apparently does not include them. Perhaps it was an incomplete build?
Sorry folks, unlike the above image, these are not the drivers you are looking for.
The real package should be coming soon, however. Recent stories which reference EA tech support (at this point we should all know better) claim that the Mantle update for Battlefield 4 will be delayed until February. Fans reached out to AMD's Robert Hallock who responded that it was, "Categorically not true". It sounds like AMD is planning on releasing at least their end of the patch before Friday ends.
This is looking promising, at least. Something is being done behind the scenes.
Subject: Graphics Cards | January 23, 2014 - 03:01 PM | Jeremy Hellstrom
Tagged: amd, asus, R9 290X DC2 OC, overclocking
[H]ard|OCP has had a chance to take the time to really see how well the R9 290X can overclock, as frequencies get lower as heat increases a quick gaming session is not enough to truly represent the performance of this new GPU. The ASUS R9 290X DirectCU II OC offers a custom cooler which demonstrated the overclocking potential of this GPU on air cooling, or at least this specific GPU as we have seen solid evidence of performance variability with 28nm Hawaii GPUs. You should read the full review to truly understand what they saw when overclocking but the good news is that once they found a sweet spot for fan speed and voltage the GPU remained at the frequency they chose. Unfortunately at 1115MHz the overclock they managed was only 75MHz higher than the cards default speed and while that could beat a stock GTX 780 Ti, the NVIDIA product overclocked higher and proved the superior card.
"We will take the ASUS R9 290X DC2 OC custom AMD R9 290X based video card and for the first time see how well the 290X can overclock. We will also for the first time compare it to an overclocked GeForce GTX 780 Ti video card head-to-head and see who wins when overclocking is accounted for."
Here are some more Graphics Card articles from around the web:
- Sapphire R9 290 4GB TRI-X OC Review @ Hardware Canucks
- HIS R9 270X IceQ X² Turbo Boost 2GB @ eTeknix
- Sapphire R9 290 Tri-X 4GB @ eTeknix
- Powercolor R9 280X TurboDuo 3GB @ eTeknix
- ASUS R9 290X DirectCU II OC 4 GB @ techPowerUp
- Gigabyte AMD Radeon R9 290X WF OC Video Card Review @ Madshrimps
- Sapphire Radeon R7 260X OC Review @ TechwareLabs
- EVGA GTX 780 Ti Classified 3072 MB @ techPowerUp
- Gigabyte GTX 780 Ti GHZ Edition Review! @ Bjorn3D
Subject: General Tech, Graphics Cards | January 23, 2014 - 12:29 AM | Scott Michaud
Tagged: ShadowPlay, nvidia, geforce experience
NVIDIA has been upgrading their GeForce Experience just about once per month, on average. Most of their attention has been focused on ShadowPlay which is their video capture and streaming service for games based on DirectX. GeForce Experience 1.8.1 brought streaming to Twitch and the ability to overlay the user's webcam.
Until this version, users could choose between "Low", "Medium", and "High" quality stages. GeForce Experence 1.8.2 adds "Custom" which allows manual control over resolution, frame rate, and bit rate. NVIDIA wants to makes it clear: frame rate controls the number of images per second and bit rate controls the file size per second. Reducing the frame rate without adjusting the bit rate will result in a file of the same size (just with better quality per frame).
Also with this update, NVIDIA allows users to set a push-to-talk key. I expect this will be mostly useful for Twitch streaming in a crowded dorm or household. Only transmitting your voice when you have something to say prevents someone else from accidentally transmitting theirs globally and instantaneously.
GeForce Experience 1.8.2 is available for download at the GeForce website. Users with a Fermi-based GPU will no longer be pushed GeForce Experience (because it really does not do anything for those graphics cards). The latest version can always be manually downloaded, however.
Subject: General Tech, Graphics Cards, Processors | January 22, 2014 - 06:41 PM | Scott Michaud
AMD had a decent quarter and close to a profitable year as a whole. For the quarter ending on December 28th, the company managed $89 million dollars in profits. This accounts for interest payments on loans and everything else. The whole year averaged to a $103 million dollar gain in operating income although that still works out to a loss of $74 million (for the year) all things considered. That said, a quarterly gain of $89 million versus an annual loss of $74 million. One more quarter would forgive the whole year.
This is a hefty turn-around from their billion dollar operating loss of last year.
This gain was led by Graphics and Visual Solutions. While Computing Solutions revenue has declined, the graphics team has steadily increased in both revenue and profits. Graphics and Visual Solutions are in charge of graphics processors as well as revenue from the game console manufacturers. Even then, their processor division is floating just below profitability.
Probably the best news for AMD is that they plan the next four quarters to each be profitable. Hopefully this means that there are no foreseen hurdles in the middle of their marathon.
Subject: Editorial, General Tech, Graphics Cards | January 21, 2014 - 11:12 PM | Scott Michaud
Tagged: linux, intel hd graphics, haswell
Looking through this post by Phoronix, it would seem that Intel had a significant regression in performance on Ubuntu 14.04 with the Linux 3.13 kernel. In some tests, HD 4600 only achieves about half of the performance recorded on the HD 4000. I have not been following Linux iGPU drivers and it is probably a bit late to do any form of in-depth analysis... but yolo. I think the article actually made a pretty big mistake and came to the exact wrong conclusion.
Let's do this!
According to the article, in Xonotic v0.7, Ivy Bridge's Intel HD 4000 scores 176.23 FPS at 1080p on low quality settings. When you compare this to Haswell's HD 4600 and its 124.45 FPS result, this seems bad. However, even though they claim this as a performance regression, they never actually post earlier (and supposedly faster) benchmarks.
So I dug one up.
Back in October, the same test was performed with the same hardware. The Intel HD 4600 was not significantly faster back then, rather it was actually a bit slower with a score of 123.84 FPS. The Intel HD 4000 managed 102.68 FPS. Haswell did not regress between that time and Ubuntu 14.04 on Linux 3.13, Ivy Bridge received a 71.63% increase between then and Ubuntu 14.04 on Linux 3.13.
Of course, there could have been a performance increase between October and now and that recently regressed for Haswell... but I could not find those benchmarks. All I can see is that Haswell has been quite steady since October. Either way, that is a significant performance increase on Ivy Bridge since that snapshot in time, even if Haswell had a rise-and-fall that I was unaware of.
Get notified when we go live!