Author:
Manufacturer: AMD

You need a bit of power for this

PC gamers. We do some dumb shit sometimes. Those on the outside looking in, forced to play on static hardware with fixed image quality and low expandability, turn up their noses and question why we do the things we do. It’s not an unfair reaction, they just don’t know what they are missing out on.

For example, what if you decided to upgrade your graphics hardware to improve performance and allow you to up the image quality on your games to unheard of levels? Rather than using a graphics configuration with performance found in a modern APU you could decide to run not one but FOUR discrete GPUs in a single machine. You could water cool them for optimal temperature and sound levels. This allows you to power not 1920x1080 (or 900p), not 2560x1400 but 4K gaming – 3840x2160.

IMG_9980.JPG

All for the low, low price of $3000. Well, crap, I guess those console gamers have a right to question the sanity of SOME enthusiasts.

After the release of AMD’s latest flagship graphics card, the Radeon R9 295X2 8GB dual-GPU beast, our mind immediately started to wander to what magic could happen (and what might go wrong) if you combined a pair of them in a single system. Sure, two Hawaii GPUs running in tandem produced the “fastest gaming graphics card you can buy” but surely four GPUs would be even better.

The truth is though, that isn’t always the case. Multi-GPU is hard, just ask AMD or NVIDIA. The software and hardware demands placed on the driver team to coordinate data sharing, timing control, etc. are extremely high even when you are working with just two GPUs in series. Moving to three or four GPUs complicates the story even further and as a result it has been typical for us to note low performance scaling, increased frame time jitter and stutter and sometimes even complete incompatibility.

IMG_0002.JPG

During our initial briefing covering the Radeon R9 295X2 with AMD there was a system photo that showed a pair of the cards inside a MAINGEAR box. As one of AMD’s biggest system builder partners, MAINGEAR and AMD were clearly insinuating that these configurations would be made available for those with the financial resources to pay for it. Even though we are talking about a very small subset of the PC gaming enthusiast base, these kinds of halo products are what bring PC gamers together to look and drool.

As it happens I was able to get a second R9 295X2 sample in our offices for a couple of quick days of testing.

Working with Kyle and Brent over at HardOCP, we decided to do some hardware sharing in order to give both outlets the ability to judge and measure Quad CrossFire independently. The results are impressive and awe inspiring.

Continue reading our review of the AMD Radeon R9 295X2 CrossFire at 4K!!

NVIDIA Announces Watch_Dogs Bundle with GeForce GPUs

Subject: Graphics Cards | April 29, 2014 - 07:22 AM |
Tagged: nvidia, watch_dogs, watch dogs, bundle, geforce

A bit of a surprise email found its way to my inbox today that announced NVIDIA's partnership with Ubisoft to include copies of the upcoming Watch_Dogs game with GeForce GTX graphics cards. 

watchdogs.jpg

Gamers that purchase a GeForce GTX 780 Ti, GTX 780, GTX 770 or GTX 760 from select retailers will qualify for a free copy of the game. You can details on this bundle and available GPUs to take advantage of it at Amazon.com!

The press release also confirms inclusion of NVIDIA exclusive features like TXAA and HBAO+ in the game itself, which is interesting. From what I am hearing, Watch_Dogs is going to be a beast of a game on GPU hardware and we are looking forward to using it as a test platform going forward.

Full press release is included below.

OWN THE TECH AND CONTROL THE CITY WITH NVIDIA® AND UBISOFT®

Select GeForce GTX GPUs Now Include the Hottest Game of the Year: Watch_Dogs™

Santa Clara, CA  April 29, 2014 — Destructoid calls it one of the “most wanted games of 2014.” CNET said it was “one of the most anticipated games in recent memory.” MTV said it’s one of the “Can’t-Miss Video Games of 2014.” This, all before anyone out there has even played it.

So, starting today(1), gamers who purchase select NVIDIA® GeForce® GTX® 780 Ti, 780, 770 and 760 desktop GPUs can get their chance to play Watch_Dogs™, the new PC game taking the world by storm and latest masterpiece from Ubisoft®.

Described as a “polished, refined and truly next generation experience,” in Watch_Dogs you play as Aiden Pearce, a brilliant hacker whose criminal past led to a violent family tragedy. While seeking justice, you will monitor and hack those around you, access omnipresent security cameras, download personal information to locate a target, control traffic lights and public transportation to stop the enemy and more.

Featuring NVIDIA TXAA and HBAO+ technology for an interactive, immersive experience, it’s clear that gamers can’t wait to play Watch_Dogs, especially considering the effusive praise that the official trailer received. Launched mere weeks ago, the trailer has already been viewed more than a combined 650,000 times. For gamers, Watch_Dogs seamlessly blends a mixture of single-player and multiplayer action in a way never before seen, and Ubisoft has gone one step further in creating a unique ctOS mobile companion app for users of smartphone and tablet devices allowing for even greater access to the fun. If you haven’t checked out the trailer, please check it out here: https://www.youtube.com/watch?v=3eHCJ8pWdf0.

The GeForce GTX and Watch_Dogs bundle is available starting today from leading e-tailers including Newegg, Amazon.com, TigerDirect, NCIX; add-in card vendors such as EVGA; and nationwide system builders including AVADirect, CyberPower, Digital Storm, Falcon Northwest, iBUYPOWER, Maingear, Origin PC, Puget Systems, V3 Gaming PC and Velocity Micro. For a full list of participating partners, please visit: www.GeForce.com/GetWatchDogs.

Source: NVIDIA

Another GPU Driver Showdown: AMD vs NVIDIA in Linux

Subject: General Tech, Graphics Cards | April 27, 2014 - 01:22 AM |
Tagged: nvidia, linux, amd

GPU drivers have been a hot and sensitive topic at the site, especially recently, probably spurred on by the announcements of Mantle and DirectX 12. These two announcements admit and illuminate (like a Christmas tree) the limitations of APIs on gaming performance. Both AMD and NVIDIA have their recent successes and failures on their respective fronts. This will not deal with that, though. This is a straight round-up of new GPUs running the latest drivers... in Linux.

7-TuxGpu.png

Again, results are mixed and a bit up for interpretation.

In all, NVIDIA tends to have better performance with its 700-series parts than equivalently-priced R7 or R9 products from AMD, especially in low-performance Source Engine titles such as Team Fortress 2. Sure, even the R7 260X was almost at 120 FPS, but the R9 290 was neck-and-neck with the GeForce GTX 760. The GeForce GTX 770, about $50 cheaper than the R9 290, had a healthy 10% lead over it.

In Unigine Heaven, however, the AMD R9 290 passed the NVIDIA GTX 770 by a small margin, coming right in line with it's aforementioned $50-bigger price tag. In that situation, where performance became non-trivial, AMD caught up (but did not beat). Also, third-party driver support is more embraced by AMD than NVIDIA. On the other hand, NVIDIA's proprietary drivers are demonstrably better, even if you would argue that the specific cases are trivial because of overkill.

And then there's Unvanquished, where AMD's R9 290 did not achieve triple-digit FPS scores despite the $250 GTX 760 getting 110 FPS.

Update: As pointed out in the comments, some games perform significantly better on the $130 R7 260X than the $175 GTX 750 Ti (HL2: Lost Coast, TF2, OpenArena, Unigine Sanctuary). Some other games are the opposite, with the 750 Ti holding a sizable lead over the R7 260X (Unigine Heaven and Unvanquished). Again, Linux performance is a grab bag between vendors.

There's a lot of things to consider, especially if you are getting into Linux gaming. I expect that it will be a hot topic, soon, as it picks up... ... Steam.

Source: Phoronix

AMD Catalyst 14.4 Release Candidate is now available

Subject: Graphics Cards | April 22, 2014 - 10:06 AM |
Tagged: catalyst 14.4, catalyst, amd

The latest available AMD Catalyst Windows and Linux drivers can be found here:
AMD Catalyst Windows: http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx
AMD Catalyst Linux: http://support.amd.com/en-us/kb-articles/Pages/latest-linux-beta-driver.aspx

image001.jpg

Highlights of AMD Catalyst™ 14.4 Windows Driver

  • Support for the AMD Radeon R9 295X

CrossFire fixes enhancements:

  • Crysis 3 – frame pacing improvements
  • Far Cry 3 – 3 and 4 GPU performance improvements at high quality settings, high resolution settings
  • Anno 2070 – Improved CrossFire scaling up to 34%
  • Titanfall – Resolved in game flickering with CrossFire enabled
  • Metro Last Light – Improved Crossfire scaling up to 10%
  • Eyefinity 3x1 (with three 4K panels) no longer cuts off portions of the application
  • Stuttering has been improved in certain applications when selecting mid-Eyefinity resolutions with V-sync Enabled

Full support for OpenGL 4.4
Mantle beta driver improvements:

  • BattleField 4: Performance slowdown is no longer seen when performing a task switch/Alt-tab
  • BattleField 4: Fuzzy images when playing in rotated SLS resolution with an A10 Kaveri system

Highlights of AMD Catalyst™ 14.1 Linux Driver

  • Support for the AMD Radeon R9 295X
  • Ubuntu 12.04.4 support
  • Full support for OpenGL 4.4

Resolved Issue highlights:

  • Corruption and system hang observed while running Sanctuary BM with Tear Free Desktop enabled
  • Memory leak about hardware context EGL create context error for glesx
  • GPU hand in CrossFire Mode [Piglit]
  • Test "spec/arb_vertex_array_object" failed [Piglit]
  • Test "glx/GLX_EXT_import_context/free context" failed [Piglit]
  • Test "spec/ARB_seamless_cube_map" failed Piglit]
  • Test "texture swizzle with border color" failed
  • Glxtest failures observed in log file Blank screen observed while running steam games with Big picture
  • 4ms delay observed in the glxSwapBuffers when vsync is enabled
  • RBDoom3BFG the game auto quit when use the security camera terminal
  • ETQW segmentation fault

Source: AMD

Nope, Never Settling... Forever. More Bundles.

Subject: General Tech, Graphics Cards | April 21, 2014 - 10:55 AM |
Tagged: radeon, never settle forever, never settle, amd

AMD has been taking PC gaming very seriously, especially over the last couple of years. While they have a dominant presence in the console space, with only IBM in opposition, I believe that direct licensing revenue was not their main goal, rather that they hope to see benefits carry over to the PC and maybe mobile spaces, eventually. In the PC space, Never Settle launched as a very successful marketing campaign. While it had a stutter with the launch of the R9 (and R7) product lines, it is back and is still called, "Never Settle Forever".

AMD-Never-Settle-Forever-2014-01.jpg

Keeping with Forever's alteration to the Never Settle formula, the type of card that you purchase yields a Gold, Silver, or Bronze reward. Gold (the R9 280 and R9 290 series, and the R9 295X2) gets three free games in the Gold tier, Silver (R9 270 and R7 260 series) gets two in the Silver tier, and Bronze (R7 250 and R7 240 series) gets one free game in the Bronze tier. By and large, the tiers are the same as last time plus a few old games and one upcoming Square Enix release: Murdered: Soul Suspect. They have also made deals with certain independent developers, where two indie titles bundled together count as one choice.

The complete breakdown of games is as follows:

 
Gold
(Choose 3)
Silver
(Choose 2)
Bronze
(Choose 1)
Murdered: Soul Suspect (June 3, 2014) Yes Yes No
Thief Yes Yes No
Tomb Raider Yes Yes No
Hitman: Absolution Yes Yes No
Sleeping Dogs Yes Yes No
Dungeon Siege III Yes Yes Yes
Dirt 3 Yes Yes Yes
Alan Wake Yes Yes Yes
Darksiders Yes Yes Yes
Darksiders II Yes Yes Yes
Company of Heroes 2 Yes Yes Yes
Total War: Shogun 2 Yes Yes Yes
Titan Quest (Gold Edition) Yes Yes Yes
Supreme Commander (Gold Edition) Yes Yes Yes
Deus Ex: Human Revolution Yes Yes No
Payday 2 Yes Yes No
Just Cause 2 Yes Yes Yes
Banner Saga + Mutant Blobs Attack (indie combo) Yes Yes Yes
Guacamelee + DYAD (indie combo) Yes Yes Yes
Mutant Blobs Attack + DYAD (indie combo) Yes Yes Yes
Banner Saga + DYAD (indie combo) Yes Yes Yes
Mutant Blobs Attack + Guacamelee (indie combo) Yes Yes Yes

Oddly enough, there does not seem to be a Banner Saga + Guacamelee combo...

... the only impossible combination.

AMD has also announced that Never Settle will continue for more "additions" in 2014. Which ones? Who knows. It is clear that they have a great working relationship with Square Enix Europe, including basically their last six major titles in Never Settle and keeping them there, but there is not really anything from them on the horizon (at least, not announced). AMD does sound confident in having other deals lined up this year, however.

amd-never-settle-forever-2014-02.jpg

Never Settle Forever graphics cards are available now "at participating retailers". Bundle codes can be redeemed any time between now and August 31st.

There is some regional variance in game availability, however. Read up before you purchase (especially if you live in Japan). You should be fine if you live in North America, Europe, Middle East, Africa, New Zealand, Australia, and Latin America, though, at least where AMD products are available. Still, it is a good idea to check.

Source: AMD
Author:
Manufacturer: Various

Competition is a Great Thing

While doing some testing with the AMD Athlon 5350 Kabini APU to determine it's flexibility as a low cost gaming platform, we decided to run a handful of tests to measure something else that is getting a lot of attention right now: AMD Mantle and NVIDIA's 337.50 driver.

Earlier this week I posted a story that looked at performance scaling of NVIDIA's new 337.50 beta driver compared to the previous 335.23 WHQL. The goal was to assess the DX11 efficiency improvements that the company stated it had been working on and implemented into this latest beta driver offering. In the end, we found some instances where games scaled by as much as 35% and 26% but other cases where there was little to no gain with the new driver. We looked at both single GPU and multi-GPU scenarios on mostly high end CPU hardware though.

Earlier in April I posted an article looking at Mantle, AMD's answer to a lower level API that is unique to its ecosystem, and how it scaled on various pieces of hardware on Battlefield 4. This was the first major game to implement Mantle and it remains the biggest name in the field. While we definitely saw some improvements in gaming experiences with Mantle there was work to be done when it comes to multi-GPU scaling and frame pacing. 

Both parties in this debate were showing promise but obviously both were far from perfect.

am1setup.jpg

While we were benchmarking the new AMD Athlon 5350 Kabini based APU, an incredibly low cost processor that Josh reviewed in April, it made sense to test out both Mantle and NVIDIA's 337.50 driver in an interesting side by side.

Continue reading our story on the scaling performance of AMD Mantle and NVIDIA's 337.50 driver with Star Swarm!!

An overclocked flagship GPU duel

Subject: Graphics Cards | April 17, 2014 - 01:10 PM |
Tagged: amd, nividia, gigabyte, asus, R9 290X, GeForce GTX 780 Ti, factory overclocked

In the green trunks is the ASUS GTX 780 Ti DirectCU II OC which [H]ard|OCP overclocked to the point they saw in game performance of 1211MHz GPU and 7.2GHz on the memory.  In the red trunks we find Gigabyte's R9 290X 4GB OC weighing in at 1115MHz and 5.08GHz for the GPU and memory respectively.  Both cards have been pushed beyond the factory overclock that they came with and will fight head to head in such events as Battling the Field, Raiding the Tomb and counting to three twice, once in a Crysis and again in a Far Cry from safety.  Who will triumph?  Will the battle be one sided or will the contenders trade top spot depending on the challenge?  Get the full coverage at [H]ard|OCP!

1397411267OPl8cM2MpM_2_8_l.jpg

"Today we look at the GIGABYTE R9 290X 4GB OC and ASUS GeForce GTX 780 Ti DirectCU II OC video cards. Each of these video cards features a custom cooling system, and a factory overclock. We will push the overclock farther and put these two video cards head-to-head for a high-end performance comparison."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP
Author:
Manufacturer: NVIDIA

SLI Testing

Let's see if I can start this story without sounding too much like a broken record when compared to the news post I wrote late last week on the subject of NVIDIA's new 337.50 driver. In March, while attending the Game Developer's Conference to learn about the upcoming DirectX 12 API, I sat down with NVIDIA to talk about changes coming to its graphics driver that would affect current users with shipping DX9, DX10 and DX11 games. 

As I wrote then:

What NVIDIA did want to focus on with us was the significant improvements that have been made on the efficiency and performance of DirectX 11.  When NVIDIA is questioned as to why they didn’t create their Mantle-like API if Microsoft was dragging its feet, they point to the vast improvements possible and made with existing APIs like DX11 and OpenGL. The idea is that rather than spend resources on creating a completely new API that needs to be integrated in a totally unique engine port (see Frostbite, CryEngine, etc.) NVIDIA has instead improved the performance, scaling, and predictability of DirectX 11.

NVIDIA claims that these fixes are not game specific and will improve performance and efficiency for a lot of GeForce users. Even if that is the case, we will only really see these improvements surface in titles that have addressable CPU limits or very low end hardware, similar to how Mantle works today.

In truth, this is something that both NVIDIA and AMD have likely been doing all along but NVIDIA has renewed purpose with the pressure that AMD's Mantle has placed on them, at least from a marketing and PR point of view. It turns out that the driver that starts to implement all of these efficiency changes is the recent 337.50 release and on Friday I wrote up a short story that tested a particularly good example of the performance changes, Total War: Rome II, with a promise to follow up this week with additional hardware and games. (As it turns out, results from Rome II are...an interesting story. More on that on the next page.)

slide1.jpg

Today I will be looking at seemingly random collection of gaming titles, running on some reconfigured test bed we had in the office in an attempt to get some idea of the overall robustness of the 337.50 driver and its advantages over the 335.23 release that came before it. Does NVIDIA have solid ground to stand on when it comes to the capabilities of current APIs over what AMD is offering today?

Continue reading our analysis of the new NVIDIA 337.50 Driver!!

Win a Galaxy GeForce GTX 750 Ti GC or GeForce GTX 750 GC!

Subject: General Tech, Graphics Cards | April 16, 2014 - 08:39 AM |
Tagged: video, giveaway, galaxy, contest

UPDATE: Our winners have been selected and notified! Thanks to everyone for participating and stayed tuned to pcper.com as we'll have more contests starting VERY SOON!!!

Our sponsors are the best, they really are. Case in point - Galaxy would like us to give away a pair of graphics cards to our fans. On the block for the contest are a Galaxy GTX 750 Ti GC and a Galaxy GTX 750 GC option, both based on the latest generation Maxwell GPU architecture from NVIDIA.

I posted a GTX 750 Ti Roundup story that looked at the Galaxy GTX 750 Ti GC option and it impressed in both stock performance and in the amount of overclocking headroom provided by the custom cooler.

IMG_9862.JPG

How can you win these awesome prizes? Head over to our YouTube channel to find or just watch the video below! You need to be a subscriber to our YouTube channel as well as leave a comment on the video itself over on YouTube.

Anyone, any where in the world can win. We'll pick a winner on April 16th - good luck!

Source: YouTube

NVIDIA GeForce Driver 337.50 Early Results are Impressive

Subject: Graphics Cards | April 11, 2014 - 12:30 PM |
Tagged: nvidia, geforce, dx11, driver, 337.50

UPDATE: We have put together a much more comprehensive story based on the NVIDIA 337.50 driver that includes more cards and more games while also disputing the Total War: Rome II results seen here. Be sure to read it!!

When I spoke with NVIDIA after the announcement of DirectX 12 at GDC this past March, a lot of the discussion centered around a pending driver release that promised impressive performance advances with current DX11 hardware and DX11 games. 

What NVIDIA did want to focus on with us was the significant improvements that have been made on the efficiency and performance of DirectX 11.  When NVIDIA is questioned as to why they didn’t create their Mantle-like API if Microsoft was dragging its feet, they point to the vast improvements possible and made with existing APIs like DX11 and OpenGL. The idea is that rather than spend resources on creating a completely new API that needs to be integrated in a totally unique engine port (see Frostbite, CryEngine, etc.) NVIDIA has instead improved the performance, scaling, and predictability of DirectX 11.

09.jpg

NVIDIA claims that these fixes are not game specific and will improve performance and efficiency for a lot of GeForce users. Even if that is the case, we will only really see these improvements surface in titles that have addressable CPU limits or very low end hardware, similar to how Mantle works today.

Lofty goals to be sure. This driver was released last week and I immediately wanted to test and verify many of these claims. However, a certain other graphics project kept me occupied most of the week and then a short jaunt to Dallas kept me from the task until yesterday. 

To be clear, I am planning to look at several more games and card configurations next week, but I thought it was worth sharing our first set of results. The test bed in use is the same as our standard GPU reviews.

Test System Setup
CPU Intel Core i7-3960X Sandy Bridge-E
Motherboard ASUS P9X79 Deluxe
Memory Corsair Dominator DDR3-1600 16GB
Hard Drive OCZ Agility 4 256GB SSD
Sound Card On-board
Graphics Card NVIDIA GeForce GTX 780 Ti 3GB
NVIDIA GeForce GTX 770 2GB
Graphics Drivers NVIDIA: 335.23 WHQL, 337.50 Beta
Power Supply Corsair AX1200i
Operating System Windows 8 Pro x64

The most interesting claims from NVIDIA were spikes as high as 70%+ in Total War: Rome II, so I decided to start there. 

First up, let's take a look at the GTX 780 Ti SLI results, the flagship gaming card from NVIDIA.

TWRome2_2560x1440_OFPS.png

TWRome2_2560x1440_PER.png

TWRome2_2560x1440_PLOT.png

With this title, running at the Extreme preset, jumps from an average frame rate of 59 FPS to 88 FPS, an increase of 48%! Frame rate variance does increase a bit with the faster average frame rate but it stays within limits of smoothness, but barely.

Next up, the GeForce GTX 770 SLI results.

TWRome2_2560x1440_OFPS.png

TWRome2_2560x1440_PER.png

TWRome2_2560x1440_PLOT.png

Results here are even more impressive as the pair of GeForce GTX 770 cards running in SLI jump from 29.5 average FPS to 51 FPS, an increase of 72%!! Even better, this occurs without any kind of frame rate variance increase and in fact, the blue line of the 337.50 driver is actually performing better in that perspective.

All of these tests were run with the latest patch on Total War: Rome II and I did specifically ask NVIDIA if there were any differences in the SLI profiles between these two drivers for this game. I was told absolutely not - this just happens to be the poster child example of changes NVIDIA has made with this DX11 efficiency push.

Of course, not all games are going to see performance improvements like this, or even improvements that are measurable at all. Just as we have seen with other driver enhancements over the years, different hardware configurations, image quality settings and even scenes used to test each game will shift the deltas considerably. I can tell you already that based on some results I have (but am holding for my story next week) performance improvements in other games are ranging from <5% up to 35%+. While those aren't reaching the 72% level we saw in Total War: Rome II above, these kinds of experience changes with driver updates are impressive to see.

Even though we are likely looking at the "best case" for NVIDIA's 337.50 driver changes with the Rome II results here, clearly there is merit behind what the company is pushing. We'll have more results next week!