AMD Eyefinity and NVIDIA Surround plus 4K
As I mentioned on the previous page, Eyefinity and Surround are technologies that have been around for a long time but are picking up steam as the price of high quality 1080p monitors drops. AMD was the progenitor of multi-display gaming with Eyefinity and it has been a part of their promotion and marking for Radeon graphics cards for years, both high-end and mid-range cards.
These configurations put a lot of pressure on GPUs. A 2560×1440 screen results in a pixel count of 3.68 million while a 5760×1080 (triple 1080p screen) setup results in 6.22 million pixels. So, to maintain a steady frame rate of 60 FPS the gamer’s PC must be able to handle 69% more pixel per second. Single GPU configurations (with the exception of maybe NVIDIA’s GeForce GTX TITAN) are unable to really meet this demand while setting games to higher quality settings and thus many users that invest in multiple panels for Eyefinity/Surround are also investing in multiple GPUs for CrossFire or SLI.
And thus our problem is stated. AMD has said that the 13.8 Catalyst beta driver (and the fix found in it) was only good for 2560×1600 resolutions and below. There was some confusion if this meant Eyefinity was included or not as some people speculated that because each screen was only 1920×1080, it would work. As it turns out, that isn’t the case.
Another problem AMD may have coming up is the move to 4K. Current 60 Hz 4K gaming panels like the ASUS PQ321Q use a dual-head configuration. That means that although only a single cable connects to the monitor via DisplayPort, multiple streams are being sent back and forth, reporting to the PC as a multi-monitor connection. Essentially, gaming at 4K 60Hz on current screens is a result of setting up a two screen Eyefinity or Surround configuration with each “head” running at 1920×2160. The combined resolution is 3840×2160 for a total pixel count of 8.29 million pixels, or 2.25x the pixels of 2560×1440.
Obviously multi-GPU setups are going to be demanded for 4K.
The ASUS PQ321Q
In order to test AMD Eyefinity and NVIDIA Surround with our Frame Rating configuration, we had to be a bit more creative. Previous testing was done simply by using a dual-link DVI splitter sitting between the panel and the graphics card, with the secondary output going into our external system and capture card. That card records the raw data and then scripts analyze the video (with overlay) to get performance data.
With Eyefinity and Surround there are three screens so which one do we capture? The answer turns out to be any of them. Thanks to an updated overlay that can present as many sets of overlay bars on the screen as we request, we can capture the left, center or right hand screen to see performance data. We are simply intercepting ONE of the screens rather than the ONLY screen.
Our Datapath DVI-DL Capture Card
As it turns out, performance measurements are the same regardless of which screen you capture from, as you would expect. For my testing then we decided to capture the center monitor which allows to create some side-by-side animation comparison videos as well.
Test System Setup | |
CPU | Intel Core i7-3960X Sandy Bridge-E |
Motherboard | ASUS P9X79 Deluxe |
Memory | Corsair Dominator DDR3-1600 16GB |
Hard Drive | OCZ Agility 4 256GB SSD |
Sound Card | On-board |
Graphics Card |
AMD Radeon HD 7970 3GB CrossFire NVIDIA GeForce GTX 770 2GB SLI |
Graphics Drivers |
AMD: 13.8 (beta) NVIDIA: 326.41 |
Power Supply | Corsair AX1200i |
Operating System | Windows 8 Pro x64 |
For my testing I wanted to use the highest end graphics offerings while maintaining price parity. The Radeon HD 7970 GHz Edition has come down in price steeply and currently sells for as low as $370 with a 3GB frame buffer. So while NVIDIA does have faster single GPU offerings, the best comparison for our purposes is the GeForce GTX 770 2GB cards that sell for about $390. I do realize that the GTX 780 and GTX TITAN would offer better single GPU and SLI results for NVIDIA but keeping the prices within range is key for any PC component discussion.
Keep the faith, Ryan and co.
Keep the faith, Ryan and co. Just continue to call it like you see it and let the chips fall where they may.
Hopefully AMD will get its stuff together otherwise they are going to lose a few folks.
I sincerely admire your
I sincerely admire your journalistic integrity Ryan… as well everyone else at the PCper team!
-Stewart Graham
what a difference in AMD
what a difference in AMD graft, they improved on there driver.
Is this so with an APU + Graphic card. Good job Ryan.
Well currently rolling with
Well currently rolling with 2×7970’s on a 1920×1200 triple display setup. Can’t say I ever really been personally bothered the various issues raised in the article in regards to the frame interlieaving and stepped tearing enough to stop playing, though I trust the guys over at PCPer to give it to me straight. I noticed the stuttering with crossfire more than anything else you guys brought up with your new testing methodology. I think most of us gamers at least gained a better understanding about the various issues involved. Sometimes my benchmarking applcation(be it FRAPS or Dxtory) would say I was getting a certain frame amount but the game just felt too jittery, whereas if I disabled crossfire the game felt more smooth even with a lower framerate.
That is not to say I haven’t thoroughly enjoyed my 7970’s/Eyefinity setup. When I’ve been been able to play at Eyefinity resolutions I’ve done so, when I haven’t I’ve just adjusted my quality or resolution settings until I could get a smooth enough playing experience.
Do I hope that AMD is able to smooth out those circumstances where I can’t play at a give resolution/quality due to micro-stuttering with crossfire, yeah that would be awesome. I think a lot of us out here still don’t have a full appreciation for the phenomena due to not having been able to test multi-GPU solutions side by side, so it just comes down to “the game doesn’t feel fluid enough at my current settings so I’ll dial them down until it does”, which I’m sure people have different sensitivities to. Keep up the good work PCPer crew.
what about this
what about this ryan?
http://www.brightsideofnews.com/news/2013/9/18/nvidia-launches-amd-has-issues-marketing-offensive-ahead-of-hawaii-launch.aspx#.Ujo-ScnFMsU.twitter
why use 2 hdmi cables when
why use 2 hdmi cables when you can use a single displayport cable and the problem does not exist with displayport ?
At first I thought this
At first I thought this article may have been over egging the problem with eyefinity + crossfire. Having now disconnected my second HD 7970 and played a few games in eyefinity I have seen that I is not. Radeon Pro may tell me that I’m getting half the FPS that I was but my eyes see the same low FPS experience.
Not impressed AMD, I feel like a chump for spending £300 on a card whose only additional effect to my system has been extra heat and noise.
Still at least I can go back and play Farcry 3 now with out the giant oversized HUD problem.
Thanks For the good article and thanks for bending AMD’s ear.
A damned good read thanks
A damned good read thanks Ryan. AMD owners should be pleased that these issues are highlighted and making sure AMD keep on their toes. Like the FCAT article, it was good to see AMD address the issue and get it fixed and again, it was PCper who made AMD aware of the issues (like they didn’t already know!)and forced them into sorting that out for their users.
I think a lot of hardware
I think a lot of hardware maker define CPU differently then Microsoft.you can ask I wrote a bug report to and today 13.10 beta.if I recall message signal interrupt and its extended variant were implemented in vista for consumer?ROFL we know how vista was received so this might be one overlooked good thing.my case?in regedit MSI was enabled (sad was not for some reason ,can’t enable it)but no amount of MSI set!(if it isn’t set isn’t it defaulting to one msi / socket?but I have 4 CPU in my i5 2500k(ya only physical CPU ms say)so imagine amd 8 core fx lol stuck with 1 MSI / msix.I think this is the cause.sadly on my system none were set . I normally tweak but from what I saw on ms it isn’t a case of 0 or 1.and ms recommend hex value.Rolf a bit too complex for my knowledge.but you guys know a lot of hardcore tweaker . if I’m right ? I would be like what the eck am I the only one that used vista ?
PS:what I wrote is for w8 64
PS:what I wrote is for w8 64 bit!But I suspect a lot of hardware maker default to 1 (probably easier to implement)since socket come fro 2 to 12) detecting might be entertaining.
2.1.4.1. Resolution,
2.1.4.1. Resolution, Granularity, and Accuracy of System Time
http://www.windowstimestamp.com/description
Bottom line?I hate compromise!