Review Index:
Feedback

Frame Rating Dissected: Full Details on Capture-based Graphics Performance Testing

AMD CrossFire and Eyefinity Concerns

It will become painfully apparent as we dive through the benchmark results on the following pages, but I feel that addressing the issues that CrossFire and Eyefinity are creating up front will make the results easier to understand.  We showed you for the first time in Frame Rating Part 3, AMD CrossFire configurations have a tendency to produce a lot of runt frames, and in many cases nearly perfectly in an alternating pattern.  Not only does this mean that frame time variance will be high, but it also tells me that the value of performance gained by of adding a second GPU is completely useless in this case.  Obviously the story would become then, “In Battlefield 3, does it even make sense to use a CrossFire configuration?”  My answer based on the below graph would be no.

View Full Size

View Full Size

 

Another issue that cropped up with the AMD configurations in CrossFire mode showed its face when we tried 5760x1080 testing.  In nearly every case, an AMD CrossFire dual-GPU configuration running an Eyefinity configuration at 5760x1080 showed alternating dropped frames.  Whereas with single monitor gaming we were seeing some full game frames being turned into runts at the display, with Eyefinity those frames were missing completely (seen as missing overly colors from the expected pattern). 

This actually caused two things to happen.  First, our Perl scripts and data generation would sometimes error out completely (after the capture and extraction steps) because the code was never able to find the initial sequence of 16 colors in the correct order to validate the video capture.  We are still working on a fix for this in order to present that information in a useable format, but for now we will point out specifically when that occurred.  Secondly, the frame rates were smoother!  Without the runts getting in the way of the animation sequences the motion on the screen was actually more fluid than it would have been otherwise, but don’t take this as a vindication of what AMD’s Eyefinity is actually doing.  While the animation is smoother, you are still not seeing any benefit from the second card in your CrossFire configuration and you could view it simply as wasted investment.

Finally, we saw some interesting visual problems in our captures of Eyefinity that cause us to raise some eyebrows.  Because the overlay changes colors with every frame I was able to notice some instances where the digital scan out of the graphics cards were not matching up; frames were being split by other frames.

View Full Size

View Full Size

This second screen uses an alternate overlay for specific multi-GPU scenarios

At first I was afraid something was going on with our capture hardware, that somehow the EDID of the Datapath VisionDVI-DL card was incorrectly communicating with the AMD GPUs.  But in fact, we saw several problems and inconsistencies with AMD’s graphics performance when more than one display was attached to the system, even if we were NOT in an Eyefinity setup!  As I later learned, enabling Vsync actually does not work at all with Eyefinity and that, combined with the results I have seen (of which the screenshot above is an indicator) with our testing, lead me to believe that something is fundamentally wrong with AMD’s Eyefinity implementation.  And if it’s not “wrong”, it is definitely counter intuitive.  We’ll be asking AMD for more information in the coming weeks and hope to get more information from them as our Frame Rating process evolves. 

 

For our first set of data we are going to be looking at the Radeon HD 7970 GHz Edition and the GeForce GTX 680 in both single and dual-GPU combinations.  By looking at the high-end graphics cards first we can set a baseline for what to expect moving forward into the mid-range options and even mainstream.

March 30, 2013 | 08:47 PM - Posted by bystander (not verified)

I have a hard time trying to grasp exactly how erratic input would affect the results. I have a feeling, based on my constitution (I get simulator sickness with poor latency), that the best case is which ever has the lowest worst case interval times.

March 31, 2013 | 01:55 AM - Posted by bystander (not verified)

..But then you have occasional latency increases. Of course those increases are to remove redundant frames, and once increased, they probably don't need much adjustments most the time.

This whole topic always gets me going back and forth, but my instincts is overall, even if latency is considered, even spacing matters more as it adds more useful points of input, assuming it adds only marginal/occasional increases of latency.

March 28, 2013 | 08:20 PM - Posted by Bob Jones (not verified)

Can you address the visual quality differences between the two cards? Specifically on Sleeping Dogs, the 660 Ti seems to be missing some lighting sources outside - most noticeable is the cafe/shop lights before you go down the stairs, and then the store across the street right at the end of the vieo.

March 29, 2013 | 10:46 AM - Posted by Ryan Shrout

Time of day in the game when those videos were recorded?  They should be basically identical, I'll check.

March 29, 2013 | 03:10 AM - Posted by I don't have a name. (not verified)

Fascinating article. I think it'll take a few reads to fully comprehend everything that is being said. Thank you indeed, I found it fascinating. Certainly, as a 7970 owner, I'll be holding off on a potential second 7970 purchase for the time being.

March 29, 2013 | 05:09 AM - Posted by rezes (not verified)

The last GeForce 314.22 Beta Drivers and last Radeon drivers 13.3 beta 3. Please using these drivers on yours test.

AMD driver may be more stable on this drivers!

March 29, 2013 | 10:46 AM - Posted by Ryan Shrout

Nothing has changed on the issues in question.

March 29, 2013 | 05:15 AM - Posted by technogiant (not verified)

Thanks for the great article and all the work you've done guys.

I run 2x gtx 460's is sli and while I dislike screen tearing I've noticed that options such as vsync, active vsyn and frame rate limiters actually make the experience less smooth as appears to have been highlighted in this article.

I've considered getting a 120Hz monitor just so I can run without any of those options at a decent frame rate but use sufficiently high settings so as not to go above 120Hz and so incur screen tearing.

Thinking further I'd like Nvidia to develop a variation of their gpu boost technology that would actually down clock the gpu to prevent frame rates from exceeding the monitors refresh rate.....think this would give the benefits of no screen tearing without the negatives of vsync and the like.

Thanks again for the article guys.

March 29, 2013 | 05:37 AM - Posted by technogiant (not verified)

Actually using gpu boost dynamically to both under and overclock the gpu to achieve a target frame rate could be a very nice way of producing a smoothed experience without any of the negatives of other methods as its occurring directly at the gpu instead of in the game engine to display time line.

March 29, 2013 | 08:01 AM - Posted by Pick23 (not verified)

So is this article saying that even with the new testing methodologies:
1.Single card 7970ghz is still slightly better than the 680
2.Crossfire absolutely sucks
?

March 29, 2013 | 08:18 AM - Posted by John Doe (not verified)

7970 Ghz is slightly better than a 680 ONLY at stock.

When you're comparing 7970 Ghz to 680, things ENTIRELY depend on clock speeds since the 7970 Ghz is nothing more than a pre-OC'ed 7970.

But yes, CF does indeed sorta suck. Still.

March 29, 2013 | 07:26 PM - Posted by Anonymous (not verified)

Sweet 7970 still the best card under.. well under $1000 lol

March 30, 2013 | 09:21 PM - Posted by steen (not verified)

What's stock for a 680? ;) 7970 GE is slightly slower than Titan...

CF sucks just like SLI. What's your poison, input lag or frame metering? Do poeple understand what "runts" are? CF is actually rendering the frames, you just don't benefit as they're too close together. One frame renders the top 1/4 of the screen when the next frame starts. Your top ~200 lines are the fastest on your screen. ;)

(Sorry for the previous multi-posts. Don't know what happened there.)

April 7, 2013 | 01:15 PM - Posted by John Doe (not verified)

I have ran far more CF and SLi setups than you did FOR YEARS and understand far more abouts these things than you and your little silly mind does.

March 29, 2013 | 11:39 AM - Posted by fausto412 (not verified)

interesting piece, good job pcper.com

now I wonder if when AMD does a global fix my 6990 performance will be dramatically improved on bf3?

and what is the effect of correcting this to latency and actual framerate? will we see FPS go down at the expense of frametimes?

It is Obvious Nvidia was on top of this for some time...I just don't see a proper fix in 120 days.

March 29, 2013 | 12:24 PM - Posted by Chad Wilson (not verified)

Just out of scientific curiosity, did you do a run without the splitter in the mix to confirm that the splitter is not adding any latency to your tests?

March 29, 2013 | 12:33 PM - Posted by Anonymous (not verified)

Do people actually use vsync without some form of triple buffering?

I don't see how. Jumping between 30 and 60fps or whatever is not an enjoyable, nor smooth experience.

So, if you can enable vsync AND allow the game to sweep through a normal range of available framerates, does this negate the increased frame times of constantly switching back and forth between high fps and low fps?

March 30, 2013 | 12:56 AM - Posted by bystander (not verified)

V-sync, even with triple buffering, still jumps back and forth between 16ms and 33ms, but it does it between frames. A 120hz monitor helps here, as you can have 25ms frames too, so it is less of a variance.

March 29, 2013 | 12:36 PM - Posted by Anonymous (not verified)

Furthermore, is playing a game without vsync enabled REALLY an option?

Are you sure gamers all over the world disable it to be rid of the latency issues? I'm not so sure.

I'll happily take a little latency in a heated round of counter-strike than end up dead, or missing my shot because 50% of the screen shifted 8 feet to the right. (screen tearing).

Pretty much all games are unplayable without the use of vsync and I'm not convinced it's a personal preference, either, if you enjoy your experience while you're tearing frames - I'd just call you a mean name that insinuates you're not telling the truth.

March 29, 2013 | 02:00 PM - Posted by Marty (not verified)

If you are a competitive player, VSync is not an option, you are lagging an extra frame behind.

March 29, 2013 | 12:45 PM - Posted by rezes

Where is the new tests? and when?

March 29, 2013 | 02:20 PM - Posted by Anonymous (not verified)

So how much did nVidia pay you?

While I can see the potential in this kind of testing, and some of the issues you have mentioned are valid, you have drawn quite a bold and one sided conclusion using the competitor's software. I'll save my judgements for when this becomes open source.

March 29, 2013 | 08:12 PM - Posted by Fergdog (not verified)

It's not purely Nvidia made software, if you read the article or paid attention to this site you'd know Ryan co-developed this benchmark with Nvidia and he's been working on it for a long time.

April 2, 2013 | 11:10 PM - Posted by Anonymous (not verified)

nVidia has to do almost everything, the amd fans need to get used to it.

AMD's years long broken bottom line and years of massive layoffs and closings mean they claim they "weren't even aware of this issue !? !! "

- like every other total screw up AMD gets caught on with their dragon drawers on the floor next to their spider platform, buck naked in epic fail and presumably clueless...

Maybe we need some depositions and subpoenas of internal emails to see just how much they covered up their epic fail here.

March 29, 2013 | 08:00 PM - Posted by Fergdog (not verified)

Quick question, for adaptive vsync, you put it on half refresh rate didnt you?

March 29, 2013 | 08:02 PM - Posted by Fergdog (not verified)

Half refresh adaptive only really makes sense on 120hz monitors, not sure why you used that setting on a 60hz monitor for benchmarking.

March 29, 2013 | 11:06 PM - Posted by Anonymous (not verified)

Hoping you post the titan and and info today as promised!!!!!!!!!

March 29, 2013 | 11:07 PM - Posted by Anonymous (not verified)

Titan and amd

March 30, 2013 | 12:56 AM - Posted by MaxBV (not verified)

Waiting on that GTX 690 and Titan article still, hope you guys haven't forgotten.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.