Review Index:
Feedback

Frame Rating Dissected: Full Details on Capture-based Graphics Performance Testing

Vsync and its Effect on Frame Rating – Does it fix CrossFire?

After publishing the Frame Rating Part 3 story, I started to see quite a bit of feedback from readers and other enthusiasts with many requests for information about Vsync and how it might affect the results we are seeing here.  Vertical Sync is the fix for screen tearing, a common artifact seen in gaming (and other mediums) when the frame rendering rate doesn’t match the display’s refresh rate.  Enabling Vsync will force the rendering engine to only display and switch frames in the buffer to match the vertical refresh rate of the monitor or a divisor of it.  So a 60 Hz monitor could only display frames at 16ms (60 FPS), 33ms (30 FPS), 50ms (20 FPS), and so on with a 120 Hz monitor could also being capable of 8ms (120 FPS), etc. 

Many early readers hypothesized that simply enabling Vsync would fix the stutter and runt issues that Frame Rating was bringing to light.  To test this we looked for a game that ran right around the 60 FPS mark in our in normal testing with Vsync disabled and then set about to re-run results with it on.  We are using a standard 60 Hz monitor with the goal of being able to test some 120 Hz capability soon after we figure out a final bug or two with our capture configuration. 

First up, let’s take a look at the NVIDIA GeForce GTX 680 and GTX 680 SLI and see what shows up.

View Full Size

Because the average frame rate per second graph averages out the frame times for a total of one second of time, the averages won’t quite be the straight lines you might have expected.  Looking at the GTX 680 SLI Vsync enabled results the only key item is that the frame rate doesn’t go above 60 FPS like it does with Vsync disabled.

View Full Size

The single card and SLI configurations without Vsync disabled look just like they did on previous pages but the graph for GTX 680 SLI with Vsync on is very different.  Frame times are only switching back and forth between 16 ms and 33 ms, 60 and 30 instantaneous FPS due to the restrictions of Vsync.  What might not be obvious at first is that the constant shifting back and forth between these two rates (two refresh cycles with one frame, one refresh cycle with one frame) can actually cause more stuttering and animation inconsistencies than would otherwise appear.

View Full Size

Based on our graph here we found that with Vsync enabled we had about 87% of our frames running at 60 FPS (16 ms) and 13% at 30 FPS (33 ms).  You might be curious how there could be 60 FPS frame rate so often with Vsync on but very few frames at 60 FPS with Vsync off, and the answer lies in the rate limiting caused by Vsync.  Because of the back pressure on the game engine caused by the longer frame times (30 FPS, 33 ms) from Vsync there is more time for the GPUs to “catch up” and render another frame at 16 ms. 

View Full Size

Our ISU graph on stutter potential tells the story in a more damning light; starting at the 30th percentile the Vsync enabled setup of GTX 680s in SLI are already running at much higher frame variances and it only gets worse as we hit the 60s, 80s and 90s.  At the 90th percentile we are seeing frame variances over 12 ms, which is nearly a complete monitor refresh cycle!

 

Now let’s see how the AMD Radeon HD 7970 results change.

View Full Size

Something interesting is already happening here – the Vsync enabled results from the HD 7970 CrossFire configuration are running at HIGHER average frame rates per second than with Vsync disabled!  The orange line clearly never hits the 60 FPS mark while the black line (Vsync) does. 

View Full Size

Without Vsync we clearly see the runts affecting the plot of frame times here on the HD 7970s in CrossFire but enabling Vsync does appear to eliminate them! 

View Full Size

With our observed frame rate we have the same results for the HD 7970 CrossFire as we did with our FRAPS results, indicating no dropped frames or runt frames.  Standard CrossFire mode still shows the horrible results we have come to expect from our analysis today.

View Full Size

Our Min FPS percentile graph shows us that we are running at 60 FPS (16 ms) 85% of the time and 30 FPS (33 ms) the rest.  Because our data here is based the observed frame rates and not the FRAPS frame rates, there is no correlation between the two CrossFire runs.

View Full Size

The ISU graph of stutter potential again indicates that the Vsync enabled option is introducing higher frame variances than we would like and it is doing it more dramatically and earlier than the GTX 680s in SLI. 

It does appear that enabling Vsync will help alleviate the runts issue seen with AMD Radeon cards in CrossFire but at the cost of much more frame variance and stuttered animation on games that previously didn’t exhibit that problem. 

Let's take a look at another example using CrossFire that has another particular set of circumstances.  I theorized that in a gaming scenario that bordered just under 60 FPS with a single GPU, we would still see problematic results when jumping to HD 7970s in CrossFire.  Take our Battlefield 3 2560x1440 testing: with only one HD 7970 we are running just under 60 FPS most of the time which would, with Vsync enabled, force the game to run at 30 FPS with 33ms frame times.  Ideally we would like to see that move from 33ms frame times to 16ms frame times when adding in another HD 7970 in CrossFire due to the extra performance pushing the card over 60 FPS steady.

View Full Size

Our FRAPS graphs looks how we would hope and expect real-world performance to look.  While the single HD 7970 ran at a non-standard frame rate when performance was under 60 FPS, towards the end (50 sec point) where it could, we see a flat line that is partially hidden behind the pink line.  That pink line represents CrossFire HD 7970s and by doubling the number of GPUs we expected to maximize performance at 60 Hz with Vsync enabled, and we have. 

View Full Size

Observed frame rates calculated by removing runts are showing the Vsync DISABLED results on the HD 7970s in CrossFire mirror what we have seen before with much lower performance.  However, the Vsync ENABLED results did not change! 

View Full Size

The somewhat complicated plot diagram of frame times indicates that at no time did the frame rate of the HD 7970 cards in CrossFire go below 60 FPS or above the 16ms mark - even though there are thousands of frames under 16ms (runts) when Vsync is disabled.  Not only that but performance over the single HD 7970 with Vsync enabled is improved - rather than having jumps between the 16ms and 33ms frame times, we are locked in at 16ms - matching the 60 Hz refresh of our panel. 

View Full Size

The minimum FPS percentile graphic shows the same story - the pink link representing the HD 7970s with Vsync turned on looks solid.

View Full Size

Notice as well that with a static 16ms frame time we see no frame time variance at all in our ISU graph indicating that the kinds of stutter we are searching for are not showing up at all.

How is this happening?  How is enabling Vsync 'fixing' the runts and frame time issues of CrossFire?  The secret lies in the inherent back pressure of vertical sync to pace the graphics card and AMD's CrossFire engines even against its own will.  By forcing the GPUs to only render one frame every 16ms (at the maximum), Vsync is able to force the GPU to pace itself in a way that it would otherwise not.  This doesn't happen in every game though as we saw in the Crysis results first, and there is a lot more testing that needs to be done with Vsync to make a firm decision.

 

NVIDIA has a couple of different solutions in the NVIDIA Control Panel that might help: Adaptive Vsync and Smooth Vsync.  Adaptive Vsync was released with the first Kepler GPUs last year and we found it to be very effective at reducing stutter while also eliminating tearing.  Smooth Vsync is a little known feature that only exists in the driver when SLI is enabled as it takes advantage of many of the same frame metering features that SLI uses.  It attempts to keep frame rates “settled” at a level until it decides it has enough horsepower to move up to the next frame rate option for an extended period of time.  It is a very dubious description at best and NVIDIA didn’t go into much detail on how they decide if they have enough GPU overhead remaining or how long that “period of time” really is.

View Full Size

I decided to run through the same Crysis 3 sequences at 1920x1080 on the GTX 680s in SLI with all four NVIDIA options enabled: Vsync off, Vsync on, Adaptive Vsync and Smooth Vsync. 

View Full Size

Our FRAPS based results show the same similar looking results for standard Vsync on and off, but the adaptive and smooth Vsync options appear to be fixed at 30 FPS with the occasional hiccup on the Smooth Vsync.

View Full Size

The plot of frame times is kind of confusing but the important data is to compare standard Vsync On to Adaptive and Smooth.  With the exception of the 6 or so spikes on the smooth configuration the frames are basically fixed at 33 ms, resulting in a perfectly smooth gameplay experience but at the expensive of limiting performance. 

View Full Size

The observed FPS doesn’t change at all.

View Full Size

Another view here shows the same thing with a fixed frame rate of 30 FPS for adaptive and smooth Vsync options.

View Full Size

NVIDIA’s Adaptive Vsync shows basically 0 variance and only very minimal variance on the Smooth Vsync option at the 96th percentile.  So even though performance is lower on average, the experience is smoother.

 

NVIDIA’s additional Vsync options are definitely a strong point in favor of its technology though the Smooth Vsync only exists on SLI configurations.  I have been told that they were considering adding it to single graphics card configurations and I certainly hope they do as it adds some significant value in the same way Adaptive Vsync and Frame Rate Limiting do.

For both NVIDIA and AMD multi-GPU solutions with standard Vsync, enabling it definitely changes the story.  NVIDIA’s cards pretty much perform as we expected but for CrossFire we didn’t really know what expect with the various visual concerns.  It does appear that the runts problem was at least mostly solved with the enabling of Vsync though to be clear we are only testing a couple of game at this point – much more needs to be done. 

However, enabling Vsync creates a whole host of other potential issues that gamers have to deal with.  Even though the goal of removing visual tearing is met with the option turned on, you do add latency to the gameplay experience, as much as 60ms in some cases, from input to display.  Putting back pressure on the GPU pipeline, for both NVIDIA and AMD, means that some frames are going to be running behind schedule or behind the input timing of the game itself.  Many gamers won't want to deal with those kind of input problems and that is why many still play games with Vsync disabled.  Turning on Vsync does help AMD's CrossFire performance but it isn't the final answer just yet.

March 28, 2013 | 04:20 AM - Posted by Beany (not verified)

I agree. I have 2x 6970 and the Crossfire results here are just embarrassing for AMD. It clearly don't work well and i've noticed myself with certain games, because you don't need graphs and stuff, as often you can see it with your own eyes. It's just BAD.

I used to build hundreds of PC's with both Nvidia and AMD, and from that experience i also noticed that 1) NV's drivers are generally better and less problematic and 2) Crossfire don't work as well as SLI.

The only reason i use AMD is for image quality. Not 3D/gaming stuff but general image output on the desktop. Colours and whites look better and more intense with AMD cards on my high-end monitors (but without being saturated), and no amount of tweaking on NV hardware will get this stuff to look as good. I've tried it with multiply different monitors and AMD just looks better. NV hardware is washed out in comparison. Most people wont be able to tell the difference and you need a good IPS diplay to properly notice but it's always bothered me because i'm a graphics designer and image quality freak. If NV sort this out i'll switch to them instantly.

March 28, 2013 | 04:20 AM - Posted by Beany (not verified)

I agree. I have 2x 6970 and the Crossfire results here are just embarrassing for AMD. It clearly don't work well and i've noticed myself with certain games, because you don't need graphs and stuff, as often you can see it with your own eyes. It's just BAD.

I used to build hundreds of PC's with both Nvidia and AMD, and from that experience i also noticed that 1) NV's drivers are generally better and less problematic and 2) Crossfire don't work as well as SLI.

The only reason i use AMD is for image quality. Not 3D/gaming stuff but general image output on the desktop. Colours and whites look better and more intense with AMD cards on my high-end monitors (but without being saturated), and no amount of tweaking on NV hardware will get this stuff to look as good. I've tried it with multiply different monitors and AMD just looks better. NV hardware is washed out in comparison. Most people wont be able to tell the difference and you need a good IPS diplay to properly notice but it's always bothered me because i'm a graphics designer and image quality freak. If NV sort this out i'll switch to them instantly.

March 28, 2013 | 02:24 PM - Posted by Anonymous (not verified)

Dont know if you read the patch notes for Radeon, But they stated that they have improved crysis 3 and more performance with their latest releases AND that they are working on GREATLY increasing CROSSFIRE performance. Keep an eye open PCPER!

March 28, 2013 | 05:35 PM - Posted by Anonymous (not verified)

lol - it's just amazing isn't it. We'll have the same excuse the politicians always give us...

" "Tired of dropping frames instead of opponents? Find a CrossFire™-certified graphics configuration that’s right for you."

" We had no idea!!!! "

I want to know how many YEARS this has been going on.

These tools need to be used on 3870 Cf on up - 4850 4870 5850 5860 6850 6870 6950 6970 and so on....

So for many years we have been fed a complete freaking lie across the board, and from every "scientific unbiased tester website"....
Just wow. If there are conspiracy theorists here, you missed this one, because right now full blown conspiracy (if amd seemed it knew what it was and is doing) would be the conclusion. (instead amd marketing lies and incompetence and pure luck at the massive cover up over many years looks correct)

I also have to say, PALMFACE !

This also SHAMES EVERY REVIEW SITE OUT THERE. lol How stupid they have been - they should have spent far less time attacking nVidia looking and hunting for every tiny chink in their armor they could dream up and squared off at AMD since these COMPLAINTS about Crossfire have been present since DAY ONE.

Just think, all those reviewers, all those tests, all those years, and complete obliviousness till now... and a CONSTANT HAMMERING on people like me who tried in vain to point out the CF issues.
"it's your imagination"
"you're crazy"
"nVidia shill"
"paid operative"
"I don't see anything wrong ! "

Just W O W.

March 29, 2013 | 02:59 AM - Posted by John Doe (not verified)

You're a moron and have some real issues.

I ran that HD 2900 CF setup up to 200 HZ and haven't had a SINGLE frametime issue with it.

I was playing Cube on it at 160 HZ on a Diamondtron SB2070 and it didn't skip a SINGLE beat.

March 29, 2013 | 09:01 AM - Posted by Anonymous (not verified)

LOL - a single example, untested, swears the amd fanboy, on an obsolete hd2900 said to be the most terrible hd flagship even by amd fanboys, and one game, from the historical mind of the fanboy....

R O F L - and you call me names.

No, we need the entire lineup of amd tested for the YEARS their scam has existed.

I feel a MASSIVE VINDICATION coming.

BTW - hold onto your undies feller, AS THIS SOFTWARE FCAT WITH THE COLOR BANDS IS BEING RELEASED TO THE PUBLIC FIRST ON MSI AFTERBURNER ON MONDAY AND THEN ON EVGA PRESICION X LATER IN THE WEEK !

ROFL - The truth is coming amd fan. NOTHING will stop it now.

March 29, 2013 | 12:31 PM - Posted by John Doe (not verified)

Indeed.

I bought those 2900's because they were fucking cheap as hell because I bought them SECOND HAND and had a couple of PULSE Digital PWM's and 512-bit bus with a ton of memory bandwidth and HDCP onboard.

How many words in that text did you had to Google?

April 2, 2013 | 06:15 PM - Posted by Anonymous (not verified)

Hey I, that's ME, I loved the HD2900, but all the amd fanboys were crying, it lost miserably to the nVidia flagship.
Have a good friend who got a Sapphire hd2900 "pro" version, but it wasn't the pro, as I recall it had an unlisted weird and in between 320 shaders with the 256 bus.
He had EXTENSIVE trouble with it - I spent countless hours hammering out the bugs in every board he ever ran it in - a Dell dimension 4700 for instance kept it locked at 500mhz core and 600mhz memory.
He hammered the crap out of it for years, he broke it three times, I resurrected it three times, then finally against advice he kept the pink rubbery foam strip off the VRMs upon reassembly after cleaning - he was told NOT to but insisted it was just a cushion and not needed - that freaking cooked it for good -
I didn't hate the HD2900 but every amd fanboy did, and said so, it was a crushing blow to their egos, as nVidia stomped it on performance.

March 29, 2013 | 12:50 PM - Posted by Anonymous (not verified)

yet another fan boy, this time on the nvidia side. (paid?)
i wish you all disappeared.

March 29, 2013 | 12:52 PM - Posted by John Doe (not verified)

I've never been paid by anybody in my entire life to speak garbage about a product.

You have absolutely no idea who I am, which just goes to show how much of an ignorant idiot you are.

Have a nice and a long life.

April 3, 2013 | 08:17 PM - Posted by Anonymous (not verified)

He was replying to me, please be able to follow the threading insets.

Now you were calling whom an idiot ? Uhh, nevermind.

April 10, 2013 | 03:21 PM - Posted by John Doe (not verified)

The more you speak to yourself, the more of a retard you'll realise you are.

March 29, 2013 | 12:51 PM - Posted by Anonymous (not verified)

Well it could just as well be "lazy" game developers not bothering to optimize for AMD and just going with nVidia and then just accepting the faults that comes when using AMD and just hoping no one notices??

March 29, 2013 | 12:51 PM - Posted by Anonymous (not verified)

Well it could just as well be "lazy" game developers not bothering to optimize for AMD and just going with nVidia and then just accepting the faults that comes when using AMD and just hoping no one notices?

March 29, 2013 | 01:05 PM - Posted by John Doe (not verified)

Yes.

April 3, 2013 | 08:15 PM - Posted by Anonymous (not verified)

lol

It happens on many AMD Gaming Evolved games that have direct amd/developer input fella.

Nice try but you obviously are clueless and a total amd fan.

March 27, 2013 | 07:50 AM - Posted by JonasMX (not verified)

My only concern is that in my country are a bunch of people don't care about this kind of stuff. They think is too boring and the readers don't care.

We are years behind what you are trying to demostrate and bring to light, but definitly I'll follow you like an example.

Thanks for you hard work

March 27, 2013 | 08:12 AM - Posted by Anonymous (not verified)

Hi Ryan
I was wondering if you could also post the RUN graphs for the 680 sli. I didn't get a clear picture from the article if the 680 sli also dropped any frames, had any runt frames or totally avoided those behaviors.

March 27, 2013 | 08:25 AM - Posted by Ryan Shrout

I'll do that now!

March 28, 2013 | 05:42 PM - Posted by Anonymous (not verified)

No drops no runts for nVidia 680 SLI - NONE

LOL

That's why they were left out... like no news is good news.

I feel so good about myself, so many years, so many attacks on me by so many raging amd lovers.
I sooooo want to see them cover their heads in shame and admit how treasonous they were.
Yes treasonous liars. (lol whatever couldn't think of the word)
I suspect heck would freeze over first.

Oh look at that the river styx has ice skaters on it.

March 27, 2013 | 08:32 AM - Posted by Eriol (not verified)

Great job on the article! This is how GPUs should be benched in the future.

What I would also like to see is how the input latencies are affected by things like vsync or nvidia frame metering. Also, digging deeper into how frame rate limits can help alleviate the stutter issues would be welcome. I found that back when I was testing 6950cf that a 3rd party fps limiter was the best solution combating micro stutter.

In the end CF is just not worth it atm, regardless of vsync or fps limiters. AMD has a lot of work to do to get CF sorted out. I doubt a new driver 2-3 months down the line will be the end of it, but hopefully they're good to go when the next gen arrives.

March 27, 2013 | 04:22 PM - Posted by Ryan Shrout

Input latency testing is my next target!

March 27, 2013 | 08:47 AM - Posted by JLM (not verified)

Great work and great review/article!

I do wish you guys had included the effects of a framerate limiter in addition to vsync, though... I've actually had a better experience using a framerate limiter through Afterburner on 7970 CF than I do with vsync; it seems to almost always remove stutter completely while also preserving the scaling.

March 27, 2013 | 04:30 PM - Posted by Ryan Shrout

I'll definitely be giving that shot!

March 27, 2013 | 09:20 AM - Posted by Anonymous (not verified)

What about older Nvidia cards?
What about Crossfire 6970s?

We played a bit of Natural Selection 2 the other day..meh..silly bunny jumpy game..imbalanced..go Marine..
Anyway..

Crossfire enabled 72 FPS
Crossfire disabled 64 FPS
What a ripoff!

Then we tried it on 2 GTX 580s same room and systems and almost the same results..

Multi GPU setups are just a ripoff unless you get Exactly DOUBLE for performance.

March 27, 2013 | 10:35 AM - Posted by John Doe (not verified)

Scaling depends on A TON of factors. GPU's, drivers, applications so on and so forth.

You need to understand the basics of these things before you throw out such pointless and silly statements.

If you want to compare scalability, first compare a pair of single cards in FurMark. Then SLi/CF them and compare again.

March 29, 2013 | 12:07 AM - Posted by Anonymous (not verified)

CF is failing in half the games across the board, and has been for some time.
TPU for instance is an example where he mentions it if you don't believe me.
That's the other big laugh, shouldn't have been recommended for a long time now.

Maybe AMD will actually do something now, I doubt it though, I suspect apologetics, misdirection, and lies.

March 27, 2013 | 11:26 AM - Posted by Noah Taylor (not verified)

What is or isn't a ripoff is completely subjective and will never be an agreed upon issue. However, you can get a great experience moving to multi gpu setups that isn't necessarily double performance. Hi resolution gaming especially, at 2560x most single cards can't quite get you to >40fps avg with everything turned up in most modern games and what we will see in the next year. If i can get 15 more fps at that resolution it really DOES make all the difference.

However, if I buy 2 graphics cards because AMD says that framerate is how i should rate their cards, and their are frames being counted that detract from the quality of the experience, and essentially bloat their test scores up to 300% over what is actually being displayed, then not only have we been deceived, but AMD has really scammed you, since the extra card is actually detracting from the experience in many ways.

March 27, 2013 | 11:34 AM - Posted by John Doe (not verified)

Moral of the story, get this before it goes out of stock.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814187196

It's a card of limited availibity and is one hell of a damn good, limited custom design.

I'm eagerly waiting for mine to be shipped.

Heh.

March 29, 2013 | 12:58 PM - Posted by Anonymous (not verified)

2GB is not enough for 1080p this days.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.