Review Index:
Feedback

Frame Rating: Visual Effects of Vsync on Gaming Animation

Author:
Manufacturer: PC Perspective

Not a simple answer

After publishing the Frame Rating Part 3 story, I started to see quite a bit of feedback from readers and other enthusiasts with many requests for information about Vsync and how it might affect the results we are seeing here.  Vertical Sync is the fix for screen tearing, a common artifact seen in gaming (and other mediums) when the frame rendering rate doesn’t match the display’s refresh rate.  Enabling Vsync will force the rendering engine to only display and switch frames in the buffer to match the vertical refresh rate of the monitor or a divisor of it.  So a 60 Hz monitor could only display frames at 16ms (60 FPS), 33ms (30 FPS), 50ms (20 FPS), and so on.

Many early readers hypothesized that simply enabling Vsync would fix the stutter and runt issues that Frame Rating was bringing to light.  In fact, AMD was a proponent of this fix, as many conversations we have had with the GPU giant trailed into the direction of Vsync as answer to their multi-GPU issues. 

In our continuing research on graphics performance, part of our Frame Rating story line, I recently spent many hours playing games on different hardware configurations and different levels of Vertical Sync.  After this time testing, I am comfortable in saying that I do not think that simply enabling Vsync on platforms that exhibit a large number of runt frames fixes the issue.  It may prevent runts, but it does not actually produce a completely smooth animation. 

To be 100% clear - the issues with Vsync and animation smoothness are not limited to AMD graphics cards or even multi-GPU configurations.  The situations we are demonstrating here present themselves equally on AMD and NVIDIA platforms and with single or dual card configurations, as long as all other parameters are met.  Our goal today is only to compare a typical Vsync situation from either vendor to a reference result at 60 FPS and at 30 FPS; not to compare AMD against NVIDIA!!

View Full Size

In our initial research with Frame Rating, I presented this graph on the page discussing Vsync.  At the time, I left this note with the image:

The single card and SLI configurations without Vsync disabled look just like they did on previous pages but the graph for GTX 680 SLI with Vsync on is very different.  Frame times are only switching back and forth between 16 ms and 33 ms, 60 and 30 instantaneous FPS due to the restrictions of Vsync.  What might not be obvious at first is that the constant shifting back and forth between these two rates (two refresh cycles with one frame, one refresh cycle with one frame) can actually cause more stuttering and animation inconsistencies than would otherwise appear.

Even though I had tested this out and could literally SEE that animation inconsistency I didn't yet have a way to try and demonstrate it to our readers, but today I think we do.

The plan for today's article is going to be simple.  I am going to present a set of three videos to you that show side by side runs from different configuration options and tell you what I think we are seeing in each result.  Then on another page, I'm going to show you three more videos and see if you can pinpoint the problems on your own.

Continue reading our article on the effects of Vsync on gaming animation smoothness!!

 

Battlefield 3 - 2560x1440 - Ultra Settings

Our first video comparison will look at two fixed frame rate runs of a portion of Battlefield 3, one at 60 FPS consistently and one at 30 FPS consistently.  The first question I'll want to address is on the hardware behind these "reference" runs.  While I will tell you we used Titan cards in SLI for our recordings, the truth it matters very little which configuration we used to get these results, as the goal was to have so much additional performance that we didn't ever worry about frame rates falling below the Vsync rates.  By enabling standard Vsync we were able to capture a steady 60 FPS result and with NVIDIA's half-refresh rate Adaptive Vsync I could capture a solid 30 FPS result. 

Download the 250MB MP4 from Mega.co.nz

Reports from most users are telling us that you NEED to download these files for a solid comparison!

Battlefield 3 - 60 FPS vs 30 FPS Comparison

You should be able to tell pretty easily that the left hand side of this video is the 60 FPS version and the right hand side is the 30 FPS version.  The animation on the left is clearly smoother though neither has any "stutter" or variance in the frame rate.  Yes, the right side won't look as good in comparison, but when viewed on its own (cover the left side with a piece of paper) and it should look great in real time and lower speeds.

In data form, this is what this comparison looks like:

View Full Size

The black line is nearly completely static at 16 ms frame times (only a single frame time spike to the higher 33 ms rate) resulting in a completely smooth 60 FPS animation rate on the screen.  Our orange line shows the result of Adaptive half-refresh rate settings from NVIDIA's control panel giving us a static 30 FPS (33 ms) animation rate, with one instance of higher / lower frame times. 

 

Our second video will now bring in a typical graphics card configuration with standard Vsync enabled and compare it to the 60 FPS result above.  In this case the test is using a single Radeon HD 7970 GHz Edition card but again this could be any card, in any game at any settings that has frame rates under the maximum refresh rate of your display for significant amounts of time.

Download the 250MB MP4 from Mega.co.nz

Reports from most users are telling us that you NEED to download these files for a solid comparison!

Battlefield 3 - 60 FPS vs Standard Vsync Comparison

In this video, the 60 FPS result is on the left and the HD 7970 running standard Vsync is on the right hand side.  You should be able to see at real time the difference in smoothness between these two different user experiences and it will be more apparent when we slow down the video to 50% and 20%. 

What does this look like in data form?

View Full Size

The black line is our 60 FPS static reference video while the orange line represents the standard Vsync run with the Radeon HD 7970 card.  What kind of appears as "blocks" of orange on the graph is actually very quick and repeated variation in the instantaneous frame rate of 16 ms and 33 ms. This is due to the the function of Vsync that forces the frame to only be displayed at each refresh cycle of the display.  In the first 20 seconds of the game, Battlefield 3 with these settings and this hardware is switching between 60 FPS and 30 FPS pretty regularl,y and because of that you see the differences in animation smoothness above. 

 

What is maybe most interesting is our final video that compares a flat 30 FPS to the same Vsync result shown above.

Download the 250MB MP4 from Mega.co.nz

Reports from most users are telling us that you NEED to download these files for a solid comparison!

Battlefield 3 - 30 FPS vs Standard Vsync Comparison

The left hand side is the static 30 FPS result and on the right again is the Vsync run from the Radeon HD 7970 GHz Edition. Comparing the video in this case is much more interesting as in my experience there are some divided opinions.  In a purely mathematical view the screen on the left should be "smoother" than the animation on the right hand side, even though on average it is running at a lower frame rate per second.  However, the Vsync result has variance in frame times and thus you can see some patterns to the frames that don't exist at static 30 FPS or 60 FPS results.  It kind of halts, or appears to freeze some times as a result of seeing frames at 16 ms, 16 ms, 16 ms, 33 ms, 16 ms, 16 ms...

Maybe looking at the data will help describe the phenomenon.

View Full Size

Clearly the black line of frame times is the same or slower than every instance of the orange line that represents the Vsync video output.  However, the black line is consistently at 30 FPS while the orange line varies between 30 FPS and 60 FPS.  Those periods of 60 FPS visuals are definitely smoother than the 30 FPS result (as we showed you in the first video on this page) but the variance in frame rates is actually more noticeable than you might have otherwise realized.

Despite all the arguing back and forth on what the limit of frame rate perception of the human eye is, there is one thing that is true without doubt - the human eye and brain can detect very subtle changes in animations pretty easily.  Looking at a five second animation at 55 FPS and then 60 FPS, you'd be hard pressed to tell which is which.  But if you see a video running at 60 FPS that suddenly drops to 30 FPS and you can clearly see the effect. 

Now comes the real debate - which side of the video above is better?  "Better" is a term that has many meanings and I don't have any doubts that there will be variance in answers from our readers across the world.  I fall on the side of more static frame rate - consistent 30 FPS performance is better than what we have in many cases with traditional Vsync.

 

Now, on the next page, we are going to present the same videos and data but without telling you which result is which.

April 16, 2013 | 03:59 PM - Posted by corhen (not verified)

we see all these comments about Vsync, but how does nvidias "adaptive vsync" fare?

April 16, 2013 | 10:51 PM - Posted by svnowviwvn

I would also like to see an analysis of Nvidia's "adaptive vsync"

April 16, 2013 | 11:43 PM - Posted by Ryan Shrout

Fair enough, we just wanted to show you a vendor agnostic comparison on Vsync at first.

April 18, 2013 | 06:20 PM - Posted by Anonymous (not verified)

While you have your Request Book open what about:

Virtual VSync
http://www.lucidlogix.com/technology-virtual-v-sync.html

A quote from that Page: "Virtual Vsync enables games to run at any FPS, with no image tearing, stuttering, lags and latency artifacts." .

Can you tell us if it does indeed 'fix everything' -- If it does you can use it as the Base System.

Thanks for these Articles and working on this for us.

April 26, 2013 | 09:10 AM - Posted by endiZ (not verified)

nVidia's Adaptive sync vs RadeonPro Dynamic sync control comparison would be great!

http://www.radeonpro.info/features/dynamic-vsync-control/

April 16, 2013 | 04:12 PM - Posted by Martin (not verified)

Was the conclusion that with a sufficiently fast card that consistently gives frame times less than 16 ms, Vsync will always give a smoother animation at the cost of up to 16 ms extra input lag?

April 20, 2013 | 07:38 AM - Posted by Bogdan (not verified)

My sentiments exactly! Obviously, even with no direct comparison between the two giant GPU manufacturers, there's no doubt in my mind that Adaptive VSync really "brings the best of both worlds" in terms of smooth transitions between various frame times while under the 60FPS barrier. Even at constant 60 FPS I could live with extra 16ms input latency even on more demanding games in this respect (like maybe Dirt 3), provided I get state of the art image quality.
Also, bare in mind that when I turn on VSync in Crysis 3 menu on my GTX690, it is for sure Adaptive VSync that's at work, and not standard VSync. That is with default settings in NVidia Control Panel! So no worries there on the NVidia side...
Thank you Ryan for your dedication and professionalism in bringing us all these really fine reviews! You guys are doing a great job there, and most impressive it's out of real passion.
Keep it up! Sky is the limit! :-)

April 16, 2013 | 04:37 PM - Posted by ezjohny

The left side was clearly the winner here.
Your way of testing certain portion of the game is very educational to the public, thank you for this!

April 16, 2013 | 09:57 PM - Posted by Anonymous (not verified)

The winner of what?

He's got what seems like an SLI setup that can sustain 60 fps with vsync on compared to a half the price non-crossfire setup with vsync on to create inconsistency.

April 16, 2013 | 11:44 PM - Posted by Ryan Shrout

The goal of these vidoes was NOT to compare SLI vs a single card.  The goal was compare what you can see at 60 FPS fixed against ANY Vsync situation and 30 FPS against ANY Vsync situation.  

Reading comprehension!

April 17, 2013 | 02:12 AM - Posted by Anonymous (not verified)

That you for editing your article to help make clear what you stated above.

Maybe edit this as well?

"What does this really mean based on our previous Frame Rating data sets, AMD's CrossFire issues and user experiences?"

It means absolutely nothing since non-crossfire doesn't have issues without Vsync.

"In fact, AMD was a proponent of this fix, as many conversations we have had with the GPU giant trailed into the direction of Vsync as answer to their multi-GPU issues. "

Reading comprehension!

April 17, 2013 | 07:23 AM - Posted by JCCIII

That was my conclusion, too. The videos were so well done that I made my choices within seconds of each. The sign “Recycling Limited Company” was irregular on the right.

And, the truth comes: am I a videophile or a poser; I sure hope I will sleep tonight!

Joseph C. Carbone III

April 16, 2013 | 04:48 PM - Posted by Paul Keeble (not verified)

It took me a maximum of 5 seconds in every case to identify those. Identifying them was immediately obvious, I did not need the 50% or the 20% slow down to be able to tell the difference. You could have done that test with considerably smaller files.

April 16, 2013 | 09:58 PM - Posted by Anonymous (not verified)

Are we able to identify what the 7970 was being compared to exactly?

April 17, 2013 | 10:18 AM - Posted by bystander (not verified)

The 7970 wasn't the comparison. The comparison was the difference between a fixed 30 FPS or fixed 60 FPS compared to v-sync on when there is variation between the two.

April 16, 2013 | 05:10 PM - Posted by Bryan Watson (not verified)

Just a tip for recording these demonstration videos in the future:

I found myself more distracted on the actual time difference between actions on the video then the smoothness of the animation.

You can get much better comparison videos by running a keyboard / mouse macro to replicate the exactly same keypresses and mouse movements every time.

That way you would have near identical videos that you could sync easily and provide a much clearer ground for comparison.

April 16, 2013 | 07:02 PM - Posted by Ryan Shrout

Thanks for the tip but we have tried that before and macro programs don't respond to input for games consistently...

April 17, 2013 | 12:39 AM - Posted by Steve W. (not verified)

Hey Ryan,

I would like to propose adding a canned benchmark to the mix. They might not be ideal for real world Benchmark between cards, but I think they should do fine in judging smoothness. Either in split copy or half and half or both =).

My picks for canned benchmark:

-Batman - Arkham City ( This benchmark has good amount of panning especially that "hiccup" in the beginning when panning up the stairs, and bunch of fast shooting objects although done by physx).

-AVP (lots of up close and large moving "beings")

-Lost Planet 2 Benchmark (the one with 3rd person in-game play, and closest to what the previous poster Bryan Watson suggest)

-Catzilla (non-real game but uses newer capabilities (OpenGL 4.0 and DirectX 9/11... and faster pace as oppsed to heaven, and just plain cooler)

-there are probably better ones, but these are the ones I've played around with on my 680.

Other thoughts... The 60fps vs Vsync difference is very unnoticeable, but jumping to a possible next level of testing is mouse/keyboard lag comparison (you mentioned it somewhere but I can't remember) which in my opinion is icing on the cake. (maybe that could lead to debunking "gaming" keyboard and mice vs ps2,standard or even wireless! very destructive info imho)

ps. the music needs more wub wub

April 16, 2013 | 08:56 PM - Posted by Humanitarian

This was distracting me a little too, Blocking off half your monitor with card or paper makes it a lot easier.

April 16, 2013 | 06:03 PM - Posted by looniam (not verified)

if i listen to that music anymore i will promptly put a spike through my head.

i know there is a mute but, it takes away from the experience! :p

April 16, 2013 | 07:02 PM - Posted by Ryan Shrout

So, you're saying you need new background music, check.

April 16, 2013 | 08:56 PM - Posted by Humanitarian

Definitely needs more wub wub.

April 16, 2013 | 07:09 PM - Posted by Maester Aemon (not verified)

This time I'm a bit confused with your results. Everybody knows that enabling standard vsync on a game where the GPU can't sustain fps above 60 is a bad idea because even if the fps go down to 59, then the everything is rendered at 30 fps until the GPU can go back to 60 or above.

The real question is, what about the case where a Crossfire configuration is capable to run a game without ever going below 60 fps and then, when you enable vsync the games runs at a constant 60fps with no drops to 30fps at all? Does that solve the runt frames and stuttering issue?

April 16, 2013 | 09:42 PM - Posted by Anonymous (not verified)

Showing this wouldn't be consistent with what the sponsor wants to present.

You have to think of all parties involved.

I'm curious, what was the actual Nvidia setup used in the battlefield 3 comparison? If it's something that can handle 60+ fps without issue, then it's easy to see why this comparison is silly.

April 16, 2013 | 09:53 PM - Posted by Anonymous (not verified)

Reread this...

"And of course, enabling Vsync results in a situation where the frame rate does NOT dip below 60 FPS, as we did with our reference videos used in this article, will not result in animation inconsistencies."

So why not use crossfire 7970 as well? 7970 on it's own doesn't have tearing issues with v-sync off.

April 16, 2013 | 11:47 PM - Posted by Ryan Shrout

You guys and your conspiracies never get tired, do you?  :)

To quote a comment I left above...

"The goal of these vidoes was NOT to compare SLI vs a single card.  The goal was compare what you can see at 60 FPS fixed against ANY Vsync situation and 30 FPS against ANY Vsync situation. 

Reading comprehension!"

April 18, 2013 | 02:11 AM - Posted by techno (not verified)

I know this is a comparison of vsync vs fixed frame rates....but the reason you are doing this is because of the hullabaloo your new methods have caused in relation to crossfire.

The question that was raised was "does vsync address the crossfire issue?"

This article does not address this and merely tries to slur vsync as a possible fix.

We all know vsync has the issues you have high ligthed but please answer the question.

"Does vsync in situations where frame rates are consistently above 60fps fix the crossfire problems?"

You have not answered this and have just tried to slur AMD's response without proper testing.

Ryan your blood runs GREEN.

April 20, 2013 | 09:16 AM - Posted by Bogdan (not verified)

You know what? I think too Ryan's blood runs Green, and to be perfectly honest I wouldn't mind if mine ran the same color, just as long as what's been showed it's from the editorial stance and one's dedication to show us ALL how actual technologies impact on various game experiences.

Even if he hasn't clearly stated (and it is said from the beginning of this article that we're not comparing the two GPU giants), it is my understanding that any GPU configuration powerful enough to consistently run over 60FPS will have no problems delivering smooth gaming experience at 60Hz-60FPS. By all means, that's good news for me and my GTX690, and it'd be the same if I owned a pair of 7970s (JUST for me personally no thanks! - due to greater power consumption, extra space requirements, worse noise and temperature levels inside my CM690 II Advanced case).

I think what Ryan means is to show us ALL (and THAT means not only me or others with ARES II in their cases), just what happens when you have a GTX670, GTX680, a HD7970, or other SLI/CF GPU configurations which don't always stay above 60FPS barrier and what your choices and tradeoffs would be in respect to VSync, image quality and gaming experience.

As for his sponsors, I think that's entirely his problem and doesn't concern us, just as long as what we read, see and hear is pertinent and objectively explained on specific topics. As for what everyone understands, that's another story...

PS: Just to be clear, not taking into consideration the high-end/highly expensive GTX690 and Titan (especially this one!), I think that hardware implemented Adaptive VSync is an obvious step forward from the Green Team, and that is especially valid for those of us who don't necessarily get to buy extreme graphics cards. I can't wait for AMD to pick up the pace and give Nvidia some rough times, but for now please don't mind if "my blood runs green" too!

Thanks again, Ryan - for your dedication and professionalism!

Peace to all! May it be that we fight only on game servers and really nice forums like PC-PER.

April 16, 2013 | 07:15 PM - Posted by AlienAndy (not verified)

Thanks again for your tireless efforts, Ryan.

Really looking forward to seeing some older cards tested.

April 16, 2013 | 08:49 PM - Posted by tbone (not verified)

good comparison but certain sections distracted me from the left vs right like when one scene has motion and the other doesn't like a car or motorbike...for example the motorcycle at the end on the left distracted me from viewing the right.

first video was obvious when at 50%

second video not so clear to me which was better even at 50%, by 20% maybe a little yes.

third video was impossible for me to tell the difference between the 2.

April 16, 2013 | 10:00 PM - Posted by Anonymous (not verified)

The difference between 7970 with vsync vs what exactly?

April 17, 2013 | 01:35 AM - Posted by PoliteMaster (not verified)

After watching the first and third videos:
In 30 FPS vs. real world V-sync--at 100% and 50% speeds--I had a hard decision to make. However, at 20% speed, the real world V-sync pulsated (at first, I described it as a heartbeat). After rewinding a couple minutes later, I could identify this quality at the higher speeds.

So after seeing the slowed-down footage, I'm on board with saying a consistent 30 FPS is better. I liked that PC Per got the real-world experience right, meaning it closely mirrors my own computer experiences. As a new builder, it is interesting to see what's gained purely from consistent frame rates (as opposed to only higher frame rates). Excluding than V-sync, what needs to happen between software and hardware devs to overcome the "heartbeat" users have accepted as the best that can be done (well, until retroactively running a game or app way down the road)? I'd be interested in seeing that discussion.

Great work! Keep on observing.

Intel i7-3770K
2x GTX 670s
1920x1080 (only), 60Hz, Adpative V-Sync

- I think an improvement for future videos of this kind would be to use built-in benchmarks to have identical split screens (as performance rating is not the focus).

April 17, 2013 | 02:07 AM - Posted by Panta (not verified)

when you say standard vsync, vs xFPS vsync,
the xFPS is just frame limiter vs in game vsync option?

April 17, 2013 | 10:10 AM - Posted by Ryan Shrout

No frame limiters, just enabling Vsync on a REALLY powerful configuration to make sure we get a static 60/30 FPS recording.

April 17, 2013 | 06:44 PM - Posted by Anonymous (not verified)

Enabling vsync on a really powerful machine to get a static 60 fps, sounds like what everyone was saying may fix crossfire issue.

Gotta please sponsors I guess.

April 22, 2013 | 07:35 AM - Posted by Anonymous (not verified)

Well, Clearly its not fixed Vsync @ 30FPS, otherwise it would not change between 60 and 30 fps.

And also, if you sli/crossfire solution can maintain above 60fps, vsync will eliminate stuttering/runt frames etc..

April 22, 2013 | 07:35 AM - Posted by Anonymous (not verified)

Well, Clearly its not fixed Vsync @ 30FPS, otherwise it would not change between 60 and 30 fps.

And also, if you sli/crossfire solution can maintain above 60fps, vsync will eliminate stuttering/runt frames etc..

April 17, 2013 | 02:48 AM - Posted by Tri Wahyudianto (not verified)

Crazy !
so where you get professor-degree in computer science Ryan ?

with video everything much more clearer now
and a nice background music, perfecto

April 17, 2013 | 04:06 AM - Posted by dragosmp (not verified)

Great stuff. It seems to me this article puts the many forum's assembly of "known facts" on a much more scientific base. I guess in a sense everybody knew that when experiencing screen tearing one of the things to try was Vsync - for some worked, for some no, and we guessed why; now we can say we know.

The Vsync can still be a good solution. It can be even better if for example we'd have manual control on it. From this article I can conclude the worst case scenario for Vsync would be a game whose output is between 50-70FPS - the Vsync makes the animation just jump between 33 and 16ms; it would be nice to cap manually at 33ms and be done with it.

Keep up the good stuff, looking forward for the adaptive Vsync.

April 17, 2013 | 07:55 AM - Posted by gamerk2 (not verified)

Not shocked by the results in the least. Despite the limitations of the human eye, the jump from 60FPS to 30FPS is quite noticeable, and most people would agree the constant 30FPS is "better" to watch.

April 17, 2013 | 10:27 AM - Posted by FenceMan (not verified)

Ryan,

This is all very confusing (and interesting), can you tell me in your opinion how I should setup if I had the following cards (enable / disable V-Sync or Adaptive V-Sync or what) and cannot stand frame tearing but want maximum smoothness?

GTX 690

Crossfire 7970

April 17, 2013 | 02:37 PM - Posted by bystander (not verified)

I'm pretty sure adaptive v-sync is your option, unless the limited tearing you get below your refresh rate still bothers you. At that point, v-sync and lower your settings is your only option. If you want to throw in hardware, getting a 120hz monitor helps, as it gives you a 25ms frame time, between 16.7 and 33.3ms that you get with a 60hz monitor.

April 17, 2013 | 02:37 PM - Posted by |ALE| (not verified)

it seems that triplebuffering is an unknown and misterious "something" that can't be enabled to avoid the fps cutting to 30fps...

April 17, 2013 | 04:22 PM - Posted by Anonymous (not verified)

exactly. these tests are useless for anyone who force triple buffering via d3doverrider.

its well known that double buffer vsync is worthless if you cannot maintain 60fps.

sure triple buffer adds input lag, but lowering max pre rendered frames to 1 (on single gpu) helps a lot.

April 17, 2013 | 05:04 PM - Posted by bystander (not verified)

Triple buffering helps prevent you getting stuck at a solid 30 FPS when you cannot maintain 60 FPS, but when v-sync is on, and you are getting 45 FPS, you still get displayed times of 16.7 ms and 33.3 ms between frames. Since you cannot update more than one frame per refresh, the system will alternate between waiting one refresh and two in order to maintain 45 FPS.

Triple buffering just allows your GPU to continue rendering a new frame, on another buffer, while the next frame is waiting for the frame buffer to be writable.

April 17, 2013 | 06:10 PM - Posted by SiberX (not verified)

I'm looking forward to seeing more about using frame rating tools to empirically calculate input latency; I believe this deserves as much attention as frame delivery smoothness as it can be equally detrimental to immersive gameplay to have slow, "soupy" controls simply because a developer can't be arsed to minimize the 3-5 frames of delay their poorly coded engine introduces!

April 17, 2013 | 06:13 PM - Posted by Luciano (not verified)

Ryan, one very good input lag test would be simracing.
There is a brazilian guy iRacing world champion called Hugo Luis that I'm sure would be very pleased to tell you how vsync types and triple buffer affect input lag.
Or Greg Huttu, another champion.

When nVidia tried to market 3dVision surround years ago with 400 series, simracers spot on detected that "something was wrong":

http://www.youtube.com/watch?v=2vls-71ofkI

April 18, 2013 | 05:10 AM - Posted by JCCIII

Dear Mr. Shrout,

Unnatural changes in motion is the concern; therefore, while examining “Battlefield 3 - 30 FPS vs Standard Vsync Comparison,” the right side of the screen, with standard vertical synchronization, was smooth; while the left side of the screen, with static 30 frames per second, with Adaptive Vsync, was less smooth.

This is empirically evident while paying attention to the wall between 2 min. 5 seconds and 2 min. 11 seconds or while paying attention to the yellow dumpster between 3 min. 15 seconds and 3 min. 21 seconds; jumpy motion is easily seen at 20% speed on the left side of the screen, with the right side of the screen demonstrating much smoother motion.

With “Sleeping Dogs - 30 FPS vs Standard Vsync Comparison,” the left side of the screen is smoother. Although both sides are irregular, the right side is more so, with sharp regular jumps, attributed to the Adaptive Vsync, being much more disturbing.

Please, examine this from 1 min. 50 seconds through 1 min. 54 seconds at 20%. Your thoughts about my conclusion would be appreciated.

I am thankful for your time and for all of this work.

Sincerely,
Joseph C. Carbone III; 17 April 2013

April 18, 2013 | 02:02 AM - Posted by myloginisbroken (not verified)

I could tell within a few seconds for the first two videos (and then double-check on 20%, just to be safe), but for the third video I am completely unable to tell which one it is which. I can almost-sorta-kinda see it at 50%, and it is obvious at 20%.

Anyone else?

April 18, 2013 | 02:26 AM - Posted by techno (not verified)

My fix for all these issues....get a 120Hz monitor....set graphical settings high enough so you don't go above the refresh rate and experience tearing, even down clock the gpu if the game isn't demanding enough to hold you under 120hz....so you can enjoy a nice smooth 60-80fps without screen tearing, additional input lag or frame rate switching issues....job done.

April 18, 2013 | 02:57 AM - Posted by Anonymous (not verified)

Unless you are running crossfire. That is where all of this started.

April 18, 2013 | 03:13 AM - Posted by techno (not verified)

^^^...lol

April 18, 2013 | 04:24 AM - Posted by Sam Maghsoodloo (not verified)

Ryan, thanks for doing this work.
I'm not sure you will belive me, but when I have vsync disabled, I can tell with my eyes when my games are hitting exactly 60fps, plus or minus 3 or 4. I can do it every time, all the time, and I use fraps to check. The motion "clicks" into smootthness, often for only a second or two, but long enough for me to glance at my fraps readout and see that the number is between 57-63 every time. Is this because the tear lines are showing up at the very top or bottom of the screen (where I'm not looking?.

April 18, 2013 | 07:29 AM - Posted by Goldmember (not verified)

Please remove the 50% and 20% from the blind test. The whole point is to see if you can see a difference between 60 and 30 FPS, so slowing it down changes the scenario completely. It just makes the file size larger for no benefit. The constant 60 FPS was best, the 30 FPS was worst and the v-sync was in between. But who runs v-sync if you cannot maintain >60 FPS anyway?

April 18, 2013 | 09:13 AM - Posted by Anonymous (not verified)

Just fuck of pcper and end the witch hunt against AMD. You don't even know how to use a fucking computer. You can create frame latency issues in bf3 with Nvidia or AMD cards but not knowing how to use Vsync or in the case of knowing what you are doing reduce frame latency issues with it. If you fucking morons don't know how just stop writing these articles seriously. The AMD witch hunt is bullshit and needs to end.

I can assure you, your frame rating results for BF3 are wrong and you don't know how to use Vsync.

April 18, 2013 | 09:22 AM - Posted by AlienAndy (not verified)

Stupid troll will troll.

April 18, 2013 | 02:27 PM - Posted by FenceMan (not verified)

I still don't get the point of all of this?

I have a GTX 690, how do I set it up? What is the "proper" pcper method of setting this up so it runs the way you want / expect it to?

Same goes for 7970CFX...

I am not trying to be weird, just saying, you are puking out all of this info, please give me something I can use, tell me how to best setup my rig.

April 18, 2013 | 11:10 PM - Posted by bystander (not verified)

This was a subjective test. It is to show you what different refresh rates look like with v-sync on. Do you notice stuttering when you are not maintaining a constant FPS? Everyone would likely agree that solid 60 FPS with v-sync on looks the best, but do you find solid 30 FPS with v-sync to be better than 40-50 FPS with v-sync? Some people do, some people don't, but it does show the weakness of v-sync regardless.

April 20, 2013 | 03:17 AM - Posted by Martin Trautvetter

So, I think something's wrong with the BF3 videos, or at least BF3-60v30.mp4 that I downloaded.

The phenomenon I'm referring to is easily seen between 2:03 and 2:10 of said video, when looking at the tile around the door that you're turning into on the left side and the rug on the floor in the kitchen.

On the left side of the video, from the 60 FPS test, both the tile and the rug stay crystal clear with every frame. On the right side, the 30 FPS part, there clearly is some form of motion blur applied. The tile, the rug, in fact everything but the gun is blurred while the player is turning left.

Just pause the video at 2:08~2:09 and have a direct comparison - clear tile and textures on the left, motion-blur on the right:

http://imgur.com/2jaftwP

So, was there a mix-up with different BF3 settings, is this a artifact of the video pipeline? And is that video still a valid basis for comparison?

April 20, 2013 | 06:54 PM - Posted by Anonymous (not verified)

The download site Mega has changed it's terms of service. Two days ago, I was able to download the Battlefield 3 comparison video files. Today, the download site is claiming this:

"Please update your browser. Warning: You are using an outdated browser that is not supported by MEGA. Please update your browser and make sure that you keep the default settings."

This is completely untrue since I have done nothing in the last two days in regards to my browser which is IE 10. Mega only gives me the option of downloading and installing Google chrome. I will not be forced to install that browser just to see the Sleeping Dogs comparison videos. Please inform Mega that this is completely unacceptable behavior to viewers of PC Perspective. Thank you.

April 22, 2013 | 12:19 AM - Posted by katamari (not verified)

60 vs 30 60 wins.
30 vs vsync 30 wins.
60 vs vsync tough call but 60 wins BARELY. I can only tell the difference when you turned the camera at 20%.

April 23, 2013 | 09:08 AM - Posted by loc

Good article but what I miss is a game test with v-sync enabled with triple buffering build in game. World of Warcraft is good example of this. It runs smoothly without tearing even if v-sync is enabled thanks to triple buffering.

April 23, 2013 | 11:46 AM - Posted by bystander (not verified)

Most games do not have a way to force triple-buffering, but it is pretty safe bet that if a game runs in the 40-50 FPS range with v-sync on, triple buffering is being used.

Triple buffering does not fix the problem v-sync causes, in which some frames take 16.7ms to display, and others take 33.3ms of time to display. It can't. It is not possible. Triple buffering makes it possible for frames to be rendered while the current frame is waiting to be sent to the frame buffer. Without triple buffering, you end up forcing the GPU to wait until the previous frame is displayed, before it can create a new frame, resulting in a constant 30 FPS, even if 50 is possible.

April 23, 2013 | 11:49 AM - Posted by bystander (not verified)

The comparison you want is the constant 30 FPS example, compared to normal v-sync on the 7970. The constant 30 FPS with v-sync is like not having triple buffering, and the normal v-sync on the 7970 is like having triple buffering, because it does.

June 9, 2013 | 11:20 AM - Posted by Anonymous (not verified)

Assuming it's 33ms and not 33.33ms, Isn't the best solution to this is to simply cap at a divisible of 16.5ms? That's what I do and it feels smooth as butter.

16.5x4 - cap at 66
8.25x16 - cap at 132

Which will obviously give us an obvious multiple for each step. If you wanted to play at 30fps you'd cap at 33 - play double that frame rate, you need to cap at 66 - on a 120hz monitor you'd need to cap at 132 - and so on.

The game engine would be sending you frame-per-frame. It seems to improve hit-detection and all that other crap too.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.