Feedback

Frame Rating Part 2: Finding and Defining Stutter

Author:
Manufacturer: PC Perspective

Another update

In our previous article and video, I introduced you to our upcoming testing methodology for evaluating graphics cards based not only frame rates but on frame smoothness and the efficiency of those frame rates.  I showed off some of the new hardware we are using for this process and detailed how direct capture of graphics card output allows us to find interesting frame and animation anomalies using some Photoshop still frames.

View Full Size

Today we are taking that a step further and looking at a couple of captured videos that demonstrate a "stutter" and walking you through, frame by frame, how we can detect, visualize and even start to measure them.

View Full Size

This video takes a couple of examples of stutter in games, DiRT 3 and Dishonored to be exact, and shows what they look like in real time, at 25% speed and then finally in a much more detailed frame-by-frame analysis.

 

Video Loading...

 

Obviously this is just a couple instances of what a stutter is and there are often times less apparent in-game stutters that are even harder to see in video playback.  Not to worry - this capture method is capable of seeing those issues as well and we plan on diving into the "micro" level as well shortly.

We aren't going to start talking about whose card and what driver is being used yet and I know that there are still a lot of questions to be answered on this topic.  You will be hearing more quite soon from us and I thank you all for your comments, critiques and support.

Let me know below what you thought of this video and any questions that you might have. 

 

January 16, 2013 | 06:41 PM - Posted by Ghost (not verified)

im glad to see you guys giving this issue some coverage, Ryan. Keep up the great work!

March 28, 2013 | 02:58 PM - Posted by Anonymous (not verified)

Yeah total thumbs up on the video and taking the time to point out the frames and moving back and forth, and showing the sliver frame (the 2nd/last time you made it CLEAR so marking that sliver frame helped).

Very good, video not too long (which is good), good explanation and walk through with the green penning in very necessary.

Thank you it was a good teaching video.

January 16, 2013 | 06:45 PM - Posted by David (not verified)

So, I'm guessing you're developing automated tools that generate the quantitative benchmarks, but analysis like in the video above is for analysis/qualitative evaluation?

January 17, 2013 | 11:47 AM - Posted by Ryan Shrout

That is correct!

January 16, 2013 | 09:27 PM - Posted by Nilbog

I am so glad you are testing this way.
That still with the 5 frames alone shows why this is a superior method.

What a waste of power.

Will you have some way of pointing this kind of thing out for regular benchmarks?
Would it be possible to note how many times the game stutters, and for how long? Maybe how many times scenes are rendered with unnecessary frames?
Or would that be a pain?
I really want to know how many useless frames my computer is wasting cycles on, that dont enhance my experience.

January 17, 2013 | 01:53 AM - Posted by Anonymous (not verified)

An easy way to show this, is to make fraps record all the individual frame times, and not just record the fps each second. That way you get a visual picture of the drops like in Hardocp's reviews. The green bar is the target 60fps and the red bar is the least acceptable 30fps line. If we experience any major drops like seen between 281 and 301, then you have a stuttering. Now Hardocp aren't recording all the individual frames here, but it's fortunately very easy to enable in fraps and use.

http://www.hardocp.com/image.html?image=MTM1NzQ5MTQwNlhWUlh0a2ZVRnhfNV81...

This gives us a graph like this one.

http://www.digital-daily.com/video/vga_testing_2007/index3.htm

Another site with focus on this...

http://techreport.com/review/24051/geforce-versus-radeon-captured-on-hig...

http://techreport.com/review/24022/does-the-radeon-hd-7950-stumble-in-wi...

January 17, 2013 | 02:32 AM - Posted by Anonymous (not verified)

And this new article

http://techreport.com/review/24218/a-driver-update-to-reduce-radeon-fram...

January 17, 2013 | 09:22 AM - Posted by YTech2 (not verified)

I do not fully agree with Anonymous.

These graphs and results doesn't identify where and when these stutters occurs. There could be 80 frames at a period which would sound great, however they could be displayed in a stuttered pattern.
The reverse reasoning apply for dropped frames.

January 17, 2013 | 11:48 AM - Posted by Ryan Shrout

Nilbog, yes we are working on things to do exactly as you are asking: how many stutters, how BAD the stutters are, etc.  Runts = frames that are wasted and shouldn't be used as part of a frame rate measurement.  

Soon!

January 17, 2013 | 02:37 PM - Posted by Nilbog

Wow that is going to be a lot of data. This method is genius!
Its sounding like PcPer is going to have the most in-depth GPU performance info. I can't wait to read these. Thanks guys!

Are you saying that you are going to try and note how long the stutters are, like in milliseconds?

Will Runts be pointed out separately or will the FPS have an asterisk?

March 28, 2013 | 03:04 PM - Posted by Anonymous (not verified)

Well so much for the the long standing argument on the human eye not being able to see more than 30 FPS... lol

What we have here is a fake "5 FPS" in one screen, cut to megga sliver jags, so the eyeball equivalent is 6FPS (on the worst stutter frame shown).

So it certainly appears people claiming 30-60 FPS is a very noticeable difference for them have a lot of facts to back themselves up with here. Of course it would mean they don't have "supervision" or a "superfastbrain", but some lousy vidcard output... LOL

Yes, exactly where things should wind up, a reasonable explanation... yet so funny...

January 17, 2013 | 05:55 AM - Posted by orvtrebor

The scientific break down/explanation is awesome, but what is the solution to the micro stutter issue?

Personally it doesn't really bother me, I don't even notice it most of the time.

Having micro stutter identified in future card reviews (based on different games) will be awesome for those who are really bothered by the issue though, so keep up the awesome work :)

January 17, 2013 | 11:49 AM - Posted by Ryan Shrout

The solution will be a combination of better / smarter game engines and better / smarter graphics cards / drivers.

That's a lot of /.

January 17, 2013 | 08:40 AM - Posted by YTech2 (not verified)

Great video!
I love these type of explanatory and comparison research work!

Frame stutter that you note, I wonder if this is caused by how the 3D Engine generates all objects based on the camera view's perspective.
What I mean is, this project may be tapping into the relationship of Game Engine with GPU and Drivers. Generally, to resolve this, the whole math may be required to be re-thought. Reminds me of how 3D Image Rendering works.

I do agree that less of these cuts does make the experience more enjoyable and closer to realism.

Keep it up :)

January 17, 2013 | 10:48 AM - Posted by Anonymous (not verified)

Do you have data on how long each frame takes to render? The previous anonymous's link to the FPS histogram was interesting, but it might be interesting to see this histogram of frame rendering times over the course of the benchmark rather than averaged. I think stutter would show up as "tailouts" to long rendering times next to an otherwise tight grouping of times.

January 17, 2013 | 11:50 AM - Posted by Ryan Shrout

Yes, we can tell that kind of information with the tools we have here and as we continue to cover this method we'll be sharing that data.

January 17, 2013 | 01:04 PM - Posted by SetiroN (not verified)

Single stutters aren't that big of a deal really. They happen, the causes can be multiple (and usually I/O related), they're annoying but don't excessively alter the experience as long as they're not frequent
The matter at hand, in my opinion, is smooth frame delivery at a micro level: framerates that don't equate to the same level of animation smoothness, the very thing that has always been so very apparent with multi-gpu setups and that to a lesser extent also exists on single cards. That's something that can substantially alter the experience and that needs to be focused on.
I'll never be thankful enough to Scott Wasson at tech report for finally putting a magnifying glass on this long standing issue.

January 17, 2013 | 01:10 PM - Posted by SetiroN (not verified)

That's to say:
Ryan, you have equipment that allows you to delve deeper into it and with the scientific objectivity that should convince everybody;
you already mentioned that you will take it to the micro level, I just hope that that's where you will focus your work instead of on normal stutters, which are already clearly visible to everyone without slow motion videos.

January 17, 2013 | 01:34 PM - Posted by Ryan Shrout

Yup, we will indeed do that!  For this video and step in the teaching process I thought it relevent to show a single stutter before diving into LOTS of them.

January 17, 2013 | 05:08 PM - Posted by Jonathan (not verified)

Ryan, based on the data you've gathered so far, is there any reason for people who bought AMD cards to think they should have gone nVidia (or vice versa)?

/nervous 7870 owner.

January 19, 2013 | 07:09 AM - Posted by Ryan Shrout

Not yet.

January 17, 2013 | 06:04 PM - Posted by Jose B (not verified)

So I guess micro stutter and and tearing is something that comes with being on PC. When I was a strict console gamer I was not very aware of these issue. Now that I am strictly a PC gamer I have become more sensitive to them.

January 18, 2013 | 12:57 PM - Posted by YTech2 (not verified)

I have notice stutter and tearing on console system for a well. Yes, even on the Nintendo system; although the low resolution make it less noticeable or negligible. During that time, there wasn't anything better.

However, I believe no one complained about them because you could not apply any such upgrade as possibly with PC.

This is another reason why I prefer PC.
You can tune it without a corporate organization telling you otherwise.

January 19, 2013 | 07:20 AM - Posted by Ryan Shrout

It can be eliminated more easily on consoles because of the fixed hardware that a software developer targets.

May 3, 2013 | 05:00 AM - Posted by Francisco (not verified)

It's in fact very difficult in this full of activity life to listen news on TV, thus I only use world wide web for that reason, and get the hottest information. you can easily buy targeted twitter followers, Facebook fans/likes in addition to Youtube . com views in addition to reviews.
If you want to buy targeted twitter followers for your twitter accounts, there are numerous great means.
Have more Targeted visitors simply by buy targeted twitter followers on your Account powerful way.

January 17, 2013 | 08:27 PM - Posted by Anonymous (not verified)

Micro stutter is a specially a problem wit CF, i wonder if you gone do some work on that two?

Also heavy loaded Eyefinty could be interesting to test, as ofc you can not capture all screens, you still can capture one monitor, and what i see that would be enough for testing.

I also had tons of problems in the beginning, and was on the verge of selling my second 5870, when i was reading on the WSGF of a guy that got a 3th card, and it reduced the problem for him whit in tolerable limits.

I then got a 3th 2GB Matrix, thinking, if it dose not fix my problem i just send it back and sell my second card, but yes for me, the problem also reduced significantly, now i got 6 months a go a cheap 4th 5870 Matrix to bridge me over till there are 8970s or 780s with more then 4GB mem. (as i want to use a 6000x1920 setup)

And the 4th even reduced the micro stutters even a little more, the general theory was that every card got more time to prepare for the next frame.

Now i hear that the driver for GCN 7xx0 cards is badly optimized to prevent Micro stutters, compared to my VLIW 5870 cards, nut new optimized anti micro stutter drivers are in the make.
(so i am actually glad i waited a extra generation ;-)

Now i am wondering, if you have your testbed ready for prime time, if you gone do some single, CF/SLI, Tri-CF/SLI and Quad-CF/SLI testing?

And i think it actually would be interesting to see 5870 vs 6970 vs 7970 vs 480 vs 580 vs 680, up to quad setups, to see if different generations act different, tho its properly a bid mouths to test.

Also what i think would make reading the frame-time graph better is if you would arrange the frame-time data in a graph that dose not go up and down, but starts with the longest frame-time and then the second longest frame-time.

Also display the frame-time in the graph, so that a 150ms frame is on the graph just as long as ten 15ms frames.

So in the example image/graph below Graph A and B are the same, only A is displaying the data as it occurs, ware in B the data is ordered neatly from long to short.

http://tweakers.net/ext/f/WN64NIkghTjI5saCrCPzdIy1/full.png

And properly actually easier to read for most readers.

January 19, 2013 | 07:21 AM - Posted by Ryan Shrout

Yeah, multi-GPU testing is in the books as well.  Also, Eyefinity / Surround should be testable with this method by capturing a single screen, yes.

January 18, 2013 | 02:12 PM - Posted by Lucian (not verified)

There is one game genre whose developers have been dealing with the matter since long ago (due to the genre gameplay nature): racing sims.
For stutter, the game simply do not display everything it can "spit". But this causes input lag.
For input lag, one common tweak is to disconnect (up to a point) the video from the controller input and from the physics simulation.

rFactor (2005) config:

Render Once Per VSync="0" // Attempts to render once per vsync; 1 = use timer, no wait; 2 = use vblank, no wait; 3 = use vblank, wait
Max Framerate="100.00000" // 0 to disable (note: positive numbers only, we always use the 'alternate' method now)
Steady Framerate Thresh="0.00000" // Allowed threshold in seconds to try to 'catch up' when falling behind using Max Framerate (use 0 for original behavior). This helps steady the framerate but may introduce more latency.
Flush Previous Frame="0" // Make sure command queue from previous frame is finished (may help prevent stuttering)
Synchronize Frame="0" // Extrapolate graphics using estimated render time in attempt to more accurately synchronize physics with graphics, 0.0 (off) - 1.0 (full)
Delay Video Swap="0" // Whether to delay video swap if card is busy - this should only be used if framerate clearly improves - otherwise it is only delaying response time

January 19, 2013 | 02:09 AM - Posted by theone2030

great stuff !! nobody has ever really payed attention to this !!! keep it up :)

January 19, 2013 | 04:56 AM - Posted by Erich (not verified)

This highlights one of the great advantages of consoles, in that developers can iron out these problems with absolute certainty prior to the game shipping.

I am a long-time PC gamer, but recently I started playing more PS3 titles simply because the overall experience (as opposed to the gee-whiz tech specs) was superior to that offered by my PC.

I have a high-powered rig: 4.8ghz i7-2600k, GTX670 OC'ed, SSD, etc. And it all looks good on paper. But when you start playing a lot of these PC games, they're really glitchy and I find myself yearning for the ability to just play the game without worrying so much. Throw in all the poorly-optimized console ports with blurry textures and we've got a real problem here.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.