Review Index:
Feedback

Frame Rating Dissected: Full Details on Capture-based Graphics Performance Testing

Visualizing Runts and Lower Frame Rates – Video Examples

I know that with the publication of this article finalizing our testing methods for Frame Rating, we are going to get lots of comments from gamers about how they have these configurations and that they don’t see the issues that we are discussing and penalizing AMD for.  Whether or not they are telling the truth or there is a bit self-preservation / fan-boyism at work, we wanted to do our best and try to provide example of these phenomenon and we made a handful of videos to share with our readers.

Keep in mind that these videos are using content captures through our Frame Rating system from Radeon HD 7950s in CrossFire and GeForce GTX 660 Ti cards in SLI – but the effect on visuals is identical to what we see with the HD 7970s and GTX 680s.  We used records from our 1920x1080 testing.  Also, because we are doing manual run throughs of all of these games now, no two runs are exactly alike; there will be differences in the overall gameplay sequences but we have done our best to make sure they are as close as possible.

If you would prefer to download this video file, you can do so here.

This is the longest our example videos that shows 50% speed and 30% speed.  Keep an eye on the smoothness of the animations on the screen including while walking straight ahead as well as while turn and rotating.  Because we have Vsync disabled on these systems you will see some horizontal tear on the screen on BOTH configurations and that is fully expected.  In order to show the examples as exactly as possible we had to leave that feature turned off but you should try to ignore those large “tears” and focus on the “micro” stutter on the grass on the ground, etc.

Did you see it?  We debating making this video a blind A/B test but instead decided it was better to just be up front with the results. 

If you would prefer to download this video file, you can do so here.

This video captured from Sleeping Dogs shows the same kind of animation smoothness advantage that NVIDIA SLI has over the Radeon HD 7950s in CrossFire.  This animation difference is not due strictly to stutter but that the fact that AMD configuration is seeing every other frame a runt, thereby cutting the number of frames in each time period in half compared to the 660 Tis in SLI. 

If you would prefer to download this video file, you can do so here.

Our next video is of Far Cry 3 and while I admit that neither experience is a smooth as we would like, the HD 7950s in CrossFire are showing a lot more stuttering than the 660 Ti cards.

If you would prefer to download this video file, you can do so here.

The visual animation issues are bit harder to see in our Battlefield 3 video due to some darker than normal scenes, but if you keep an eye on the sprint section down the alley way closely you will clearly see the slower animation fresh rate of the AMD CrossFire system.

If you would prefer to download this video file, you can do so here.

Our capture of Crysis 3 is another case where neither configuration is running as smooth as we should expect but it is very apparent that the AMD HD 7950s are running at a much lower observed frame rate than the 660 Ti SLI solution. 

If you would prefer to download this video file, you can do so here.

DiRT 3 is a bit different - this compares single card performance and smoothness on SINGLE cards, showing you that without taking multi-GPU issues into account, both vendors can provide high quality real-world experiences.  Both sides of the video look equally smooth and look to be providing a solid game play experience, which more or less backs up the results we found in our 1920x1080 data in today’s article.   

 

Even though we are showing videos slowed down to 50% and even 30% of their full frame rate, these animation differences are very apparent to me in real time, while playing the games.  Without a doubt there will be some variability in what kind of annoyance thresholds for each gamer but even if you can’t see the difference when looking at ONLY your video, you should be able to see the differences when looking at the two options side by side. 

One of the advantages of our capture system is that we keep literally every video we record, though obviously in a compressed format.  (Keeping 25GB, 60 second benchmark runs would hilarious.)  This allows us to verify our data analysis results through the extractor and Perl scripts with the original recorded video to see if the animation stutters, variances, runts, etc are all there. 

I realize we need some more videos and demonstrations of these animation issues and we are working on producing some more that run at various percent speeds including 100%.  If you have any more ideas for how to use standard video edit tools to create comparison that would help our readers understand the experience we are having with our hardware, please email me or let us know in the comments below!

March 30, 2013 | 05:47 PM - Posted by bystander (not verified)

I have a hard time trying to grasp exactly how erratic input would affect the results. I have a feeling, based on my constitution (I get simulator sickness with poor latency), that the best case is which ever has the lowest worst case interval times.

March 30, 2013 | 10:55 PM - Posted by bystander (not verified)

..But then you have occasional latency increases. Of course those increases are to remove redundant frames, and once increased, they probably don't need much adjustments most the time.

This whole topic always gets me going back and forth, but my instincts is overall, even if latency is considered, even spacing matters more as it adds more useful points of input, assuming it adds only marginal/occasional increases of latency.

March 28, 2013 | 05:20 PM - Posted by Bob Jones (not verified)

Can you address the visual quality differences between the two cards? Specifically on Sleeping Dogs, the 660 Ti seems to be missing some lighting sources outside - most noticeable is the cafe/shop lights before you go down the stairs, and then the store across the street right at the end of the vieo.

March 29, 2013 | 07:46 AM - Posted by Ryan Shrout

Time of day in the game when those videos were recorded?  They should be basically identical, I'll check.

March 29, 2013 | 12:10 AM - Posted by I don't have a name. (not verified)

Fascinating article. I think it'll take a few reads to fully comprehend everything that is being said. Thank you indeed, I found it fascinating. Certainly, as a 7970 owner, I'll be holding off on a potential second 7970 purchase for the time being.

March 29, 2013 | 02:09 AM - Posted by rezes (not verified)

The last GeForce 314.22 Beta Drivers and last Radeon drivers 13.3 beta 3. Please using these drivers on yours test.

AMD driver may be more stable on this drivers!

March 29, 2013 | 07:46 AM - Posted by Ryan Shrout

Nothing has changed on the issues in question.

March 29, 2013 | 02:15 AM - Posted by technogiant (not verified)

Thanks for the great article and all the work you've done guys.

I run 2x gtx 460's is sli and while I dislike screen tearing I've noticed that options such as vsync, active vsyn and frame rate limiters actually make the experience less smooth as appears to have been highlighted in this article.

I've considered getting a 120Hz monitor just so I can run without any of those options at a decent frame rate but use sufficiently high settings so as not to go above 120Hz and so incur screen tearing.

Thinking further I'd like Nvidia to develop a variation of their gpu boost technology that would actually down clock the gpu to prevent frame rates from exceeding the monitors refresh rate.....think this would give the benefits of no screen tearing without the negatives of vsync and the like.

Thanks again for the article guys.

March 29, 2013 | 02:37 AM - Posted by technogiant (not verified)

Actually using gpu boost dynamically to both under and overclock the gpu to achieve a target frame rate could be a very nice way of producing a smoothed experience without any of the negatives of other methods as its occurring directly at the gpu instead of in the game engine to display time line.

March 29, 2013 | 05:01 AM - Posted by Pick23 (not verified)

So is this article saying that even with the new testing methodologies:
1.Single card 7970ghz is still slightly better than the 680
2.Crossfire absolutely sucks
?

March 29, 2013 | 05:18 AM - Posted by John Doe (not verified)

7970 Ghz is slightly better than a 680 ONLY at stock.

When you're comparing 7970 Ghz to 680, things ENTIRELY depend on clock speeds since the 7970 Ghz is nothing more than a pre-OC'ed 7970.

But yes, CF does indeed sorta suck. Still.

March 29, 2013 | 04:26 PM - Posted by Anonymous (not verified)

Sweet 7970 still the best card under.. well under $1000 lol

March 30, 2013 | 06:21 PM - Posted by steen (not verified)

What's stock for a 680? ;) 7970 GE is slightly slower than Titan...

CF sucks just like SLI. What's your poison, input lag or frame metering? Do poeple understand what "runts" are? CF is actually rendering the frames, you just don't benefit as they're too close together. One frame renders the top 1/4 of the screen when the next frame starts. Your top ~200 lines are the fastest on your screen. ;)

(Sorry for the previous multi-posts. Don't know what happened there.)

April 7, 2013 | 10:15 AM - Posted by John Doe (not verified)

I have ran far more CF and SLi setups than you did FOR YEARS and understand far more abouts these things than you and your little silly mind does.

March 29, 2013 | 08:39 AM - Posted by fausto412 (not verified)

interesting piece, good job pcper.com

now I wonder if when AMD does a global fix my 6990 performance will be dramatically improved on bf3?

and what is the effect of correcting this to latency and actual framerate? will we see FPS go down at the expense of frametimes?

It is Obvious Nvidia was on top of this for some time...I just don't see a proper fix in 120 days.

March 29, 2013 | 09:24 AM - Posted by Chad Wilson (not verified)

Just out of scientific curiosity, did you do a run without the splitter in the mix to confirm that the splitter is not adding any latency to your tests?

March 29, 2013 | 09:33 AM - Posted by Anonymous (not verified)

Do people actually use vsync without some form of triple buffering?

I don't see how. Jumping between 30 and 60fps or whatever is not an enjoyable, nor smooth experience.

So, if you can enable vsync AND allow the game to sweep through a normal range of available framerates, does this negate the increased frame times of constantly switching back and forth between high fps and low fps?

March 29, 2013 | 09:56 PM - Posted by bystander (not verified)

V-sync, even with triple buffering, still jumps back and forth between 16ms and 33ms, but it does it between frames. A 120hz monitor helps here, as you can have 25ms frames too, so it is less of a variance.

March 29, 2013 | 09:36 AM - Posted by Anonymous (not verified)

Furthermore, is playing a game without vsync enabled REALLY an option?

Are you sure gamers all over the world disable it to be rid of the latency issues? I'm not so sure.

I'll happily take a little latency in a heated round of counter-strike than end up dead, or missing my shot because 50% of the screen shifted 8 feet to the right. (screen tearing).

Pretty much all games are unplayable without the use of vsync and I'm not convinced it's a personal preference, either, if you enjoy your experience while you're tearing frames - I'd just call you a mean name that insinuates you're not telling the truth.

March 29, 2013 | 11:00 AM - Posted by Marty (not verified)

If you are a competitive player, VSync is not an option, you are lagging an extra frame behind.

March 29, 2013 | 09:45 AM - Posted by rezes

Where is the new tests? and when?

March 29, 2013 | 11:20 AM - Posted by Anonymous (not verified)

So how much did nVidia pay you?

While I can see the potential in this kind of testing, and some of the issues you have mentioned are valid, you have drawn quite a bold and one sided conclusion using the competitor's software. I'll save my judgements for when this becomes open source.

March 29, 2013 | 05:12 PM - Posted by Fergdog (not verified)

It's not purely Nvidia made software, if you read the article or paid attention to this site you'd know Ryan co-developed this benchmark with Nvidia and he's been working on it for a long time.

April 2, 2013 | 08:10 PM - Posted by Anonymous (not verified)

nVidia has to do almost everything, the amd fans need to get used to it.

AMD's years long broken bottom line and years of massive layoffs and closings mean they claim they "weren't even aware of this issue !? !! "

- like every other total screw up AMD gets caught on with their dragon drawers on the floor next to their spider platform, buck naked in epic fail and presumably clueless...

Maybe we need some depositions and subpoenas of internal emails to see just how much they covered up their epic fail here.

March 29, 2013 | 05:00 PM - Posted by Fergdog (not verified)

Quick question, for adaptive vsync, you put it on half refresh rate didnt you?

March 29, 2013 | 05:02 PM - Posted by Fergdog (not verified)

Half refresh adaptive only really makes sense on 120hz monitors, not sure why you used that setting on a 60hz monitor for benchmarking.

March 29, 2013 | 08:06 PM - Posted by Anonymous (not verified)

Hoping you post the titan and and info today as promised!!!!!!!!!

March 29, 2013 | 08:07 PM - Posted by Anonymous (not verified)

Titan and amd

March 29, 2013 | 09:56 PM - Posted by MaxBV (not verified)

Waiting on that GTX 690 and Titan article still, hope you guys haven't forgotten.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.