Feedback

Frame Rating: A New Graphics Performance Metric

Author:
Manufacturer: PC Perspective

A change is coming in 2013

If the new year will bring us anything, it looks like it might be the end of using "FPS" as the primary measuring tool for graphics performance on PCs.  A long, long time ago we started with simple "time demos" that recorded rendered frames in a game like Quake and then played them back as quickly as possible on a test system.  The lone result was given as time, in seconds, and was then converted to an average frame rate having known the total number of frames recorded to start with.

More recently we saw a transition to frame rates over time and the advent frame time graphs like the ones we have been using in our graphics reviews on PC Perspective. This expanded the amount of data required to get an accurate picture of graphics and gaming performance but it was indeed more accurate, giving us a more clear image of how GPUs (and CPUs and systems for that matter) performed in games.

And even though the idea of frame times have been around just a long, not many people were interested in getting into that detail level until this past year.  A frame time is the amount of time each frame takes to render, usually listed in milliseconds, and could range from 5ms to 50ms depending on performance.  For a reference, 120 FPS equates to an average of 8.3ms, 60 FPS is 16.6ms and 30 FPS is 33.3ms.  But rather than average those out by each second of time, what if you looked at each frame individually?

Video Loading...

Scott over at Tech Report started doing that this past year and found some interesting results.  I encourage all of our readers to follow up on what he has been doing as I think you'll find it incredibly educational and interesting. 

Through emails and tweets many PC Perspective readers have been asking for our take on it, why we weren't testing graphics cards in the same fashion yet, etc.  I've stayed quiet about it simply because we were working on quite a few different angles on our side and I wasn't ready to share results.  I am still not ready to share the glut of our information yet but I am ready to start the discussion and I hope our community find its compelling and offers some feedback.

View Full Size

At the heart of our unique GPU testing method is this card, a high-end dual-link DVI capture card capable of handling 2560x1600 resolutions at 60 Hz.  Essentially this card will act as a monitor to our GPU test bed and allow us to capture the actual display output that reaches the gamer's eyes.  This method is the best possible way to measure frame rates, frame times, stutter, runts, smoothness, and any other graphics-related metrics.

Using that recorded footage, sometimes reaching 400 MB/s of consistent writes at high resolutions, we can then analyze the frames one by one, though with the help of some additional software.  There are a lot of details that I am glossing over including the need for perfectly synced frame rates, having absolutely zero dropped frames in the recording, analyzing, etc, but trust me when I say we have been spending a lot of time on this. 

Continue reading our editorial on Frame Rating: A New Graphics Performance Metric.

The result are multi-GB files (60 seconds of game play produces a 16GB file) that have each frame presented to the monitor of the gamer in perfect fashion. 

View Full Size

View Full Size

The second image here was made in Photoshop to show you the three different "frames" of Unigine that make up the single frame that the gamer would actually see.  With a 60 Hz display, this equates to three frames being shown in 16ms, though each frame has a variable amount of real estate on the screen. 

We are still finalizing what we can do with all of this data, but it definitely allows us to find some unique cases:

View Full Size

This shot from Sleeping Dogs shows three frames on our single 16ms instance to the monitor, but notice how little screen spaces the green frame takes up - is measuring this frame even useful for gamers?  Should it be dropped from the performance metrics all together?

I put together this short video below with a bit more on my thoughts on this topic but I can assure you that January and February are going to bring some major change to the way graphics cards are tested.  Please, we want your thoughts!  This is an important dialogue for not only us but for all PC gamers going forward.  Scott has done some great work "Inside the Second" and we hope that we can offer additional information and clarification on this problem.  Rest assured we have not been sitting on our hands here! 

Video Loading...

Is this kind of testing methodology going to be useful?  Is it overkill?  Leave us your thoughts below!

January 4, 2013 | 06:22 PM - Posted by David (not verified)

This is awesome!

Scott and the guys over at Tech Report have made me a believer and this promises to offer a nice twist on latency-based testing. Very curious to see how the actual benchmark testing evolves out of this!

January 5, 2013 | 10:24 AM - Posted by I don't have a name. (not verified)

Looks awesome Ryan. This sounds like a unique and innovative way of doing things, and should provide a great second perspective to what Scott at TR has been doing. :)

January 5, 2013 | 12:28 PM - Posted by shellbunner (not verified)

Always loved your reviews and have been a long time fan of this site since the AMD days. I'm REALLY looking forward to your input on this topic as it has exploded the last month. It will definitely be a great resource especially for the next generation of GPU's that will be here in a few months.
Thanks for the hard work Ryan.

January 5, 2013 | 04:21 PM - Posted by The Sorcerer (not verified)

I am more curious about that tool. Would you be making that tool be available in the public? atleast putting up some screenshots of that tool will really help!

January 5, 2013 | 05:51 PM - Posted by mtcn77

I cannot wait to read your review.
I just wonder if the display splitter might incur some nasty "input lag", just as that epic 30" Dell display might in real time. I noticed some anonymous input lag discrepancy in reviews, the bigger the screen gets from 23" and up.

January 5, 2013 | 09:15 PM - Posted by Anonymous (not verified)

No point in testing when Pcper doesn't use the latest Nvidia drivers and constant handicapping Nvidia hardware running outdated drivers.

January 5, 2013 | 10:36 PM - Posted by Anonymous Coward (not verified)

Dear Ryan

I hope by the sheer volume of comments on a post that's not a giveaway, you can see the intense interest in this sort of testing. We're sick and f***ing tired of being lied to with FPS, and having both AMD and Nvidia optimize their cards for the metrics they know will end up in reviews. Smoothness is a big deal, and is the #1 thing people want to know. I look forward to pcper having some friendly competition with the techreport in advancing the state of the art. Because we're all so sick of FPS charts that lie to our faces. I remember playing Battlefield 3 at 100+ fps and having it feel like garbage (tri-fire 6970s for those curious)

January 6, 2013 | 01:22 AM - Posted by Anonymous (not verified)

Micro stutter is a specially a problem wit CF!

Tho i also had tons of problems in the beginning, and was on the verge of selling my second 5870, when i was reading on the WSGF of a guy that got a 3th card, and it reduced the problem for him whit in tolerable limits.

I then got a 3th 2GB Matrix, thinking, if it dose not fix my problem i just send it back and sell my second card, but yes for me, the problem also reduced significantly, now i got 6 months a go a cheap 4th 5870 Matrix to bridge me over till there are 8970s or 780s with more then 4GB mem. (as i want to use a 6000x1920 setup)

And the 4th even reduced the micro stutters even a little more, the general theory was that every card got more time to prepare for the next frame.

Now i hear that the driver for GCN 7xx0 cards is badly optimized to prevent Micro stutters, compared to my VLIW 5870 cards.
(so i am actually glad i waited a extra generation ;-)

Now i am wondering, if you have your testbed ready for prime time, if you gone do some single, CF/SLI, Tri-CF/SLI and Quad-CF/SLI testing?

And i think it actually would be interesting to see 5870 vs 6970 vs 7970 vs 480 vs 580 vs 680, up to quad setups, to see if different generations act different.

January 8, 2013 | 12:07 AM - Posted by Anonymous (not verified)

How will this effect low level cards that might have trouble splitting the signals to both computers. Would the actually splitter handle the buck of the issue?

January 10, 2013 | 01:52 PM - Posted by arbiter

They are using a in line DVI spliter to split the dvi. So the low level cards don't even see its being split.

January 8, 2013 | 06:07 AM - Posted by kukreknecmi (not verified)

How the capture card and capture system will actually determine / distinguish each frame? Is there a signal on dvi link ,saying that frame is drawn and this is a new frame and the capture card/system will be aware of it is a new frame so it can timestamp it? From Fraps' point, i understand that since it is on the same system, it can near-ideally identify and timestamp the frame generation times and write to csv. By trying to identify the frames from out of the system, i'm just confused that how it will work. ofc i dunno if there is frame- drawn kinda signal/feed-back on the Dvi system.

January 10, 2013 | 01:54 PM - Posted by arbiter

It different frames being drawn can be seen by tearing effects in the recorded video. Look at above screen shots you can see a slight tearing of the image.

January 10, 2013 | 09:47 PM - Posted by kukreknecmi (not verified)

Do you really have idea how it works or just assuming? How can it detects if i stand still? I guess almost no tearing then. Is the capturing system have idea which frame is being drawn on the gaming system so it can timestamp it? It is some different version of recording with camera i got that part, it can detect tearing/cheating etc. i got that too. I just dont get how it will detect the real frame generation time and make stat of it like fraps'. Without the frame time data it cant accurately stat the frame generation / latency time on the gaming system.

January 10, 2013 | 09:48 PM - Posted by kukreknecmi (not verified)

Do you really have idea how it works or just assuming? How can it detects if i stand still? I guess almost no tearing then. Is the capturing system have idea which frame is being drawn on the gaming system so it can timestamp it? It is some different version of recording with camera i got that part, it can detect tearing/cheating etc. i got that too. I just dont get how it will detect the real frame generation time and make stat of it like fraps'. Without the frame time data it cant accurately stat the frame generation / latency time on the gaming system.

January 10, 2013 | 09:48 PM - Posted by kukreknecmi (not verified)

Do you really have idea how it works or just assuming? How can it detects if i stand still? I guess almost no tearing then. Is the capturing system have idea which frame is being drawn on the gaming system so it can timestamp it? It is some different version of recording with camera i got that part, it can detect tearing/cheating etc. i got that too. I just dont get how it will detect the real frame generation time and make stat of it like fraps'. Without the frame time data it cant accurately stat the frame generation / latency time on the gaming system.

January 8, 2013 | 11:46 PM - Posted by bootdiscerror

16GB a minute is an insane amount of data to sort through. I look forward to to the future, and what this will bring. No more 'padding the specs'.

Great stuff!

January 9, 2013 | 07:08 AM - Posted by BigMack70 (not verified)

Yes please! This stuff is awesome, guys... I've been hoping for a while now that sites like yours and others would take the idea TechReport had of looking at frametimes and develop on it yourselves.

Can't wait to see what you guys come up with as a final method of evaluation.

January 9, 2013 | 01:12 PM - Posted by markt (not verified)

Ryan reading all the stuff I'v read I understand why u have kept this project hush hush.....some people get big time butt-hurt

January 10, 2013 | 02:03 PM - Posted by arbiter

Yea, well its gonna reveal which side is cheating the fps numbers. Look at techreports where they credited the idea from. Just because you have higher fps doesn't mean more fluid game play and where frame latency can show which overall is smoother. This will put the proverbial Instant replay on the cards to see what each card is doing to get those numbers like you see in the photo where one image of 16ms has 3 frames counted when only 2 frames drawn at best.

January 14, 2013 | 03:34 AM - Posted by Fairplay (not verified)

If you want to see some fuzzy numbers,have a look at the frame latencies of GTX560/570.
It has huge latency spikes compared with the AMD 6 series cards and yet we never heard a word from nvidia fans.
Suddenly now when nvidia is well behind in frame rates per dollar.."OMG its a HUGE problem"!!!!

Typical...

January 17, 2013 | 05:57 AM - Posted by uartin (not verified)

Why everybody keeps saying that techreport started a new way to benchmark when microstutter has been talked about in pcgameshardware and computerbase reviews sites for ages (we are talking 2008 time, maybe even earlier)?

I can also remember interminable discussions on ancient threads of nvidia forums where people kept posting their Unreal tournament fraps frametimes dump showing how slow frames would break the smoothness of gameplay in multigpu configurations...

January 19, 2013 | 04:46 PM - Posted by NitroX infinity (not verified)

Would you share some info on that capturing card (company/model) ?

And how do you measure the frametimes/dropped frames, etc? With the custom software you mentioned in the video? If so, will you be releasing that software or keeping it to yourself? I've got a lot of OLD 3d cards that I'd like to do the frame rating test with and fraps won't do since it only works with DirectX/OpenGL.

September 20, 2013 | 02:03 AM - Posted by ezjohny

This is a very good method you have going here, it shows AMD and Nvidia someone out there is paying attention to video games and that someone cares! Good job guys keep up the good work!
Question: I do not like when you buy a video game, and your character goes through a solid object, Who is responsible for this, is this the game developers. "Hope someone could get on them for this"!

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.