Review Index:
Feedback

Frame Rating Dissected: Full Details on Capture-based Graphics Performance Testing

Crysis 3 – HD 7970 versus GTX 680

View Full Size

Crysis 3 is the currently the biggest GPU hog and both the GTX 680 and HD 7970 handle it equally well at 1920x1080.  Even using our FRAPS metrics though, GTX 680s in SLI are scaling better than the HD 7970s in CrossFire.

View Full Size

The frame time plot from our Frame Rating system shows another instance though of CrossFire’s inability to keep consistent animation on the screen.  The single card configurations are pretty consistent with each other but both also exhibit some tiny bumps in frame times on a repeating pattern, obviously a particular of the CryEngine.  SLI does have some increases in frame time variance across the board with a few minor “hitches” as well.  CrossFire though appears to be alternating between 2ms frame times and up to 50ms frame times resulting in…

View Full Size

Not only a lower observed frame rate but a frame rate that is LOWER than the single card!  I can tell you from first-hand experience that this definitely was the case in play through as well; it felt slower than the single card experience.

View Full Size

SLI looks fantastic in this graph and is able to take the matching performance of the GTX 680 and HD 7970 up from 30 FPS average for the entire run to 57 FPS.

View Full Size

Our custom ISU rating tells me that the GTX 680 SLI configuration looks GREAT and only differs from the single card configurations at 95% and above percentile.  The HD 7970 in CrossFire though shows huge amounts of variance from the outset and in fact does exhibit a lot of stutter in game as well.

 

View Full Size

At the higher resolution the single card HD 7970 as a slight edge over the GTX 680 this time around and this time the scaling of CrossFire appears to be faster than SLI.

View Full Size

A quick glance at our results from the observed frame rate clearly shows that isn't the case though as only in short bursts does the CrossFire experience actually match that of the dual GTX 680s in SLI.

View Full Size

Our frame time plot indicates where the alternating frame times in Crysis 3 occur with CrossFire and how it relates to the performance of SLI.  Other than the single large hitch seen at the 12 second mark or so in SLI, the GTX 680s handle Crysis 3 much better.

View Full Size

The GTX 680s are able to scale from about 19 FPS on average to 36 FPS - a solid 89% scaling factor.  The HD 7970 GHz Edition cards are not so impressive, only going from 21 FPS to 25 FPS but that quickly falls down to just even performance with a single card.

View Full Size

Ouch, another very poor result here for HD 7970s in CrossFire with Crysis 3 at 2560x1440 with as much as 25 ms of frame variance (nearly two full refresh cycels). 

 

View Full Size

This is one of the few Eyefinity runs for the HD 7970 CrossFire configuration that was able to run until completion and generate the necessary graphs for us.  So we’ll finally get to see some interesting results.  Even at first glance, we can tell that something here isn't quite right.  According to FRAPS, the HD 7970 is pushing out more than 200 FPS to the screen on Crysis 3 at 5760x1080, which is obviously inaccurate. 

View Full Size

Ouch, there are definitely some problems here, not the least of which is the graphs poor setting of range maximums (will fix soon!)  Notice the spots on the plot of the orange line (HD 7970 CF) where there is no data – that indicates a dropped frame and a lot of frame time variance. 

View Full Size

Removing those and any runts we find that the observed FPS is actually right in line with that of the single HD 7970 graphics card.  Also, without the CrossFire misreported results out of the window, the update scale helps us see the scaling that the GTX 680s in SLI. 

View Full Size

Here again is another one of our RUN files to show you the affects of dropped frames on Eyefinity testing.  FRAPS based frame rates sky rocket up though the observed frame rate is much lower, in line with a single HD 7970 GHz Edition card.  There are some runts involved in this but the biggest factor is obviously the dropped frames (missing colors in our pattern).

View Full Size

Again, for comparison, here is the RUN graph for the GTX 680s running in SLI at 5760x1080.  Notice that the frame rate is consistent with no drops or runts.

View Full Size

Interesting results here with the CrossFire setup taking a lower position across the board in our percentile minimum frame rate graphs. Also, we see the pair of GTX 680s in SLI start out much faster than anything else tested but at the 94th percentile or so fall below that of the HD 7970.

View Full Size

Keeping mind that we are looking at pretty low frame rates across the board, the HD 7970 has the best overall result here in our ISU graph with the least amount of frame variance over the course of our 60 second run.  Obviously CrossFire has a big issue once again and see significant variance starting at the 50th percentile and it only gets worse from there.

March 27, 2013 | 06:16 AM - Posted by grommet

Hey Ryan, is the variable "t_ready" that the text refers to "t_present" in the diagram two paragraphs above?

March 27, 2013 | 06:22 AM - Posted by Ryan Shrout

Yes, fixed, thanks.

March 27, 2013 | 07:20 AM - Posted by Prodeous (not verified)

I was just wondering.. since the capture and analysis system relies on the left bar only, why doesn't it trunckate the rendered frame it self and keep only 1-3 pixels from the left side of the frame?

For some tests if you want to show off the specific "runts" "stutters" then you can keep it the entire frame captured.

But for most tests, you can record only the left colour bar and do analysis on that bar only, therefore you will not have to save the 600GB per second of uncompressed frames.

Just a thought.

March 27, 2013 | 07:40 AM - Posted by Ryan Shrout

We have that option (notice the crop capture option in our software) but we often like to reference recorded video down the road.

March 28, 2013 | 05:53 AM - Posted by Luciano (not verified)

If you write a piece of software with that "colored" portion only in mind you dont need any of the additional hardware and any user could use it just like Fraps.

March 28, 2013 | 04:02 PM - Posted by Anonymous (not verified)

LOL - AMD you are soooo freaking BUSTED !

Runt frames and zero time ghost frames is an AMD specialty.

AMD is 33% slower than nVidia, so amd pumped in ghosts and runts !!

Their crossfire BF3 is a HUGE CHEAT.

ROFL - man there are going to be some really pissed off amd owners whose $1,000 dual gpu frame rate purchases have here been exposed as LIES.

Shame on you amd, runts and ghosts, and all those fanboys screaming bang for the buck ... little did they know it was all a big fat lie.

March 28, 2013 | 11:42 PM - Posted by Anonymous (not verified)

true words of a fan boy.

May 26, 2013 | 05:23 PM - Posted by Johan (not verified)

Even i have Nvidia, but his comment, was really a fan-boy comment.

March 27, 2013 | 06:43 AM - Posted by Anon (not verified)

Run the benchmarks like any sane gamer would do so!

Running v-sync disabled benches with 60+ fps is dumb!

Running v-sync enabled benches with 60- fps is way more dumb!

YOU'RE DOING IT WRONG!

March 27, 2013 | 07:00 AM - Posted by John Doe (not verified)

Don't forget that V-sync also sometimes fucks shit up.

The best solution is to limit the frames on the game end, or using a 3rd party frame limiter.

March 27, 2013 | 07:51 AM - Posted by Anon (not verified)

And sometimes disabling it fucks physics like in Skyrim.

Your point?

March 27, 2013 | 07:18 AM - Posted by grommet

Read the whole article before commenting- there is an entire page that focuses on your argument (surprise- you're not the first to suggest this!!)

March 27, 2013 | 07:32 AM - Posted by John Doe (not verified)

And if you have actually watched one of his streams, you'd have seen that he INDEED IS doing it "wrong".

When he was testing Tomb Raider, the card was artifacting and flickering and shimmering. It was complete and totally obvious that he didn't know what he's doing.

March 27, 2013 | 07:31 AM - Posted by Anonymous (not verified)

To the guy who says "you're doing it wrong".

Not true. He's doing it right. Many hardcore gamers run v-sync disabled over 60 fps, and only a few of the scenarios tested are consistently over 60fps.

And read the story to see the OBSERVED FPS is much less in many cases, not just the FRAPs FPS (which does not give proper info at the end of the pipeline (at the display) what users actually see).

March 27, 2013 | 07:38 AM - Posted by John Doe (not verified)

There's absolutely nothing "hardcore" about running an old ass game at 1000 FPS with a modern card.

All it does is to waste resources and make the card work at it's hardest for no reason. It's no different than swapping a gear at a say, maximum, 6500 RPM on a car that revs 7000 per gear. You should keep it lower so that the hardware doesn't get excessively get stressed.

If the card is obviously way beyond the generation of the game you're playing, then you're best off, if possible, LIMITING frames/putting a frame lock on your end.

March 27, 2013 | 07:42 AM - Posted by Ryan Shrout

Vsync does cause a ton of other issues including input latency.

March 27, 2013 | 07:44 AM - Posted by Anon (not verified)

You've said this already in both the article and comments but you only talked about input latency.

What are those other issues?

March 27, 2013 | 07:52 AM - Posted by John Doe (not verified)

It makes the entire game laggy an unplayable depending on condition.

You can read up over that TweakGuides guy's site on Vsync.

March 27, 2013 | 08:09 AM - Posted by Josh Walrath

Some pretty fascinating implications about the entire game/rendering pipeline.  There are so many problems throughout, and the tradeoffs tend to impact each other in varyingly negative ways.  Seems like frames are essentially our biggest issue, but how do you get around that?  I had previously theorized that per pixel change in an asynchronous manner would solve a lot of these issues, but the performance needed for such a procedure is insane.  Essentially it becomes a massive particle model with a robust physics engine being the base of changes (movement, creation, destruction, etc.).

March 27, 2013 | 11:18 AM - Posted by Jason (not verified)

research voxels... the technology exists but is pretty far off.

March 27, 2013 | 02:44 PM - Posted by Anonymous (not verified)

Regarding per-pixel-updates: Check this out: https://www.youtube.com/watch?feature=player_embedded&v=pXZ33YoKu9w

It's obvious the result must be delivered in frames (no way around it with current monitor technology), but the engine in the vid clearly works differently from the usual by just putting the newly calculated pixels into the framebuffer that are ready by the time.

March 27, 2013 | 11:07 AM - Posted by bystander (not verified)

Due to many frames being forced to wait for the next vertical retrace mode, while others do not, it will result in some time metering issues. This can result in a form of stuttering.

March 28, 2013 | 11:49 PM - Posted by Anonymous (not verified)

hardcore gamer here play BF3 v-sync ON!

because:

1. My monitor (as 90% of us) is 60hz!
don't need the extra stress/heat/electric juice on card/system.

2. Fed up with frame tearing every time i turn around.

March 27, 2013 | 07:27 AM - Posted by Prodeous (not verified)

With regards to the Nvidia settings of v-sync, it seems that Addaptive (half refresh rate) was selected capping it at 30fps vs Addaptive which would cap at 60fps.

Was that on purpose?

March 27, 2013 | 07:31 AM - Posted by Noah Taylor (not verified)

I'm still interested to see what type of results you come up with when using amd crossfire with a program like afterburner's riva tuner to limit the fps to 60, which would seems to be everyone's preference for this situation.

March 27, 2013 | 07:42 AM - Posted by Ryan Shrout

I think you'll find the same problems that crop up with Vsync enabled.

March 27, 2013 | 08:14 AM - Posted by Noah Taylor (not verified)

I have to admit observed FPS graphs are DAMNING to AMD, and I own 2 7970s and 7950s so I'm not remotely biased against them in any way. One thing i did notice is that dropped frames don't effect every game so hopefully this is something AMD may be able to potentially mitigate through driver tweaks.

I have to admit, crysis 3 can be a sloppy affair in AMD crossfire and now I can see exactly what I'm experiencing without trying to make guesswork out of it.

Regardless, the bottom line EVERYONE should take away from this, is that crossfire DOES NOT function as intended whatsoever, and we can now actually say AMD is deceptive in their marketing as well, this is taken directly from AMD's website advertising crossfire:

"Tired of dropping frames instead of opponents? Find a CrossFire™-certified graphics configuration that’s right for you."

They have built their business on a faulty product and every crossfire owner should speak up so that AMD makes the changes they have control over to FIX a BROKEN system.

March 27, 2013 | 08:17 AM - Posted by Josh Walrath

Just think how bad a X1900 CrossFire Edition setup would do here...  I think Ryan should dig up those cards and test them!

March 27, 2013 | 08:49 AM - Posted by SB48 (not verified)

I wonder if that one of the reasons why AMD never really released the 7990,
also if this was already a obvious problem with the 3870 X2...
or the previous gens card, like HD5000, 6000.

anyway, is there any input lag difference from NV to AMD (SLI-CF) without vsync?

also I would be curious to see some CPU comparison.

March 27, 2013 | 10:31 AM - Posted by John Doe (not verified)

I had both a dual 3870 and a dual 2900 setup, both were more or less the same thing.

Both has driven a CRT at 160HZ for Cube and both were silk smooth.

This is a recent issue. It has nothing to do with cards of that age. The major problem with those old cards was the lack of CrossFire application profiles. Before the beginning of 2010, you had absolutely no application profiles like you have with nVidia. So CF either worked or you had to change the game ".exe" names, which either worked or, made the game mess up or just kick you out of Steam servers due to VAC.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.