Review Index:
Feedback

Frame Rating: GeForce GTX 660 Ti and Radeon HD 7950

Battlefield 3

Battlefield 3 (DirectX 11)


 

Battlefield 3™ leaps ahead of its time with the power of Frostbite 2, DICE's new cutting-edge game engine. This state-of-the-art technology is the foundation on which Battlefield 3 is built, delivering enhanced visual quality, a grand sense of scale, massive destruction, dynamic audio and character animation utilizing ANT technology as seen in the latest EA SPORTS™ games.

Frostbite 2 now enables deferred shading, dynamic global illumination and new streaming architecture. Sounds like tech talk? Play the game and experience the difference!

Our Settings for Battlefield 3

Here is our testing run through the game, for your reference.

View Full Size

View Full Size

Under our FRAPS data you see that the HD 7950s in CrossFire appear to be faster than the GTX 660 Tis in SLI, but after taking out runts our observed average frame rates are much lower, more in line with the single Radeon HD 7950. 

View Full Size

Our now familiar frame time plot data demonstrates the problem with AMD's CrossFire technology and runt frames - frames taking up such a small area of the screen that they adversely affecting animation performance.  While both single GPU results look pretty good the HD 7950s running in parallel produce the mass of orange alternating long/short frames.  NVIDIA's GTX 660 Tis in SLI are definitely slightly more variant than the single GPU options but appear to be within reason.

View Full Size

The minimum FPS data shows the GTX 660 Tis in SLI running well ahead of any other option with average frame rate of about 122 FPS while the rest hoever in the 70 FPS range.

View Full Size

The GTX 660 Ti cards in SLI definitely exhibit some more frame time variance than the single GPUs but the Radeon HD 7950s in CrossFire definitely skew things dramatically.  While this doesn't definitely detect stutter it is a good indication that you will see some with results like this.

 

View Full Size

View Full Size

Again the Radeon HD 7950s look great in CrossFire on the outset, ahead of the GTX 660 Ti cards in SLI.  But after removing those dreaded runt frames the "real world" performance of the GTX 660 Ti cards is much better.

View Full Size

The Radeon HD 7950 combo actually looks worse at 2560x1440 with more variance on a frame to frame comparison where as the NVIDIA GTX 660 Ti cards appear to be tighter, resulting in a better multi-GPU environment. 

View Full Size

Our minimum FPS percentile data shows great results from both the single GeForce GTX 660 Ti and the Radeon HD 7950 but adding a second HD 7950 doesn't change much for AMD.  Only the GTX 660 Ti is scaling.

View Full Size

Again at 2560x1440 we see the frame variance ratings are much higher with the HD 7950s in CrossFire.

 

View Full Size

View Full Size

Eyefinity with CrossFire continues to be an even worse problem for AMD as we can see here there are tons of not only runts, but flat out dropped (never shown at all) frames that bring the observed frame rate down considerably. 

View Full Size

Ech, a mess of color here.  The blue lines are telling us that the 2GB version of the GTX 660 Ti are struggling with 5760x1080 in SLI mode though they are reliably showing frames to the gamer. Single GPU results (though kind of cut off at the top, sorry!) are much smoother but are definitely faster on the single HD 7950 than the single GTX 660 Ti.

View Full Size

The minimum FPS chart shows the single HD 7950 ahead of the GTX 660 Ti, as we expected based on the plot of frame times above (22 FPS average against 27 FPS average).  But while CrossFire sees basically no observed frame rate increases the SLI result jumps up to 38 FPS or so at the 50th percentile.  Note though that SLI in this case does follow a curve that brings the frame rates back down in line with a single HD 7950 at the tail end.

View Full Size

Our frame time variance chart has really interesthing things to tell us, starting with the expected "AMD CrossFire isn't working that well" kind of thing.  AMD's CrossFire clearly sees the most frame rate differences though NVIDIA's SLI isn't immune here with frame variances crossing the 15 ms in some fringe cases.  To put that in perspective though, CrossFire HD 7950s see variances larger than that for more than 20% of all frames rendered!

 

April 3, 2013 | 07:25 AM - Posted by bystander (not verified)

The TechReport articles "were" accurate when they were released, and sparked change from AMD, who went through their drivers to fix the problem.

They didn't prove TechReport wrong. They prove AMD fixed things on single cards configurations.

April 3, 2013 | 02:59 AM - Posted by Cpt. Obvious (not verified)

Why do the graphs for "FRAPS FPS" and "Observed FPS" match exactly except for Crossfire?!

There should be at least little differences between these graphs.

You don't have FRAPS running during the capture, do you?!

April 3, 2013 | 08:15 AM - Posted by Ryan Shrout

No, FRAPS was not running at the same time as the overlay and capture testing.

April 3, 2013 | 05:38 PM - Posted by Cpt. Obvious (not verified)

So what would be the explanation for it?

The graphs match exactly.

If you measure the same scene twice, the results wont be exactly the same.

So you let the bench run, measure with FRAPS.
You let the bench run and measure with FCAT.
The results are EXACTLY the same except for crossfire.

There has to be something wrong.

April 3, 2013 | 04:48 AM - Posted by Anonymous (not verified)

Another article to confirm that with vsync off CFX is nerfed atm...

Which doesn't affect me in the slightest as i don't play with vsync off .. i won't play with vsync off fullstop.

Twin lightning 7970s @ 1400 mhz , 60 fps solid in crysis 3 with RP 60 DFC and vsync on and its butter smooth. Slightly different to the results i see in this article i might add.

My advice to any CFX user is simply run with vsync on and radeon pro DFC (dynamic frame control) 60 fps with vsync and you won't get 90 % of the issues in these articles.

It is a pity that pcper won't simply do a review with vsync on running radeon pro .. i guess they wouldn't have much to write about then.

April 3, 2013 | 06:09 AM - Posted by Lord Binky (not verified)

I don't think dropped frames are as significant to the viewing experience as the runt frames are which also cause small tears. Also, keeping the image being displayed the most current representation of the game state would seem to support dropping a frame if it gets in the way of a newer frame.

That said, I really want to see some rules of thumb applied to the graphs as to what consitutes perceptable gameplay degredation. For example in the frame variance chart, I would think there is a slope based assessment as to the type of problem the person would percieve.

April 3, 2013 | 08:16 AM - Posted by Ryan Shrout

The idea of "rules of thumb applied to the graphs as to what consitutes perceptable gameplay degredation" is a good idea, but it is much harder to actually do.  That is the eventual goal but in all honestly it will likely be an issue that changes for each gaming title.

April 4, 2013 | 10:43 AM - Posted by bystander (not verified)

I was thinking about how many FPS compared to your refresh rate has an effect on what the average frame size would be. This might mean what is considered a runt frame might be something that should scale based on FPS. This may also help narrow down when something is not acceptable.

April 3, 2013 | 09:03 AM - Posted by Verticals (not verified)

Yeah, the hidden-agenda accusations are rather bizarre considering the AMD ads, but any tech site is going to get its share of unstable posters. don't let it get to ya.

Anyway, AMD's single gpus came out looking nicer than I would have figured. My experience with previous gens had me expecting to see disparity in observed frames for the single cards as well. I'd also like to see tests on the last gen from both manufacturers, but I know that may not be a practical use of copy.

April 3, 2013 | 09:52 AM - Posted by Anonymous (not verified)

Window cause more of those issue then everything else combined.that is what I mean with the long post Ryan didn't understand.

April 3, 2013 | 10:00 AM - Posted by Anonymous (not verified)

You have to use radeon pro, and set up a profile for each game being used on Crossfire setups. Its the only way to get them to run smoothly. Its an extra program and a bit of work but the results are promising, regarding frame time. I hope Future AMD driver versions have this feature built in.

April 3, 2013 | 10:23 AM - Posted by Anonymous (not verified)

You use nVidia Hardware to measure SOMETHING?
Evilol....

April 3, 2013 | 12:50 PM - Posted by endiZ (not verified)

Hey Ryan, great articles.

Have you tried RadeonPro with Dynamic V-sync Control? It looks they are using a similar technique that Nvidia uses with Adaptive Vsync.

April 3, 2013 | 02:06 PM - Posted by Cannyone

I'm really surprised by this article. And that is an ambivalent sort of surprise! On one hand I'm pleased to see that SLI 660 Ti's do so well at 2560x1440. On the other hand I'm dismayed that Crossfire is NOT competitive. So the crux of what I'm feeling is disappointment. Because "competition" is good for me as a consumer.

See I have a pair of Nvidia (specifically EVGA) GTX 660 Ti's in my computer. And they were chosen because I wanted to run game's on a 2560x1440 display at full resolution. But it troubles me to think that AMD isn't being competitive because that points to a Market where Nvidia's dominance is most likely to mean higher prices. Which contributes to the feeling that PC Gaming my indeed be on its last leg.

The bottom line, economically speaking, is that while computers are a boon to our lives our "life" itself does not depend on them. So when things get really tight we will find ways to get by without spending money on new hardware. And both AMD and Nvidia will be forced to change to maintain a revenue stream.

... so despite the fact I own Nvidia cards I sure hope AMD can get their act together!

April 3, 2013 | 04:05 PM - Posted by Zarich (not verified)

I enjoyed this article. However, after watching all the comparison videos I learned one thing. ATI and Nvidia do not have the same colors. So Out of the box which card is producing the most accurate colors?

April 3, 2013 | 05:45 PM - Posted by Cpt. Obvious (not verified)

Why do the graphs for "FRAPS FPS" and "Observed FPS" match exactly except for Crossfire?!

There should be at least little differences between these graphs.

You already said, that tests with FCAT and FRAPS weren't made at the same time.

So what would be the explanation for it?

The graphs match exactly.

If you measure the same scene twice, the results wont be exactly the same.

So you let the bench run, measure with FRAPS.
You let the bench run and measure with FCAT.
The results are EXACTLY the same except for crossfire.

There has to be something wrong.

April 3, 2013 | 07:09 PM - Posted by ThorAxe

The only thing wrong is the borked Crossfire. The single GPUs and SLI are doing what they should.

April 4, 2013 | 12:24 AM - Posted by Cpt. Obvious (not verified)

Read. -> Think. -> Answer.

The results can't be exactly the same.
2 Benches, 2 different tools that give different data,
but 1 result? I don't think so.

April 4, 2013 | 06:22 AM - Posted by ThorAxe

Maybe you should re-read what has been written or at least look at the definition of observed frame-rate.

Observed Frame Rates will, indeed, must show identical frame rates to FRAPS if all frames are defined as useful. That means there were no dropped or runt frames in the benchmark.

The tools use the same data. Only discarding useless frames would result in a variance. If there are no useless frames then there should not be a variance.

April 4, 2013 | 08:40 AM - Posted by Cpt. Obvious (not verified)

So if you bench the same scene two times, the results will be the same? Interesting.

April 4, 2013 | 10:31 AM - Posted by Tri Wahyudianto (not verified)

completely no, this called "blind and randomization" in journal, health journal, social journal, or other

cause we know the same scene never do the same results in two testing in all graphic card

so it do to all graphic card was testing, bad results, good results, all graphic card has same chance to have it

in the end, we called it a fair comparison

April 4, 2013 | 08:23 PM - Posted by ThorAxe

And there is your problem. The benchmark is only run once.

In the test run FRAPS displays all Frames rendered regardless of size. The video output is then captured via the capture card and the raw AVI file is analyzed. If there are no runts or drops (as is the case with the single GPUs tested) then the Observed Frame Rate must be identical to the FRAPs results.

April 5, 2013 | 12:46 AM - Posted by Cpt. Obvious (not verified)

Wrong. I already asked. Ryans answer:

April 3, 2013 | 11:15 AM - Posted by Ryan Shrout
No, FRAPS was not running at the same time as the overlay and capture testing.

I am not trolling, I know that AMD has bigger issues with MGPU than Nvidia.
The only thing I am saying:
Somethig has to be wrong with the graphs, as there always will be slight differences between two bench runs.

April 5, 2013 | 07:47 AM - Posted by ThorAxe

If that is the case then there would normally be a slight variation.

I don't know why it would need to be run separately, though I think the podcast mentioned issues with FRAPS and the overlay being used at the same time.

April 5, 2013 | 12:58 PM - Posted by KittenMasher (not verified)

You're right, there should be at least slight differences between the two runs. However, that doesn't mean those differences when displayed on a graph that is made for human consumption are going to be noticeable. In order for the graph's to be completely accurate, the 90seconds or so of benchmark would have to be sampled at frame. In at least one of the cases that's about 150fps, which means the graph would have to have a width of about 810,000 pixels. You may have noticed that the graphs on pcper are a bit smaller than that.

April 5, 2013 | 12:58 PM - Posted by KittenMasher (not verified)

You're right, there should be at least slight differences between the two runs. However, that doesn't mean those differences when displayed on a graph that is made for human consumption are going to be noticeable. In order for the graph's to be completely accurate, the 90seconds or so of benchmark would have to be sampled at frame. In at least one of the cases that's about 150fps, which means the graph would have to have a width of about 810,000 pixels. You may have noticed that the graphs on pcper are a bit smaller than that.

April 4, 2013 | 01:08 AM - Posted by Tri Wahyudianto (not verified)

Thanks for the objective and numberical article despite many critics and subjectivity of the reader

well done ryan

thumbs up pcper.com

April 4, 2013 | 01:54 AM - Posted by sLiCeR- (not verified)

I would very much like to read a quick analysis of previous generations crossfire setup (ie: 2x 5850/5870), to try to understand if these issues are only related to the drivers / GCN arch or the crossfire technology itself.

And of course, AMD should have the ability to discuss these results with pcper.com and eventually revisit these tests with the *future* drivers that will address these issues.

Regardless of the impacts this kind of articles have, it's known today that there are issues with AMD drivers. Even if there are some errors with some tests, if were not for them we wouldn't even be looking at this. So.. well done pcper.com, but don't leave it at this bombshell ;)

cheers.

April 4, 2013 | 06:18 AM - Posted by AlienAndy (not verified)

Thank you for taking your time to do this Ryan. I used to use Crossfire but it never quite felt like what was being displayed. What I mean is when I ran 5770 CF it always seemed jerky and stuttered even though the readings I was seeing seemed pretty high (as it should have been, just ahead of a 5870).

I swore off of Crossfire after that and just concluded it was down to poor drivers. I'm now using GTX 670 in SLI and I have to say it's been fantastic. No stuttering and pretty smooth. I have had the odd issue waiting for drivers (Far Cry 3) but other than that it seems to work great.

Funny really. Crossfire always felt rather crude to me. Would also love to see some 5xxx results !!

April 4, 2013 | 07:22 AM - Posted by AlienAndy (not verified)

BTW could you also test 3dmark11 please?

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.