Review Index:

Frame Rating: GeForce GTX 660 Ti and Radeon HD 7950

Battlefield 3

Battlefield 3 (DirectX 11)


Battlefield 3™ leaps ahead of its time with the power of Frostbite 2, DICE's new cutting-edge game engine. This state-of-the-art technology is the foundation on which Battlefield 3 is built, delivering enhanced visual quality, a grand sense of scale, massive destruction, dynamic audio and character animation utilizing ANT technology as seen in the latest EA SPORTS™ games.

Frostbite 2 now enables deferred shading, dynamic global illumination and new streaming architecture. Sounds like tech talk? Play the game and experience the difference!

Our Settings for Battlefield 3

Here is our testing run through the game, for your reference.

View Full Size

View Full Size

Under our FRAPS data you see that the HD 7950s in CrossFire appear to be faster than the GTX 660 Tis in SLI, but after taking out runts our observed average frame rates are much lower, more in line with the single Radeon HD 7950. 

View Full Size

Our now familiar frame time plot data demonstrates the problem with AMD's CrossFire technology and runt frames - frames taking up such a small area of the screen that they adversely affecting animation performance.  While both single GPU results look pretty good the HD 7950s running in parallel produce the mass of orange alternating long/short frames.  NVIDIA's GTX 660 Tis in SLI are definitely slightly more variant than the single GPU options but appear to be within reason.

View Full Size

The minimum FPS data shows the GTX 660 Tis in SLI running well ahead of any other option with average frame rate of about 122 FPS while the rest hoever in the 70 FPS range.

View Full Size

The GTX 660 Ti cards in SLI definitely exhibit some more frame time variance than the single GPUs but the Radeon HD 7950s in CrossFire definitely skew things dramatically.  While this doesn't definitely detect stutter it is a good indication that you will see some with results like this.


View Full Size

View Full Size

Again the Radeon HD 7950s look great in CrossFire on the outset, ahead of the GTX 660 Ti cards in SLI.  But after removing those dreaded runt frames the "real world" performance of the GTX 660 Ti cards is much better.

View Full Size

The Radeon HD 7950 combo actually looks worse at 2560x1440 with more variance on a frame to frame comparison where as the NVIDIA GTX 660 Ti cards appear to be tighter, resulting in a better multi-GPU environment. 

View Full Size

Our minimum FPS percentile data shows great results from both the single GeForce GTX 660 Ti and the Radeon HD 7950 but adding a second HD 7950 doesn't change much for AMD.  Only the GTX 660 Ti is scaling.

View Full Size

Again at 2560x1440 we see the frame variance ratings are much higher with the HD 7950s in CrossFire.


View Full Size

View Full Size

Eyefinity with CrossFire continues to be an even worse problem for AMD as we can see here there are tons of not only runts, but flat out dropped (never shown at all) frames that bring the observed frame rate down considerably. 

View Full Size

Ech, a mess of color here.  The blue lines are telling us that the 2GB version of the GTX 660 Ti are struggling with 5760x1080 in SLI mode though they are reliably showing frames to the gamer. Single GPU results (though kind of cut off at the top, sorry!) are much smoother but are definitely faster on the single HD 7950 than the single GTX 660 Ti.

View Full Size

The minimum FPS chart shows the single HD 7950 ahead of the GTX 660 Ti, as we expected based on the plot of frame times above (22 FPS average against 27 FPS average).  But while CrossFire sees basically no observed frame rate increases the SLI result jumps up to 38 FPS or so at the 50th percentile.  Note though that SLI in this case does follow a curve that brings the frame rates back down in line with a single HD 7950 at the tail end.

View Full Size

Our frame time variance chart has really interesthing things to tell us, starting with the expected "AMD CrossFire isn't working that well" kind of thing.  AMD's CrossFire clearly sees the most frame rate differences though NVIDIA's SLI isn't immune here with frame variances crossing the 15 ms in some fringe cases.  To put that in perspective though, CrossFire HD 7950s see variances larger than that for more than 20% of all frames rendered!


April 4, 2013 | 10:27 AM - Posted by Anonymous (not verified)

Result are mostly within margin of error.only factor making this possible ?is if the operating system is causing the I posted in my big post ;the operating system is the cause .probably unknowingly.

April 4, 2013 | 01:51 PM - Posted by FenceMan (not verified)

Not sure I grasp the whole methodology. If it looks good to me what is the problem? I have (2) 7950's in CFX mode and I do not see any 'stutter' or problem, why should it matter if FRAPS tells me 100 fps in Battlefield 3 and this article tells me its really only 50? Seems to me this is extreme nitpicking of something that isnt even visible to the naked eye and quite irresponsible to make AMD out to be so bad.

April 13, 2013 | 07:48 PM - Posted by Anonymous (not verified)

Yet someone like Kyle at Hardocp has said in the far cry 3 review that there is no doubt nVidia is smoother, and in the former SLI vs CF, that EVERY SINGLE TESTER AND REVIEWER THERE NOTICES SLI IS MUCH SMOOTHER THAN CROSSFIRE, for the past number of years.

He also says it's true some people (slow eyes, slow mind, poor vision, crap monitor, low resolution ) "can't" see the differences.

Oh well, sorry you drew a lousy deck. Some cards drawn are not as good as others.

On the other hand, with them side by side, you'd probably not be a blind, doubting amd fanboy anymore, even if you still claimed you were verbally.

April 4, 2013 | 01:54 PM - Posted by FenceMan (not verified)

Cant quite call this "observed frame rate" if you need a capture card and slow motion video to "observe" it??

April 13, 2013 | 07:49 PM - Posted by Anonymous (not verified)

Yes, you don't, thousands of people report on it daily.

Others who sold their souls to the amd red devil can't see it.

April 4, 2013 | 03:07 PM - Posted by Solix (not verified)

<- Me this round for my rig

I generally have good luck with Nvidia drivers, okay two mid-range cards, let's see... 192 bit memory bandwidth on your mid-range card?! argh, no thanks I like anti aliasing. Open CL results are... wow okay. Let's check out ATI...

Open CL is good, I have a dedicated PhysX card, heat and efficiency are... aw nuts. well okay I can undervolt a bit, fine. Let's get some CF 7950s... one week later and a good deal more broke, sees this article. Yargh... Oh hey... look an Nvidia workstation GPU for gaming(Titan right?), maybe compute tasks will... oh nevermind, not so good and 1k, ouch. No bitcoin mining for me.

Maybe I should just get a single GPU? Wait... I just bought three monitors for surround gaming.

I think I'll go downstairs and play a console game to relax.

April 4, 2013 | 05:52 PM - Posted by db87


I would like to know if the 'runts fps' are indeed 100% completely useless for the gaming experience. So lets say take a test person put him behind a game running 55fps with 'runts' (read: observed fps is lower) and then put him behind the same game but with the fps capped at the observed fps value. I like to know what the test person experiences are under real gaming.

How is the mouse movement? Does It feel smooth? Which scenario does the test person thinks is giving the best gaming experience and so on...

So essentially are the runt frames so annoying that you are better of with way lower fps? If so by what margin? For example I can imagine that 60fps with 'runts' is still better than a 100% solid and consistent 30 fps.

April 13, 2013 | 07:55 PM - Posted by Anonymous (not verified)

Since a normal screen resolution means 1080 horizontal scan lines, how do you feel about 7 or 8 scan lines being called a "valuable frame" for AMD ?
Let's check it with something called logic, since the clueless are here everywhere.

154 runts needed to make a single full screen frame pic JUST ONCE

So in that case, 154 fps give you an effective frame rate of 1 fps. LOL How valuable is 1/154th of the screen in a sliver across it ? That's a runt, a drop is ZERO VISUAL VALUE period.

How about 21 scan lines ? Is that "valuable" ?

1080/21 = 54 of them to make just one full screen. LOL

So, AMD counts 54fps, and you get one screen of data - yes I'd say that's a very valuable way for amd to cheat like heck on steroids.

April 4, 2013 | 08:15 PM - Posted by Anon (not verified)

As a competitive gamer, you WANT the frames to start before the previous ones finished. Nvidia is introducing latency with their frame metering crap.....

-coming from a 660ti owner.

April 4, 2013 | 09:14 PM - Posted by icebug

I can't believe the amount of trolling over this article. I consider myself loosely an AMD fanboy when it comes to video cards. I don't really have an issue recomending nVidia cards to friends if that is where their pricing falls but I, personally run AMD. I don't see anything malicious about these articles and I find them very interesting and detailed. It's not going to stop me from buying a 7950 within the next month to replace my 5850 (Which let's be honest is still pretty darn good for today's games). I will just be happy that AMD knows about the issue and is working on a solution so I can get a nice fully functional Crossfire set-up later when they drop in price.

April 5, 2013 | 03:04 AM - Posted by thinkbiggar (not verified)

OK, AMD admits a issue then stop showing the comparisons with crossfire. What is the point is reviewing the same issues time and time again like something is suddenly going to change as we hit a different price point. This isn't a comparison at this point. If what you say is true then post this as a buyer beware when reviewing AMD cards about crossfire. What your doing here is the same as reviewing a nVidia card nad every article mentioning AMD crossfire is broken.

If you want a single GPU card no issue. If you want multi GPU go with nVidia at this point or move up to the next higher single GPU AMD. But remember with crossfire X when this fix is in place you can continue to boost your performance buy adding the next gen AMD. All nVidia offers is physx.

April 5, 2013 | 07:05 AM - Posted by Mac (not verified)

I agree with all of that and would also like to suggest PCper explores the 3rd party apps like radeon pro that fix a lot of these issues. Much more useful than telling us Xfire is broken over and over again

April 5, 2013 | 07:05 AM - Posted by Mac (not verified)

I agree with all of that and would also like to suggest PCper explores the 3rd party apps like radeon pro that fix a lot of these issues. Much more useful than telling us Xfire is broken over and over again

April 5, 2013 | 11:17 AM - Posted by bystander (not verified)

I wouldn't mind seeing 3-way CF/SLI, as an article a while back by THG found 3-way CF helped a while back.

I would like seeing RadeonPro, but realize that the fixes all include FPS limiting, which obviously isn't ideal.

I would also like to see different ranges of runt frames. 20 pixels high is 2% of your screen. That is really small. You could fairly increase that number some.

April 5, 2013 | 11:52 AM - Posted by Trey Long (not verified)

The exposure of Crossfire problems will force AMD to fix it. HOW IS THIS A BAD THING?

April 5, 2013 | 02:03 PM - Posted by FenceMan (not verified)

The way this is a bad thing:

1. I own a 7950 Crossfire setup which looks perfect to me (I use RadeonPro), and my initial reaction was that I should sell them and buy Nvidia, imagine how this could impact sales of people just considering AMD. I think the results, while quite accurate in and of themselves, should be better quantified in the real world (lets be honest can anyone at PCPer in a blind test really tell the difference between AMD and Nvidia)?

2. The graphs completely misrepresent what is happening on the screen. If you look at these graphs you imagine this stuttering mess on the screen, the truth is nothing close to that (as I said I can watch my rig in the real world).

3. It is completely beating a dead horse, we get it, the Crossfire has "major issues" when you enhance and slow down the frames to a crawl (which has zero bearing on how it looks in the real world), no reason to post the same results for each Crossfire compatible card, we already know what they are.

I am no AMD fanboy, I want all of the info I can get so next time I drop $1,000.00 on video cards I have all of the available info, I just think the info should be presented in a more realistic form, to me this is all quite inconsequential to real world gaming performance.

April 5, 2013 | 02:29 PM - Posted by bystander (not verified)

The problem isn't that you don't get playable frames in crossfire. It is that you aren't gaining full benefit from crossfire. When choosing between Crossfire and SLI, what will help you more so you can plan ahead.

And if you use v-sync, the limited testing has shown it helps or fixes the issue (only tested on 2 games so far, it could be the FPS limiting aspect that helps, and not v-sync itself).

April 5, 2013 | 02:33 PM - Posted by FenceMan (not verified)

"The problem isn't that you don't get playable frames in crossfire. It is that you aren't gaining full benefit from crossfire."

That right there to me is an accurate and thoughtful representation of what is going on here. That would have been a much more accurate way to present this information instead of the current "AMD Crossfire is a stuttering mess" message.

April 13, 2013 | 08:03 PM - Posted by Anonymous (not verified)

I have been on every review website there is and was for well over a decade, and argued with every fanboy of every type - and NEVER has a single amd fan or otherwise EVER MENTIONED "radeon pro"...

So who cares if NOW, you use it. For YEARS not a single amd fanboy CF user was using it, and if they were THEY LIED AND KEPT IT A SECRET BECAUSE THEY KNEW AMD SUCKS WITHOUT IT.

So who really cares at this point what lying hidden crutch amd needs for a "quicky bandaid patch" for it's crapster crap crossfire epic failure ?

It's a big fat excuse for YEARS of total loserville and lies.


What we need now is a 30fps chart and 60 fps chart with crap amd cards listed running vsync and radeon pro and the chart can tell us which pieces of crap can do 30 and 60 fps and we can THROW OUT EVERY OTHER FRAME RATE EVER LISTED FOR THE CRAP AMD CARDS.

April 5, 2013 | 02:34 PM - Posted by FenceMan (not verified)

"The problem isn't that you don't get playable frames in crossfire. It is that you aren't gaining full benefit from crossfire."

That right there to me is an accurate and thoughtful representation of what is going on here. That would have been a much more accurate way to present this information instead of the current "AMD Crossfire is a stuttering mess" message.

April 7, 2013 | 01:25 PM - Posted by Anonymous (not verified)

Interesting test, if what you have tested is valid, AMD is in deep-sh*t once again. If not, there is some kind of money transaction going on...

Nonetheless, thanks for all your efforts.

April 8, 2013 | 01:23 AM - Posted by PClover (not verified)

Quick Google "geforce frame metering" and you will find out why the nVi cards rarely have runt frames. In fact, nVi cards DO have them. They just delays those frames a bit to match with other good frames' speed, therefore the frame time chart looks good miraculously. And you, the user, will have to deal with input delay.

For me, an AMD cards and a 3rd party FPS cap software is the best. No input lag, no stuttering. And the image quality from AMD is always superior.

April 11, 2013 | 07:14 PM - Posted by praack

wow, it's spot on data, tough to take in, especially if you purchased Radeon's since the begining but it is true.

but GOD People - stop with the flaming!!!.

AMD will have to do a hardware re-do to fix this, i don't really expect anything to come out for a year. i don't really think that anyone misled on purpose - this is all new testing data.

AMD still makes an ok card- but the issue is with multiple cards.

answer- don't buy multiple lower cost cards with AMD

April 13, 2013 | 03:45 AM - Posted by Anonymous (not verified)

I wish you'd start testing with newer drivers. They have 314.22's out, had 314.21 out, and even a 314.07 all in WHQL, but you're still on beta .07's? I think they even had a 314.14 whql in there.

Many games have been optimized since then. Great data, just want later Nv drivers as you're a few behind at best.
15% for 314.07 vs. 314.21 in tomb raider, others show basically the same and in some cases more and this was on an old 9600GT card...LOL. Who knew?

June 27, 2014 | 04:58 PM - Posted by Anonymous (not verified)

Sorry for the huge review, computer repair in schaumburg but I'm really loving the new Zune, and hope this, as well as the excellent reviews some other people have written, will help you decide if it's the right choice for you.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.