Review Index:
Feedback

Frame Rating: GeForce GTX 660 Ti and Radeon HD 7950

Summary and Conclusions

Because of the complexity and sheer amount of data we have gathered using our Frame Rating performance methodology, we are breaking it up into several articles that each feature different GPU comparisons.  Here is the schedule:

 

Now that we are a few articles into this series called Frame Rating, it is important to go back and start to evaluate these results, compare them to previous ones and look for trends, patterns and anything that stands out.  We are hoping to find some answer as to WHY AMD's CrossFire and Eyefinity are having so many problems in hopes that they can address it sooner rather than later.

 

Performance

In a very similar result to the HD 7970 and GTX 680 results we launched Frame Rating with, the AMD card here has done well when compared to the GTX 660 Ti in terms of single GPU performance.  Both NVIDIA and AMD are doing great with single GPU frame rates, frame times and frame variances, especially in our 1920x1080 and 2560x1440 testing.  Since the release of the GTX 680, and after it the Radeon HD 7970 GHz Edition, AMD has been careful to position its products, based on pricing, so that they offer better performance per dollar than NVIDIA.  That was the case when we tested with FRAPS exclusively and that continues to be the case with our new capture-based method, called Frame Rating. 

View Full Size

The Radeon HD 7950 3GB (with Boost) is faster than the GeForce GTX 660 Ti in basically all six of our games tested today.  Battlefield 3, Crysis 3, DiRT 3, Sleeping Dogs, Skyrim and Far Cry 3 all showed benefits in frame rates and frame times when run on the HD 7950 at single monitor resolutions.  In many cases though the GTX 660 Ti was close; very close.  At multi-monitor testing (5760x1080) the 3GB frame buffer of the HD 7950 seemed to stretch out the lead that AMD had in single card results, particularly in games like Skyrim and Battlefield 3.  Having 50% more memory at that kind of resolution is definitely an advantage that AMD is holding on to.

The issue of multi-GPU gaming on AMD crept up again with CrossFire - AMD's HD 7950s in a pair were consistently resulting in runt frames and dropped frames that caused lower observed frame rates than found with the GTX 660 Ti in SLI.  In fact, it occurred more often here with the HD 7950s than it did with the HD 7970s - both DiRT 3 and Skyrim had problems in our review today when they did not appear to have any issues with the initial launch article.  This tells me that as performance of the GPU is starting to go down (as we step down the product stack) the bottleneck of the GPU is going to cause more of these problems, not less. 

View Full Size

This doesn't paint a very good picture for the Radeon HD 7870 and HD 7850 with multiple graphics cards if the pattern holds. 

(Just a reminder as to why the GeForce GTX 670 was left out of this article.  The GeForce GTX 660 Ti 2GB ($289) and the Radeon HD 7950 3GB ($299) are much closer in price!  The GeForce GTX 670 ($359) will be included in a future Frame Rating article.)

 

Final Thoughts

There isn't much left to say here but to reiterate what I have written on the last two articles: AMD presently has the better hardware (just in terms of raw performance) when looking at single GPU options. (Though that is debatable with the inclusion of the GeForce GTX Titan at the high end.) But gamers that are or are thinking of using multiple-GPUs will without a doubt find the issues discussed here with Radeon GPUs in CrossFire mode troublesome. 

View Full Size

AMD is aware of the issues, and has been since before the release of our first full article on Frame Rating, and is trying to figure out the answer for gamers.  The current line of thinking is that they will release a driver some time in the summer that will give the user the option of enabling frame metering or not, rather than forcing it on as NVIDIA is doing today.  Whether or not that will work out as well for AMD as currently does for NVIDIA will have to be seen.

I welcome your thoughts and comments below!!

April 4, 2013 | 07:27 AM - Posted by Anonymous (not verified)

Result are mostly within margin of error.only factor making this possible ?is if the operating system is causing the issue.like I posted in my big post ;the operating system is the cause .probably unknowingly.

April 4, 2013 | 10:51 AM - Posted by FenceMan (not verified)

Not sure I grasp the whole methodology. If it looks good to me what is the problem? I have (2) 7950's in CFX mode and I do not see any 'stutter' or problem, why should it matter if FRAPS tells me 100 fps in Battlefield 3 and this article tells me its really only 50? Seems to me this is extreme nitpicking of something that isnt even visible to the naked eye and quite irresponsible to make AMD out to be so bad.

April 13, 2013 | 04:48 PM - Posted by Anonymous (not verified)

Yet someone like Kyle at Hardocp has said in the far cry 3 review that there is no doubt nVidia is smoother, and in the former SLI vs CF, that EVERY SINGLE TESTER AND REVIEWER THERE NOTICES SLI IS MUCH SMOOTHER THAN CROSSFIRE, for the past number of years.

He also says it's true some people (slow eyes, slow mind, poor vision, crap monitor, low resolution ) "can't" see the differences.

Oh well, sorry you drew a lousy deck. Some cards drawn are not as good as others.

On the other hand, with them side by side, you'd probably not be a blind, doubting amd fanboy anymore, even if you still claimed you were verbally.

April 4, 2013 | 10:54 AM - Posted by FenceMan (not verified)

Cant quite call this "observed frame rate" if you need a capture card and slow motion video to "observe" it??

April 13, 2013 | 04:49 PM - Posted by Anonymous (not verified)

Yes, you don't, thousands of people report on it daily.

Others who sold their souls to the amd red devil can't see it.

April 4, 2013 | 12:07 PM - Posted by Solix (not verified)

<- Me this round for my rig

I generally have good luck with Nvidia drivers, okay two mid-range cards, let's see... 192 bit memory bandwidth on your mid-range card?! argh, no thanks I like anti aliasing. Open CL results are... wow okay. Let's check out ATI...

Open CL is good, I have a dedicated PhysX card, heat and efficiency are... aw nuts. well okay I can undervolt a bit, fine. Let's get some CF 7950s... one week later and a good deal more broke, sees this article. Yargh... Oh hey... look an Nvidia workstation GPU for gaming(Titan right?), maybe compute tasks will... oh nevermind, not so good and 1k, ouch. No bitcoin mining for me.

Maybe I should just get a single GPU? Wait... I just bought three monitors for surround gaming.

I think I'll go downstairs and play a console game to relax.

April 4, 2013 | 02:52 PM - Posted by db87

Ryan,

I would like to know if the 'runts fps' are indeed 100% completely useless for the gaming experience. So lets say take a test person put him behind a game running 55fps with 'runts' (read: observed fps is lower) and then put him behind the same game but with the fps capped at the observed fps value. I like to know what the test person experiences are under real gaming.

How is the mouse movement? Does It feel smooth? Which scenario does the test person thinks is giving the best gaming experience and so on...

So essentially are the runt frames so annoying that you are better of with way lower fps? If so by what margin? For example I can imagine that 60fps with 'runts' is still better than a 100% solid and consistent 30 fps.

April 13, 2013 | 04:55 PM - Posted by Anonymous (not verified)

Since a normal screen resolution means 1080 horizontal scan lines, how do you feel about 7 or 8 scan lines being called a "valuable frame" for AMD ?
Let's check it with something called logic, since the clueless are here everywhere.

1080/7=
154 runts needed to make a single full screen frame pic JUST ONCE

So in that case, 154 fps give you an effective frame rate of 1 fps. LOL How valuable is 1/154th of the screen in a sliver across it ? That's a runt, a drop is ZERO VISUAL VALUE period.

How about 21 scan lines ? Is that "valuable" ?

1080/21 = 54 of them to make just one full screen. LOL

So, AMD counts 54fps, and you get one screen of data - yes I'd say that's a very valuable way for amd to cheat like heck on steroids.

April 4, 2013 | 05:15 PM - Posted by Anon (not verified)

As a competitive gamer, you WANT the frames to start before the previous ones finished. Nvidia is introducing latency with their frame metering crap.....

-coming from a 660ti owner.

April 4, 2013 | 06:14 PM - Posted by icebug

I can't believe the amount of trolling over this article. I consider myself loosely an AMD fanboy when it comes to video cards. I don't really have an issue recomending nVidia cards to friends if that is where their pricing falls but I, personally run AMD. I don't see anything malicious about these articles and I find them very interesting and detailed. It's not going to stop me from buying a 7950 within the next month to replace my 5850 (Which let's be honest is still pretty darn good for today's games). I will just be happy that AMD knows about the issue and is working on a solution so I can get a nice fully functional Crossfire set-up later when they drop in price.

April 5, 2013 | 12:04 AM - Posted by thinkbiggar (not verified)

OK, AMD admits a issue then stop showing the comparisons with crossfire. What is the point is reviewing the same issues time and time again like something is suddenly going to change as we hit a different price point. This isn't a comparison at this point. If what you say is true then post this as a buyer beware when reviewing AMD cards about crossfire. What your doing here is the same as reviewing a nVidia card nad every article mentioning AMD crossfire is broken.

If you want a single GPU card no issue. If you want multi GPU go with nVidia at this point or move up to the next higher single GPU AMD. But remember with crossfire X when this fix is in place you can continue to boost your performance buy adding the next gen AMD. All nVidia offers is physx.

April 5, 2013 | 04:05 AM - Posted by Mac (not verified)

I agree with all of that and would also like to suggest PCper explores the 3rd party apps like radeon pro that fix a lot of these issues. Much more useful than telling us Xfire is broken over and over again

April 5, 2013 | 04:05 AM - Posted by Mac (not verified)

I agree with all of that and would also like to suggest PCper explores the 3rd party apps like radeon pro that fix a lot of these issues. Much more useful than telling us Xfire is broken over and over again

April 5, 2013 | 08:17 AM - Posted by bystander (not verified)

I wouldn't mind seeing 3-way CF/SLI, as an article a while back by THG found 3-way CF helped a while back.

I would like seeing RadeonPro, but realize that the fixes all include FPS limiting, which obviously isn't ideal.

I would also like to see different ranges of runt frames. 20 pixels high is 2% of your screen. That is really small. You could fairly increase that number some.

April 5, 2013 | 08:52 AM - Posted by Trey Long (not verified)

The exposure of Crossfire problems will force AMD to fix it. HOW IS THIS A BAD THING?

April 5, 2013 | 11:03 AM - Posted by FenceMan (not verified)

The way this is a bad thing:

1. I own a 7950 Crossfire setup which looks perfect to me (I use RadeonPro), and my initial reaction was that I should sell them and buy Nvidia, imagine how this could impact sales of people just considering AMD. I think the results, while quite accurate in and of themselves, should be better quantified in the real world (lets be honest can anyone at PCPer in a blind test really tell the difference between AMD and Nvidia)?

2. The graphs completely misrepresent what is happening on the screen. If you look at these graphs you imagine this stuttering mess on the screen, the truth is nothing close to that (as I said I can watch my rig in the real world).

3. It is completely beating a dead horse, we get it, the Crossfire has "major issues" when you enhance and slow down the frames to a crawl (which has zero bearing on how it looks in the real world), no reason to post the same results for each Crossfire compatible card, we already know what they are.

I am no AMD fanboy, I want all of the info I can get so next time I drop $1,000.00 on video cards I have all of the available info, I just think the info should be presented in a more realistic form, to me this is all quite inconsequential to real world gaming performance.

April 5, 2013 | 11:29 AM - Posted by bystander (not verified)

The problem isn't that you don't get playable frames in crossfire. It is that you aren't gaining full benefit from crossfire. When choosing between Crossfire and SLI, what will help you more so you can plan ahead.

And if you use v-sync, the limited testing has shown it helps or fixes the issue (only tested on 2 games so far, it could be the FPS limiting aspect that helps, and not v-sync itself).

April 5, 2013 | 11:33 AM - Posted by FenceMan (not verified)

"The problem isn't that you don't get playable frames in crossfire. It is that you aren't gaining full benefit from crossfire."

That right there to me is an accurate and thoughtful representation of what is going on here. That would have been a much more accurate way to present this information instead of the current "AMD Crossfire is a stuttering mess" message.

April 13, 2013 | 05:03 PM - Posted by Anonymous (not verified)

I have been on every review website there is and was for well over a decade, and argued with every fanboy of every type - and NEVER has a single amd fan or otherwise EVER MENTIONED "radeon pro"...

So who cares if NOW, you use it. For YEARS not a single amd fanboy CF user was using it, and if they were THEY LIED AND KEPT IT A SECRET BECAUSE THEY KNEW AMD SUCKS WITHOUT IT.

So who really cares at this point what lying hidden crutch amd needs for a "quicky bandaid patch" for it's crapster crap crossfire epic failure ?

It's a big fat excuse for YEARS of total loserville and lies.

I'll also mention once v-sync and radeon pro hack the frame rate down to 30 or 60, YOU MIGHT AS WELL NOT USE AN AMD CROSSFIRE SOLUTION AT ALL BECAUSE YOU CANNOT USE FRAME RATE POTENTIAL ON IT FULLY, YOU HAVE TO CRIPPLE IT FOR IT TO WORK.

What we need now is a 30fps chart and 60 fps chart with crap amd cards listed running vsync and radeon pro and the chart can tell us which pieces of crap can do 30 and 60 fps and we can THROW OUT EVERY OTHER FRAME RATE EVER LISTED FOR THE CRAP AMD CARDS.

April 5, 2013 | 11:34 AM - Posted by FenceMan (not verified)

"The problem isn't that you don't get playable frames in crossfire. It is that you aren't gaining full benefit from crossfire."

That right there to me is an accurate and thoughtful representation of what is going on here. That would have been a much more accurate way to present this information instead of the current "AMD Crossfire is a stuttering mess" message.

April 7, 2013 | 10:25 AM - Posted by Anonymous (not verified)

Interesting test, if what you have tested is valid, AMD is in deep-sh*t once again. If not, there is some kind of money transaction going on...

Nonetheless, thanks for all your efforts.

April 7, 2013 | 10:23 PM - Posted by PClover (not verified)

Quick Google "geforce frame metering" and you will find out why the nVi cards rarely have runt frames. In fact, nVi cards DO have them. They just delays those frames a bit to match with other good frames' speed, therefore the frame time chart looks good miraculously. And you, the user, will have to deal with input delay.

For me, an AMD cards and a 3rd party FPS cap software is the best. No input lag, no stuttering. And the image quality from AMD is always superior.

April 11, 2013 | 04:14 PM - Posted by praack

wow, it's spot on data, tough to take in, especially if you purchased Radeon's since the begining but it is true.

but GOD People - stop with the flaming!!!.

AMD will have to do a hardware re-do to fix this, i don't really expect anything to come out for a year. i don't really think that anyone misled on purpose - this is all new testing data.

AMD still makes an ok card- but the issue is with multiple cards.

answer- don't buy multiple lower cost cards with AMD

April 13, 2013 | 12:45 AM - Posted by Anonymous (not verified)

I wish you'd start testing with newer drivers. They have 314.22's out, had 314.21 out, and even a 314.07 all in WHQL, but you're still on beta .07's? I think they even had a 314.14 whql in there.

Many games have been optimized since then. Great data, just want later Nv drivers as you're a few behind at best.
https://www.youtube.com/watch?v=gQ3U2P8ZLz4
15% for 314.07 vs. 314.21 in tomb raider, others show basically the same and in some cases more and this was on an old 9600GT card...LOL. Who knew?

June 27, 2014 | 01:58 PM - Posted by Anonymous (not verified)

Sorry for the huge review, computer repair in schaumburg but I'm really loving the new Zune, and hope this, as well as the excellent reviews some other people have written, will help you decide if it's the right choice for you.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.