Review Index:
Feedback

Frame Rating: GeForce GTX 660 Ti and Radeon HD 7950

What to Look For, Test Setup

Because of the complexity and sheer amount of data we have gathered using our Frame Rating performance methodology, we are breaking it up into several articles that each feature different GPU comparisons.  Here is the schedule:

We are back again with another edition of our continued reveal of data from the capture-based Frame Rating GPU performance methods.  In this third segment we are moving on down the product stack to the NVIDIA GeForce GTX 660 Ti and the AMD Radeon HD 7950 - both cards that fall into a similar price range.

View Full Size

I have gotten many questions about why we are using the cards in each comparison and the answer is pretty straight forward: pricing.  In our first article we looked at the Radeon HD 7970 GHz Edition and the GeForce GTX 680 while in the second we compared the Radeon HD 7990 (HD 7970s in CrossFire), the GeForce GTX 690 and the GeForce GTX Titan.  This time around we have the GeForce GTX 660 Ti ($289 on Newegg.com) and the Radeon HD 7950 ($299 on Newegg.com) but we did not include the GeForce GTX 670 because it sits much higher at $359 or so.  I know some of you are going to be disappointed that it isn't in here, but I promise we'll see it again in a future piece!


If you are just joining this article series today, you have missed a lot!  If nothing else you should read our initial full release article that details everything about the Frame Rating methodology and why we are making this change to begin with.  In short, we are moving away from using FRAPS for average frame rates or even frame times and instead are using a secondary hardware capture system to record all the frames of our game play as they would be displayed to the gamer, then doing post-process analyzation on that recorded file to measure real world performance.

Because FRAPS measures frame times at a different point in the game pipeline (closer to the game engine) its results can vary dramatically from what is presented to the end user on their display.  Frame Rating solves that problem by recording video through a dual-link DVI capture card that emulates a monitor to the testing system and by simply applying a unique overlay color on each produced frame from the game, we can gather a new kind of information that tells a very unique story.

View Full Size

The capture card that makes all of this work possible.

I don't want to spend too much time on this part of the story here as I already wrote a solid 16,000 words on the topic in our first article and I think you'll really find the results fascinating.  So, please check out my first article on the topic if you have any questions before diving into these results today!

Test System Setup
CPU Intel Core i7-3960X Sandy Bridge-E
Motherboard ASUS P9X79 Deluxe
Memory Corsair Dominator DDR3-1600 16GB
Hard Drive OCZ Agility 4 256GB SSD
Sound Card On-board
Graphics Card NVIDIA GeForce GTX 660 Ti 2GB
AMD Radeon HD 7950 3GB
Graphics Drivers AMD: 13.2 beta 7
NVIDIA: 314.07 beta
Power Supply Corsair AX1200i
Operating System Windows 8 Pro x64

 

On to the results! 

Continue reading our review of the GTX 660 Ti and HD 7950 using Frame Rating!!

 

April 4, 2013 | 07:27 AM - Posted by Anonymous (not verified)

Result are mostly within margin of error.only factor making this possible ?is if the operating system is causing the issue.like I posted in my big post ;the operating system is the cause .probably unknowingly.

April 4, 2013 | 10:51 AM - Posted by FenceMan (not verified)

Not sure I grasp the whole methodology. If it looks good to me what is the problem? I have (2) 7950's in CFX mode and I do not see any 'stutter' or problem, why should it matter if FRAPS tells me 100 fps in Battlefield 3 and this article tells me its really only 50? Seems to me this is extreme nitpicking of something that isnt even visible to the naked eye and quite irresponsible to make AMD out to be so bad.

April 13, 2013 | 04:48 PM - Posted by Anonymous (not verified)

Yet someone like Kyle at Hardocp has said in the far cry 3 review that there is no doubt nVidia is smoother, and in the former SLI vs CF, that EVERY SINGLE TESTER AND REVIEWER THERE NOTICES SLI IS MUCH SMOOTHER THAN CROSSFIRE, for the past number of years.

He also says it's true some people (slow eyes, slow mind, poor vision, crap monitor, low resolution ) "can't" see the differences.

Oh well, sorry you drew a lousy deck. Some cards drawn are not as good as others.

On the other hand, with them side by side, you'd probably not be a blind, doubting amd fanboy anymore, even if you still claimed you were verbally.

April 4, 2013 | 10:54 AM - Posted by FenceMan (not verified)

Cant quite call this "observed frame rate" if you need a capture card and slow motion video to "observe" it??

April 13, 2013 | 04:49 PM - Posted by Anonymous (not verified)

Yes, you don't, thousands of people report on it daily.

Others who sold their souls to the amd red devil can't see it.

April 4, 2013 | 12:07 PM - Posted by Solix (not verified)

<- Me this round for my rig

I generally have good luck with Nvidia drivers, okay two mid-range cards, let's see... 192 bit memory bandwidth on your mid-range card?! argh, no thanks I like anti aliasing. Open CL results are... wow okay. Let's check out ATI...

Open CL is good, I have a dedicated PhysX card, heat and efficiency are... aw nuts. well okay I can undervolt a bit, fine. Let's get some CF 7950s... one week later and a good deal more broke, sees this article. Yargh... Oh hey... look an Nvidia workstation GPU for gaming(Titan right?), maybe compute tasks will... oh nevermind, not so good and 1k, ouch. No bitcoin mining for me.

Maybe I should just get a single GPU? Wait... I just bought three monitors for surround gaming.

I think I'll go downstairs and play a console game to relax.

April 4, 2013 | 02:52 PM - Posted by db87

Ryan,

I would like to know if the 'runts fps' are indeed 100% completely useless for the gaming experience. So lets say take a test person put him behind a game running 55fps with 'runts' (read: observed fps is lower) and then put him behind the same game but with the fps capped at the observed fps value. I like to know what the test person experiences are under real gaming.

How is the mouse movement? Does It feel smooth? Which scenario does the test person thinks is giving the best gaming experience and so on...

So essentially are the runt frames so annoying that you are better of with way lower fps? If so by what margin? For example I can imagine that 60fps with 'runts' is still better than a 100% solid and consistent 30 fps.

April 13, 2013 | 04:55 PM - Posted by Anonymous (not verified)

Since a normal screen resolution means 1080 horizontal scan lines, how do you feel about 7 or 8 scan lines being called a "valuable frame" for AMD ?
Let's check it with something called logic, since the clueless are here everywhere.

1080/7=
154 runts needed to make a single full screen frame pic JUST ONCE

So in that case, 154 fps give you an effective frame rate of 1 fps. LOL How valuable is 1/154th of the screen in a sliver across it ? That's a runt, a drop is ZERO VISUAL VALUE period.

How about 21 scan lines ? Is that "valuable" ?

1080/21 = 54 of them to make just one full screen. LOL

So, AMD counts 54fps, and you get one screen of data - yes I'd say that's a very valuable way for amd to cheat like heck on steroids.

April 4, 2013 | 05:15 PM - Posted by Anon (not verified)

As a competitive gamer, you WANT the frames to start before the previous ones finished. Nvidia is introducing latency with their frame metering crap.....

-coming from a 660ti owner.

April 4, 2013 | 06:14 PM - Posted by icebug

I can't believe the amount of trolling over this article. I consider myself loosely an AMD fanboy when it comes to video cards. I don't really have an issue recomending nVidia cards to friends if that is where their pricing falls but I, personally run AMD. I don't see anything malicious about these articles and I find them very interesting and detailed. It's not going to stop me from buying a 7950 within the next month to replace my 5850 (Which let's be honest is still pretty darn good for today's games). I will just be happy that AMD knows about the issue and is working on a solution so I can get a nice fully functional Crossfire set-up later when they drop in price.

April 5, 2013 | 12:04 AM - Posted by thinkbiggar (not verified)

OK, AMD admits a issue then stop showing the comparisons with crossfire. What is the point is reviewing the same issues time and time again like something is suddenly going to change as we hit a different price point. This isn't a comparison at this point. If what you say is true then post this as a buyer beware when reviewing AMD cards about crossfire. What your doing here is the same as reviewing a nVidia card nad every article mentioning AMD crossfire is broken.

If you want a single GPU card no issue. If you want multi GPU go with nVidia at this point or move up to the next higher single GPU AMD. But remember with crossfire X when this fix is in place you can continue to boost your performance buy adding the next gen AMD. All nVidia offers is physx.

April 5, 2013 | 04:05 AM - Posted by Mac (not verified)

I agree with all of that and would also like to suggest PCper explores the 3rd party apps like radeon pro that fix a lot of these issues. Much more useful than telling us Xfire is broken over and over again

April 5, 2013 | 04:05 AM - Posted by Mac (not verified)

I agree with all of that and would also like to suggest PCper explores the 3rd party apps like radeon pro that fix a lot of these issues. Much more useful than telling us Xfire is broken over and over again

April 5, 2013 | 08:17 AM - Posted by bystander (not verified)

I wouldn't mind seeing 3-way CF/SLI, as an article a while back by THG found 3-way CF helped a while back.

I would like seeing RadeonPro, but realize that the fixes all include FPS limiting, which obviously isn't ideal.

I would also like to see different ranges of runt frames. 20 pixels high is 2% of your screen. That is really small. You could fairly increase that number some.

April 5, 2013 | 08:52 AM - Posted by Trey Long (not verified)

The exposure of Crossfire problems will force AMD to fix it. HOW IS THIS A BAD THING?

April 5, 2013 | 11:03 AM - Posted by FenceMan (not verified)

The way this is a bad thing:

1. I own a 7950 Crossfire setup which looks perfect to me (I use RadeonPro), and my initial reaction was that I should sell them and buy Nvidia, imagine how this could impact sales of people just considering AMD. I think the results, while quite accurate in and of themselves, should be better quantified in the real world (lets be honest can anyone at PCPer in a blind test really tell the difference between AMD and Nvidia)?

2. The graphs completely misrepresent what is happening on the screen. If you look at these graphs you imagine this stuttering mess on the screen, the truth is nothing close to that (as I said I can watch my rig in the real world).

3. It is completely beating a dead horse, we get it, the Crossfire has "major issues" when you enhance and slow down the frames to a crawl (which has zero bearing on how it looks in the real world), no reason to post the same results for each Crossfire compatible card, we already know what they are.

I am no AMD fanboy, I want all of the info I can get so next time I drop $1,000.00 on video cards I have all of the available info, I just think the info should be presented in a more realistic form, to me this is all quite inconsequential to real world gaming performance.

April 5, 2013 | 11:29 AM - Posted by bystander (not verified)

The problem isn't that you don't get playable frames in crossfire. It is that you aren't gaining full benefit from crossfire. When choosing between Crossfire and SLI, what will help you more so you can plan ahead.

And if you use v-sync, the limited testing has shown it helps or fixes the issue (only tested on 2 games so far, it could be the FPS limiting aspect that helps, and not v-sync itself).

April 5, 2013 | 11:33 AM - Posted by FenceMan (not verified)

"The problem isn't that you don't get playable frames in crossfire. It is that you aren't gaining full benefit from crossfire."

That right there to me is an accurate and thoughtful representation of what is going on here. That would have been a much more accurate way to present this information instead of the current "AMD Crossfire is a stuttering mess" message.

April 13, 2013 | 05:03 PM - Posted by Anonymous (not verified)

I have been on every review website there is and was for well over a decade, and argued with every fanboy of every type - and NEVER has a single amd fan or otherwise EVER MENTIONED "radeon pro"...

So who cares if NOW, you use it. For YEARS not a single amd fanboy CF user was using it, and if they were THEY LIED AND KEPT IT A SECRET BECAUSE THEY KNEW AMD SUCKS WITHOUT IT.

So who really cares at this point what lying hidden crutch amd needs for a "quicky bandaid patch" for it's crapster crap crossfire epic failure ?

It's a big fat excuse for YEARS of total loserville and lies.

I'll also mention once v-sync and radeon pro hack the frame rate down to 30 or 60, YOU MIGHT AS WELL NOT USE AN AMD CROSSFIRE SOLUTION AT ALL BECAUSE YOU CANNOT USE FRAME RATE POTENTIAL ON IT FULLY, YOU HAVE TO CRIPPLE IT FOR IT TO WORK.

What we need now is a 30fps chart and 60 fps chart with crap amd cards listed running vsync and radeon pro and the chart can tell us which pieces of crap can do 30 and 60 fps and we can THROW OUT EVERY OTHER FRAME RATE EVER LISTED FOR THE CRAP AMD CARDS.

April 5, 2013 | 11:34 AM - Posted by FenceMan (not verified)

"The problem isn't that you don't get playable frames in crossfire. It is that you aren't gaining full benefit from crossfire."

That right there to me is an accurate and thoughtful representation of what is going on here. That would have been a much more accurate way to present this information instead of the current "AMD Crossfire is a stuttering mess" message.

April 7, 2013 | 10:25 AM - Posted by Anonymous (not verified)

Interesting test, if what you have tested is valid, AMD is in deep-sh*t once again. If not, there is some kind of money transaction going on...

Nonetheless, thanks for all your efforts.

April 7, 2013 | 10:23 PM - Posted by PClover (not verified)

Quick Google "geforce frame metering" and you will find out why the nVi cards rarely have runt frames. In fact, nVi cards DO have them. They just delays those frames a bit to match with other good frames' speed, therefore the frame time chart looks good miraculously. And you, the user, will have to deal with input delay.

For me, an AMD cards and a 3rd party FPS cap software is the best. No input lag, no stuttering. And the image quality from AMD is always superior.

April 11, 2013 | 04:14 PM - Posted by praack

wow, it's spot on data, tough to take in, especially if you purchased Radeon's since the begining but it is true.

but GOD People - stop with the flaming!!!.

AMD will have to do a hardware re-do to fix this, i don't really expect anything to come out for a year. i don't really think that anyone misled on purpose - this is all new testing data.

AMD still makes an ok card- but the issue is with multiple cards.

answer- don't buy multiple lower cost cards with AMD

April 13, 2013 | 12:45 AM - Posted by Anonymous (not verified)

I wish you'd start testing with newer drivers. They have 314.22's out, had 314.21 out, and even a 314.07 all in WHQL, but you're still on beta .07's? I think they even had a 314.14 whql in there.

Many games have been optimized since then. Great data, just want later Nv drivers as you're a few behind at best.
https://www.youtube.com/watch?v=gQ3U2P8ZLz4
15% for 314.07 vs. 314.21 in tomb raider, others show basically the same and in some cases more and this was on an old 9600GT card...LOL. Who knew?

June 27, 2014 | 01:58 PM - Posted by Anonymous (not verified)

Sorry for the huge review, computer repair in schaumburg but I'm really loving the new Zune, and hope this, as well as the excellent reviews some other people have written, will help you decide if it's the right choice for you.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.