Review Index:
Feedback

Frame Rating: GeForce GTX 660 Ti and Radeon HD 7950

What to Look For, Test Setup

Because of the complexity and sheer amount of data we have gathered using our Frame Rating performance methodology, we are breaking it up into several articles that each feature different GPU comparisons.  Here is the schedule:

We are back again with another edition of our continued reveal of data from the capture-based Frame Rating GPU performance methods.  In this third segment we are moving on down the product stack to the NVIDIA GeForce GTX 660 Ti and the AMD Radeon HD 7950 - both cards that fall into a similar price range.

View Full Size

I have gotten many questions about why we are using the cards in each comparison and the answer is pretty straight forward: pricing.  In our first article we looked at the Radeon HD 7970 GHz Edition and the GeForce GTX 680 while in the second we compared the Radeon HD 7990 (HD 7970s in CrossFire), the GeForce GTX 690 and the GeForce GTX Titan.  This time around we have the GeForce GTX 660 Ti ($289 on Newegg.com) and the Radeon HD 7950 ($299 on Newegg.com) but we did not include the GeForce GTX 670 because it sits much higher at $359 or so.  I know some of you are going to be disappointed that it isn't in here, but I promise we'll see it again in a future piece!


If you are just joining this article series today, you have missed a lot!  If nothing else you should read our initial full release article that details everything about the Frame Rating methodology and why we are making this change to begin with.  In short, we are moving away from using FRAPS for average frame rates or even frame times and instead are using a secondary hardware capture system to record all the frames of our game play as they would be displayed to the gamer, then doing post-process analyzation on that recorded file to measure real world performance.

Because FRAPS measures frame times at a different point in the game pipeline (closer to the game engine) its results can vary dramatically from what is presented to the end user on their display.  Frame Rating solves that problem by recording video through a dual-link DVI capture card that emulates a monitor to the testing system and by simply applying a unique overlay color on each produced frame from the game, we can gather a new kind of information that tells a very unique story.

View Full Size

The capture card that makes all of this work possible.

I don't want to spend too much time on this part of the story here as I already wrote a solid 16,000 words on the topic in our first article and I think you'll really find the results fascinating.  So, please check out my first article on the topic if you have any questions before diving into these results today!

Test System Setup
CPU Intel Core i7-3960X Sandy Bridge-E
Motherboard ASUS P9X79 Deluxe
Memory Corsair Dominator DDR3-1600 16GB
Hard Drive OCZ Agility 4 256GB SSD
Sound Card On-board
Graphics Card NVIDIA GeForce GTX 660 Ti 2GB
AMD Radeon HD 7950 3GB
Graphics Drivers AMD: 13.2 beta 7
NVIDIA: 314.07 beta
Power Supply Corsair AX1200i
Operating System Windows 8 Pro x64

 

On to the results! 

Continue reading our review of the GTX 660 Ti and HD 7950 using Frame Rating!!

 

April 3, 2013 | 07:25 AM - Posted by bystander (not verified)

The TechReport articles "were" accurate when they were released, and sparked change from AMD, who went through their drivers to fix the problem.

They didn't prove TechReport wrong. They prove AMD fixed things on single cards configurations.

April 3, 2013 | 02:59 AM - Posted by Cpt. Obvious (not verified)

Why do the graphs for "FRAPS FPS" and "Observed FPS" match exactly except for Crossfire?!

There should be at least little differences between these graphs.

You don't have FRAPS running during the capture, do you?!

April 3, 2013 | 08:15 AM - Posted by Ryan Shrout

No, FRAPS was not running at the same time as the overlay and capture testing.

April 3, 2013 | 05:38 PM - Posted by Cpt. Obvious (not verified)

So what would be the explanation for it?

The graphs match exactly.

If you measure the same scene twice, the results wont be exactly the same.

So you let the bench run, measure with FRAPS.
You let the bench run and measure with FCAT.
The results are EXACTLY the same except for crossfire.

There has to be something wrong.

April 3, 2013 | 04:48 AM - Posted by Anonymous (not verified)

Another article to confirm that with vsync off CFX is nerfed atm...

Which doesn't affect me in the slightest as i don't play with vsync off .. i won't play with vsync off fullstop.

Twin lightning 7970s @ 1400 mhz , 60 fps solid in crysis 3 with RP 60 DFC and vsync on and its butter smooth. Slightly different to the results i see in this article i might add.

My advice to any CFX user is simply run with vsync on and radeon pro DFC (dynamic frame control) 60 fps with vsync and you won't get 90 % of the issues in these articles.

It is a pity that pcper won't simply do a review with vsync on running radeon pro .. i guess they wouldn't have much to write about then.

April 3, 2013 | 06:09 AM - Posted by Lord Binky (not verified)

I don't think dropped frames are as significant to the viewing experience as the runt frames are which also cause small tears. Also, keeping the image being displayed the most current representation of the game state would seem to support dropping a frame if it gets in the way of a newer frame.

That said, I really want to see some rules of thumb applied to the graphs as to what consitutes perceptable gameplay degredation. For example in the frame variance chart, I would think there is a slope based assessment as to the type of problem the person would percieve.

April 3, 2013 | 08:16 AM - Posted by Ryan Shrout

The idea of "rules of thumb applied to the graphs as to what consitutes perceptable gameplay degredation" is a good idea, but it is much harder to actually do.  That is the eventual goal but in all honestly it will likely be an issue that changes for each gaming title.

April 4, 2013 | 10:43 AM - Posted by bystander (not verified)

I was thinking about how many FPS compared to your refresh rate has an effect on what the average frame size would be. This might mean what is considered a runt frame might be something that should scale based on FPS. This may also help narrow down when something is not acceptable.

April 3, 2013 | 09:03 AM - Posted by Verticals (not verified)

Yeah, the hidden-agenda accusations are rather bizarre considering the AMD ads, but any tech site is going to get its share of unstable posters. don't let it get to ya.

Anyway, AMD's single gpus came out looking nicer than I would have figured. My experience with previous gens had me expecting to see disparity in observed frames for the single cards as well. I'd also like to see tests on the last gen from both manufacturers, but I know that may not be a practical use of copy.

April 3, 2013 | 09:52 AM - Posted by Anonymous (not verified)

Window cause more of those issue then everything else combined.that is what I mean with the long post Ryan didn't understand.

April 3, 2013 | 10:00 AM - Posted by Anonymous (not verified)

You have to use radeon pro, and set up a profile for each game being used on Crossfire setups. Its the only way to get them to run smoothly. Its an extra program and a bit of work but the results are promising, regarding frame time. I hope Future AMD driver versions have this feature built in.

April 3, 2013 | 10:23 AM - Posted by Anonymous (not verified)

You use nVidia Hardware to measure SOMETHING?
Evilol....

April 3, 2013 | 12:50 PM - Posted by endiZ (not verified)

Hey Ryan, great articles.

Have you tried RadeonPro with Dynamic V-sync Control? It looks they are using a similar technique that Nvidia uses with Adaptive Vsync.

April 3, 2013 | 02:06 PM - Posted by Cannyone

I'm really surprised by this article. And that is an ambivalent sort of surprise! On one hand I'm pleased to see that SLI 660 Ti's do so well at 2560x1440. On the other hand I'm dismayed that Crossfire is NOT competitive. So the crux of what I'm feeling is disappointment. Because "competition" is good for me as a consumer.

See I have a pair of Nvidia (specifically EVGA) GTX 660 Ti's in my computer. And they were chosen because I wanted to run game's on a 2560x1440 display at full resolution. But it troubles me to think that AMD isn't being competitive because that points to a Market where Nvidia's dominance is most likely to mean higher prices. Which contributes to the feeling that PC Gaming my indeed be on its last leg.

The bottom line, economically speaking, is that while computers are a boon to our lives our "life" itself does not depend on them. So when things get really tight we will find ways to get by without spending money on new hardware. And both AMD and Nvidia will be forced to change to maintain a revenue stream.

... so despite the fact I own Nvidia cards I sure hope AMD can get their act together!

April 3, 2013 | 04:05 PM - Posted by Zarich (not verified)

I enjoyed this article. However, after watching all the comparison videos I learned one thing. ATI and Nvidia do not have the same colors. So Out of the box which card is producing the most accurate colors?

April 3, 2013 | 05:45 PM - Posted by Cpt. Obvious (not verified)

Why do the graphs for "FRAPS FPS" and "Observed FPS" match exactly except for Crossfire?!

There should be at least little differences between these graphs.

You already said, that tests with FCAT and FRAPS weren't made at the same time.

So what would be the explanation for it?

The graphs match exactly.

If you measure the same scene twice, the results wont be exactly the same.

So you let the bench run, measure with FRAPS.
You let the bench run and measure with FCAT.
The results are EXACTLY the same except for crossfire.

There has to be something wrong.

April 3, 2013 | 07:09 PM - Posted by ThorAxe

The only thing wrong is the borked Crossfire. The single GPUs and SLI are doing what they should.

April 4, 2013 | 12:24 AM - Posted by Cpt. Obvious (not verified)

Read. -> Think. -> Answer.

The results can't be exactly the same.
2 Benches, 2 different tools that give different data,
but 1 result? I don't think so.

April 4, 2013 | 06:22 AM - Posted by ThorAxe

Maybe you should re-read what has been written or at least look at the definition of observed frame-rate.

Observed Frame Rates will, indeed, must show identical frame rates to FRAPS if all frames are defined as useful. That means there were no dropped or runt frames in the benchmark.

The tools use the same data. Only discarding useless frames would result in a variance. If there are no useless frames then there should not be a variance.

April 4, 2013 | 08:40 AM - Posted by Cpt. Obvious (not verified)

So if you bench the same scene two times, the results will be the same? Interesting.

April 4, 2013 | 10:31 AM - Posted by Tri Wahyudianto (not verified)

completely no, this called "blind and randomization" in journal, health journal, social journal, or other

cause we know the same scene never do the same results in two testing in all graphic card

so it do to all graphic card was testing, bad results, good results, all graphic card has same chance to have it

in the end, we called it a fair comparison

April 4, 2013 | 08:23 PM - Posted by ThorAxe

And there is your problem. The benchmark is only run once.

In the test run FRAPS displays all Frames rendered regardless of size. The video output is then captured via the capture card and the raw AVI file is analyzed. If there are no runts or drops (as is the case with the single GPUs tested) then the Observed Frame Rate must be identical to the FRAPs results.

April 5, 2013 | 12:46 AM - Posted by Cpt. Obvious (not verified)

Wrong. I already asked. Ryans answer:

April 3, 2013 | 11:15 AM - Posted by Ryan Shrout
No, FRAPS was not running at the same time as the overlay and capture testing.

I am not trolling, I know that AMD has bigger issues with MGPU than Nvidia.
The only thing I am saying:
Somethig has to be wrong with the graphs, as there always will be slight differences between two bench runs.

April 5, 2013 | 07:47 AM - Posted by ThorAxe

If that is the case then there would normally be a slight variation.

I don't know why it would need to be run separately, though I think the podcast mentioned issues with FRAPS and the overlay being used at the same time.

April 5, 2013 | 12:58 PM - Posted by KittenMasher (not verified)

You're right, there should be at least slight differences between the two runs. However, that doesn't mean those differences when displayed on a graph that is made for human consumption are going to be noticeable. In order for the graph's to be completely accurate, the 90seconds or so of benchmark would have to be sampled at frame. In at least one of the cases that's about 150fps, which means the graph would have to have a width of about 810,000 pixels. You may have noticed that the graphs on pcper are a bit smaller than that.

April 5, 2013 | 12:58 PM - Posted by KittenMasher (not verified)

You're right, there should be at least slight differences between the two runs. However, that doesn't mean those differences when displayed on a graph that is made for human consumption are going to be noticeable. In order for the graph's to be completely accurate, the 90seconds or so of benchmark would have to be sampled at frame. In at least one of the cases that's about 150fps, which means the graph would have to have a width of about 810,000 pixels. You may have noticed that the graphs on pcper are a bit smaller than that.

April 4, 2013 | 01:08 AM - Posted by Tri Wahyudianto (not verified)

Thanks for the objective and numberical article despite many critics and subjectivity of the reader

well done ryan

thumbs up pcper.com

April 4, 2013 | 01:54 AM - Posted by sLiCeR- (not verified)

I would very much like to read a quick analysis of previous generations crossfire setup (ie: 2x 5850/5870), to try to understand if these issues are only related to the drivers / GCN arch or the crossfire technology itself.

And of course, AMD should have the ability to discuss these results with pcper.com and eventually revisit these tests with the *future* drivers that will address these issues.

Regardless of the impacts this kind of articles have, it's known today that there are issues with AMD drivers. Even if there are some errors with some tests, if were not for them we wouldn't even be looking at this. So.. well done pcper.com, but don't leave it at this bombshell ;)

cheers.

April 4, 2013 | 06:18 AM - Posted by AlienAndy (not verified)

Thank you for taking your time to do this Ryan. I used to use Crossfire but it never quite felt like what was being displayed. What I mean is when I ran 5770 CF it always seemed jerky and stuttered even though the readings I was seeing seemed pretty high (as it should have been, just ahead of a 5870).

I swore off of Crossfire after that and just concluded it was down to poor drivers. I'm now using GTX 670 in SLI and I have to say it's been fantastic. No stuttering and pretty smooth. I have had the odd issue waiting for drivers (Far Cry 3) but other than that it seems to work great.

Funny really. Crossfire always felt rather crude to me. Would also love to see some 5xxx results !!

April 4, 2013 | 07:22 AM - Posted by AlienAndy (not verified)

BTW could you also test 3dmark11 please?

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.