Review Index:
Feedback

Frame Rating: GeForce GTX 660 Ti and Radeon HD 7950

Summary and Conclusions

Because of the complexity and sheer amount of data we have gathered using our Frame Rating performance methodology, we are breaking it up into several articles that each feature different GPU comparisons.  Here is the schedule:

 

Now that we are a few articles into this series called Frame Rating, it is important to go back and start to evaluate these results, compare them to previous ones and look for trends, patterns and anything that stands out.  We are hoping to find some answer as to WHY AMD's CrossFire and Eyefinity are having so many problems in hopes that they can address it sooner rather than later.

 

Performance

In a very similar result to the HD 7970 and GTX 680 results we launched Frame Rating with, the AMD card here has done well when compared to the GTX 660 Ti in terms of single GPU performance.  Both NVIDIA and AMD are doing great with single GPU frame rates, frame times and frame variances, especially in our 1920x1080 and 2560x1440 testing.  Since the release of the GTX 680, and after it the Radeon HD 7970 GHz Edition, AMD has been careful to position its products, based on pricing, so that they offer better performance per dollar than NVIDIA.  That was the case when we tested with FRAPS exclusively and that continues to be the case with our new capture-based method, called Frame Rating. 

View Full Size

The Radeon HD 7950 3GB (with Boost) is faster than the GeForce GTX 660 Ti in basically all six of our games tested today.  Battlefield 3, Crysis 3, DiRT 3, Sleeping Dogs, Skyrim and Far Cry 3 all showed benefits in frame rates and frame times when run on the HD 7950 at single monitor resolutions.  In many cases though the GTX 660 Ti was close; very close.  At multi-monitor testing (5760x1080) the 3GB frame buffer of the HD 7950 seemed to stretch out the lead that AMD had in single card results, particularly in games like Skyrim and Battlefield 3.  Having 50% more memory at that kind of resolution is definitely an advantage that AMD is holding on to.

The issue of multi-GPU gaming on AMD crept up again with CrossFire - AMD's HD 7950s in a pair were consistently resulting in runt frames and dropped frames that caused lower observed frame rates than found with the GTX 660 Ti in SLI.  In fact, it occurred more often here with the HD 7950s than it did with the HD 7970s - both DiRT 3 and Skyrim had problems in our review today when they did not appear to have any issues with the initial launch article.  This tells me that as performance of the GPU is starting to go down (as we step down the product stack) the bottleneck of the GPU is going to cause more of these problems, not less. 

View Full Size

This doesn't paint a very good picture for the Radeon HD 7870 and HD 7850 with multiple graphics cards if the pattern holds. 

(Just a reminder as to why the GeForce GTX 670 was left out of this article.  The GeForce GTX 660 Ti 2GB ($289) and the Radeon HD 7950 3GB ($299) are much closer in price!  The GeForce GTX 670 ($359) will be included in a future Frame Rating article.)

 

Final Thoughts

There isn't much left to say here but to reiterate what I have written on the last two articles: AMD presently has the better hardware (just in terms of raw performance) when looking at single GPU options. (Though that is debatable with the inclusion of the GeForce GTX Titan at the high end.) But gamers that are or are thinking of using multiple-GPUs will without a doubt find the issues discussed here with Radeon GPUs in CrossFire mode troublesome. 

View Full Size

AMD is aware of the issues, and has been since before the release of our first full article on Frame Rating, and is trying to figure out the answer for gamers.  The current line of thinking is that they will release a driver some time in the summer that will give the user the option of enabling frame metering or not, rather than forcing it on as NVIDIA is doing today.  Whether or not that will work out as well for AMD as currently does for NVIDIA will have to be seen.

I welcome your thoughts and comments below!!

April 3, 2013 | 07:25 AM - Posted by bystander (not verified)

The TechReport articles "were" accurate when they were released, and sparked change from AMD, who went through their drivers to fix the problem.

They didn't prove TechReport wrong. They prove AMD fixed things on single cards configurations.

April 3, 2013 | 02:59 AM - Posted by Cpt. Obvious (not verified)

Why do the graphs for "FRAPS FPS" and "Observed FPS" match exactly except for Crossfire?!

There should be at least little differences between these graphs.

You don't have FRAPS running during the capture, do you?!

April 3, 2013 | 08:15 AM - Posted by Ryan Shrout

No, FRAPS was not running at the same time as the overlay and capture testing.

April 3, 2013 | 05:38 PM - Posted by Cpt. Obvious (not verified)

So what would be the explanation for it?

The graphs match exactly.

If you measure the same scene twice, the results wont be exactly the same.

So you let the bench run, measure with FRAPS.
You let the bench run and measure with FCAT.
The results are EXACTLY the same except for crossfire.

There has to be something wrong.

April 3, 2013 | 04:48 AM - Posted by Anonymous (not verified)

Another article to confirm that with vsync off CFX is nerfed atm...

Which doesn't affect me in the slightest as i don't play with vsync off .. i won't play with vsync off fullstop.

Twin lightning 7970s @ 1400 mhz , 60 fps solid in crysis 3 with RP 60 DFC and vsync on and its butter smooth. Slightly different to the results i see in this article i might add.

My advice to any CFX user is simply run with vsync on and radeon pro DFC (dynamic frame control) 60 fps with vsync and you won't get 90 % of the issues in these articles.

It is a pity that pcper won't simply do a review with vsync on running radeon pro .. i guess they wouldn't have much to write about then.

April 3, 2013 | 06:09 AM - Posted by Lord Binky (not verified)

I don't think dropped frames are as significant to the viewing experience as the runt frames are which also cause small tears. Also, keeping the image being displayed the most current representation of the game state would seem to support dropping a frame if it gets in the way of a newer frame.

That said, I really want to see some rules of thumb applied to the graphs as to what consitutes perceptable gameplay degredation. For example in the frame variance chart, I would think there is a slope based assessment as to the type of problem the person would percieve.

April 3, 2013 | 08:16 AM - Posted by Ryan Shrout

The idea of "rules of thumb applied to the graphs as to what consitutes perceptable gameplay degredation" is a good idea, but it is much harder to actually do.  That is the eventual goal but in all honestly it will likely be an issue that changes for each gaming title.

April 4, 2013 | 10:43 AM - Posted by bystander (not verified)

I was thinking about how many FPS compared to your refresh rate has an effect on what the average frame size would be. This might mean what is considered a runt frame might be something that should scale based on FPS. This may also help narrow down when something is not acceptable.

April 3, 2013 | 09:03 AM - Posted by Verticals (not verified)

Yeah, the hidden-agenda accusations are rather bizarre considering the AMD ads, but any tech site is going to get its share of unstable posters. don't let it get to ya.

Anyway, AMD's single gpus came out looking nicer than I would have figured. My experience with previous gens had me expecting to see disparity in observed frames for the single cards as well. I'd also like to see tests on the last gen from both manufacturers, but I know that may not be a practical use of copy.

April 3, 2013 | 09:52 AM - Posted by Anonymous (not verified)

Window cause more of those issue then everything else combined.that is what I mean with the long post Ryan didn't understand.

April 3, 2013 | 10:00 AM - Posted by Anonymous (not verified)

You have to use radeon pro, and set up a profile for each game being used on Crossfire setups. Its the only way to get them to run smoothly. Its an extra program and a bit of work but the results are promising, regarding frame time. I hope Future AMD driver versions have this feature built in.

April 3, 2013 | 10:23 AM - Posted by Anonymous (not verified)

You use nVidia Hardware to measure SOMETHING?
Evilol....

April 3, 2013 | 12:50 PM - Posted by endiZ (not verified)

Hey Ryan, great articles.

Have you tried RadeonPro with Dynamic V-sync Control? It looks they are using a similar technique that Nvidia uses with Adaptive Vsync.

April 3, 2013 | 02:06 PM - Posted by Cannyone

I'm really surprised by this article. And that is an ambivalent sort of surprise! On one hand I'm pleased to see that SLI 660 Ti's do so well at 2560x1440. On the other hand I'm dismayed that Crossfire is NOT competitive. So the crux of what I'm feeling is disappointment. Because "competition" is good for me as a consumer.

See I have a pair of Nvidia (specifically EVGA) GTX 660 Ti's in my computer. And they were chosen because I wanted to run game's on a 2560x1440 display at full resolution. But it troubles me to think that AMD isn't being competitive because that points to a Market where Nvidia's dominance is most likely to mean higher prices. Which contributes to the feeling that PC Gaming my indeed be on its last leg.

The bottom line, economically speaking, is that while computers are a boon to our lives our "life" itself does not depend on them. So when things get really tight we will find ways to get by without spending money on new hardware. And both AMD and Nvidia will be forced to change to maintain a revenue stream.

... so despite the fact I own Nvidia cards I sure hope AMD can get their act together!

April 3, 2013 | 04:05 PM - Posted by Zarich (not verified)

I enjoyed this article. However, after watching all the comparison videos I learned one thing. ATI and Nvidia do not have the same colors. So Out of the box which card is producing the most accurate colors?

April 3, 2013 | 05:45 PM - Posted by Cpt. Obvious (not verified)

Why do the graphs for "FRAPS FPS" and "Observed FPS" match exactly except for Crossfire?!

There should be at least little differences between these graphs.

You already said, that tests with FCAT and FRAPS weren't made at the same time.

So what would be the explanation for it?

The graphs match exactly.

If you measure the same scene twice, the results wont be exactly the same.

So you let the bench run, measure with FRAPS.
You let the bench run and measure with FCAT.
The results are EXACTLY the same except for crossfire.

There has to be something wrong.

April 3, 2013 | 07:09 PM - Posted by ThorAxe

The only thing wrong is the borked Crossfire. The single GPUs and SLI are doing what they should.

April 4, 2013 | 12:24 AM - Posted by Cpt. Obvious (not verified)

Read. -> Think. -> Answer.

The results can't be exactly the same.
2 Benches, 2 different tools that give different data,
but 1 result? I don't think so.

April 4, 2013 | 06:22 AM - Posted by ThorAxe

Maybe you should re-read what has been written or at least look at the definition of observed frame-rate.

Observed Frame Rates will, indeed, must show identical frame rates to FRAPS if all frames are defined as useful. That means there were no dropped or runt frames in the benchmark.

The tools use the same data. Only discarding useless frames would result in a variance. If there are no useless frames then there should not be a variance.

April 4, 2013 | 08:40 AM - Posted by Cpt. Obvious (not verified)

So if you bench the same scene two times, the results will be the same? Interesting.

April 4, 2013 | 10:31 AM - Posted by Tri Wahyudianto (not verified)

completely no, this called "blind and randomization" in journal, health journal, social journal, or other

cause we know the same scene never do the same results in two testing in all graphic card

so it do to all graphic card was testing, bad results, good results, all graphic card has same chance to have it

in the end, we called it a fair comparison

April 4, 2013 | 08:23 PM - Posted by ThorAxe

And there is your problem. The benchmark is only run once.

In the test run FRAPS displays all Frames rendered regardless of size. The video output is then captured via the capture card and the raw AVI file is analyzed. If there are no runts or drops (as is the case with the single GPUs tested) then the Observed Frame Rate must be identical to the FRAPs results.

April 5, 2013 | 12:46 AM - Posted by Cpt. Obvious (not verified)

Wrong. I already asked. Ryans answer:

April 3, 2013 | 11:15 AM - Posted by Ryan Shrout
No, FRAPS was not running at the same time as the overlay and capture testing.

I am not trolling, I know that AMD has bigger issues with MGPU than Nvidia.
The only thing I am saying:
Somethig has to be wrong with the graphs, as there always will be slight differences between two bench runs.

April 5, 2013 | 07:47 AM - Posted by ThorAxe

If that is the case then there would normally be a slight variation.

I don't know why it would need to be run separately, though I think the podcast mentioned issues with FRAPS and the overlay being used at the same time.

April 5, 2013 | 12:58 PM - Posted by KittenMasher (not verified)

You're right, there should be at least slight differences between the two runs. However, that doesn't mean those differences when displayed on a graph that is made for human consumption are going to be noticeable. In order for the graph's to be completely accurate, the 90seconds or so of benchmark would have to be sampled at frame. In at least one of the cases that's about 150fps, which means the graph would have to have a width of about 810,000 pixels. You may have noticed that the graphs on pcper are a bit smaller than that.

April 5, 2013 | 12:58 PM - Posted by KittenMasher (not verified)

You're right, there should be at least slight differences between the two runs. However, that doesn't mean those differences when displayed on a graph that is made for human consumption are going to be noticeable. In order for the graph's to be completely accurate, the 90seconds or so of benchmark would have to be sampled at frame. In at least one of the cases that's about 150fps, which means the graph would have to have a width of about 810,000 pixels. You may have noticed that the graphs on pcper are a bit smaller than that.

April 4, 2013 | 01:08 AM - Posted by Tri Wahyudianto (not verified)

Thanks for the objective and numberical article despite many critics and subjectivity of the reader

well done ryan

thumbs up pcper.com

April 4, 2013 | 01:54 AM - Posted by sLiCeR- (not verified)

I would very much like to read a quick analysis of previous generations crossfire setup (ie: 2x 5850/5870), to try to understand if these issues are only related to the drivers / GCN arch or the crossfire technology itself.

And of course, AMD should have the ability to discuss these results with pcper.com and eventually revisit these tests with the *future* drivers that will address these issues.

Regardless of the impacts this kind of articles have, it's known today that there are issues with AMD drivers. Even if there are some errors with some tests, if were not for them we wouldn't even be looking at this. So.. well done pcper.com, but don't leave it at this bombshell ;)

cheers.

April 4, 2013 | 06:18 AM - Posted by AlienAndy (not verified)

Thank you for taking your time to do this Ryan. I used to use Crossfire but it never quite felt like what was being displayed. What I mean is when I ran 5770 CF it always seemed jerky and stuttered even though the readings I was seeing seemed pretty high (as it should have been, just ahead of a 5870).

I swore off of Crossfire after that and just concluded it was down to poor drivers. I'm now using GTX 670 in SLI and I have to say it's been fantastic. No stuttering and pretty smooth. I have had the odd issue waiting for drivers (Far Cry 3) but other than that it seems to work great.

Funny really. Crossfire always felt rather crude to me. Would also love to see some 5xxx results !!

April 4, 2013 | 07:22 AM - Posted by AlienAndy (not verified)

BTW could you also test 3dmark11 please?

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.