Review Index:

Frame Rating: GTX 660 vs HD 7870, plus HD 7790, HD 7850, GTX 650 Ti BOOST

Author: Ryan Shrout
Manufacturer: PC Perspective

Summary and Conclusions


Because of the complexity and sheer amount of data we have gathered using our Frame Rating performance methodology, we are breaking it up into several articles that each feature different GPU comparisons.  Here is the schedule:


In this article we have looked at two completely unique sets of data.  The first compared the Radeon HD 7870 GHz Edition and the GeForce GTX 660 cards in both single and dual-GPU configurations and resulted in similar kinds of data that we found in our HD 7950 vs GTX 660 Ti and HD 7970 vs GTX 680 articles.  The second set of data compared only single GPU configurations but on the lower end of the market with the Radeon HD 7850 and HD 7790 as well as the GeForce GTX 650 Ti and GTX 650 Ti BOOST.

We'll dive into each set of results separately. 


HD 7870 GHz Edition and GTX 660 Performance

Results from this testing should look very similar to those of you that read our Frame Rating story that compared the HD 7950 and the GTX 660 Ti cards.  We have found that as we continue down the product stack to lower performing graphics cards, the problems of AMD CrossFire technology are creeping up more often. 

At single monitor resolutions, tested here as 1920x1080 and 2560x1440, the Radeon HD 7870 CrossFire configuration showed the same problems we saw in our initial article once again in titles like Battlefield 3, Crysis 3 and Sleeping Dogs.  But as we saw in our previous piece problems are creeping up in DiRT 3 and Far Cry 3, game that previously didn't exhibit problems.  The issues of runt frames continues to haunt the AMD CrossFire technology.

View Full Size

The AMD Radeon HD 7870 is a great single GPU graphics option

It would appear that as the GPU becomes the bottleneck to bigger degree, which occurs in our testing as we keep the settings the same from the GTX Titan down to the GTX 660 Ti today, CrossFire results in more runt frames, more often and in more games. 

From a single GPU perspective though, the Radeon HD 7870 is indeed faster than the GeForce GTX 660 in nearly all of our tests, whether that be single monitor or multi-monitor configurations.  Buyers that are really only concerned with single card options then you might not care about the CrossFire or Eyefinity problems we have witnessed over the past two weeks.


HD 7850, HD 7790, GTX 650 Ti BOOST, GTX 650 Ti Performance

With only single GPU results to consider here, the results fall in line with what I wrote in the initial GTX 650 Ti BOOST review in late March.  With the release of both it and the Radeon HD 7790 in March, the complicated budget graphics market got a bit more complicated.

View Full Size

The Radeon HD 7790 falls behind the GTX 650 Ti BOOST often

In general, the most competitive cards are the GTX 650 Ti BOOST and the Radeon HD 7850 and they swap "wins" in our benchmarks pretty regularly.  If I had to give the nod to a single card it would be the GTX 650 Ti BOOST with slight wins in BF3, Crysis 3, Far Cry 3 and Skyrim.  The HD 7850 is faster in DiRT 3 and Sleeping Dogs.  But because the GTX 650 Ti BOOST is about $15-20 less expensive than the HD 7850, I think it is the better choice, all else being equal.

The secondary comparison is for the HD 7790 and the original GTX 650 Ti; there is very little to debate here though as the GTX 650 Ti is noticeably slower in basically all the games we tested.  It has a smaller frame buffer, lower performance and is only $10 less, not an investment I would make today.


Final Thoughts

Don't get the wrong impression, this is in no way the end of our Frame Rating stories and I am only finalizing our first complete set of data today.  The steps to get here have been difficult but now you have essentially all of the data and numbers that I have been building over the last year, with a lot of troubleshooting and analysis along the way. 

I think the takeaway from the current state of Frame Rating is that while AMD's single GPU graphics cards offer fantastic performance per dollar, there is a fundamental problem with the CrossFire and CrossFire + Eyefinity technology that needs to be addressed by AMD.  Even though there is still much research to be done on the subject of the full effect of the results and the discrepancies we are seeing in our CrossFire results, the fact is that something is fundamentally different between CrossFire and SLI, and in my hands on time with graphics cards over many years and over the last 18 months with this capture capability, NVIDIA is doing multi-GPU better. 

View Full Size

The GTX 660 is the better multi-GPU option today

AMD has promised to fix this, and we are going to hold them to it.  We differ on how dramatic an impact runt frames and dropped frames have on the user experience, but they are not denying that there is an issue many gamers are seeing.  That's the good news.  The bad news is that it may take until well into the summer to see something from AMD that will start to address the concerns we have.  But better late than never and we'll continue to push forward.

Our next step is to look at the many user-submitted solutions for this AMD issue including the full effects of enabling Vsync, applications like RadeonPro and how we can measure changes in input latency because of frame meter. 

It's going to be a busy summer...

April 5, 2013 | 07:29 PM - Posted by Torrijos (not verified)

Great work! I guess you're as happy as your readers to get to the end of this generation cards.

It would be nice, in order to facilitate readers choice, to have a summary page with all cards presents (maybe with dynamic selection).

What I image would be best is the graphs, where the reader selects a Game, a Screen Resolution, and gets all the cards on the same graphs (with tick-boxes to removes the cards he isn't interested in).

Graph= ƒ(Game, Screen resolution, GPU selection)

That way a reader looking for the best performance for his screen and favourite game would have all the information needed on a single graph. (you could be motivated to create another graph à-la TechReport taking into account the price)

Anyway Thanks again for the good work!

April 6, 2013 | 02:09 AM - Posted by Luciano (not verified)

thumbs up for this suggestion!
thats what here in brazil they call "info-graphics" (redundant i know by you get the idea).

April 5, 2013 | 08:28 PM - Posted by Cerebralassassin

Great work Ryan!

April 5, 2013 | 08:34 PM - Posted by ThorAxe

Thanks for a another great article. As a few others have pointed out it would be great to see older cards tested such as the 8800GTX, HD 4870 and up. I realize that this would be a huge amount of work but not everyone is a graphics whore like me that upgrades almost every cycle.

April 5, 2013 | 11:21 PM - Posted by Tim Verry

Ah, the good ole' 8800GTX. My little brother is running one of those still and when I finally upgrade he'll get my psuedo-6970 (unlocked xfx 6950). Unfortunately he missed out on my 3850 upgrade when it died on me and would no longer output anything to the display. It plays Kerbal Space Program, Minecraft, and most other games (with lots of stuff turned out of course) so yup, I would like to see Frame Rating through the several generations of graphics cards--even if it's only one card from each NV and AMD's generation and not their whole lineups (heh, we'd never see or hear from Ryan or Ken for 6 months if they had to Frame Rate all those cards :P).

April 6, 2013 | 02:12 AM - Posted by ThorAxe

I still have a HD 3850, 2x 8800 GTX, HD 4870x2, 2x HD 6870s and 2x GTX 570s SLI in various PCs of my wife and kids . My current gaming PC is 2x GTX 680s SLI. I told you I have a sickness. :)

At one point I ran a 4870x2+4870 in Trifire and it didn't seem as bad as the 6870s in Crossfire if I remember correctly, so it would be interesting to see what impact Trifire has on Observed Frame Rates.

April 6, 2013 | 02:25 AM - Posted by Luciano (not verified)

Your sickness gives you the power of intuition:,299...

AMD fans: "see? you just need to buy a third card! im sure is cheaper than 2 nvidias"

April 6, 2013 | 01:32 PM - Posted by bystander (not verified)

Unfortunately only 1 card was tested, so conclusions are hard to be definite. What might have fixed it may be what fixes 2-way Crossfire, a CPU bottleneck. When you add more cards, the GPU puts more stress on the CPU, and if it couldn't keep up, it will bottleneck, which also has shown to fix stuttering.

More testing is needed. Same with V-sync. Only 2 games were used on the v-sync test. It may be that v-sync only helps due to its FPS limiting component, though I have a feeling it would help anyways, but more testing is needed to be sure.

April 6, 2013 | 01:34 PM - Posted by bystander (not verified)

I meant that only 1 game was tested, not one card.

April 6, 2013 | 02:20 AM - Posted by Luciano (not verified)

Maybe its time for you elite of reviews (pcper, techreport, toms, anand) get together to work on the data generation in gpu reviews.
The value of each reviewer is what he points for us to see more closely and his conclusions.
Toms for instance is saying "just turn on vsync and everything would be ok".
PCPER refuted that.
Thats why god gave us multi-tab browsing.

April 6, 2013 | 11:19 AM - Posted by bystander (not verified)

Unfortunately different people have different preferences. You typically find competitive gamers refuse to use v-sync, because it induces latency and you lose the advantage of high FPS.

And curiously enough, v-sync doesn't work with Eyefinity.

April 5, 2013 | 10:32 PM - Posted by bystander (not verified)

I'm having a hard time finding what you are defining a runt frame. THG has it set to under 21 rows, which they said was the default, are you using the same definition, or something else?

April 7, 2013 | 04:07 PM - Posted by Ryan Shrout

Our versions of the scripts define it based on percentages.  If the frame is 20% or less of the preceding 10 frames average, then we consider it a "runt".

April 7, 2013 | 05:11 PM - Posted by bystander (not verified)

Cool, that is actually a cool method. The higher the FPS, the smaller the frames will be shown, so having it scale based on the average is good.

April 11, 2013 | 04:06 PM - Posted by Anonymous (not verified)

With 1080 horizontal "scan lines" on the average screen, calling 21 scan lines a frame is a joke really.

25% of one full screen would be 270 scan lines. Calling a quarter of a screen a "frame" is stretching it quite a lot.

Maybe one fifth the screen is a reasonable cutoff point, or one fourth. Anything less is a joke really, as once 5 "frames" are "sharing" one screen, we see slivered runts in all the examples given.

So I expect some reason for the answer, but I suspect we'll get ridiculous things like 7 or 8 scan lines out of over 1,000 on the screen "counts" as "one frame" or some tiny portion of the whole screen, far less than 20%, less than 10%, less than 5%, as 7 or 8 scan lines is less than 1% !

I can do the math, and do the visuals. Good luck arguing me down to 10% of a 1920x1080, or "108 scan lines" counting as one FPS.

April 12, 2013 | 04:42 PM - Posted by bystander (not verified)

I can't tell if you are arguing or agreeing with my thought.

I will add that as your FPS go higher, the smaller the slice of a frame will be on average. At 300 FPS, the average frame size will be about 20% of the screen. This is why I like what they did, as it scales based on the average frame size.

April 6, 2013 | 12:13 AM - Posted by Shambles (not verified)

Amazing timing! Needed exactly this.

April 6, 2013 | 12:31 AM - Posted by lindethier

Good work!

April 6, 2013 | 01:23 AM - Posted by Number1 (not verified)

I might have felt this dragged on a bit too long but I was wrong. It was important to do the tests across the whole line up of current cards. I was mainly concerned that you would end the investigation here, after doing so much work on this, but now that you have stated it is on going I am very impressed. Anyway just thought I'd say good work. Really good work. Very glad to hear that you will be looking at user submitted solutions.

It will be interesting to see if peoples claims about radeonpro etc are true. I have seen people post pretty impressive results in terms of frame latency using these solutions. I have also been promoting this solution going off other peoples reports. Hopefully there aren't too many costs associated with using it (other than setting up individual game profiles, and making sure you do it right..)

I agree with other users in that it would also be great to check say 6950s or 6970s in crossfire. AMD has suggested things like the GCNs memory manager needs work. AMD haven't specifically suggested that this is where crossfire is falling down and I don't think it's related. However, that isn't stopping people around the internet claiming it is the cause though. So it would be interesting to have a single non-GCN card thrown in, even if its only tested in two or three games, just to confirm it is AMD's whole crossfire solution rather than their GCN products and its memory management.

Good work nVidia in the last couple of years for making an SLI solution that works out of the box so well.

Good work to AMD for producing cards that are performing so well in the single card tests. Feeling pretty happy about my 7870 tahiti LE purchase.

April 6, 2013 | 03:33 AM - Posted by alwayssts (not verified)

Believe me when I say I understand that asking for more after all this feels terrible, and am greatful for the work on these articles but I do wish Ryan would test Tahiti well as show some 'typical' overclocking now that stock comparisons have been made.

It's very simple to argue LE is one of the most relevant cards for the largest market because of it's price/performance and absolute (overclocked) performance at 1080p, even if it's 'limited edition'/highly salvaged (which are usually awkward niche parts) and certainly soon to be replaced by Hainan...which will undoubtedly have a slightly better stock placement relative to 660ti. This is similar to the way '7870' became 7950 in performance through Tahiti LE, 7950B became the old 7970, and 7970GHZ ED was to have a stock config to beat the nvidia competition which often equaled or beat those original configs). Am I the only one telegraphing that coming stock above 660ti (which will drop to 250) with partner overclocks (like 7790 has with 1075/6400mhz) very similar to a stock 670 (but probably priced cheaper? Think 250, 270, 300, 330 (or something like that).

I would say I hope we get a 670/Tahiti LE update article, but I feel 670 is best left for when Hainan launches...and wonder if that is Ryan's thinking as well considering the deliberate choice to leave it out. I do wish both would've been thrown in the 660ti/7950B article, but understand his reasoning. 670 is in it's own price niche, just like 7870XT. I know the later may be a 'tweener, but it really would be interesting to see the results as I think it will show a nice point of reference/cutoff where most people should be looking to get smooth 1080p HQ game-play.

April 6, 2013 | 01:34 PM - Posted by Number1 (not verified)

Personally I think there is enough data to extrapolate how the gtx670 and tahiti LE will perform.

April 6, 2013 | 10:56 AM - Posted by AlienAndy (not verified)

More epic data !

April 6, 2013 | 04:50 PM - Posted by Joshua B. (not verified)

I am interested to see how crossfire/sli are effected by slower processors. I know that there is no end to the types of configurations out there but I wonder if causing a bottleneck elsewhere in the system will cause either adverse or positive results.

I also know that originally the setup used was to show the performance of the cards unrestricted, to which I say is a good thing when it comes to comparison from card to card, but now I find that adding in the cpu (crossfire/sli) may generate different results, unless you feel that the cpu still doesn't cause any "real" problems as far as perceived framerate. ex. the bulldozer review.

April 7, 2013 | 02:48 AM - Posted by Howie Doohan (not verified)

Shocking results from AMD. These results almost completely invalidate any price savings that might be had by going crossfire (eg CF 7970's vs SLI 680's). I'm definitely going green for my next multi card rig and won't be giving AMD a look until they pull their thumb out and fix crossfire.

April 7, 2013 | 09:03 AM - Posted by Humanitarian

Very comprehensive look at frame rating, excellent write up.

Really interested to see what some of software solutions previously suggested about will effect this problem.

April 7, 2013 | 11:34 AM - Posted by keromyaou (not verified)

A great work!!
I would like you to test older hardware (such as HD5870) to see when this crossfire problem occurred.
I found one link which might be interesting for you (,299...). Although the test in this link is not sufficient for withdrawing the final conclusions, it suggested that microstuttering was more in HD6870 crossfire than in GTX560 SLI setup and also that triple crossfire setup with HD6870 alleviated microstuttering greatly. Since their methods were not as finesse as yours, they probably were not able to detect some issues (like runts) like you were. But still their data is worth noting, I guess. I think that their data hinted that:

(1) Crossfire issues might be older than HD7000 series.
(2) Triple crossfire setup might be able to alleviate the problem which occurs with dual crossfire setup.

I hope that you have a chance to test triple crossfire setup with your apparatus as well as older gpu cards.

Thank you for your interesting data.

April 7, 2013 | 04:09 PM - Posted by Ryan Shrout

I think we are going to test one or two of the previous generations of cards to see how things scaled throughout.  Right now I am considering the HD 5870 and the GTX 480.

April 7, 2013 | 02:27 PM - Posted by Trey Long (not verified)

Great work. This is exactly the type of investigation that will make ALL companies better and go to a higher level.

April 7, 2013 | 06:27 PM - Posted by Anonymous (not verified)

Great test as always, but I think you should test GTX 580 SLI vs 660 SLI ASAP!
both have around the same performance, but Kepler supposedly deals with frame rate metering with dedicated hardware for that, and the 580 doesn't, both will use the same driver, from the same vendor, so it would be a great test and maybe help to understand how close AMD could get just with software tweaks!?

April 7, 2013 | 10:40 PM - Posted by I don't have a name. (not verified)

That is a huge amount of data presented Ryan. On the whole a killer article, one of the best I've ever read. The idea of testing the older 480 and 5870 should prove very interesting.

Thank you for such an informative, cohesive and thorough and downright geeky article. I loved reading through it. I think I went through the first article about five times just to ensure I had all the concepts fully understood. Awesome. :)

April 8, 2013 | 12:00 AM - Posted by Anonymous (not verified)

I just saw a video from valve discussing a tool called Telemetry from RAD games, it seems like it could be useful for analysis of game performance.

April 8, 2013 | 10:41 AM - Posted by Pendulously (not verified)


You have already noted elsewhere that Percentile (1-99) is insufficient, for tracking large SPIKES in Frame Times, spikes which may occur only once every few seconds (i.e., once every few hundred frames, thus well beyond the 99th percentile threshold).

So you need a different dependent variable (x-axis), with 'milliseconds' remaining the independent variable (y-axis).

In a 10 minute gampeplay section (36000 Frames at 60 FPS), 36 Frames is equivalent to the 99.9th percentile (36000 divided by 1000).

If you put '36' on the x-axis (or perhaps multiples of 10), you could easily show people the effects of LARGE SPIKES IN FRAME TIMES. In many cases/circumstances, more illuminating than Frame Variance Graphs which show nothing beyond the 99th percentile.

April 8, 2013 | 11:52 AM - Posted by Stennan (not verified)

Are there any redeeming features to AMDs Crossfire solution?

April 8, 2013 | 02:06 PM - Posted by Anonymous (not verified)

Yes, they drop every other frame in crossfire and count it anyway and sell it to their fans as a superior frame rate than SLI.

AMS CF Frame numbers sold 80!(actual screen rate 40+stutter ! )

Wow it sure beats 76 from SLI !

Yes, a very "redeeming" quality.

April 10, 2013 | 09:12 AM - Posted by Stennan (not verified)

Thank you for not answering my question :)

I just wanted to know if there might be an upside with regards to perhaps input lag. Kind of hard to measure though.

April 11, 2013 | 04:17 PM - Posted by Anonymous (not verified)

Ok, here's a few thoughts. With the AMD runts and drops, and the fans now trying to claim "less input lag", my question is of course, "What are you aiming at?" !

With every other frame dropped or runted, you're getting ~50% of the game screen opponent positions from the frames displayed (first person shooter as an example of course since most games are that ) - so your opponents position on screen is "jumping" from one frame over to two frames ahead, changing position in a lurch - with that type of inaccuracy of on screen opponents, how is "less lag" going to help when one cannot see half the positions of the opposition, and aiming at those in between spots just is not directly occurring? you have no visual data for those positions, half the positions of your opponent !

You can see how that may present some real issues for the gamer, as far as firing accuracy.
Yes, one could develop a sort of compensation with practice, just like a similar amount of consistent input lag is gotten used to as the mind and muscles acclimate and compensate accordingly.

So, it appears to me missing frames and runts are anything but helpful, and the lag thenno lag, or the lag not running in sync with what appears on screen could be a real problem.

With equalized frame time outputs to screen, the "smoothness" factor certainly helps aiming at opponents and learning to time things correctly.

April 10, 2013 | 09:13 AM - Posted by Stennan (not verified)

Thank you for not answering my question :)

I just wanted to know if there might be an upside with regards to perhaps input lag. Kind of hard to measure though.

April 9, 2013 | 06:45 AM - Posted by fadeout (not verified)

Are we seeing that crossfire is inherently borked? Or are we seeing that the current drivers are not leveraging the potential of the hardware?

I'm thinking that drivers can be updated.

A note on the fanboy thingy, surely we need both companies to succeed to push one another to create ever faster and more powerful cards. No?

April 9, 2013 | 10:44 AM - Posted by bystander (not verified)

The problem has persisted at least two series of cards, but based on AMD's response, it likely has always existed. AMD had said they never considered frame metering. Instead, they had been focusing on delivering frames as fast as possible.

That to me would say the drivers were not borked per say, they just weren't trying to do anything about it. They weren't even aware of it.

Nvidia has their metering done with special hardware, but AMD believes they can do the same with software, which they probably can, though it may not be as ideal.

April 9, 2013 | 10:23 PM - Posted by Anonymous (not verified)

Not sure why the 660 is being compared to a 7870. MSRP is over $100 difference. 7850 makes more sense to me but there must be a good reason the 7870 was chosen.

April 9, 2013 | 10:57 PM - Posted by bystander (not verified)

The GTX 660 is the cheaper of the two cards. You must have the GTX 660ti in mind.

April 13, 2013 | 03:32 PM - Posted by Anonymous (not verified)

The reason is a big fat smooch on the behind of amd and their fanboys.
Keep the raging natives a bit less restless.
You see, only 2 pages of comments, they didn't come in screaming bias while their radeon I don't believe any of this and "if it's true" it doesn't matter heads exploded.

They have all been made absolute idiots, as the years of their stupidity is apparent.

The years of them screeching no one can see over 30 fps anyway is exposed. It doesn't matter what they knew or did not know, they made every excuse and lie and fanboy promo they could over the course of years and now it is absolutely clear they were seeing only half the frames gobbled with screen wretching runts, but still claimed it was beauty incarnate.

They are fools, and we all know it.

April 11, 2013 | 03:15 PM - Posted by Arek (not verified)

It would be nice to see how a pair of gtx 580's fares, as I have these in my setup. Pretty please! *_*

April 13, 2013 | 07:21 PM - Posted by Johnny Rook (not verified)

I can't stress enough how amazing this is.

I have one or two questions though:

1. Is the ATI/AMD Dual-GPU cards like the HD7990 have the same "problems" has dual cards in SLi?

I have an HD5970 and I do recognize there are problems as far as stutter is concerned but, I notice, I have a clear visual "feeling" that something is not right in a very few games. Mostly those "heavy" titles like "Metro2033" and the new "Crysis 3".

2. Is the problem something related to the hardware itself, is it drivers or is it a mix of both?

April 13, 2013 | 07:31 PM - Posted by Johnny Rook (not verified)

Sorry, I meant "problems has dual cards in Crossfire?"

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.