Review Index:
Feedback

Frame Rating: GeForce GTX 660 Ti and Radeon HD 7950

What to Look For, Test Setup

Because of the complexity and sheer amount of data we have gathered using our Frame Rating performance methodology, we are breaking it up into several articles that each feature different GPU comparisons.  Here is the schedule:

We are back again with another edition of our continued reveal of data from the capture-based Frame Rating GPU performance methods.  In this third segment we are moving on down the product stack to the NVIDIA GeForce GTX 660 Ti and the AMD Radeon HD 7950 - both cards that fall into a similar price range.

View Full Size

I have gotten many questions about why we are using the cards in each comparison and the answer is pretty straight forward: pricing.  In our first article we looked at the Radeon HD 7970 GHz Edition and the GeForce GTX 680 while in the second we compared the Radeon HD 7990 (HD 7970s in CrossFire), the GeForce GTX 690 and the GeForce GTX Titan.  This time around we have the GeForce GTX 660 Ti ($289 on Newegg.com) and the Radeon HD 7950 ($299 on Newegg.com) but we did not include the GeForce GTX 670 because it sits much higher at $359 or so.  I know some of you are going to be disappointed that it isn't in here, but I promise we'll see it again in a future piece!


If you are just joining this article series today, you have missed a lot!  If nothing else you should read our initial full release article that details everything about the Frame Rating methodology and why we are making this change to begin with.  In short, we are moving away from using FRAPS for average frame rates or even frame times and instead are using a secondary hardware capture system to record all the frames of our game play as they would be displayed to the gamer, then doing post-process analyzation on that recorded file to measure real world performance.

Because FRAPS measures frame times at a different point in the game pipeline (closer to the game engine) its results can vary dramatically from what is presented to the end user on their display.  Frame Rating solves that problem by recording video through a dual-link DVI capture card that emulates a monitor to the testing system and by simply applying a unique overlay color on each produced frame from the game, we can gather a new kind of information that tells a very unique story.

View Full Size

The capture card that makes all of this work possible.

I don't want to spend too much time on this part of the story here as I already wrote a solid 16,000 words on the topic in our first article and I think you'll really find the results fascinating.  So, please check out my first article on the topic if you have any questions before diving into these results today!

Test System Setup
CPU Intel Core i7-3960X Sandy Bridge-E
Motherboard ASUS P9X79 Deluxe
Memory Corsair Dominator DDR3-1600 16GB
Hard Drive OCZ Agility 4 256GB SSD
Sound Card On-board
Graphics Card NVIDIA GeForce GTX 660 Ti 2GB
AMD Radeon HD 7950 3GB
Graphics Drivers AMD: 13.2 beta 7
NVIDIA: 314.07 beta
Power Supply Corsair AX1200i
Operating System Windows 8 Pro x64

 

On to the results! 

Continue reading our review of the GTX 660 Ti and HD 7950 using Frame Rating!!

 


April 2, 2013 | 03:24 PM - Posted by technogiant (not verified)

Absolutely god awe-full again for AMD....a couple of 7950's in crossfire was meant to be about the best bang for buck high performance system you could build....oh how deluded or even worse misled we have been.

April 3, 2013 | 11:59 AM - Posted by Anonymous (not verified)

Deluded, misled, fanboyed out, raging radeon rightist, screaming psychotic frame rate fanboy, totally full of it, amd marketing borg bot, COMPLETELY FOOLISH IDIOT WHO CANNOT SEE THE GAME ON THEIR OWN ULTRA HIGH REZZ BRAGGADOCIO DISPLAYS WITH MEGA BUCK AMD CARDS....

I cannot even fathom the utter SHAME the CF amd fanboys must be feeling now.

A thousand times on every website we were told lies, by the reviewers, by the fanboys raping away in the comments section.

Now we know they were all deluded idiots, period.

AMD has claimed utter ignorance, and I certainly do not believe that since it's worse than being just incompetent in writing drivers and making hardware, and extends the incompetence out ever further, into "not knowing what is going on at all" instead of covering it up because monetary concerns rule the day.

It's one giant can of fail, and AMD has gulped the entire 55 gallon drum in one fail swill.

I wonder, will they pay the full price refund for one of the amd cards for every crossfire user, because indeed they certainly owe them. Hopefully a giant lawsuit will make the amd fanboys pocketbooks right, and I could see the penny pinching fanboy scrooges getting behind that 100%.

The next thing I want to know is how many of the LYING BUFFOON IDIOT articles from the hundred review sites that show amd fraps! in crossfire winning will be CHANGED, UPDATED, and in big red letters at the top:

NEW INFORMATION HAS SHOWN WE WERE WRONG. AMD FRAME RATES IN CROSSFIRE ARE HALF OF WHAT WE CLAIMED THEY WERE FOR MANY YEARS. WE HAVE MARKED "FAIL!" IN BIG RED LETTERS ACROSS EVERY SINGLE CHART AND IN EVERY REVIEW THEY WE HAVE EVER POSTED.
WE ARE SORRY OUR EYEBALLS DID NOT LIE TO US AS WE WATCHED THE BENCHMARKS STUTTER LIKE A RETARD ON AMD HARDWARE, BUT OUR FANBOY AMD BRAINS TOOK OVER AND WE SCREAMED VICTORY FOR AMD IN THE HIGHEST OF EMOTIONAL STATES AND RECOMMENDATIONS !

(ROFL - Oh man, they ALL need to resign)

That should only take them a few months to fix on all their websites since the amd underdog fanboyism is so freaking strong it could destroy all the rainforests of the world and induce global warming and nuclear winter concurrently.

O M G - thanks for the years and years of total lies

April 3, 2013 | 02:58 PM - Posted by Anonymous (not verified)

Dude...are you okay?

You have issues.

April 3, 2013 | 07:51 PM - Posted by Anonymous (not verified)

poor bitter guy.. go f*ck yourself and try getting a life. these guys at pcper are doing their job right unlike you causing misery to others making them feel bad..

kudos pcper for a well done review..

kudos to amd and nvidia

in the end. consumer wins the deal!! ^_^

April 4, 2013 | 05:05 AM - Posted by Anonymous (not verified)

poor bitter guy is spot on.

they all deserve to feel bad for spreading their ignorance like a virus.

kudos to amd ? get outta here.

April 5, 2013 | 02:53 AM - Posted by thinkbiggar (not verified)

Assuming data collection method has validity which on the surface seems to make sense, then the best we can say here is AMD and Nvidia are neck and neck in single card performance and AMD has some real crossfire performance issues. However I play some games that generate a fair amount of lag with with single card performance and I add a card in crossfire and see a huge improvement. Secondly, in crossfire you see time in ms is all over the place, not as tight, and us guys know tighter is better right? Well how is your data going to as low as 0 ms on the crossfire setup. This is a clear indication that something in your formulation is wrong. For all we know this variance is actually improving gaming visual effects and reality.

April 9, 2013 | 11:05 PM - Posted by Anonymous (not verified)

If you are not satisfied... get two GeForce GTX 660 Ti and SLI it. AMD cards are not designed to handle two cards (CrossFireX).

April 2, 2013 | 03:32 PM - Posted by Shambles (not verified)

Aha, starting to get into cards in the price range I'm looking for. I'll be greedily going through this article and waiting for the next one with the GTX 660 and 7870 before buying my next card.

April 3, 2013 | 02:41 AM - Posted by alwayssts (not verified)

Unless something is shown to be massively off the rails, the results will repeat the pattern.

That said, I own a 7870 and while I could argue all the clear merits, metrics, and reasoning vs. 660, the jist of the matter is it too, while slightly better, is at most (overclocked) sitting right at to slightly under where you want high quality 1080p performance now-a-days (and moving forward considering the ps4 spec). I would recommend you look closer at this article; buy a Tahiti LE if these are out of your budget and overclock it to 7950B speeds...or wait for the inevitable Hainan SKU if length/size is a factor (which should also drive down the price of 660ti to around ~250, in theory).

If you think that performance level is overpriced now, like I did, just wait. It's going to be a very important market soon because of the analogous to new consoles (1080p 30/60hz) so the competition will be incredibly fierce, if not then stagnant for quite a while.

April 2, 2013 | 03:45 PM - Posted by Anonymous (not verified)

Thanks for yet another great article, I can only imagine the amount of work that has gone in to testing all these cards and games at different resolutions. I do hope AMD will fix their broken CrossFire system to balance out the competition, even though I'm not a fan of their cards myself.

That said, it would be nice to see how two Titans would perform in SLI. I swore to stick with a single card, but two of them are starting to tempt me as SLI appears to perform with smooth frame times.

April 2, 2013 | 03:54 PM - Posted by seravia

Is AMD implementing the summer solution for the previous HD6000 series too?

April 2, 2013 | 11:21 PM - Posted by arbiter

Its gonna be a software fix less they re-release their chipset which i doubt will happen. Nvidia apperently seen this issue few years ago and do it via hardware on their boards. Would expect AMD to address it properly with their next generation chip.

April 2, 2013 | 04:07 PM - Posted by Bob Jones (not verified)

Wht's up with the FRAPS FPS for Far Cry 3 at 1440p in Crossfire. It's showing 4x the performance of a single card - why do no other review sites see that kind of error?

April 2, 2013 | 05:25 PM - Posted by Ryan Shrout

The numbers you see on that metric are based on re-adding the runt frames as well as the frames completely dropped by the game/GPU driver.  Because all of those dropped frames are put back into the compilation the of average frame rate per second, the original or FRAPS result can sometimes be much higher than is really shown.

We saw the same result at 5760x1080 in many of our tests in this article as well as in the HD 7970 vs GTX 680 piece.

April 2, 2013 | 06:11 PM - Posted by Anonymous (not verified)

It's because pcper and nvidia have together invented a reason to throw out frame numbers using their subjective and bias opinion to then create an illusion metric they call 'perceived fps'

pcper cannot be taken at face value due to their heavy nvidia involvement.

the bright side is that more credible hardware review websites will soon be giving their results using similar methods without the nvidia influence that pcper suffers from.

April 2, 2013 | 06:38 PM - Posted by bystander (not verified)

What about Toms Hardware, TechReport and AnandTech? Are they all creating a conspiracy? Use your own head, see the data, and tell me it doesn't matter.

April 2, 2013 | 07:18 PM - Posted by truthobfuscated (not verified)

Have been following much of this new method closely. Take your own advice. Tom's has already called into question that the smaller frames may have value and their worth or not is subjective and requires further study. pcper has hopped on board with nvidia and is towing their line on them having no value.

techreport too is saying the data from fcat is insufficient and more information from the api must be exposed for draw calls. you hear none of that here at pcper/nvidia (which is it, hard to tell any more)

anandtech has reserved any results at all until they can do a full and proper investigation with the fcat tools and will then release their findings.

pcper has attempted to get attention by cowtowing to nvidia in order to release data and riding the gravy train of amd bashing to garner a few page clicks from nvidia fanboys who swallow anything so long as it comes in nvidia positive or amd negative flavor.

much more investigation to be done, by reviewers unlike this obvious nvidia biased and sponsored one

April 2, 2013 | 07:30 PM - Posted by bystander (not verified)

While most are not giving final verdicts, they all see it as a problem. How severe of a problem is what they are holding back on.

There are a few things that they aren't even considering in all this either. Having evenly spaced frames, also means evenly spaced input. This gives a smoother feel as well.

How much latency really is added to the metering technology, if all it does is make one frame that is faster than the previous, wait a little. Once they are at the correct offsets, they should remain fairly close in line, with only occasional adjustments needed.

You also must have to consider that while the crossfire setup still gives a decent playing experience, is it giving as good of a playing experience? Reviews are meant to help us pick what is the better experience for the end user, not to just to find what is acceptable.

And if you've followed a lot of review sites articles, you probably have noticed many notes about different games giving stuttery results in Crossfire, and much fewer cases where SLI has been mentioned. They never made a big stink about it in their articles, because they are trying to give unbiased articles, and using quantifiable tests to gauge what is better. Now we finally have quantifiable evidence of crossfires problems, so it can given more attention.

I found I always had to run with v-sync with my 6950's. I was ok with it, until I learned that the added latency was the cause of my nausea.

April 2, 2013 | 07:51 PM - Posted by Shambles (not verified)

The trolling is strong in this one.

April 2, 2013 | 07:36 PM - Posted by bystander (not verified)

I will agree with one thing though. Pcper is definitely not holding back on their opinion, and would probably be more well received if they delivered the results without as much judgement.

While they may be right, a lot of people see it as an attack, and seem to be very defensive about it.

April 2, 2013 | 09:47 PM - Posted by ThorAxe

I realise the need to justify your purchasing decision but accusing PC Per of bias is one helluva stretch.

I have used 6870s in Crossfire and noticed these problems and yet I don't have the same issues with SLI.

The numbers don't lie however much people dislike the results.

Bravo Ryan for calling it as you see it.

April 2, 2013 | 10:40 PM - Posted by bystander (not verified)

I was agreeing with Pcper. The guy I originally responded to was the one who was trying to discredit it.

April 2, 2013 | 11:13 PM - Posted by ThorAxe

Sorry about responding to you bystander, it was intended for the guy you mentioned. :)

April 3, 2013 | 11:35 AM - Posted by Anonymous (not verified)

ROFL !!!!

"Smaller frames may have value" HAHAHHAHAHHAHAAAAAAAAAAAA

Please, you and the amd fanboys at Tom's play with runts and ghosts, that's valuable. A value that could not be understated.
Here let me show you what you game will look like:

-------------------------------------------------
_______________________________________ ___________
------------------------- -------------------------
____________ ___________________________________
- -----------------------------------------------

AWESOME ! RUNTS DO HAVE VALUE ! LET'S GIVE THEM A GOLDEN SHORTBUS AND FRONT OF THE STORE PARKING !

April 2, 2013 | 11:24 PM - Posted by arbiter

Maybe you didn't notice a lot of AMD sponsor ad's on the site? As for cannot taken at face value, if a site is biased in their reviews, no one will trust them. Yes the tools were developed originally by Nvidia but PcPer is working on coding their own tools to get away from those.

July 6, 2013 | 03:53 AM - Posted by Rod (not verified)

Hmm iѕ anyоne elѕe encountering ρrоblems with the imаges on this blоg
lοаding? І'm trying to find out if its a problem on my end or if it's the blog.

Any suggеstіonѕ woulԁ be greаtly aρpreсiated.

my webρage :: Southold eye eхam (Rod)

April 2, 2013 | 11:33 PM - Posted by Ryan Shrout

Try not to be such a troll.

How about this theory?  Because I have been using these tools, regardless of who developed them, for MUCH LONGER than Tech Report, Anandtech or Tom's Harware, I have more experience and am much more confident in the results they are showing me.  

Accusing me of NVIDIA bias while AMD ads run continuously on our site is also pretty stupid.

April 3, 2013 | 03:15 AM - Posted by andrei (not verified)

Well done Ryan !

You have destroyed all AMD's credibility as planed, without any real proof just with the help of Nvidia and I mean FCAT ( and who knows what else... )

" I have more experience and am much more confident in the results they are showing me" - You know just what Nvidia wants you to know... AND by the way You are a liar! First time when you mentioned about this "new frame metric" You told that it is developed by you and your team but on the first day of your benchmark you say the converter is made by Nvidia and I have to find from other sites that this whole benchmark is actually an Nvidia's product???

Now you can go and count your green money, upss I mean tell your readers Nvidia's propaganda, upss I mean the real objective stuff, you do care about your readers don't you??? as it can be seen you DON'T!

From a site with AMD in his name you became Nvidia's marketing guy (puppet ) but that is understandable because AMD does not give bribe to sycophants as Nvidia does ( How about this theory? ). Reading your articles it gives me the impression that they are written by Tom Petersen.

PS:go have fun with project shield, this is the future of gaming!- Nvidia marketing.
By the way you should rename this site to www.nvidia-pcper.com

"Accusing me of NVIDIA bias while AMD ads run continuously on our site is also pretty stupid." - you are not doing this freely! you receive good money for that, now who is pretty stupid?

April 3, 2013 | 06:38 PM - Posted by Josh Walrath

Yeah, well... except that AMD has publically admitted that they are having issues with CF and will have a fix in place by this summer (in Beta form).  Damn guys must be biased against themselves!

April 3, 2013 | 09:59 PM - Posted by ThorAxe

It's no use Josh, these zealots won't listen to reason.

I have a 4870x2 and 2 6870s still running in my kids' PCs but I am not blind enough to call Ryan a liar for exposing Crossfires deficiencies. Instead I greatly appreciate all the hard work that has gone into these tremendous articles.

It turns out that my eyes were not lying to me.

April 4, 2013 | 09:10 AM - Posted by Noah Taylor (not verified)

I really don't understand how incompetent you must be to accuse Ryan's last few articles of being all Nvidia and meant to destroy AMD. He explains his testing methods, his setups, and creates repeatable, measurable data for us to analyze. If you don't believe it's true, don't read it. I for one have been utterly displaced from my own enthusiasm for AMD's crossfire. I just couldn't understand why i had such a rough gameplay experience at 1440p when my frames were so high. What a mess.

BF3, Crysis 3, and Even Bioshock Infinite: anytime i'm in battles or high character count areas it becomes choppy and skippy, even though it says 100 fps :S

April 8, 2013 | 12:51 AM - Posted by raulduke

meh, useless trolls..

April 3, 2013 | 11:10 AM - Posted by Maester Aemon (not verified)

Was the tinfoil hat bundled with Radeon card or did you buy that separately?

April 2, 2013 | 04:31 PM - Posted by Daniel Masterson (not verified)

I'm rooting for AMD. I like to have a 2 card solution but I just can't do that with Xfire like I can with SLI. Here is to hoping they come out with a decent driver that really shows Xfire can work! Then I might get an 8000 series.

April 2, 2013 | 04:40 PM - Posted by Nigel (not verified)

I'm pleased with the frame rating results for AMD's single GPU solutions. I'll never do crossfire anyway but I'm hoping they fix their drivers for scaling... I can see people staying clear for AMD otherwise.

April 2, 2013 | 06:04 PM - Posted by Anonymous (not verified)

AMD is garbage. We are almost to the point of refusing to sell AMD sh*t. It really has gotten that bad.

April 2, 2013 | 06:12 PM - Posted by Shambles (not verified)

You must have read a different article since this one clearly shows that AMDs single GPU products are better than nVidia at this price point. Since the dawn of SLI/Crossfire it's been a bad idea to go multi-gpu on mid-range (lol $300 mid) parts rather than go with a single higher end GPU.

April 2, 2013 | 06:05 PM - Posted by db87

I Like to know how long these Crossfire problems exist. How much is the GCN Architecture to blame? How long have we been stupid in believing that Crossfire was just as good as SLI?

Take a HD6990 and a HD4870X2 for example and also add some old games (UT3, Doom 3, Half Life 2).

I have been running SLI for years in the old days. I never used Vsync because it was known to not work great for SLI (also was my experience) Since I have been using Radeon HD5850 OC in Crossfire I use Vsync for all singleplayer games and using framecap for multiplayer games. I always thought it had something to do with the Crossfire implementation and that it might have been optimized for Vsync primarly. I also did notice that Crossfire had some additional tearing. Now these days I know better...
Thanks for these great articles!

April 2, 2013 | 06:41 PM - Posted by bystander (not verified)

I know that FRAPS was showing problems in the 6000 and 5000 series. I also have heard SLI had problems back in the 200 series. I believe it was with 400 or 600 series that they started the frame metering improvements, since they said they started this a couple years ago.

April 2, 2013 | 11:34 PM - Posted by Ryan Shrout

This is a really interesting idea though I don't know how much time I'll be able to devote to older GPUs...

April 3, 2013 | 11:26 AM - Posted by Maester Aemon (not verified)

That would be an interesting test just to see if this issue was introduced with the GCN architecture or it's an underlying problems of ATI/AMD multi GPU implementation.

Maybe without going to far back, a test with a pair of 69xx cards shouldn't take too long ;)

April 3, 2013 | 02:38 PM - Posted by Luciano (not verified)

The next step, after AMD put an option for evenly distributed frames in the drivers, is going to be test input lag compared to SLI.
If the results of this future test for both are closer to vsync + triple buffering, then AMD could claim that there never was a problem and the option (of smoothing) was already there for all users.

April 2, 2013 | 09:21 PM - Posted by Anonymous (not verified)

The BF3 discussion mentioned GTX 680 SLI twice above and below the 4th plot. I think GTX 660 SLI was intended.

April 2, 2013 | 11:36 PM - Posted by Ryan Shrout

Thanks, fixed!

April 2, 2013 | 09:39 PM - Posted by Rick (not verified)

I know this is a ton of work, but....

I would like to see how VLIW4 compares to GCN. Is this a new problem with AMD cards, or has this been happening for a long time?

Thanks Ryan for doing all this work, it is a very interesting read.

And keep up the good work on the podcasts!!

April 2, 2013 | 09:44 PM - Posted by Rick (not verified)

Even if it is a quick game review (BF3?) of 6970's in crossfire.

April 2, 2013 | 10:18 PM - Posted by Foosh (not verified)

Crossfire and SLI both are unbearable without using something like Afterburner to meter the frames. I'd like to see that tested.

April 3, 2013 | 12:10 PM - Posted by Anonymous (not verified)

Nope. Only amd's crossfire is unbearable.

SLI is awesome and rocks the smoothness.

Frame rate target is ultra awesome on single card nVidia.

Only amd's crossfire is an utter failure and a giant fps lie.

CF fails completely on half the games, too.
On the other half they have false frame rate double pumped when in reality they display single card rates with massive stutter and tearing.

amd cf is epic fail incarnate

Now all the amd fans can scream once again, "you can't see anything beyond 30 fps, the human eye is incapable of it ! "
Then they can move their human eye benchmark to 60fps
Then to 120 since the 120hz monitors have come out...

In any case, is easy to say on amd crossfire you will get 30 frames MAX with massive stutter, so it is true that the amd fanboy human eye cannot see beyond 30 fps.
It's okay to be on disability with amd cards.

April 2, 2013 | 10:45 PM - Posted by Anonymous (not verified)

Guys!there are too many variable involved in somethi ng this massive .I all brain an angle most forget.what is needed to generate graphic?don't say GPU .next ?is control the vast majority of what is going on .Google sRGB 15 and read his view on color profile.this is just the tip of the iceburg .then you have data fragmentation everywhere always along the way and supposedly seemless.give me a break.nothing cause more trouble then the various fragememtation.at cache level at network level at GPU level.add to this the new .net .then we have bufferbloat everywhere which cause more fragmentation.I tell you this because the main cause of my issue have been at the is level I have been trying to get rid of all those fragmentation point but so far the task is so big its insane .I ain't even started on the subject of color profile converting data for x y z to ?all along with the way wiggle room get by and at the end cause all the chaos pcper notice.but I bet if you ask ms they all say it is lower the human perception (85 ms)so it is a non issue.sadly ms is wrong .a fix?lol try getting the exact value of the black point of you monitor for fun.the white point?etc etc etc.there are a lot of value that shouldnt be on the eye ball guess or random yet a huge amount are.this is a good idea but without doing the same for all variable related to the os it is futile exercise because most of the issue come from the os because of all the compromise .imagine when 4k ITU standard is added to scrgb (?)lol it will be even worst.this is why tablet are popular like nexus 7.you pay 200 $and will look as good most computer at 1080p!ya there are issue .they are the same as they were when I was using xp 10 years ago .ms took 10 years to fix the universal clock.so better be patient on the issue you discover here cause most will be caused by the is (window)

April 2, 2013 | 11:36 PM - Posted by Ryan Shrout

I...I...what...?

April 3, 2013 | 01:40 AM - Posted by alwayssts (not verified)

Some people will never learn that online translators do not work in conversation. ;)

April 3, 2013 | 09:11 AM - Posted by Lord Binky (not verified)

Has anyone really been far even as decided to use even go want to do look more like?

April 2, 2013 | 11:51 PM - Posted by Anonymous (not verified)

You know for a site that's supposed to be hardware centric, you sure dropped the ball.

NOT A SINGLE WORD on the mini GTX 670.

HERE, I'll do some of your homework FFS.

http://rog.asus.com/220932013/graphics-cards-2/picture-gallery-heres-the...

April 3, 2013 | 11:13 AM - Posted by Ryan Shrout

You mean words like this?

http://www.pcper.com/news/Graphics-Cards/ASUS-Finalizes-Mini-ITX-System-...

April 3, 2013 | 01:37 AM - Posted by Trey Long (not verified)

latency and frame time delivery as more and more sites are testing will only improve ALL products. Obviously there is a big problem with Crossfire at the moment, and I have no doubt it will eventually be fixed.
My Titan btw, has the lowest latency of ANY solution, so that makes me feel even better about my addiction.

April 3, 2013 | 02:57 AM - Posted by Number1 (not verified)

We get the deal, AMD and nVidia both fairly well matched single card perf. AMD crossfire has problems if you for some idiotic reason don't use a frame limiter like radeon pro.
Blowing things out a little bit here mate. Time to start looking at the quality solutions that people have been using for a long time and that have been reported on for quite some time. Even Tom's Hardware did a bit on radeon pro some months ago and how it eliminates micro stutter. Pretty sure the end result was a win for the unofficial 7990 against the gtx690 in terms of latency and smoothness.

By quality solutions I don't mean ticking the "vsync" box in games then running the game at settings that only get you 40fps. I mean using radeon pro setting frame limit ~59 playing at settings that get you an average of 60+ and watching how smooth your cards look compared to the competition. Oh and how input lag free they remain.

April 3, 2013 | 10:21 AM - Posted by bystander (not verified)

The Toms Hardware article is here: http://www.tomshardware.com/reviews/radeon-hd-7990-devil13-7970-x2,3329-...

You will notice that they used Dynamic v-sync within RadeonPro to fix it at different FPS limited points. It appears to be a v-sync/FPS limiter combo they are using.

It is worth using, but you shouldn't be required to use a FPS limiter or v-sync to fix stuttering either, but it is a solution for now.

April 3, 2013 | 03:22 AM - Posted by Pete (not verified)

Wow what a complete blunder for AMD!!! No wonder they are giving out amazing free games!! To sell their broken CF hardware!! What an eye opener!!

Looks like Im buying Nvidia this round.

April 3, 2013 | 03:31 AM - Posted by ReWind (not verified)

I have to agree with bystander here, seeing even just a slim article about CF/SLI performance with previous generation cards would have great value, as it would let a lot of people (myself included with my HD5770 CF setup) get some information on what an upgrade to a new CF setup will mean for us.

Having just ordered a Tahiti LE I would love to know if I will get the same amount of stuttering with two cards, just more FPS, or if the stuttering is worse, and in turn will make the upgrade of a lower perceived value than what FPS numbers imply.

April 3, 2013 | 05:17 AM - Posted by rrr (not verified)

Kind of proves Techreport wrong on 660Ti vs 7950, according to whom SINGLE 660Ti was supposed to be much better than 7950.

Then again, flawed conclusions lead to improvements in methodology and finding REAL problems, so I can't complain about that.

Is problem with CF hardware in nature? In last article you mentioned nVidia having built in hardware elements just to handle the issue, whereas AMD cards just output frames ASAP. So it means issue is unresolvable in drivers and AMD has to improve it hardware-wise in next generation? Looks like we have to disregard CF as viable option until it happens, then.

April 3, 2013 | 10:25 AM - Posted by bystander (not verified)

The TechReport articles "were" accurate when they were released, and sparked change from AMD, who went through their drivers to fix the problem.

They didn't prove TechReport wrong. They prove AMD fixed things on single cards configurations.

April 3, 2013 | 05:59 AM - Posted by Cpt. Obvious (not verified)

Why do the graphs for "FRAPS FPS" and "Observed FPS" match exactly except for Crossfire?!

There should be at least little differences between these graphs.

You don't have FRAPS running during the capture, do you?!

April 3, 2013 | 11:15 AM - Posted by Ryan Shrout

No, FRAPS was not running at the same time as the overlay and capture testing.

April 3, 2013 | 08:38 PM - Posted by Cpt. Obvious (not verified)

So what would be the explanation for it?

The graphs match exactly.

If you measure the same scene twice, the results wont be exactly the same.

So you let the bench run, measure with FRAPS.
You let the bench run and measure with FCAT.
The results are EXACTLY the same except for crossfire.

There has to be something wrong.

April 3, 2013 | 07:48 AM - Posted by Anonymous (not verified)

Another article to confirm that with vsync off CFX is nerfed atm...

Which doesn't affect me in the slightest as i don't play with vsync off .. i won't play with vsync off fullstop.

Twin lightning 7970s @ 1400 mhz , 60 fps solid in crysis 3 with RP 60 DFC and vsync on and its butter smooth. Slightly different to the results i see in this article i might add.

My advice to any CFX user is simply run with vsync on and radeon pro DFC (dynamic frame control) 60 fps with vsync and you won't get 90 % of the issues in these articles.

It is a pity that pcper won't simply do a review with vsync on running radeon pro .. i guess they wouldn't have much to write about then.

April 3, 2013 | 09:09 AM - Posted by Lord Binky (not verified)

I don't think dropped frames are as significant to the viewing experience as the runt frames are which also cause small tears. Also, keeping the image being displayed the most current representation of the game state would seem to support dropping a frame if it gets in the way of a newer frame.

That said, I really want to see some rules of thumb applied to the graphs as to what consitutes perceptable gameplay degredation. For example in the frame variance chart, I would think there is a slope based assessment as to the type of problem the person would percieve.

April 3, 2013 | 11:16 AM - Posted by Ryan Shrout

The idea of "rules of thumb applied to the graphs as to what consitutes perceptable gameplay degredation" is a good idea, but it is much harder to actually do.  That is the eventual goal but in all honestly it will likely be an issue that changes for each gaming title.

April 4, 2013 | 01:43 PM - Posted by bystander (not verified)

I was thinking about how many FPS compared to your refresh rate has an effect on what the average frame size would be. This might mean what is considered a runt frame might be something that should scale based on FPS. This may also help narrow down when something is not acceptable.

April 3, 2013 | 12:03 PM - Posted by Verticals (not verified)

Yeah, the hidden-agenda accusations are rather bizarre considering the AMD ads, but any tech site is going to get its share of unstable posters. don't let it get to ya.

Anyway, AMD's single gpus came out looking nicer than I would have figured. My experience with previous gens had me expecting to see disparity in observed frames for the single cards as well. I'd also like to see tests on the last gen from both manufacturers, but I know that may not be a practical use of copy.

April 3, 2013 | 12:52 PM - Posted by Anonymous (not verified)

Window cause more of those issue then everything else combined.that is what I mean with the long post Ryan didn't understand.

April 3, 2013 | 01:00 PM - Posted by Anonymous (not verified)

You have to use radeon pro, and set up a profile for each game being used on Crossfire setups. Its the only way to get them to run smoothly. Its an extra program and a bit of work but the results are promising, regarding frame time. I hope Future AMD driver versions have this feature built in.

April 3, 2013 | 01:23 PM - Posted by Anonymous (not verified)

You use nVidia Hardware to measure SOMETHING?
Evilol....

April 3, 2013 | 03:50 PM - Posted by endiZ (not verified)

Hey Ryan, great articles.

Have you tried RadeonPro with Dynamic V-sync Control? It looks they are using a similar technique that Nvidia uses with Adaptive Vsync.

April 3, 2013 | 05:06 PM - Posted by Cannyone

I'm really surprised by this article. And that is an ambivalent sort of surprise! On one hand I'm pleased to see that SLI 660 Ti's do so well at 2560x1440. On the other hand I'm dismayed that Crossfire is NOT competitive. So the crux of what I'm feeling is disappointment. Because "competition" is good for me as a consumer.

See I have a pair of Nvidia (specifically EVGA) GTX 660 Ti's in my computer. And they were chosen because I wanted to run game's on a 2560x1440 display at full resolution. But it troubles me to think that AMD isn't being competitive because that points to a Market where Nvidia's dominance is most likely to mean higher prices. Which contributes to the feeling that PC Gaming my indeed be on its last leg.

The bottom line, economically speaking, is that while computers are a boon to our lives our "life" itself does not depend on them. So when things get really tight we will find ways to get by without spending money on new hardware. And both AMD and Nvidia will be forced to change to maintain a revenue stream.

... so despite the fact I own Nvidia cards I sure hope AMD can get their act together!

April 3, 2013 | 07:05 PM - Posted by Zarich (not verified)

I enjoyed this article. However, after watching all the comparison videos I learned one thing. ATI and Nvidia do not have the same colors. So Out of the box which card is producing the most accurate colors?

April 3, 2013 | 08:45 PM - Posted by Cpt. Obvious (not verified)

Why do the graphs for "FRAPS FPS" and "Observed FPS" match exactly except for Crossfire?!

There should be at least little differences between these graphs.

You already said, that tests with FCAT and FRAPS weren't made at the same time.

So what would be the explanation for it?

The graphs match exactly.

If you measure the same scene twice, the results wont be exactly the same.

So you let the bench run, measure with FRAPS.
You let the bench run and measure with FCAT.
The results are EXACTLY the same except for crossfire.

There has to be something wrong.

April 3, 2013 | 10:09 PM - Posted by ThorAxe

The only thing wrong is the borked Crossfire. The single GPUs and SLI are doing what they should.

April 4, 2013 | 03:24 AM - Posted by Cpt. Obvious (not verified)

Read. -> Think. -> Answer.

The results can't be exactly the same.
2 Benches, 2 different tools that give different data,
but 1 result? I don't think so.

April 4, 2013 | 09:22 AM - Posted by ThorAxe

Maybe you should re-read what has been written or at least look at the definition of observed frame-rate.

Observed Frame Rates will, indeed, must show identical frame rates to FRAPS if all frames are defined as useful. That means there were no dropped or runt frames in the benchmark.

The tools use the same data. Only discarding useless frames would result in a variance. If there are no useless frames then there should not be a variance.

April 4, 2013 | 11:40 AM - Posted by Cpt. Obvious (not verified)

So if you bench the same scene two times, the results will be the same? Interesting.

April 4, 2013 | 01:31 PM - Posted by Tri Wahyudianto (not verified)

completely no, this called "blind and randomization" in journal, health journal, social journal, or other

cause we know the same scene never do the same results in two testing in all graphic card

so it do to all graphic card was testing, bad results, good results, all graphic card has same chance to have it

in the end, we called it a fair comparison

April 4, 2013 | 11:23 PM - Posted by ThorAxe

And there is your problem. The benchmark is only run once.

In the test run FRAPS displays all Frames rendered regardless of size. The video output is then captured via the capture card and the raw AVI file is analyzed. If there are no runts or drops (as is the case with the single GPUs tested) then the Observed Frame Rate must be identical to the FRAPs results.

April 5, 2013 | 03:46 AM - Posted by Cpt. Obvious (not verified)

Wrong. I already asked. Ryans answer:

April 3, 2013 | 11:15 AM - Posted by Ryan Shrout
No, FRAPS was not running at the same time as the overlay and capture testing.

I am not trolling, I know that AMD has bigger issues with MGPU than Nvidia.
The only thing I am saying:
Somethig has to be wrong with the graphs, as there always will be slight differences between two bench runs.

April 5, 2013 | 10:47 AM - Posted by ThorAxe

If that is the case then there would normally be a slight variation.

I don't know why it would need to be run separately, though I think the podcast mentioned issues with FRAPS and the overlay being used at the same time.

April 5, 2013 | 03:58 PM - Posted by KittenMasher (not verified)

You're right, there should be at least slight differences between the two runs. However, that doesn't mean those differences when displayed on a graph that is made for human consumption are going to be noticeable. In order for the graph's to be completely accurate, the 90seconds or so of benchmark would have to be sampled at frame. In at least one of the cases that's about 150fps, which means the graph would have to have a width of about 810,000 pixels. You may have noticed that the graphs on pcper are a bit smaller than that.

April 5, 2013 | 03:58 PM - Posted by KittenMasher (not verified)

You're right, there should be at least slight differences between the two runs. However, that doesn't mean those differences when displayed on a graph that is made for human consumption are going to be noticeable. In order for the graph's to be completely accurate, the 90seconds or so of benchmark would have to be sampled at frame. In at least one of the cases that's about 150fps, which means the graph would have to have a width of about 810,000 pixels. You may have noticed that the graphs on pcper are a bit smaller than that.

April 4, 2013 | 04:08 AM - Posted by Tri Wahyudianto (not verified)

Thanks for the objective and numberical article despite many critics and subjectivity of the reader

well done ryan

thumbs up pcper.com

April 4, 2013 | 04:54 AM - Posted by sLiCeR- (not verified)

I would very much like to read a quick analysis of previous generations crossfire setup (ie: 2x 5850/5870), to try to understand if these issues are only related to the drivers / GCN arch or the crossfire technology itself.

And of course, AMD should have the ability to discuss these results with pcper.com and eventually revisit these tests with the *future* drivers that will address these issues.

Regardless of the impacts this kind of articles have, it's known today that there are issues with AMD drivers. Even if there are some errors with some tests, if were not for them we wouldn't even be looking at this. So.. well done pcper.com, but don't leave it at this bombshell ;)

cheers.

April 4, 2013 | 09:18 AM - Posted by AlienAndy (not verified)

Thank you for taking your time to do this Ryan. I used to use Crossfire but it never quite felt like what was being displayed. What I mean is when I ran 5770 CF it always seemed jerky and stuttered even though the readings I was seeing seemed pretty high (as it should have been, just ahead of a 5870).

I swore off of Crossfire after that and just concluded it was down to poor drivers. I'm now using GTX 670 in SLI and I have to say it's been fantastic. No stuttering and pretty smooth. I have had the odd issue waiting for drivers (Far Cry 3) but other than that it seems to work great.

Funny really. Crossfire always felt rather crude to me. Would also love to see some 5xxx results !!

April 4, 2013 | 10:22 AM - Posted by AlienAndy (not verified)

BTW could you also test 3dmark11 please?

April 4, 2013 | 10:27 AM - Posted by Anonymous (not verified)

Result are mostly within margin of error.only factor making this possible ?is if the operating system is causing the issue.like I posted in my big post ;the operating system is the cause .probably unknowingly.

April 4, 2013 | 01:51 PM - Posted by FenceMan (not verified)

Not sure I grasp the whole methodology. If it looks good to me what is the problem? I have (2) 7950's in CFX mode and I do not see any 'stutter' or problem, why should it matter if FRAPS tells me 100 fps in Battlefield 3 and this article tells me its really only 50? Seems to me this is extreme nitpicking of something that isnt even visible to the naked eye and quite irresponsible to make AMD out to be so bad.

April 13, 2013 | 07:48 PM - Posted by Anonymous (not verified)

Yet someone like Kyle at Hardocp has said in the far cry 3 review that there is no doubt nVidia is smoother, and in the former SLI vs CF, that EVERY SINGLE TESTER AND REVIEWER THERE NOTICES SLI IS MUCH SMOOTHER THAN CROSSFIRE, for the past number of years.

He also says it's true some people (slow eyes, slow mind, poor vision, crap monitor, low resolution ) "can't" see the differences.

Oh well, sorry you drew a lousy deck. Some cards drawn are not as good as others.

On the other hand, with them side by side, you'd probably not be a blind, doubting amd fanboy anymore, even if you still claimed you were verbally.

April 4, 2013 | 01:54 PM - Posted by FenceMan (not verified)

Cant quite call this "observed frame rate" if you need a capture card and slow motion video to "observe" it??

April 13, 2013 | 07:49 PM - Posted by Anonymous (not verified)

Yes, you don't, thousands of people report on it daily.

Others who sold their souls to the amd red devil can't see it.

April 4, 2013 | 03:07 PM - Posted by Solix (not verified)

<- Me this round for my rig

I generally have good luck with Nvidia drivers, okay two mid-range cards, let's see... 192 bit memory bandwidth on your mid-range card?! argh, no thanks I like anti aliasing. Open CL results are... wow okay. Let's check out ATI...

Open CL is good, I have a dedicated PhysX card, heat and efficiency are... aw nuts. well okay I can undervolt a bit, fine. Let's get some CF 7950s... one week later and a good deal more broke, sees this article. Yargh... Oh hey... look an Nvidia workstation GPU for gaming(Titan right?), maybe compute tasks will... oh nevermind, not so good and 1k, ouch. No bitcoin mining for me.

Maybe I should just get a single GPU? Wait... I just bought three monitors for surround gaming.

I think I'll go downstairs and play a console game to relax.

April 4, 2013 | 05:52 PM - Posted by db87

Ryan,

I would like to know if the 'runts fps' are indeed 100% completely useless for the gaming experience. So lets say take a test person put him behind a game running 55fps with 'runts' (read: observed fps is lower) and then put him behind the same game but with the fps capped at the observed fps value. I like to know what the test person experiences are under real gaming.

How is the mouse movement? Does It feel smooth? Which scenario does the test person thinks is giving the best gaming experience and so on...

So essentially are the runt frames so annoying that you are better of with way lower fps? If so by what margin? For example I can imagine that 60fps with 'runts' is still better than a 100% solid and consistent 30 fps.

April 13, 2013 | 07:55 PM - Posted by Anonymous (not verified)

Since a normal screen resolution means 1080 horizontal scan lines, how do you feel about 7 or 8 scan lines being called a "valuable frame" for AMD ?
Let's check it with something called logic, since the clueless are here everywhere.

1080/7=
154 runts needed to make a single full screen frame pic JUST ONCE

So in that case, 154 fps give you an effective frame rate of 1 fps. LOL How valuable is 1/154th of the screen in a sliver across it ? That's a runt, a drop is ZERO VISUAL VALUE period.

How about 21 scan lines ? Is that "valuable" ?

1080/21 = 54 of them to make just one full screen. LOL

So, AMD counts 54fps, and you get one screen of data - yes I'd say that's a very valuable way for amd to cheat like heck on steroids.

April 4, 2013 | 08:15 PM - Posted by Anon (not verified)

As a competitive gamer, you WANT the frames to start before the previous ones finished. Nvidia is introducing latency with their frame metering crap.....

-coming from a 660ti owner.

April 4, 2013 | 09:14 PM - Posted by icebug

I can't believe the amount of trolling over this article. I consider myself loosely an AMD fanboy when it comes to video cards. I don't really have an issue recomending nVidia cards to friends if that is where their pricing falls but I, personally run AMD. I don't see anything malicious about these articles and I find them very interesting and detailed. It's not going to stop me from buying a 7950 within the next month to replace my 5850 (Which let's be honest is still pretty darn good for today's games). I will just be happy that AMD knows about the issue and is working on a solution so I can get a nice fully functional Crossfire set-up later when they drop in price.

April 5, 2013 | 03:04 AM - Posted by thinkbiggar (not verified)

OK, AMD admits a issue then stop showing the comparisons with crossfire. What is the point is reviewing the same issues time and time again like something is suddenly going to change as we hit a different price point. This isn't a comparison at this point. If what you say is true then post this as a buyer beware when reviewing AMD cards about crossfire. What your doing here is the same as reviewing a nVidia card nad every article mentioning AMD crossfire is broken.

If you want a single GPU card no issue. If you want multi GPU go with nVidia at this point or move up to the next higher single GPU AMD. But remember with crossfire X when this fix is in place you can continue to boost your performance buy adding the next gen AMD. All nVidia offers is physx.

April 5, 2013 | 07:05 AM - Posted by Mac (not verified)

I agree with all of that and would also like to suggest PCper explores the 3rd party apps like radeon pro that fix a lot of these issues. Much more useful than telling us Xfire is broken over and over again

August 3, 2015 | 09:14 AM - Posted by William (not verified)

Regardless if this is your own or rented or shared it will need some appliances.

Make your appliances work for you, and not the other way around.
Categorized as "white goods", need to have consideration regarding their very own treatment
method. And they give the added benefit of making your kitchen look fancy.
Packing with the environment in mind can also save you money because recycling
cuts down costs.

My web blog: kitchen appliance assistance shelbyville

April 5, 2013 | 07:05 AM - Posted by Mac (not verified)

I agree with all of that and would also like to suggest PCper explores the 3rd party apps like radeon pro that fix a lot of these issues. Much more useful than telling us Xfire is broken over and over again

April 5, 2013 | 11:17 AM - Posted by bystander (not verified)

I wouldn't mind seeing 3-way CF/SLI, as an article a while back by THG found 3-way CF helped a while back.

I would like seeing RadeonPro, but realize that the fixes all include FPS limiting, which obviously isn't ideal.

I would also like to see different ranges of runt frames. 20 pixels high is 2% of your screen. That is really small. You could fairly increase that number some.

April 5, 2013 | 11:52 AM - Posted by Trey Long (not verified)

The exposure of Crossfire problems will force AMD to fix it. HOW IS THIS A BAD THING?

April 5, 2013 | 02:03 PM - Posted by FenceMan (not verified)

The way this is a bad thing:

1. I own a 7950 Crossfire setup which looks perfect to me (I use RadeonPro), and my initial reaction was that I should sell them and buy Nvidia, imagine how this could impact sales of people just considering AMD. I think the results, while quite accurate in and of themselves, should be better quantified in the real world (lets be honest can anyone at PCPer in a blind test really tell the difference between AMD and Nvidia)?

2. The graphs completely misrepresent what is happening on the screen. If you look at these graphs you imagine this stuttering mess on the screen, the truth is nothing close to that (as I said I can watch my rig in the real world).

3. It is completely beating a dead horse, we get it, the Crossfire has "major issues" when you enhance and slow down the frames to a crawl (which has zero bearing on how it looks in the real world), no reason to post the same results for each Crossfire compatible card, we already know what they are.

I am no AMD fanboy, I want all of the info I can get so next time I drop $1,000.00 on video cards I have all of the available info, I just think the info should be presented in a more realistic form, to me this is all quite inconsequential to real world gaming performance.

April 5, 2013 | 02:29 PM - Posted by bystander (not verified)

The problem isn't that you don't get playable frames in crossfire. It is that you aren't gaining full benefit from crossfire. When choosing between Crossfire and SLI, what will help you more so you can plan ahead.

And if you use v-sync, the limited testing has shown it helps or fixes the issue (only tested on 2 games so far, it could be the FPS limiting aspect that helps, and not v-sync itself).

April 5, 2013 | 02:33 PM - Posted by FenceMan (not verified)

"The problem isn't that you don't get playable frames in crossfire. It is that you aren't gaining full benefit from crossfire."

That right there to me is an accurate and thoughtful representation of what is going on here. That would have been a much more accurate way to present this information instead of the current "AMD Crossfire is a stuttering mess" message.

April 13, 2013 | 08:03 PM - Posted by Anonymous (not verified)

I have been on every review website there is and was for well over a decade, and argued with every fanboy of every type - and NEVER has a single amd fan or otherwise EVER MENTIONED "radeon pro"...

So who cares if NOW, you use it. For YEARS not a single amd fanboy CF user was using it, and if they were THEY LIED AND KEPT IT A SECRET BECAUSE THEY KNEW AMD SUCKS WITHOUT IT.

So who really cares at this point what lying hidden crutch amd needs for a "quicky bandaid patch" for it's crapster crap crossfire epic failure ?

It's a big fat excuse for YEARS of total loserville and lies.

I'll also mention once v-sync and radeon pro hack the frame rate down to 30 or 60, YOU MIGHT AS WELL NOT USE AN AMD CROSSFIRE SOLUTION AT ALL BECAUSE YOU CANNOT USE FRAME RATE POTENTIAL ON IT FULLY, YOU HAVE TO CRIPPLE IT FOR IT TO WORK.

What we need now is a 30fps chart and 60 fps chart with crap amd cards listed running vsync and radeon pro and the chart can tell us which pieces of crap can do 30 and 60 fps and we can THROW OUT EVERY OTHER FRAME RATE EVER LISTED FOR THE CRAP AMD CARDS.

April 5, 2013 | 02:34 PM - Posted by FenceMan (not verified)

"The problem isn't that you don't get playable frames in crossfire. It is that you aren't gaining full benefit from crossfire."

That right there to me is an accurate and thoughtful representation of what is going on here. That would have been a much more accurate way to present this information instead of the current "AMD Crossfire is a stuttering mess" message.

April 7, 2013 | 01:25 PM - Posted by Anonymous (not verified)

Interesting test, if what you have tested is valid, AMD is in deep-sh*t once again. If not, there is some kind of money transaction going on...

Nonetheless, thanks for all your efforts.

April 8, 2013 | 01:23 AM - Posted by PClover (not verified)

Quick Google "geforce frame metering" and you will find out why the nVi cards rarely have runt frames. In fact, nVi cards DO have them. They just delays those frames a bit to match with other good frames' speed, therefore the frame time chart looks good miraculously. And you, the user, will have to deal with input delay.

For me, an AMD cards and a 3rd party FPS cap software is the best. No input lag, no stuttering. And the image quality from AMD is always superior.

April 11, 2013 | 07:14 PM - Posted by praack

wow, it's spot on data, tough to take in, especially if you purchased Radeon's since the begining but it is true.

but GOD People - stop with the flaming!!!.

AMD will have to do a hardware re-do to fix this, i don't really expect anything to come out for a year. i don't really think that anyone misled on purpose - this is all new testing data.

AMD still makes an ok card- but the issue is with multiple cards.

answer- don't buy multiple lower cost cards with AMD

April 13, 2013 | 03:45 AM - Posted by Anonymous (not verified)

I wish you'd start testing with newer drivers. They have 314.22's out, had 314.21 out, and even a 314.07 all in WHQL, but you're still on beta .07's? I think they even had a 314.14 whql in there.

Many games have been optimized since then. Great data, just want later Nv drivers as you're a few behind at best.
https://www.youtube.com/watch?v=gQ3U2P8ZLz4
15% for 314.07 vs. 314.21 in tomb raider, others show basically the same and in some cases more and this was on an old 9600GT card...LOL. Who knew?

June 27, 2014 | 04:58 PM - Posted by Anonymous (not verified)

Sorry for the huge review, computer repair in schaumburg but I'm really loving the new Zune, and hope this, as well as the excellent reviews some other people have written, will help you decide if it's the right choice for you.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.