Review Index:
Feedback

Frame Rating: GeForce GTX Titan, GeForce GTX 690, Radeon HD 7990 (HD 7970 CrossFire)

Battlefield 3

Battlefield 3 (DirectX 11)


 

Battlefield 3™ leaps ahead of its time with the power of Frostbite 2, DICE's new cutting-edge game engine. This state-of-the-art technology is the foundation on which Battlefield 3 is built, delivering enhanced visual quality, a grand sense of scale, massive destruction, dynamic audio and character animation utilizing ANT technology as seen in the latest EA SPORTS™ games.

Frostbite 2 now enables deferred shading, dynamic global illumination and new streaming architecture. Sounds like tech talk? Play the game and experience the difference!

Our Settings for Battlefield 3

Here is our testing run through the game, for your reference.

View Full Size

While all three cards are able to keep Battlefield 3 running well at 1920x1080, in the FRAPS based information the HD 7970s in CrossFire that are emulating the HD 7990 are clearly the performance leader followed by the GTX 690 with the GTX Titan rounding things out. 

View Full Size

Well, things change quickly around these parts and you can see that for the HD 7990 the observed frame rate after removing any runts or drops has come down considerably. 

View Full Size

Our plot of frame times from the Frame Rating capture technology shows two interesting items.  First, the HD 7970s in CrossFire result in an alternating fast/slow frame times, usually indicative of runts, frames that take up so little of the screen's scanlines that they aren't positively affecting apparent performance.  Also, even though the GTX 690 has better frame rates than the GTX Titan, it definitely has more frame time variance as evident by the wider blue band of color on the image above.

View Full Size

Minimum frame rates after taking out the runts result in an observed FPS average of about 102 FPS for the HD 7970s, 120 FPS for the GTX Titan and 140 FPS for the GTX 690. 

View Full Size

But once we look at the variance picture again we find that the GTX 690 and the GTX Titan have swapped places, with the single GPU performance of the Titan resulting in a smoother overall experience than even the GTX 690.  Both NVIDIA solutions are drastically better than the HD 7990 / HD 7970s in CrossFire.

 

View Full Size

At 2560x1440 the FRAPS results look pretty similar to those above...

View Full Size

But once we take away the runts and drops we find the HD 7970s in CrossFire fall behind the performance of both of the NVIDIA GeForce cards.

View Full Size

Ouch, another blanket of color from the Radeon solution that indicates unsmooth and inconsistent frame rates!  If we look just as the NVIDIA side of the equation we again see the a thinner band of color on the GTX Titan results that indicates tighter and more consistent frame times throughout the benchmark run.

View Full Size

Here is an individual run graph for the HD 7970s in CrossFire to help demonstrate how the runts cause the observed frame rates to be lower. 

View Full Size

View Full Size

And the two runs for the GTX 690 and the GTX Titan do not indicate any runts at all...

View Full Size

These minimum FPS percentile charts show some pretty dramatic differences, even between the competing NVIDIA options.  The HD 7970s in CrossFire average around 60 FPS, the GTX Titan at 75 FPS and the GTX 690 at 95 FPS. 

View Full Size

But the frame variance results, our ISUs (International Stutter Units), once again prove that the single GPU solution has a more consistent and fluid frame time result with the HD 7970s in CrossFire really seperating themselves (not in a good way) starting at the 80th percentile.

 

View Full Size

View Full Size

Even though we only have NVIDIA results for 5760x1080 due to the extreme amount of dropped frames on the HD 7990 / HD 7970s, comparing these two options is interesting.  In both FRAPS and observed average frame rates per second, the GTX 690 is showing as the faster of the two options, running faster than the Titan the entire time.

View Full Size

But the plot shows an interesting story - the frame times on the GTX 690 are not as consistent or as smooth as on the GTX Titan.  They are averaging much lower, based on the where the bulk of the blue band resides in comparison to the green band, but the spikes that show themselves on the GTX 690 are gone completely with the GTX Titan. 

View Full Size

If we look at only the minimum FPS marks we find the GTX 690 to be 33% faster in average frame rate over the entire run, but based on the graph above (and the one below) that isn't the whole story.

View Full Size

Here we see the result of all of those "spikes" in frame times - a pretty sizeable difference in frame variance going from the GTX 690 to the GTX Titan. While the Titan never has more than 2.5 ms of variance from one frame against the running average of the past 20, the GTX 690 has 8-9 ms jumps at times, which will likely cause some noticeable stutter.

 

There are two take aways from this first page of results.  First, the AMD Radeon HD 7990 or HD 7970s in CrossFire are not going to compare well to the GTX Titan or GTX 690 in many cases because of the runt and dropped frame issues we have detailed.  Second, while the GTX 690 may be "faster" than the GTX Titan in Battlefield 3, at higher resolutions and especially for multi-monitor situations, the GTX Titan looks to provide the better overall experience.

 


March 30, 2013 | 01:40 AM - Posted by John Doe (not verified)

Personally, I'll be the first to say that I could care less about ANY of this shit since I own a Sparkle Calibre 680 and a Galaxy 680 White to complement it.

:)

March 30, 2013 | 01:44 AM - Posted by Anon (not verified)

Just because you can post doesn't mean you have to.

March 30, 2013 | 06:35 AM - Posted by John Doe (not verified)

The same applies to you as well.

June 2, 2013 | 11:11 PM - Posted by clearlice (not verified)

Soon after all these years I had been undertaking this exercising
pondering it was going to whip me in shape, wow!

March 31, 2013 | 12:02 PM - Posted by Anonymous (not verified)

http://www.youtube.com/watch?v=U8HVQXkeU8U&list=PLLpjKzWIuEfmHcyTF7Qxuh1...

^

Battlefield 4 running at 60FPS, 3K resolution on an AMD Radeon 7990.

PCPER=Fail!

March 31, 2013 | 03:20 PM - Posted by renz (not verified)

60 fps won't matter if it doesn't play as smooth as it should be. hence the whole point of these article

April 4, 2013 | 03:22 AM - Posted by Anonymous (not verified)

you mean at a reported 60 fps. as you can see by the charts above, reported versus actual is about a 2:1 ratio. so take that as the radeon running at 30 fps.

April 5, 2013 | 09:50 AM - Posted by Anonymous (not verified)

how stupid is this hole page comparing two 7970's and not a 7990

July 8, 2013 | 09:39 AM - Posted by Anonymous (not verified)

How stupid are you for not spelling whole right?

March 30, 2013 | 02:00 AM - Posted by rezes

you using still old drivers!!

AMD: 13.2 beta 7 ????

March 30, 2013 | 03:38 AM - Posted by Edge86 (not verified)

As you should have noticed also the NV Drivers where older (314.07 & 314.09 for TITAN). This is because he started a time ago to make these time-consuming tests. At that Moment the AMD Driver wasn't old. :)

Thanks Ryan for the interesting Frame Rating tests so far (and upcoming). Would it be possible the add 3-way & 4-way MGPU-Setups for an extra review in future?

Greetings,
Edge

March 30, 2013 | 05:37 AM - Posted by Ryan Shrout

LOL 13.2 beta 7 is SOO OOLLDD!!  :)

All of our testing was completed as of March 15th, and it was the latest driver as of that date.  And trust me, nothing that affects what we are seeing here is changed with the latest beta drivers. 

As for 3/4 card configs, we can do it, its just a time concern now.  

March 30, 2013 | 06:27 AM - Posted by John Doe (not verified)

Talk about being subjective.

Not everyone updates their drives like they change their underwear, dummy.

June 2, 2013 | 11:11 PM - Posted by Darlene (not verified)

Soon after all these years I had been carrying out
this physical exercise considering it was going to whip me in shape,
wow!

March 30, 2013 | 07:03 AM - Posted by Prodeous (not verified)

Totally agreed with you.

March 30, 2013 | 09:38 AM - Posted by rezes

Thank you for answering

March 30, 2013 | 03:46 PM - Posted by technogiant (not verified)

Would be interesting....I've heard else where that tricrossfire eliminates stuttering.....at least subjectively....would be good to shine the truth light of "frame rating" on that statement.

March 30, 2013 | 05:27 PM - Posted by arbiter

I doubt it eliminates the stuttering problem as it just increases FPS to a point you can't see it anymore. nvidia doesn't have stuttering cause its what they do on the hardware to prevent it which AMD doesn't but supportable gonna make a fix for in July which would have to be software fix.

April 2, 2013 | 01:16 AM - Posted by ThorAxe

I'd like to see that too. I used to run 4870x2 + 4870 in Tri-fire and I don't recall stutter. However, I have run 6870s in Crossfire and did notice issues in BFBC2.

I don't have problems with my GTX 570 SLI or GTX 680 SLI PCs.

It would be great to see older cards tested too.

April 1, 2013 | 10:30 AM - Posted by mfpterodactyl (not verified)

Oh, Ryan you rascal! Are you not familiar with the MO of the AMD diehards? These guys are more deluded than the Westboro baptist church.

Let me bring you up to speed.

The latest beta driver, released at 3am the night before, not available on the official AMD website, and available only through obscure links passed around by the AMD diehards, is the JESUS DRIVER THAT WILL FINALLY CHANGE EVERYTHING! It will unlock all the untold power that AMD diehards just KNEW was in their cards all along. This situation repeats EVERY SINGLE TIME a new beta driver is released, publicly or not.

To be perfectly honest you're throwing the diehards way too much of a bone by using beta drivers to begin with. You wouldn't review beta (non-production) hardware, now would you? But that's another discussion for another time.

I really am sorry Ryan for all the venom some people will spit at you now and in the future because you're the first tech journalist in quite a while to actually do some investigative journalism instead of being a press release parrot. There are a lot of people that appreciate all your hard work, please don't forget that. The real value in what you're doing is not just in evaluating current hardware and making better buying decisions, but in shaping the future of the industry. Because of your hard work, future hardware WILL run games more smoothly than they would otherwise, and for that we thank you. Thanks, Ryan.

April 3, 2013 | 12:32 PM - Posted by Anonymous (not verified)

Hope and beta driver change. Praise be the new frame rate!
The future is here and this is the driver we have been waiting for! Honor be upon Catalyst Maker, the oceans are receding and all crossfire rigs now rise with this tiding of good joy!
Banished are the runts frames, in Abu Dhabi's name.
AlluAMDahkbar!

April 1, 2013 | 09:48 AM - Posted by steen (not verified)

I agree, 3/4-way multi gpu setups might show interesting results.

People are getting too hung up depending on their preference. This is genuinely interesting stuff, that has't been applied by end-user review sites before. Both IHVs have resources that are beyond anyone else. I don't buy for a minute that any of the major IHVs doesn't analyze the render pipeline in great detail.

March 30, 2013 | 02:22 AM - Posted by pdjblum

Ryan,

Great work you have done. Has Scott at techreport had anything to say about it?

March 30, 2013 | 01:19 PM - Posted by pdjblum

Disappointing that I never get a response from the staff. I have been a loyal reader for some time now, and do not get why you guys cannot once in a while respond.

March 30, 2013 | 01:39 PM - Posted by Josh Walrath

I love you, and I care for you.  I just have no idea what Scott has communicated with Ryan about.

March 30, 2013 | 01:43 PM - Posted by pdjblum

Back at you Josh. Thanks for replying.

March 30, 2013 | 02:22 PM - Posted by Ryan Shrout

I'd rather keep conversations between Scott and I private. 

March 30, 2013 | 03:01 PM - Posted by pdjblum

I was just hoping he was impressed with how far you have taken this thing. No doubt, your work has nvidia and amd taking notice, which is quite a thing.

March 30, 2013 | 05:29 PM - Posted by arbiter

nVidia made some tools that are used in this testing which pcper is in process of making their own up to move away from using nvidia made tools

April 2, 2013 | 08:45 PM - Posted by Tom Petersen (not verified)

I watch everything Ryan does:)

March 30, 2013 | 06:03 PM - Posted by Anonymous (not verified)

not trying to be a dick, but ignoring him doesn't make the answer obvious.

April 1, 2013 | 02:41 AM - Posted by CaptTomato (not verified)

Is there romance in the air?

March 30, 2013 | 05:58 AM - Posted by billeman

Hey Guys, these articles really confirm what I saw with a crossfired HD6950 setup, I noticed big stuttering (3dmark11 springs to mind) on a dual-card setup even though the FPS were double than the single card setup.
With one card it was much smoother even though the FPS were half.
I ended up putting one card in a drawer so that was a big waste of money.
Now running a single GTX TITAN.
I will only be running single GPU from now on so I bought the fastest single GPU...... Sorry AMD, been running you for years but you don't have the fastest GPU. Good luck anyway :)

One exception I would like to mention, I remember running Crysis Warhead with crossfire on 1920x1200, enabling VSYNC,
and being able to run at 60 FPS constantly. This setup didn't seem to be stuttering at all so it would be interesting to test crossfire with VSYNC when the game permits at least 60 FPS constantly.

Cheers, keep up to excellent work !

March 30, 2013 | 06:33 AM - Posted by John Doe (not verified)

Titan is a fucking POS card and is overpriced to Moon.

Buy either one of these:

http://www.tigerdirect.com/applications/SearchTools/item-details.asp?Edp...

http://www.newegg.com/Product/Product.aspx?Item=N82E16814187196

http://www.amazon.com/MSI-DisplayPort-PCI-Express-N680-2GD5/dp/B0090EC9IS

March 30, 2013 | 09:00 AM - Posted by Humanitarian

Opinions errywhere.

March 30, 2013 | 11:38 AM - Posted by bystander (not verified)

Did you even read the article?

March 30, 2013 | 12:01 PM - Posted by John Doe (not verified)

c

March 30, 2013 | 12:01 PM - Posted by John Doe (not verified)

No, I did not.

I could care less about what the article is on about actually, since the TITAN is THE most stupid card I've EVER seen in my life.

The only card that matters is the 7990 in this article.

And EVEN that is barely worthwhile, because of all these skipping frametimes and shit.

My weasel is stronger than your fucking stupid TITAN.

March 30, 2013 | 12:28 PM - Posted by bystander (not verified)

Don't you think you have an obligation to know what you are complaining about before you troll?

March 30, 2013 | 12:30 PM - Posted by John Doe (not verified)

I do.

It's explained right in there over the TechSpot site, over TechReport and ON here and on some other tech sites as well, at least, from what I heard.

My weasel is really hot. So will be my new sports car.

April 1, 2013 | 02:43 AM - Posted by CaptTomato (not verified)

Titan is an excellent card, just hideously over overpriced.

March 30, 2013 | 05:31 PM - Posted by arbiter

its his money to spend on what ever he wants.

April 2, 2013 | 10:37 PM - Posted by salesATbossrigsDOTcom (not verified)

You sir, are a f*cking idiot.

May 10, 2013 | 10:08 PM - Posted by Anonymous (not verified)

Eat a fuckin dick John. And shove ya 680 up your arse

May 10, 2013 | 10:09 PM - Posted by Anonymous (not verified)

Eat a fuckin dick John. And shove ya 680 up your arse

May 11, 2013 | 12:52 AM - Posted by Anonymous (not verified)

Eat a fuckin dick John. And shove ya 680 up your arse

March 30, 2013 | 12:21 PM - Posted by billeman

Going to reply to my own comment, it seems people get really exited about this and start trolling and calling people names. Quite disappointing. Now, I'm 41 years old, maybe that's the problem :-) For all those people who are able to give their opinion without being rude to others, kudos, your comments are well appreciated.

March 30, 2013 | 12:33 PM - Posted by Daniel Masterson (not verified)

Here here sir. This guy is 54 years old and has nothing better to do than troll this awesome tech site. If you are interested in a laugh I have started a little thing called the John D. Show. You should check it out!

March 30, 2013 | 12:56 PM - Posted by billeman

-

March 30, 2013 | 12:56 PM - Posted by billeman

-

March 30, 2013 | 12:59 PM - Posted by billeman

John D Geek, the show ? Is that it :)

March 30, 2013 | 08:37 AM - Posted by Mac (not verified)

Maybe I missed it , but is there an explanation as to why the radeons didn't give an data at the highest resolution in some of those runs? what do you mean by "reliable data"?

March 30, 2013 | 07:06 PM - Posted by Ryan Shrout

Check first article page on Eyefinity.

March 30, 2013 | 09:47 AM - Posted by 6GB Guy (not verified)

Grat work Ryan, keep it up. I was considering a crossfire setup but seeing this has helped reassure me to save up for a Titan ... Or two...

March 30, 2013 | 09:48 AM - Posted by Luciano (not verified)

Huge mistake: A 7990 cannot be "emulated" by a pair of 7970 since the the "bridge" is done onboard.

March 30, 2013 | 11:15 AM - Posted by Anon (not verified)

This site is somehow harvesting its momentum with all this AMD bashing.

The unreleased HD7990 can't implement frame metering like Kepler cards do.

Anyway we don't really know how's that Nv's frame metering affecting to input lag.

March 30, 2013 | 02:25 PM - Posted by Ryan Shrout

That's....just not true. You can precisely emulate it. There will be very little difference. 

March 31, 2013 | 02:12 PM - Posted by Luciano (not verified)

You are right. I was wrong: third parties used a Lucid chip in the past for dual GPU instead of the AMD reference bridge. But the latency remained the same except for the case where the dual-GPU board was paired with at least a third card:

http://www.tomshardware.com/reviews/radeon-geforce-stutter-crossfire,299...

Interesting note:
Back in 2011, it already took 3 AMD gpus + Lucid Bridge to get the smoothness of 2 nVidia

April 2, 2013 | 01:49 PM - Posted by Truth seeker (not verified)

It also CANNOT be reproduced as the boards that are currently 7970 x 2 are NOT 7990!!!!

The 7990 is going to be based off of a CGN respin similar to the 7790 (respin). The conclusions cannot even accurately refer to the 7990 without pointing out this is an NVIDIA sponsored site, as the 7990 is not even released.

What a joke to discuss an unreleased part and not only that, reflect so negatively on it.

April 2, 2013 | 01:50 PM - Posted by Truth seeker (not verified)

It also CANNOT be reproduced as the boards that are currently 7970 x 2 are NOT 7990!!!!

The 7990 is going to be based off of a CGN respin similar to the 7790 (respin). The conclusions cannot even accurately refer to the 7990 without appearing to be an NVIDIA sponsored site, as the 7990 is not even released.

What a joke to discuss an unreleased part and not only that, reflect so negatively on it.

March 30, 2013 | 11:45 AM - Posted by bystander (not verified)

Good read, I like the new charts. They are easy to follow. I think these are the best looking graphs I've seen on the subject, so keep up the good work.

I happen to be a 3D Vision user. I play it with about any game I have and when it doesn't work, I use a Helix mod in most cases.

Anyways, I'd be very interested in seeing how Crossfire/SLI works in 3D, be it HD3D or 3D Vision. Theoretically, I believe their results should be similar to a single card setup in variance, but much faster, though some confirmation would be nice. This is because two images are made for each frame, so each card will start a frame at the same time for delivery at the same time, while something like the Titan would have to create two separate images for each frame.

Maybe just one article on it at some point would be nice just to paint a picture. We wouldn't need a lot of them if my theory is correct.

March 30, 2013 | 02:26 PM - Posted by Ryan Shrout

Hmm, not sure how we could use the overlay for that but we will experiment. 

March 31, 2013 | 03:17 PM - Posted by Luciano (not verified)

Running Tridef 3D in passive mode instead of Shutter Glasses the strangest thing I've discovered is that the same FPS feels faster.

3dVision shutter glasses in surround with 400/500 series (it had to be SLI for surround to work with reference boards) according to simracers had stutter.
But that was fixed using .ini configs that PC racing simulators had documented for quite some time.
Those .ini configs, years later I discovered, lead to the same results as nVidia's smoothing techniques.

This is a 2005 racing simulator:

Render Once Per VSync="0"
Max Framerate="0.00000"
Steady Framerate Thresh="0.00000"
Flush Previous Frame="0"
Synchronize Frame="1.00000"
Delay Video Swap="0"

April 2, 2013 | 06:55 AM - Posted by Swolern (not verified)

Hey Bystander! Long time no talk to. Hope you are doing good!

I second the 3d testing please. I'm running 3d vision surround with Titan SLI and would love to see frame time effects with 3d enabled. To me 3d with Lightboost is much smoother than non-Lightboost so I believe the display used will affect your perceived stutter also.

April 2, 2013 | 06:55 AM - Posted by Swolern (not verified)

Hey Bystander! Long time no talk to. Hope you are doing good!

I second the 3d testing please. I'm running 3d vision surround with Titan SLI and would love to see frame time effects with 3d enabled. To me 3d with Lightboost is much smoother than non-Lightboost so I believe the display used will affect your perceived stutter also.

April 2, 2013 | 06:56 AM - Posted by Swolern (not verified)

Hey Bystander! Long time no talk to. Hope you are doing good!

I second the 3d testing please. I'm running 3d vision surround with Titan SLI and would love to see frame time effects with 3d enabled. To me 3d with Lightboost is much smoother than non-Lightboost so I believe the display used will affect your perceived stutter also.

March 30, 2013 | 12:18 PM - Posted by Xen (not verified)

I think most of us understand that 2x 7970s aren't the same as the upcoming 7990, but we appreciate that Ryan and his team used what is available currently from AMD as a comparison to Nvidia's high end solutions.

Thanks for the review!

Guys, please keep the trashy, immature comments to your selves.

April 2, 2013 | 01:54 PM - Posted by truth seeker (not verified)

Yes, but the article is deviously worded to portray the unreleased 7990 in a bad light. It's true the 7970x2 may be pretty bad, but that should be talked as it is, not trying to portray an unreleased card in a bad fashion!

March 30, 2013 | 12:43 PM - Posted by Brokenstorm

I'm curious, have you considered using a dual socket LGA-2011 with 128GB of RAM for RAMDisk as a temporary storage for the capture system?

Obviously such write speeds aren't needed right now, but should a capture card capable of 4k@60Hz or 1600p@120Hz ever be made, then it might be cheaper to use a RAMDisk than to buy the entrepise SSDs that would be needed to capture at such speeds.

March 30, 2013 | 12:45 PM - Posted by Anonymous (not verified)

PCPER still shilling hard for nvidia. 7990 isn't even released and he is 'reviewing' it here by using two 7970s.. what a joke this site is.

Without knowing what the 7990 actually has on board or what the drivers will be like for it, he is 'reviewing' it now and using the 7990 name. Never mind that the 7990 is using two of the new Malta cores, not the Tahiti cores in the 7970.

You're a joke, dude. Shill harder for nvidia why don't you. When is the next nvidia card release where you'll once again have their PR man and yourself shilling away in your live stream (free nvidia advert)

Same guy who was releasing information using nvidia's toolset without disclosing everything was being done with the help of nvidia. While decent review sites didn't do anything until they first came out and said the tools are from nvidia.

PCPER is a worthless nvidia shill site.

March 30, 2013 | 01:01 PM - Posted by billeman

The 7990 is 2 7970 GPU's with a PCIe bridge, doesn't sound too different to 2 7970's on different PCIe slots

March 30, 2013 | 01:45 PM - Posted by Josh Walrath

Yeah, the 7990 is just two 7970 GPUs connected via a PCI-E 3.0 bridge chip that then communicates with the system at x16 PCI-E 3.0 speeds.  It might show a small improvement in communicating with each other, but it is going to be minimal.  The problems that AMD have will require a pretty hefty driver revision to smooth things out.

March 30, 2013 | 02:28 PM - Posted by Ryan Shrout

Cool story, bro.

March 31, 2013 | 02:30 PM - Posted by Luciano (not verified)

I was questioning that but as I discover it wouldnt make no difference.

And PCPER never backed nVidia in the articles Ive read.
I had AMD 6900 and nVidia 570 and Frame Limiter + Adaptive vSync or x-buffering eliminates stutter in the racesims I play to experience the same as with nVidia 500.

But as PCPER says: although available solution (with Radeonpro) this is NOT AMD's control panel option and is not supported by AMD.

April 3, 2013 | 12:55 PM - Posted by Anonymous (not verified)

The maker of msiafterburner and evga precision has written the fcat type color bars into the new releases.

The problem for you is amd is fail, and the new evidence is unassailable.

The last cry of the dying, lying cf breed, total annihilation is moments away.

None of us will be awaiting your certainly never forthcoming apology, I however will be enjoying your delicious tears, as the full implications of total epic CF amd fail sinks home into the thick, biased, crank of amd fanboy water carrying bloated skull.

BWAHAHAHAAAAAAAAAAAAAAAAAAAAAA !

You may buy the newly released book at online shops everywhere:

: "Death of the amd fanboy"
The lies, the fantasies, the obstinate denial, years in the stuttering darkness, and the gruesome ending when full exposure and half the frame rate ripped their living guts out. Delicious amd fanboy tears ending. Don't miss it !

March 30, 2013 | 02:28 PM - Posted by John Doe (not verified)

Ryan and Josh are doing a pretty damn fine job at deleting EVERY single comment related to JonnyGuru people.

That should tell you about their psyche.

March 30, 2013 | 04:00 PM - Posted by technogiant (not verified)

The bit I don't understand is why the crossfire configuration doesn't have any problems in....I think it was Dirt 3.

I mean if this is supposed to be something that is fundamentally broken in crossfire it should be the same in ALL games.....so whats going on differently??

March 30, 2013 | 07:33 PM - Posted by Ryan Shrout

My theory is it does not happen as often when the CPU is more of the bottleneck in the game. 

March 30, 2013 | 11:55 PM - Posted by bystander (not verified)

It certainly would be a good thing to test.

March 31, 2013 | 03:25 AM - Posted by Anonymous (not verified)

Isnt that where GPUView from Microsoft comes into play. That seams to be a better tool then this is.

Nvidia FCAT grabs the information overlay at the point where Fraps take it at the beginning of the pipeline and merges it to the Frame output at the end of the pipeline.
Its supplying two pieces of information at different points of the pipeline and presenting them as one.

Thats an odd way of measuring things to say the least. Especially if your discounting what goes on in the middle.

March 30, 2013 | 04:33 PM - Posted by Anonymous (not verified)

Wow people can be assholes huh? Anyway great job Ryan!!!

March 30, 2013 | 04:52 PM - Posted by Braveheart (not verified)

Loving the framerating reviews in comparison with standard benchmark based testing (Which clearly still has its place)

It's like the difference between testing a car on a rolling road and taking it for a thorough test drive.

Well done to all the team.

March 30, 2013 | 06:13 PM - Posted by technogiant (not verified)

Ryan...if your theory is correct that this problem only occurs with crossfire when the gpu is the primary bottle neck...then this should be very easy to test out just by dropping the graphics settings/resolution and seeing if it goes away? I'm sure you've already thought of doing this.

Cheers...I'm off now to see if granny has learned to suck eggs....lol

March 31, 2013 | 02:44 AM - Posted by technogiant (not verified)

Or visa versa use dirt 3 and drop the cpu clock speed and see if the runts come back.

March 30, 2013 | 10:31 PM - Posted by Anonymous (not verified)

EPIC FAIL. Where is Titan SLI comparison?

March 30, 2013 | 10:41 PM - Posted by Ryan Shrout

Not in this article...  I don't see how that affects our results at all??

A pair of Titan cards prices out at $2000, not the $1000 that all three of these cards in our comparison today were around.

March 31, 2013 | 03:27 PM - Posted by renz (not verified)

why is it fail? the point is to compare single gpu frame time vs multi gpu configuration. btw since you want Titan in SLI action why don't you donate one so PCPer can make the test?

March 31, 2013 | 12:04 AM - Posted by Maester Aemon (not verified)

Great job Ryan! Kudos to you and PCPer for going through all this testing.

This confirms what I suspected. I have been using ATI/AMD cards in single and xfire configurations for many years and always felt there was something amiss with dual gpus, but could not pinpoint it before now.

March 31, 2013 | 03:56 AM - Posted by Pendulously (not verified)

The Frame Variance Graph: "...What this does NOT really show are the large hitches in game play seen as the spikes in frame times. Another stutter metric is going to be needed to catch and quantify them directly..."

As an example: If Average FPS (over 60 seconds) is 100, then Total Frames observed over 60 seconds is 6000.

If ONE SINGLE FRAME is above 100ms, then for the y-axis value '100' (milliseconds), the x-axis value will be '99.9983' (percentile), i.e. one minus (1/6000).

If FOUR FRAMES are above 30ms, then for the y-axis value '30' (milliseconds), the x-axis value will be '99.9933' (percentile), i.e. one minus (4/6000).

If TEN FRAMES are above 20ms, then for the y-axis value '20' (milliseconds), the x-axis value will be '99.8333' (percentile), i.e. one minus (10/6000).

Therefore, instead of PERCENTILE on the X-AXIS, you should put NUMBER OF FRAMES on the X-AXIS.

Following our example, for the y-axis value of '100' (ms), the x-axis value will be '1' (frames), for y-axis '30' (ms), the x-axis will be '4' (frames), for y-axis '100' (ms), the x-axis will be '10' (frames), and so on.

March 31, 2013 | 04:01 AM - Posted by Pendulously (not verified)

EDIT: Therefore, instead of PERCENTILE on the X-AXIS, you should put NUMBER OF FRAMES on the X-AXIS. Following our example, for the y-axis value of '100' (ms), the x-axis value will be '1' (frames), for y-axis '30' (ms), the x-axis will be '4' (frames), for y-axis '20' (ms), the x-axis will be '10' (frames), and so on.

April 1, 2013 | 08:09 PM - Posted by Pendulously (not verified)

To put it simply: 'NUMBER OF FRAMES WHICH EXCEED THRESHOLD' should be on the X-Axis, and the THRESHOLD (in milliseconds) will be specified by the corresponding Y-Axis value.

March 31, 2013 | 07:17 AM - Posted by gloomfrost

Fantastic article Ryan. I think this will be the reference for future benchmarks and keep all GPU manufactures honest! In the podcast you mentioned having uploaded a non you tube video without the compression artifacts of flash and you tube. Is there anywhere i can download the video for the Skyrim comparison?

Thanks

April 3, 2013 | 01:03 PM - Posted by Anonymous (not verified)

I think you meant "keep the lying amd con artist CF foobar for years, incapable of getting away with straight out fps lying, by sheer force of facts."

In the case of nVidia, they already gave all of us the benefits of smooth frames for years.

This means of course, all the screaming denials by the amd fanboys and websites have been utter and total crap, for years on end.

I said so, and was attacked relentlessly.
Delicious amd fanboy tears, years of raging failure, total VINDICATION.

March 31, 2013 | 12:01 PM - Posted by Anonymous (not verified)

http://www.youtube.com/watch?v=U8HVQXkeU8U&list=PLLpjKzWIuEfmHcyTF7Qxuh1...

^

Battlefield 4 running at 60FPS, 3K resolution on an AMD Radeon 7990.

PCPER=Fail!

March 31, 2013 | 01:37 PM - Posted by bystander (not verified)

What does that video have anything to do with this article?

April 2, 2013 | 01:58 PM - Posted by Anonymous (not verified)

According to the auther 7970x2 is bad, yet they are playing nicely!

April 3, 2013 | 06:14 AM - Posted by rrr (not verified)

I still see some very little hiccups throughout the video. Not saying it can't be acceptable for anyone, but there are people, who would like better fluidity than this.

April 3, 2013 | 06:15 AM - Posted by rrr (not verified)

I still see some very little hiccups throughout the video. Not saying it can't be acceptable for anyone, but there are people, who would like better fluidity than this.

March 31, 2013 | 06:42 PM - Posted by db87

AMD: 13.2 beta 7????

Catalyst 13.2 Beta 7 is from 2/26/2013

The latest drivers are Catalyst 13.3 Beta3, dated 3/20/2013

March 31, 2013 | 07:29 PM - Posted by bystander (not verified)

As said a few times, it takes time to do all the data collecting, analysis, graphs and write an article. They used the most up to date drivers when they started the benchmarking.

Do you think this stuff is done instantly?

June 5, 2013 | 03:14 PM - Posted by Santo (not verified)

But this, οf сourse," you'd only be partly correct. One Missed ShotIn the beginning, it's common to miss a lot of ground to cover. Remember to give your class a reward. We get into the caves. Some people own swamp buying used cars with bad credits as a form of outdoors competition for the working line. 4mpg and if you want the wheels to look sporty, classy, or clean. Signal services are usually offered on a monthly basis.

My site ... сom/groups/buy-a-caг-with-bаd-сredit-and-nο-money-down/">autocrat

March 31, 2013 | 08:00 PM - Posted by Anonymous (not verified)

Honestly running RadeonPro with DFC and vsync fixs most of these games. I have played sleeping dogs at 60fps and it was glass smooth with RP. Most of the CFX problems are without vsync on - which in my opinion is not an option. So i've never had these so called runt frames affecting my gaming by default. I'm playing crysis 3 atm with radeon pro DFC and vsync and its pure smoothness. Sure the current drivers are a problem for people who game without vsync but i don't see why anybody would game without vsync.

Its interesting they recommend people not to buy AMD cards just because they can't be bothered to run a 3rd party app. Try suggesting things to fix the issue rather would help gamers more.

April 2, 2013 | 01:59 PM - Posted by Anonymous (not verified)

That wouldn't please the sponsor (nvidia).

April 4, 2013 | 12:01 AM - Posted by Anonymous (not verified)

Well, then, we need a totally new set of benchmarks for the past year plus, where 60FPS with vsync on is the bench, and either it does at the settings or it doesn't.

That means scrapping every single claim 7970 cf overclocked or whatever has higher fps, just throw ALL OF IT OUT FOREVER.

Thanks for "the solution", and you're welcome for the "implications of your solution" mr amd fan boy.

March 31, 2013 | 09:09 PM - Posted by Anonymous (not verified)

All the AMD fanboys are pissed. Oh no.

April 3, 2013 | 08:09 AM - Posted by Panta (not verified)

which put you in the same category.. a fan boy :|

April 3, 2013 | 01:07 PM - Posted by Anonymous (not verified)

WRONG.

Noticing the lying liars and their rager reactionary scunge brains borging out about the calamity of their full facepalm after YEARS of hearing them scream two amd crossfire For The Win !!!!!! does not a fanboy maketh.

April 3, 2013 | 08:09 AM - Posted by Panta (not verified)

which put you in the same category.. a fan boy :|

April 1, 2013 | 05:16 AM - Posted by CaptTomato

As I said at Techreport, when it comes to single GPU, framerates are all that matter on most occasions, so don't be put off buying a 7870/7950/7970 because of XF results.
I have a single 7950 and it's killer

April 3, 2013 | 01:14 PM - Posted by Anonymous (not verified)

That's fine but all those years of squacking future proof and buy one amd card now because in the near future you can go with CF and have a huge boost when you need it....

BLAH BLAH BLAH BLAH BLAH is what those endless YEARS of amd justifications now mean.

I tell you one thing, I noticed a year ago how some of these websites started shying away from dual card recommendations, which of course means they all had the info and inkling on this problem with amd crossfire, and no doubt nVidia was hammering their little eardrums and deep down inside they knew - not to mention when one of the did talk, they posted how CF failed in no less than 50% of the popular games they were using in their testing suites... it became IMPOSSIBLE to recommend CF with a clear conscience, but the fanboy in them made certain they included SLI in the "do not recommend" category as well.

This is what I saw develop, and there is no doubt why it went that way. When the crap underdog card is failing miserably but the cover is still not blown, make ceratin to cut down the competition, too, that way, you never really fully recommended the failing dirty amd dog double fps lying CF setups...

It's so sad. It's so facepalm. It has made every single site that does reviews look like clueless fools, and in fact, they WERE for YEARS.
Credibility ?
Not so much, so very little. So very, very little.

April 1, 2013 | 05:26 AM - Posted by CaptTomato

Damn, I tried registering for the forums and I couldn't do it.

April 1, 2013 | 09:43 AM - Posted by svnowviwvn

So where is the "3/31: Radeon HD 7950 vs GeForce GTX 660 Ti (Single and Dual GPU)" review?

It is 4/1 and I do not see it up on the site.

April 1, 2013 | 11:53 AM - Posted by Josh Walrath

Sometimes life does not always cooperate with review release schedules.  I should know, I'm the champ when it comes to that!

April 1, 2013 | 02:12 PM - Posted by Anonymous (not verified)

Nvidia looks good under frame rating with either single GPU or multi-GPU. How does Nvidia do with Physx enabled?

Just wondering...

April 1, 2013 | 04:21 PM - Posted by GarTheConquer

Ryan, thank you so much for bringing this issue to light!
I have an Xfire-7970 setup and because of your research, AMD will have no choice but to remedy this problem (eventually).
Thank you, your hard work is greatly appreciated.

April 1, 2013 | 06:00 PM - Posted by svnowviwvn

Ryan thanks for the very detailed reviews.

I for one would also like to see "Frame Rating" reviews done for all single GPUs, both discrete and integrated. That includes AMD APUs, Intel's HD series, Nvidia's Optimus, AMD's Endora. Laptop & Desktop both Single GPU and/or CF/SLI.

Since we can now see that Fraps numbers can be suspect and that drivers/implementation can result in runts/missing frames I want to see if any of the above mentioned systems result in lower actual FPS than the Fraps number shows.

April 1, 2013 | 07:23 PM - Posted by Pete (not verified)

Amazing work Ryan. Much appreciated. PCper is now my #1 site for hardare reviews. Can you possibly show Titan SLI frame times in the future. Thanks bud.

April 1, 2013 | 10:13 PM - Posted by seravia

Where is the next part?

3/31: Radeon HD 7950 vs GeForce GTX 660 Ti (Single and Dual GPU)

April 2, 2013 | 07:08 AM - Posted by rezes

At the Skyrim game. Fraps FPS and observed FPS equal for AMD radeon cars! however of frame time variance. How can we explaining that situation? Skyrim used DX9.0! is answer it?

It seems to AMDs drivers problem at the only on the DX10-DX11

In the meantime, I am using gtx690 in my system.I was just wondering cause of the problem.

April 2, 2013 | 02:27 PM - Posted by ZoranICS

Great article guys! I hope AMD is watching and are doing all they can to fix this issue that has been known for years.

I am amazed how many ignorant people actually post in this discussion. It seems that even hard facts aren't good enough for some dumb people. :|

I prefer AMD cards (Though I run a GTX670 at the moment) and hope they fix this ASAP.

April 3, 2013 | 08:07 AM - Posted by Panta (not verified)

I would have thought for sure this article
was going to be testing the ASUS ARES II or the Powercolor Devil 13, to see if there's a diffidence between CF & single board Dual GPU's.

because we already know clearly how 2 7970 would perform in CF.

puzzled.

April 3, 2013 | 10:09 AM - Posted by DrFramesPerSecond (not verified)

AMD gets OWNED. That's why the nvidia prices are so high, too high. Gives a break Nvidia!

April 3, 2013 | 01:19 PM - Posted by Anonymous (not verified)

Don't forget the 10% IQ cheat AMD introduced with their 10.10 catalyst and initial 6000 series cards, porting it back to all of the 5000 series.

We don't hear about that because it's still ongoing.

So we have the up to 10% IQ cheating, now add in the 100% cf frame rate cheating... what are we up to in terms of percentage of fps epic fail by amd ?

Who even wants to say, it's so humiliating.

April 4, 2013 | 04:00 AM - Posted by Anonymous (not verified)

I just read the ENTIRE thread, 33 pages, at Overclock:

http://www.overclock.net/t/1377275/pcper-frame-rating-comparison-hd-7950...

The amd fanboys are in TOTAL DENIAL, and there are 2 posters who actually read the articles or gave themselves clue one by studying just a bit.

The ENTIRE 33 pages of comments there has not ONE, I repeat NOT ONE comment that points out in CF in the game they are moaning about that EVERY OTHER FRAME WAS ENTIRELY DROPPED FROM THE END USER SCREEN.

So there you have it - the fanboy brain wins out over all other facts, including the often tried and true massive ignorance, total lack of reading, completely ignorant BLISS accompanied by the raging radeon fanboy screed "It just can't be!"

So their tinfoil DUNCE caps are lofted upon their heads in FULL SPLENDOR. It is absolutely amazing.

It also appears that just one commenter on the entire 33 pages had any inkling that the frames presented to the gamer were ALL CAPTURED in real time, and could be gone through manually frame by frame by frame SO THAT NO OVERLAY FCAT CREATED BY NVIDIA COULD BE A BIAS ISSUE !

So expect, I'd say, about 5 or 10 YEARS before the amd fanboys finally admit runt and dropped frames ever even happened, and 5 or 10 years from now they will repeat the current refrain: " AMD driver problems are a thing of the past and the drivers are equal with nVidia, whose had problems too ! > ...link....(from 5 years ago).

Just remember people, the average human is a C grade, and half the people are DUMBER THAN THAT !

So you have to spell it out EXPLICITLY to them, directly, in simple retard friendly terms... then explain how their conspiracy theory about it's all one big lie is not actually possible, because there are things called FACTS.

AMD was DROPPING EVERY OTHER FRAME IN CROSSFIRE IN SOME GAMES. THAT MEANS THAT FRAPS FPS WAS 100% TOO HIGH! CUT IT IN HALF, YOU HAVE THE EQUIVALENT OF "JUST ONE CARD".
ADD IN THE STUTTER FROM EVERY OTHER FRAME BEING DROPPED OR TOTALLY RUNTED, AND GUESS WHAT AMD FANBOYS ?

If you are a low C, or below that, or a B, or claim to be an A student, who cares, the amd fanboy in you will win out over all the collective intelligent consciousness in the entire multiverse.

April 4, 2013 | 07:35 PM - Posted by Anonymous (not verified)

Hey, calm down ... you're going to need new keyboard too soon.

April 4, 2013 | 07:37 PM - Posted by Anonymous (not verified)

Hey, calm down ... you're going to need new keyboard too soon.
And reading 33 pages? You must have been very angry boy.

October 10, 2014 | 07:08 PM - Posted by khao yai day tour (not verified)

For the experienced diver - the deep sites with their strong currents and huge fish make are clear favorites.
Who needs that stupid overload of eating tools on a western restaurant
table which only serves to confuse and embarrass the uninitiated and succeeds to seriously annoy me.
When arriving there it can seem like a bit of a culture shock,
if you've never been to Asia before you will be amazed by the craziness of
the city.

Also visit my website: khao yai day tour

April 4, 2013 | 03:19 AM - Posted by Luke Daley (not verified)

Thank you ryan, groundbreaking work !

April 4, 2013 | 09:26 AM - Posted by AlienAndy (not verified)

If I had a pound for every time some one told me that the latest AMD driver fixes everything I would be a millionaire.

Funny really as it's impossible to believe when you have AMD cards shoved in your system yet some one doesn't seem to spot all of the issues.

April 4, 2013 | 07:33 PM - Posted by Anonymous (not verified)

So this is the cave where nVidia fanboys sleep?

April 4, 2013 | 07:53 PM - Posted by Anonymous (not verified)

I just love how many fine young crossfire combos owners come here and in such a polite way tell us story about all their many many years long unsolved issues with their expensive, totally useless gfx configs.....I don't know should I laugh when sucha nicely mannered "AMD owners" don't react in more believable way to all this bashing from green side. Amuzing masquerade. Many red masks with green smiles underneath.

April 4, 2013 | 08:06 PM - Posted by George

Ryan,
Don’t worry about the negative and bias comments.
Thank you for this great review, it has opened my eyes to the cause of these problems. And hopefully a new way to review all Graphics cards in future, instead of just looking at the highest FPS numbers.
I have always thought smooth experience is better than a fast (high FPS) and choppy visual gameplay.
Hopefully AMD and Nvidia will consider these issues in there next GPU and or driver releases now it has been exposed, rather than targeting figures. This means a better gameplay experience for the consumer.
Thank you and Keep up the good work.

April 4, 2013 | 08:07 PM - Posted by George

Ryan,
Don’t worry about the negative and bias comments.
Thank you for this great review, it has opened my eyes to the cause of these problems. And hopefully a new way to review all Graphics cards in future, instead of just looking at the highest FPS numbers.
I have always thought smooth experience is better than a fast (high FPS) and choppy visual gameplay.
Hopefully AMD and Nvidia will consider these issues in there next GPU and or driver releases now it has been exposed, rather than targeting figures. This means a better gameplay experience for the consumer.
Thank you and Keep up the good work.

April 5, 2013 | 04:54 AM - Posted by uartin (not verified)

I think that instead of the percentile curve you could reach a more meaningful result using a derived curve(of the frametime curve).
Let's say that the average is 60 fps.
Now let's say that 20 percent of the frames are 25 ms(40fps).
The difference is how these 25 ms values are spread in the curve. If they are all together or if they are alternated to 17 ms ones, forming saw-like shape in the curve.
You will not have the same feeling stutter-wise
What i want to say is that the percentile graph is not appropriate for the kind of analysis that you are doing. You should use a derived curve since deriving a function measures how quickly a curve grows (negatively or positively) and this is not measured by the percentile grows. After this you could measure the area of this curve and you could also arrive to use one only number to measure the amount of stutter.Infact in this way you would bring out of the equation the part of the frametime curve that is below the average but that runs steadily.
Calculating the area of a very saw-like derived frametime curve you would obtain a high number whereas calculating the area of a smooth (even if variating) derived frametime curve you would get a very low number. This would tell you how smooth are transitions, not if the gpu is powerful enough to make the game playable. For this you should check the average fps.
So in the end if you got decent fps and very low value for the area of this function you got a great experience,
if oyu got decent fps but high derived func area value then you got stutterish experience.
If you got low fps and low value you got a underdimensioned gpu but good smoothness.

April 5, 2013 | 05:13 AM - Posted by uartin (not verified)

I think that instead of the percentile curve you could reach a more meaningful result using a derived curve(of the frametime curve).
Let's say that the average is 60 fps.
Now let's say that 20 percent of the frames are 25 ms(40fps).
The difference is how these 25 ms values are spread in the curve. If they are all together or if they are alternated to 17 ms ones, forming saw-like shape in the curve.
You will not have the same feeling stutter-wise (and here i am not saying anything new)
What i want to say is that the percentile graph is not appropriate for the kind of analysis that you are doing. You should use a derived curve since deriving a function measures how quickly a curve grows (negatively or positively) and this is not measured by the percentile curve. After this you could measure the area of this curve and you could also arrive to use one only number to measure the amount of stutter.Infact in this way you would bring out of the equation the part of the frametime curve that is below the average but that runs steadily(something that with percentile curve you cant do).
Calculating the area of the derivation of a very saw-like frametime curve you would obtain a high number whereas calculating the area of the derivation ofa smooth (even if variating) frametime curve you would get a very low number. This would tell you how smooth are transitions, not if the gpu is powerful enough to make the game playable. For this you should check the average fps.
So in the end if you got decent fps and very low value for the area of this function you got a great experience,
if oyu got decent fps but high derived func area value then you got stutterish experience.
If you got low fps and low value you got a underdimensioned gpu but good smoothness.
EDITED :I made some corrections to the post i previously wrote since it is not possible to edit it

April 8, 2013 | 01:16 AM - Posted by PClover (not verified)

Quick Google "geforce frame metering" and you will find out why the nVi cards rarely have runt frames. In fact, nVi cards DO have them. They just delays those frames a bit to match with other good frames' speed, therefore the frame time chart looks good miraculously.

That's nVidia, it's meant to SELL, at crazy pricetags of course.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.