Review Index:

Frame Rating: AMD Improves CrossFire with Prototype Driver

Manufacturer: Various

A very early look at the future of Catalyst

Today is a very interesting day for AMD.  It marks both the release of the reference design of the Radeon HD 7990 graphics card, a dual-GPU Tahiti behemoth, and the first sample of a change to the CrossFire technology that will improve animation performance across the board.  Both stories are incredibly interesting and as it turns out both feed off of each other in a very important way: the HD 7990 depends on CrossFire and CrossFire depends on this driver. 

If you already read our review (or any review that is using the FCAT / frame capture system) of the Radeon HD 7990, you likely came away somewhat unimpressed.  The combination of a two AMD Tahiti GPUs on a single PCB with 6GB of frame buffer SHOULD have been an incredibly exciting release for us and would likely have become the single fastest graphics card on the planet.  That didn't happen though and our results clearly state why that is the case: AMD CrossFire technology has some serious issues with animation smoothness, runt frames and giving users what they are promised. 

Our first results using our Frame Rating performance analysis method were shown during the release of the NVIDIA GeForce GTX Titan card in February.  Since then we have been in constant talks with the folks at AMD to figure out what was wrong, how they could fix it, and what it would mean to gamers to implement frame metering technology.  We followed that story up with several more that showed the current state of performance on the GPU market using Frame Rating that painted CrossFire in a very negative light.  Even though we were accused by some outlets of being biased or that AMD wasn't doing anything incorrectly, we stuck by our results and as it turns out, so does AMD. 

Today's preview of a very early prototype driver shows that the company is serious about fixing the problems we discovered. 

If you are just catching up on the story, you really need some background information.  The best place to start is our article published in late March that goes into detail about how game engines work, how our completely new testing methods work and the problems with AMD CrossFire technology very specifically.  From that piece:

It will become painfully apparent as we dive through the benchmark results on the following pages, but I feel that addressing the issues that CrossFire and Eyefinity are creating up front will make the results easier to understand.  We showed you for the first time in Frame Rating Part 3, AMD CrossFire configurations have a tendency to produce a lot of runt frames, and in many cases nearly perfectly in an alternating pattern.  Not only does this mean that frame time variance will be high, but it also tells me that the value of performance gained by of adding a second GPU is completely useless in this case.  Obviously the story would become then, “In Battlefield 3, does it even make sense to use a CrossFire configuration?”  My answer based on the below graph would be no.

View Full Size

An example of a runt frame in a CrossFire configuration

NVIDIA's solution for getting around this potential problem with SLI was to integrate frame metering, a technology that balances frame presentation to the user and to the game engine in a way that enabled smoother, more consistent frame times and thus smoother animations on the screen.  For GeForce cards, frame metering began as a software solution but was actually integrated as a hardware function on the Fermi design, taking some load off of the driver.

Continue reading our article on the new prototype driver from AMD to address frame pacing issues in CrossFire!!

Until today, AMD did not integrate any kind of frame metering on multi-GPU solutions and simply rendered frames as quickly as possible when the game engine asked them to.  That might seem like the best answer without doing any analysis and that is likely the same conclusion AMD came to.  But as it turns out, as we have proven in our various benchmark results and video comparison, that just isn't true.  All animations are not created equal.


AMD came to me last week with a prototype driver that integrates a software frame metering or frame pacing technology.  What is important here is that AMD is having to rebuild the driver pipeline around this software model and as such it is going to take some time to get it 100% correct.  Also, because the company started work on this over a month ago, the base driver version for this prototype driver is something in the 13.2 stack - not the 13.5 used in our Radeon HD 7990 review. 

What changes in the new driver?  A new algorithm is being implemented that measures frame render times on a continuous basis to determine how long that frame should be displayed on the screen.  AMD is calling this measurement the game's "heartbeat" and that information is used to insert a delay into the Present() call return going back to the game.  The Present() call is used by the game to know when a frame has been rendered and another is ready to be taken by the GPU for work. 

Previously, in GPU-bound instances, AMD was actually sending Present() complete calls at almost the same time, to which the game replied with data that was similarly close together.  When both GPUs rendered the frames, they rendered it about the same speed (since the scenes are so similar) and thus they were presented in a nearly completely overlapped way, resulting in the very small slivers of frames shown on the screen: runts.  Essentially adding an offset to frames being rendered.

View Full Size

In this diagram, the unmetered display output shows runts because of unevenly paced frames.  The metered output adds a little delay but produces a better overall animation.

As the workload changes AMD is able to update the frame delay offset in real time.  If frames begin to take longer to render due to a change in scenery, then the driver will add more delay into the next present call in preparation to have balanced frame presentation on the screen.  It can seem counter-intuitive to introduce latency into the game engine pipe to make things smoother, but in truth we are oversimplifying the problem in our explanation. 

I asked AMD about a "polling time" associated with this new measurement and was told that in fact it was continuous because of its complete integration into the rendering pipeline.  This will likely add some CPU overhead in the driver but it would appear pretty minimal compared to the work that a typical GPU driver is handling already. 

There is still a lot of work to be done on the prototype driver that AMD is showing here today that includes tweaking the algorithm for individual games and fine tuning of the implementation.  But for a first attempt and a very quick turnaround, we are pretty impressed with the results on the following pages.

View Full Size

Many more results on the coming pages...

AMD is still planning on releasing this driver in a beta form in the summer but I wouldn't be surprised to see the schedule moved up a bit with some pressure with the Radeon HD 7990 release and better than expected results thus far.  AMD continues to promise the ability to enable and disable this feature in the control panel as well as to enable it on a per-game basis, something that NVIDIA hasn't done yet.  There are debates on whether or not there are actually input latency benefits to AMD's current method and we are still finding a way to test that at PC Perspective.


Download the 250MB MP4 from

Reports from most users are telling us that you NEED to download these files for a solid comparison!

Crysis 3 - 13.5 beta vs Prototype 2 Comparison

One thing to note: this fix does not yet address Eyefinity + CrossFire problems.  The prototype and the current implementation of the fix are only going to address single monitor configurations due to the differences in how the multiple rendered images are composited.  Resolutions up to 2560x1600 are handled by a hardware compositor while the 5760x1080 and above Eyefinity resolution use a software implementation that is apparently much more complex (and causes quite a few graphical issues we'll dive into later). 


How We Tested

Our testing was done with the exact same setup as our recently published Radeon HD 7990 review.  Except this time I have dropped the results from the Radeon HD 7970s in CrossFire in favor of the new HD 7990 results with the Prototype 2 driver.  Due to limited time and the fact that the Eyefinity results were unaffected, you are only going to see 2560x1440 results for now!

Test System Setup
CPU Intel Core i7-3960X Sandy Bridge-E
Motherboard ASUS P9X79 Deluxe
Memory Corsair Dominator DDR3-1600 16GB
Hard Drive OCZ Agility 4 256GB SSD
Sound Card On-board
Graphics Card AMD Radeon HD 7990 6GB
NVIDIA GeForce GTX 690 4GB
Graphics Drivers AMD: 13.5 beta (HD 7990)
AMD: Frame Pacing Prototype 2 (HD 7990)
NVIDIA: 314.07
Power Supply Corsair AX1200i
Operating System Windows 8 Pro x64

What you should be watching for

  1. HD 7990 13.5 beta vs HD 7990 Prototype 2 - Here's the big question - what changes and by how much?  Ideally we want to see more consistent frame times in our Frame Rating system.
  2. HD 7990 Prototype 2 vs GTX 690 - If the driver works as we were promised, how does it affect the performance compared to the GTX 690?
  3. HD 7990 Prototype 2 vs GTX Titan - Same here for the Titan!


Video News

April 24, 2013 | 12:53 AM - Posted by Randomoneh (not verified)

Wouldn't the 1-line high frames (if all the frames were 1-line high) actually convey the most information - effectively reversely simulating what is known (on cameras) as rolling shutter effect?

April 24, 2013 | 01:12 AM - Posted by Randomoneh (not verified)

Frame time [ms] chart and "Observed FPS" chart seem to contradict each other. In frame time chart, there are ~0.1 ms [partial] frames being shown which are not present in "Observed FPS" chart, probably because they would have to be up up there, somewhere around 100000 FPS if I'm not mistaken.

April 24, 2013 | 03:23 AM - Posted by Ryan Shrout

Not sure what you are asking...?

April 24, 2013 | 11:59 AM - Posted by Randomoneh (not verified)

What we call "runt" frames - frames less than x lines high, are pretty useless if they come once in a while BUT if all the [partial, of course] frames were as small as possible, that would mean that you get the most information out of your display.

Let's make an example. 100x100 display, 60Hz refresh rate.

16.67, 16.67, 16.67...
would get you ~1 frame per refresh cycle

Ideal (perfect) for such a display limited by 60Hz refresh rate would be:
0.167, 0.167, 0.167... (100 partial frames per refresh cycle)

Every new (refreshed) line would contain new information, meaning display is utilized to its limits, right? When rotating fast, such framerate would produce the reversed rolling shutter effect.

April 24, 2013 | 04:56 PM - Posted by Anonymous (not verified)

If you guarantee that new information comes at a steady rate, that is a form of frame metering. If you don't guarantee, then information comes in as fast as possible but animation may be compromised like we have seen here. It's a tradeoff.

April 24, 2013 | 08:31 PM - Posted by Tim (not verified)

What you are asking of your pc is to render 100 times the framerate that it currently does, on a 100 x 100 display. Yes, the effect would be buttery smooth and would have essentially no visible tearing, ever. But nobody plays a game on a 100 x 100 display, and even at the fairly normal 1080p(1080 x 60fps to maintain one refresh per horizontal pixel line) thats a required framerate of 64,800fps. While i would like to say my PC gets 64 thousand frames per second in BF3 or Crysis3, Its simply not going to happen any time soon.

April 25, 2013 | 08:08 AM - Posted by Randomoneh

I'm not asking anything. I'm simply saying that "runt" frames or frames of small height are nothing to be afraid of. World would be a better place if all the frames were that small.

If frame intervals are fairly even, smaller is always better. 10 partial frames on a display is better than two partial frames.

April 25, 2013 | 02:39 PM - Posted by Anonymous (not verified)

Partial frames and runts are not desirable at all. A small number of partials produce the tearing effect that everyone is familiar with. But, if every line on your display represented a discrete moment in time like a rolling shutter, you end up with the shearing or "wobble" effect associated with rolling shutters in digital cameras.

April 25, 2013 | 05:21 PM - Posted by Randomoneh

"you end up with the shearing or "wobble" effect associated with rolling shutters in digital cameras."

Yes, that's what I wrote before and that's better than one frame per refresh cycle.

April 26, 2013 | 09:36 PM - Posted by Tim (not verified)

Along these lines, it would be cool to see a monitor technology capable of variable refresh rate, such that it can refresh upon each new complete frame from the GPU, giving dynamic V-Sync to the monitor based on GPU frames, not the other way around.

April 24, 2013 | 12:07 PM - Posted by Randomoneh (not verified)

Hey, even John Carmack is talking about it here. Search for "rolling shutter" inside the article.

Hope you'll reply with your views. Thanks.

April 24, 2013 | 01:25 AM - Posted by derz

Such a thorough analysis. Good job!

April 24, 2013 | 03:21 AM - Posted by Ryan Shrout


April 24, 2013 | 02:29 AM - Posted by Fishbait

Man Ryan, those video comparisons show a difference of night and day. I congratulate you on doing something incredible, you made an impact in the industry. I am very impressed!

April 24, 2013 | 03:23 AM - Posted by Ryan Shrout

Thanks, it was really annoying ot be accused of bias or making stuff up, but I think now that AMD is starting to address it directly, we are more or less vindicated.  

April 24, 2013 | 03:47 AM - Posted by mdgreat (not verified)

but how wrong do you think it is of them to release a card that they know has problems. you asked them to delay the release and they released it anyway knowing what they know, you can sugar coat it anyway you want but don't you think that's a little corrupt of them, knowing some people will buy those cards and have no clue their buying something they are not getting as advertised..

April 24, 2013 | 04:44 AM - Posted by Anonymous (not verified)

Well, there's nothing really wrong with the hardware. It appears that AMD will be able to fix the issues in due time. If you assume that's true, then people will be able to get the performance that the cards are truly capable sometime in the future. I'm not saying it's completely the right thing to do, but hopefully the kind of person buying a $1000 graphics card does their due diligence and knows what they're getting into.

Let's not forget though that GPUs can be used for more than just gaming. When used for compute applications, the driver problems that AMD has is a non-issue.

April 25, 2013 | 03:54 PM - Posted by Anonymous (not verified)

Their compute drivers and software are crap too, so you're wrong.

April 28, 2013 | 12:06 PM - Posted by Revdarian (not verified)

-.- if AMD's compute drivers are crap, then i don't want to know what are Nvidia's because as of late, Nvidia gets trounced on OpenCL computing.

April 24, 2013 | 09:35 PM - Posted by Anonymous (not verified)

You were always vindicated, except the part about sitting on the info for a full year, giving the amd fanboys the most gigantic break they could ever have hoped to acquire.

Rest well, those of us without the quite severe mental problems already knew all about this years ago, and there are those who will not ever come clean, in their minds, and face reality, and their number is legion.
Remember one of the Ten is - do not falsely accuse thy neighbor - so this is a much more common occurrence than most of us would like to recognize.

I congratulate you for actually forcing the change upon the stubborn you know whoms (amd and their entourage of fanboys)... which by the way, will soon be singing ANOTHER TUNE.

That new sirens song, will be glory and praise on high, to the fps gods, and new rash of verdetrol addiction will no doubt descend upon this and all other similar websites.

Then, though you won't often get direct praise, it will be clear that your "harsh and hate filled bias against amd!!!" will have produced the results the little raging card moaners declared to be the case near serveral years ago - "superior amd hardware!!!" as the fps tales will indeed, as the preliminary shows, loft the broken runted zero dropper card(s) into the lead.

Oh happy days, that will come, after END OF LIFE for amd.
*snicker, it couldn't be sicker amd*

Yes, when the amd cards are finally EOL, the drivers will be "fixed for the moment" and the endless worshipful amd slave will slobber all over you, profusely, in gratitiude.
I'm certain you can hardly wait.

April 24, 2013 | 10:42 PM - Posted by CompetitionFanBoy (not verified)

I'm sure only one graphics card manufacturer will only produce the greatest cards, and be driven to develop big thing because you know monopolies always turn out the best for the consumer.

April 25, 2013 | 03:57 PM - Posted by Anonymous (not verified)

Don't worry, we have seen that no matter how bad amd gets, a thousand fanboys are screaming for they are the very best.

This latest fail has been years in the running, and this very site sat on it for nearly a year, then was harshly accused of bias for nVidia, since the amd fanboys are so clueless and lazy and don't read, or if they do, their emotions immediately replace any common sense they could muster with the utmost effort.

Yes, you don't need to worry, amd made certain it had loads of hate filled raging amd fanboys, and that brainwashing is not wearing off, ever.

May 27, 2013 | 11:53 AM - Posted by innocent bystander (not verified)

Wow such vehemence... I am reminded of high school and watching the awkward dweeb giggle at his own uncomfortably dull insults. I wonder what horrors you suffered at the hands of an AMD fanboy to instill you with such vitriol. What ever the injustice it has inspired you with a remarkably irrational hatred... Without AMD, nvidia would be selling all their cards at similar premium to their new GTX 780. I personally prefer Nvidia because of their drivers but lately, price and the great game bundles have me thinking of giving AMD a chance for a generation. This is precisely why AMD being healthy is a obvious benefit to all consumers. But regardless you will have plenty of opportunities to hate on AMD and AMD users since now all three major consoles have big red technology under the hood...

June 12, 2013 | 04:33 PM - Posted by Anonymous (not verified)

DUDE, relax. It's just a video card manufacturer. It's not this serious. Not even a little bit.

April 24, 2013 | 04:29 AM - Posted by tekk (not verified)

i read a small tidbit from the techpowerup 7990 review that mentioned these drivers will only be in windows 8 flavor. is that true?

April 24, 2013 | 05:49 AM - Posted by Ryan Shrout

No, that is not correct.  When AMD sent this to press it was initially Windows 8 only but they sent a Windows 7 version on Friday/Saturday.

The fix will definitely be for both Win7 and Win8!

April 24, 2013 | 04:38 AM - Posted by Nigel (not verified)

Great work again Ryan! I was wondering if the prototype driver has improved frame latency for single cards as well?

April 24, 2013 | 11:29 AM - Posted by Ryan Shrout

No, I would say not.  The algorithm that AMD has created for this prototype is very specific to multi-GPU and to single monitor configurations.

April 24, 2013 | 05:04 AM - Posted by Sublym3 (not verified)

Top stuff Ryan

Those videos are (as mentioned above) night and day. AMD must be kicking themselves they didn't see this sooner.

And if I am reading the graphs right they didn't lose any performance in the FPS metric either.

When these drivers are official and work with Eyefinity I will buy the best AMD card available.

Though it is still very worrying that AMD drivers take so long to adapt to their new hardware, have they even updated the memory side of the drivers for GCN yet?

April 24, 2013 | 09:44 PM - Posted by Anonymous (not verified)

Unfortunately. the "newest amd card" will always have the latest renewed and unprecedented set of multiple driver fiasco problems.

Go for EOL amd, then at least the gigantic industrially misshapen driver iron has been pressing away multiple flattenings and rewrinklings into that amd game card fabric, and by chance with a very stiff shot of starch under your collar you'll be fine grimacing and bearing it.

April 25, 2013 | 12:44 PM - Posted by Wild Thing (not verified)

OMG have look at this bloke...
Dude you need some medical help it seems.
Is that you Lonbjerb?

Nice work on the review thanx Ryan.

April 25, 2013 | 04:02 PM - Posted by Anonymous (not verified)

If stating the truths we have all seen is considered a mental problem for you, then you of course are a raging fanboy in denial.

As I've told you fools many times before, the ONLY way we have ever seen amd improve their drivers is with strong criticism from review sites - unfortunately the last round of harshness from hardocp with terry mekdon in there concerning crossfire has turned out to be this lousy dropped and runted frames "fix".

So yes, even I was wrong when I praised hardocp for finally complaining about crossfire failure and amd "fixing it".

Instead, "the fix was in".

These are facts, not "blokes opinions", and certainly not omg look at this bloke crap that you spewed.

MAN UP and face the facts, like a man, not a crybaby fanboy filled with lies.

April 25, 2013 | 04:13 PM - Posted by Anonymous (not verified)

Unfortunately. the "newest amd card" will always have the latest renewed and unprecedented set of multiple driver fiasco problems.


If stating the truths we have all seen is considered a mental problem for you, then you of course are a raging fanboy in denial.

If you're going to present that as a fact, you'd best back that up with references, or you'd be no better than any fanboy. You're stating that every release of a new AMD card (or lets say, new architecture) is plagued with driver troubles, and implying that such is not the case with NVidia cards, right? If it's as bad as you say, you'd have no problems finding sources that tell it like that.

May 8, 2013 | 12:01 AM - Posted by PClover (not verified)

Running crossfire setup with RadeonPro, everything is fine.
Dude, go with nVi, you pay higher price and get better support. We save our bucks with our AMD and a little tweak, it's fair and square.

April 24, 2013 | 05:15 AM - Posted by Mountainlifter

A small error in the last sentence on the Far Cry 3 page, Ryan. It should be "followed by the GTX 690" , not titan.

April 24, 2013 | 11:31 AM - Posted by Ryan Shrout


April 24, 2013 | 07:46 AM - Posted by rrr (not verified)

More fantastic job, PCPer is now my goto sites for GPU reviews, maybe alongside with [H].

April 24, 2013 | 10:47 AM - Posted by Nathan (not verified)

Will the fixes AMD is applying here also apply to 6xxx crossfire setups?

April 24, 2013 | 11:25 AM - Posted by Josh Walrath

I would imagine so, it is a completely software driven utility, so it should be able to be ported to previous CrossFire based cards.

April 24, 2013 | 09:46 PM - Posted by Anonymous (not verified)

lol- with very different architectures... by the ghost town cat amd team...

May I have 10,000 puts on that prospective with 100x leverage please.

April 24, 2013 | 11:29 AM - Posted by Ryan Shrout

I think eventually that might be the case, but I don't think it is yet.  This driver we tested was VERY specific in its hardware support because it is so early in development.

April 25, 2013 | 04:52 AM - Posted by Nilbog

It would be very interesting to see 6900 results whenever you get a new version

April 24, 2013 | 11:06 AM - Posted by Martin Trautvetter

Thanks for putting in the extra work and compiling the videos, the differences are startling and impressive!

April 24, 2013 | 12:11 PM - Posted by drbaltazar (not verified)

If you install as per amd instruction?everything is way worst then if you install the Microsoft way (a LA compatibility) like clean install)ms not speaking to amd or vice versa,,methodology whise?ya!again corp trying to outsmart os maker!

April 24, 2013 | 12:11 PM - Posted by drbaltazar (not verified)

If you install as per amd instruction?everything is way worst then if you install the Microsoft way (a LA compatibility) like clean install)ms not speaking to amd or vice versa,,methodology whise?ya!again corp trying to outsmart os maker!

April 24, 2013 | 12:20 PM - Posted by Randomoneh (not verified)

Hey Ryan, is it possible for you to upload the xlm files so we could take a look at times in ms for every frame?

April 26, 2013 | 03:23 PM - Posted by Ryan Shrout

Yeah, I think this is something we can start to include for readers.

April 27, 2013 | 01:25 PM - Posted by Randomoneh

Splendid! I hope I was clear, I wanted FCAT / Rivatuner Statistics server, not FRAPS timings.

I think your Indiegogo projects works so well exactly because you are willing to take advice / requests from us.

Thank you for that.

April 24, 2013 | 12:21 PM - Posted by drbaltazar (not verified)

Isn't the value set to 7.8 microsecond on most motherboard?if so sweat spot would be what 10 12 microsecond for Intel(sorry my processor math is very rusted might need Intel for accurate value.

April 24, 2013 | 01:06 PM - Posted by MarkT (not verified)

These are the articles that set pcper apart from everybody else.

The articles that put in the hard work and analysis are the articles that get u avid readers.

Ryan and crew keep it up!

April 24, 2013 | 01:24 PM - Posted by ATIMarcps (not verified)

When out this driver?

April 24, 2013 | 09:48 PM - Posted by Anonymous (not verified)

Late June, July, Aug...

April 24, 2013 | 01:44 PM - Posted by yasamoka (not verified)

Amazing article. Thorough, detailed, excellent read! Thanks!

Have 2x 7970s here, I use VSync all the time so no microstutter. I am currently downloading these drivers to test.

Thumbs up AMD as well!

April 26, 2013 | 03:24 PM - Posted by Ryan Shrout

Thank you!

April 24, 2013 | 05:03 PM - Posted by Anonymous (not verified)

Impressive article, great improvement from AMD, my 7970 CF config might yet have a bright future to look forward to. Wish i could get my hands into that prototype driver right now :P

April 24, 2013 | 08:32 PM - Posted by JOE_E

hey ryan do you know if the runt frame problem exists with the older generation ati/amd cards in crossfirex? i have a system with 2 5770 cards in crossfire and i had a lot of issues while playing games, and i was just wondering if maybe that was the problem the whole time.

April 24, 2013 | 09:52 PM - Posted by Anonymous (not verified)

YES, of course it does. DUH. AMD sucks, and is a foaming at the mouth evil corp, unlike nVidia, the exact opposite of what the monkees preached to everyone for years on end.

In any case there's no chance the data won't come out, but the reviewers are busy catching up on the very current right now, so a bit more patience and full of the horrors will be seen, mark my words, there is no chance it isn't coming.

April 26, 2013 | 03:25 PM - Posted by Ryan Shrout

Honestly not sure.  It's something I would like to test but I don't know if we have the bandwidth for it at this point in the cycle.

April 24, 2013 | 08:48 PM - Posted by RedNekTek (not verified)

Any chance that you'll get to frame rating 3-way/Quad Crossfire/SLI? I would love to see the variance and improvements that could be achieved. Especially for someone interested in 7990x2+ configurations.

April 25, 2013 | 12:05 AM - Posted by Jonathan Llamas (not verified)

I feel honored that were on the eve of having a new and much better way of measure perofrmance.
Ryan, i dont doubt that eventually you'll be known as the father of frame rating.

Keep up the awesome work, and ignore teh bias comments.

April 25, 2013 | 04:17 AM - Posted by Anonymous (not verified)

What about 120 hz monitors.with a 120 hz output i suppose results should be better, since shorter frames would be visualized in a single refresh cycle instead of create tearing

April 25, 2013 | 04:48 AM - Posted by Nilbog

Have you asked them about all the tearing?

Obviously the fluidity of the game play is significantly improved, however it seems like tearing is much worse.

BF3 was tearing the entire time it seemed.

April 25, 2013 | 08:39 AM - Posted by Randomoneh

What do you mean "tearing is much worse"?
"Tearing" is essential part of game playing experience. More tears - better experience.

April 25, 2013 | 09:32 PM - Posted by Nilbog

I mean download the BF3 vid and watch for tearing at 50% and 20% speeds. Its bad, and i would definitely argue that tearing like in this example is degrading the game experience.

April 27, 2013 | 07:19 AM - Posted by Revdarian (not verified)

Well, the thing is that you either have vsync on preventing the tearing or you have max framerate.

The issue with vsync is that your performance will have to be in direct relation to your displays frequency, so with a standard 60hz display your fps would either be 60, 30, 20, 15, 12, 10... and with a 120hz display you would have 120, 60, 40, 30, 24, 20, 17.14, 15, 10.91, 10 ....

With vsync on WYSIWYG, meaning that the shown and perceived framerate is identical as each time a full frame will be shown, so no "runt" will ever exist.

After saying all this i now notice that you didn't understand properly the problem discussed with multigpu setups, look at the "monitor output by dual gpus" graphic and notice that the frame metered solution doesn't show a full frame because of it's nature, what it does is that the ammount of frame shown from each gpu is cut to be aproximately of the same size, and this more homogeneization of frame time is what gives fluidity to the experience.

April 25, 2013 | 05:52 AM - Posted by Fatal (not verified)

would this be same for Hd 7970 crossfire? if so i'm going upgrade to them xD

April 26, 2013 | 03:25 AM - Posted by Anonymous (not verified)

Kudos due where its due. Great review.

Unlike Hardocp who didn't even mention the prototype driver and slammed the 7990 in its current driver state, atleast pcper mentioned the future fix, and they even went further showing results in a seperate review. Thats about as fair as you can be.

Even though this driver will be years late for non vsync xfire users (vsync + rp dfc fixed this issue long ago for me), but better late than never for those who game without that configuration and really one of nvidias biggest multi gpu advantages is about to dissapear , as results even this early are very encouraging.

April 29, 2013 | 06:34 PM - Posted by jonhgian (not verified)

i think that you didn't understand very well the issue...first of all indeed it's not only a crossfire issue, the new drivers will fix everything about the frame generation, after that the frame is put into the frame buffer from the it will be also a single config improvement. And also the v-sync function will work better and it will benefit, as it's always related to the frame buffer, from which vsync pulls the frames from.

April 26, 2013 | 06:03 AM - Posted by Vbs (not verified)

What was the average load power consumption with the prototype driver?

I would expect it to be lower, since a metered gpu setup could be doing only 50-60% of the work of an unmetered setup, as shown in the first page of the article.

April 26, 2013 | 03:26 PM - Posted by Ryan Shrout

I didn't test exactly, but it should be the same.  We are rendering the same number of frames, they are just spaced differently.

April 27, 2013 | 09:41 AM - Posted by Vbs (not verified)

You are rendering less frames, because the metered GPU will try not render any frames that will be runt (in a best case scenario).

In your example image, in page 1, the unmetered GPU will render ~6 frames per draw time vs ~4 frames on the metered GPU per draw time.

Instead of having both GPUs at 100% on an unmetered setup, the metered setup works more in tandem, which should reflect less power consumption. :)

April 28, 2013 | 06:08 PM - Posted by bystander (not verified)

Since the prototype driver is not effecting Fraps FPS much at all, it is not effecting total frames rendered much at all, and in return, the power usage is going to hardly be effected.

Frame metering doesn't require a lot of adjustments to work, at least in their current state, so I doubt there is much of a difference in power usage.

April 29, 2013 | 06:40 AM - Posted by Vbs (not verified)

The prototype driver is affecting "Observed FPS".

The difference between drivers in "Observed FPS" can be used as an estimate for the amount of discarded or runt frames. The only way to eliminate those is by having one GPU wait until the other is almost done rendering a frame, a "tandem" setup. If you are having one GPU waiting, it's not consuming power.

You guys are looking at frame averages (fps) without looking at the frame distribution underneath. The prototype driver achieves the same fps numbers doing *less* work than the normal driver, since its *efficiency* is higher.

Looking at the frame variance charts, we can estimate the time each GPU is waiting in the prototype driver -- e.g., for BF3, each will wait around 10ms. In the same chart, we can see that the normal driver keeps both GPU always at 100%, that's why frames vary ~[0,20]ms.

So, not only does the normal driver spends time rendering frames that won't be seen, it renders them inconsistently.

Bottom line, from the moment a driver introduces any kind of time delay into the pipeline, there *has* to be reduced power usage, since Watt equals to Joule/second.

April 30, 2013 | 01:19 AM - Posted by bystander (not verified)

You should be using total FPS from Fraps, when talking about the work the GPU does. It doesn't matter if it is a runt or not, the GPU still rendered the whole frame. It just gets overwritten by the 2nd GPU's image. In terms of power usage, Fraps has the more accurate picture.

In terms of what we find useful, that is where Observable FPS comes into play.

April 30, 2013 | 06:39 AM - Posted by Vbs (not verified)

Yes, you are correct! :) The unmetered vs metered pic on the first page of the article is misleading and led me to believe there could be more delays introduced than there really are.

As you said, and is shown by total fps from fraps, both setups are rendering the same amount of frames (so, doing the same work), the metered setup just has a different "starting point" (see my comment below).

April 29, 2013 | 07:42 AM - Posted by Vbs (not verified)

Still, I guess most of the time the power consumption between both (metered vs unmetered) will be equal, as when rendering a constant framerate both GPUs will be at 100% usage with one starting rendering when the other has its frame ~50% complete.

Only when there are more complex scenes where the framerate starts dipping (more time to render each frame) there is a need to pause one of the GPUs untill the other is at ~50% frame completion. The reverse case is also an interesting conundrum. :)

April 27, 2013 | 08:04 AM - Posted by fffrantz

You guys have been doing such an impressive work. Hands down...

April 29, 2013 | 01:09 PM - Posted by JustinW (not verified)

Are you guys able to get 120hz through your testing setup with reduced blanking?

April 30, 2013 | 01:23 AM - Posted by bystander (not verified)

From what I understand, saving the data from a 60hz run, requires a RAID 0 setup with SSD's, otherwise the storage can't keep up. I believe THG required 5 SSD's in RAID 0 just to be able to store the data fast enough so they can analyze it later.

I'm guessing there are technical roadblocks for 120hz to be analyzed like this. The FCAT card may also not be capable of taking in data that fast either.

May 1, 2013 | 10:10 AM - Posted by bystander (not verified)

Well, they managed to do it with a 4K display, so it should be possible on the storage side, but the FCAT card still has to be able to handle 120hz.

May 1, 2013 | 02:17 PM - Posted by JustinW (not verified)

Interesting, thanks.

May 7, 2013 | 03:28 AM - Posted by Anonymous (not verified)

Is there anyway to get a hold of the prototype driver?

May 20, 2013 | 06:35 PM - Posted by Davron (not verified)

I don't want to give you a big head or anything, but might you have helped AMD with a vital bit of debugging/testing that they never thought of? What does this do to 3x 7970s? I also want to know if they will backport the update to 6870/6970/6990s if this is possible. If you can do anything else like this to give them more data to help out AMD card users, we will be eternally grateful, though I'm not sure NVidia will be happy about it.

May 31, 2013 | 10:18 AM - Posted by krikman696 (not verified)

I've read another review about 7990's performance, FCAT, prototype driver, etc, basically the same thing, although there was something i noticed.
In their benchmarks the hardware (FRAPS) FPS was also quite bigger for the prototype d. compared to the basic catalyst 7990. While for the observable FPS this difference is easy understandable, i found no explanation for the extra FPS.
Prototype driver just "arranges" the frames in a smoother way, but why it appears to produce more frames?

May 31, 2013 | 10:28 AM - Posted by krikman696 (not verified)

The review is on Tom's hardware. You can notice that there are a lot of differences in frames rates (both observable and FRAPS) between the 7990's prot. driver and normal one.

May 31, 2013 | 10:28 AM - Posted by krikman696 (not verified)

The review is on Tom's hardware. You can notice that there are a lot of differences in frames rates (both observable and FRAPS) between the 7990's prot. driver and normal one.

May 31, 2013 | 10:28 AM - Posted by krikman696 (not verified)

The review is on Tom's hardware. You can notice that there are a lot of differences in frames rates (both observable and FRAPS) between the 7990's prot. driver and normal one.

June 11, 2013 | 11:40 AM - Posted by TwinShadow (not verified)

it is possible to compare a video of a 690 vs. 7990

September 12, 2013 | 10:59 AM - Posted by mike2411 (not verified)

I installed my 7990 last night and went straight to BF3...Frame Rates sucked! I was expecting much more from this card. I am reading where it is a driver issue and that the driver to fix these issues should have been released already. Is it out? How do I fix this issue so I get the card to work better than the GTX 570 I used to have??

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

This question is for testing whether you are a human visitor and to prevent automated spam submissions.