Review Index:
Feedback

ASUS Radeon R9 290X DirectCU II Graphics Card Review

Author: Ryan Shrout
Manufacturer: ASUS

Testing Setup and Frame Rating Info

Testing Configuration

The specifications for our testing system haven't changed much.

Test System Setup
CPU Intel Core i7-3960X Sandy Bridge-E
Motherboard ASUS P9X79 Deluxe
Memory Corsair Dominator DDR3-1600 16GB
Hard Drive OCZ Agility 4 256GB SSD
Sound Card On-board
Graphics Card ASUS R9 290X DirectCU II 4GB
AMD Radeon R9 290X 4GB
NVIDIA GeForce GTX 780 Ti 3GB
NVIDIA GeForce GTX 780 3GB
Graphics Drivers AMD: 13.11 V9.5
NVIDIA: 331.80
Power Supply Corsair AX1200i
Operating System Windows 8 Pro x64

View Full Size

What you should be watching for

  1. ASUS R9 290X vs Reference R9 290X - This is the most important and obvious comparison to make.  How much faster is the ASUS R9 290X DirectCU II card than the reference models with its 50 MHz higher clock speed and much more consistent clock speeds?
     
  2. ASUS R9 290X vs GTX 780 Ti - With this improved performacne does the ASUS card help AMD reclaim the performance crown for single GPU graphics cards?

Frame Rating: Our Testing Process

If you aren't familiar with it, you should probably do a little research into our testing methodology as it is quite different than others you may see online.  Rather than using FRAPS to measure frame rates or frame times, we are using an secondary PC to capture the output from the tested graphics card directly and then use post processing on the resulting video to determine frame rates, frame times, frame variance and much more. 

This amount of data can be pretty confusing if you attempting to read it without proper background, but I strongly believe that the results we present paint a much more thorough picture of performance than other options.  So please, read up on the full discussion about our Frame Rating methods before moving forward!!

While there are literally dozens of file created for each “run” of benchmarks, there are several resulting graphs that FCAT produces, as well as several more that we are generating with additional code of our own. 

If you don't need the example graphs and explanations below, you can jump straight to the benchmark results now!!

 

The PCPER FRAPS File

While the graphs above are produced by the default version of the scripts from NVIDIA, I have modified and added to them in a few ways to produce additional data for our readers.  The first file shows a sub-set of the data from the RUN file above, the average frame rate over time as defined by FRAPS, though we are combining all of the GPUs we are comparing into a single graph.  This will basically emulate the data we have been showing you for the past several years.

 

The PCPER Observed FPS File

This graph takes a different subset of data points and plots them similarly to the FRAPS file above, but this time we are look at the “observed” average frame rates, shown previously as the blue bars in the RUN file above.  This takes out the dropped and runts frames, giving you the performance metrics that actually matter – how many frames are being shown to the gamer to improve the animation sequences. 

As you’ll see in our full results on the coming pages, seeing a big difference between the FRAPS FPS graphic and the Observed FPS will indicate cases where it is likely the gamer is not getting the full benefit of the hardware investment in their PC.

 

The PLOT File

The primary file that is generated from the extracted data is a plot of calculated frame times including runts.  The numbers here represent the amount of time that frames appear on the screen for the user, a “thinner” line across the time span represents frame times that are consistent and thus should produce the smoothest animation to the gamer.  A “wider” line or one with a lot of peaks and valleys indicates a lot more variance and is likely caused by a lot of runts being displayed.

 

The RUN File

While the two graphs above show combined results for a set of cards being compared, the RUN file will show you the results from a single card on that particular result.  It is in this graph that you can see interesting data about runts, drops, average frame rate and the actual frame rate of your gaming experience. 

For tests that show no runts or drops, the data is pretty clean.  This is the standard frame rate per second over a span of time graph that has become the standard for performance evaluation on graphics cards.

A test that does have runts and drops will look much different.  The black bar labeled FRAPS indicates the average frame rate over time that traditional testing would show if you counted the drops and runts in the equation – as FRAPS FPS measurement does.  Any area in red is a dropped frame – the wider the amount of red you see, the more colored bars from our overlay were missing in the captured video file, indicating the gamer never saw those frames in any form.

The wide yellow area is the representation of runts, the thin bands of color in our captured video, that we have determined do not add to the animation of the image on the screen.  The larger the area of yellow the more often those runts are appearing.

Finally, the blue line is the measured FPS over each second after removing the runts and drops.  We are going to be calling this metric the “observed frame rate” as it measures the actual speed of the animation that the gamer experiences.

 

The PERcentile File

Scott introduced the idea of frame time percentiles months ago but now that we have some different data using direct capture as opposed to FRAPS, the results might be even more telling.  In this case, FCAT is showing percentiles not by frame time but instead by instantaneous FPS.  This will tell you the minimum frame rate that will appear on the screen at any given percent of time during our benchmark run.  The 50th percentile should be very close to the average total frame rate of the benchmark but as we creep closer to the 100% we see how the frame rate will be affected. 

The closer this line is to being perfectly flat the better as that would mean we are running at a constant frame rate the entire time.  A steep decline on the right hand side tells us that frame times are varying more and more frequently and might indicate potential stutter in the animation.

 

The PCPER Frame Time Variance File

Of all the data we are presenting, this is probably the one that needs the most discussion.  In an attempt to create a new metric for gaming and graphics performance, I wanted to try to find a way to define stutter based on the data sets we had collected.  As I mentioned earlier, we can define a single stutter as a variance level between t_game and t_display. This variance can be introduced in t_game, t_display, or on both levels.  Since we can currently only reliably test the t_display rate, how can we create a definition of stutter that makes sense and that can be applied across multiple games and platforms?

We define a single frame variance as the difference between the current frame time and the previous frame time – how consistent the two frames presented to the gamer.  However, as I found in my testing plotting the value of this frame variance is nearly a perfect match to the data presented by the minimum FPS (PER) file created by FCAT.  To be more specific, stutter is only perceived when there is a break from the previous animation frame rates. 

Our current running theory for a stutter evaluation is this: find the current frame time variance by comparing the current frame time to the running average of the frame times of the previous 20 frames.  Then, by sorting these frame times and plotting them in a percentile form we can get an interesting look at potential stutter.  Comparing the frame times to a running average rather than just to the previous frame should prevent potential problems from legitimate performance peaks or valleys found when moving from a highly compute intensive scene to a lower one.

While we are still trying to figure out if this is the best way to visualize stutter in a game, we have seen enough evidence in our game play testing and by comparing the above graphic to other data generated through our Frame rating system to be reasonably confident in our assertions.  So much in fact that I am going to going this data the PCPER ISU, which beer fans will appreciate the acronym of International Stutter Units.

To compare these results you want to see a line that is as close the 0ms mark as possible indicating very little frame rate variance when compared to a running average of previous frames.  There will be some inevitable incline as we reach the 90+ percentile but that is expected with any game play sequence that varies from scene to scene.  What we do not want to see is a sharper line up that would indicate higher frame variance (ISU) and could be an indication that the game sees microstuttering and hitching problems.

December 18, 2013 | 01:34 PM - Posted by Daniel Meier Nielsen (not verified)

Damn, did not expect it to actually do this well. Nice review Ryan.

December 18, 2013 | 01:35 PM - Posted by Imre (not verified)

Awesome:ooo

December 18, 2013 | 01:36 PM - Posted by Imre (not verified)

Please Jesus:((

December 19, 2013 | 08:07 AM - Posted by StewartGraham (not verified)

Jesus can't help you.

December 18, 2013 | 01:42 PM - Posted by OneAgainstOne (not verified)

Exactly, let's wait til these actually come out and what Nvidia is going to do to counter this cause we it could cause a TI drop to reasonable pricing. Thanks Ryan as always!!

December 21, 2013 | 03:29 AM - Posted by nobodyspecial (not verified)

NV doesn't have to respond with anything. You can buy an MSI 780ti gaming card for $699 that is 1020/1085. So a FREE OC is already priced at REF MSRP. It won all benchmarks at hardwarecanucks review of this 290x card except for Dirt Showdown which nobody plays (I say that because it hasn't sold 100K copies yet according to vgchartz which is what it takes to show up on their charts, total failure). Also it won everything without OCing, which nets another 10% as it runs up to 1225 as they show.

Tried to put a link in to hardwarecanucks but caught for spam. I'm sure anyone can get there on their own anyway (newegg below also).

You should be testing at MSI speeds since you can get an OC card for ref pricing. You are testing an OC card here (290X asus) vs. REF 780ti and asking if they took the crown? NOPE. Because you can get OC 780ti for $699 already that easily wins.

The EVGA REF model is now $680 also on newegg once in the cart. More expensive than 290x maybe, but comes with gsync, 3 AAA games, cooler etc.

December 21, 2013 | 01:01 PM - Posted by Anonymous (not verified)

You really are a Nvidia fanboy. This 290x beats the 780 TI through and through. Not only does it run slightly cooler, the frame rate is the exact same. OH wait, its $150 cheaper and if Nvidia fanboys can't realize that Nvidia does need to respond with a price drop, they will be seriously hurting. I know fanboys like you will always keep them in business though because you're paying for an overpriced piece of crap. I don't know where you get that the EVGA Ref card is cooler, but it isn't. Look at the 4 degrees difference. Also AMD comes with the Never Settle Bundles, so you have zero point there with the AAA titles. Gsync is pointless. Drivers are far better on AMD now. What's that? OH wait...Games like BF4 AMD dominates Nvidia in every shape and form and this is all before mantle. Is Nvidia in the PS4 or Xbox? Nope. If Mantle does well, is every game from here on out going to adpot it? Yep. So basically Nvidia has an overpriced card that does the same things as the 290x but for way more money and....No thats pretty much it. Have a nice day.

December 21, 2013 | 03:19 PM - Posted by Anonymous (not verified)

fanboy, lol me think you can't read graphs, only one where it comes out on top. Then you have the 780ti direct cu2 which eats your AMD fanboy ass

December 22, 2013 | 12:18 PM - Posted by snook

This is the first 290X non-reference card out. If you think there aren't going to be faster non-reference 290Xs released, you've lost it.

enjoy the crow when it's served.

December 22, 2013 | 01:10 PM - Posted by Anonymous (not verified)

Yes kudos to ASUS for fixing a broken product. Ryan , put it up against the Gigabyte 780ti Ghz Edition or the EVGA SC ACX, would be interested in the results for non ref v non ref
Cheers

December 22, 2013 | 01:07 AM - Posted by Anonymous (not verified)

>g-sync is pointless
What?
You do know what g-sync does right?

December 23, 2013 | 11:54 PM - Posted by Niabureth (not verified)

I'll second what Snook said. Until G-sync gets mainstream, (meaning it would be fully compatible with most displays out there) which won't happen anytime soon, I don't really see the point of it. Most people won't be able to use it. And it's not exactly cheap. Don't get me wrong, G-sync is great and all, but it will take some time before it's really a valid argument for going green.

December 22, 2013 | 12:29 PM - Posted by snook

it doesn't come with g-sync. it comes with the ability to use g-sync. huge difference. you get to pay extra for an aftermarket kit (per ryan, limited) or buy a g-sync enabled monitor.

last podcast points out the g-sync limits also. many more than I thought it would have.

December 23, 2013 | 11:42 PM - Posted by Niabureth (not verified)

OMG, you are actually getting really upset about this review, and getting into a defensive position for Nvidia? *The fanboy is strong in this one*

December 18, 2013 | 01:44 PM - Posted by Anonymous (not verified)

How much more FPS do you get with the sticker ?
&
Which color combination gives the most FPS ?

December 18, 2013 | 02:19 PM - Posted by D1RTYD1Z619

Actually blue stickers run cooler giving you better overclocks, its science and junk. I have a friend in the "know" that says Asus is saving the blue stickers for the Rev. 2 models.

December 18, 2013 | 02:25 PM - Posted by Anonymous (not verified)

speed holes, get out that pickaxe, or shotgun, and make some Holes, at least on those refrence SKUs!

December 18, 2013 | 03:53 PM - Posted by N3n0 (not verified)

Shutup Homer.

December 22, 2013 | 01:11 PM - Posted by Homer, the poet (not verified)

Shut up, pleb.

December 23, 2013 | 11:56 PM - Posted by Niabureth (not verified)

Damn! Can't wait for those.

December 18, 2013 | 02:01 PM - Posted by Anonymous (not verified)

how could AMD have not come up with a similarly performing solution for their flagship cards?

AMD have cut their engineering workforce down to almost the bare-bones to stay alive, and maybe AMD should go with a flagship non-refrence design partner, and forgo the costs of thermal engineering to that partner, until AMD rehires/recommits more of its engineering assets towards thermal engineering and cooling/sound design. Yes the parts only cost $20 more, but the engineering costs much more than the materials, and if your engineering workforce is not there, or not available, it is better to get a flagship partner, and utilize that partners engineering workforce.

Alms for the cooling design challenged.

December 18, 2013 | 02:03 PM - Posted by JohnGR (not verified)

Great. But I think most of us knew it for over a month now when many used third party coolers on 290X and had excellent results.
This must be the only time ever that the same gpu that lost the performance crown took it back with just a cooler change.

March 1, 2014 | 03:02 PM - Posted by nashathedog (not verified)

It's shocking how dumb some people are....

December 18, 2013 | 02:49 PM - Posted by Brett from Australia (not verified)

very comprehensive review Ryan, and you have made some good points why AMD could not have done a better job with these cards when they came out of the gate is worrying. These cards do create a lot of heat and as such should have better built in cooling.

December 18, 2013 | 02:59 PM - Posted by envygamer (not verified)

FAWK YEA am selling off my 290 ref!

December 18, 2013 | 03:07 PM - Posted by mikeymike (not verified)

AMD shut up and take my money!!!

December 18, 2013 | 03:08 PM - Posted by Anonymous (not verified)

Do you run your Skyrim tests with the official HD Textures DLC?

December 18, 2013 | 04:41 PM - Posted by Ryan Shrout

It's the DLC textures but no mods.

December 21, 2013 | 04:31 PM - Posted by BattMarn (not verified)

Ryan, I know it makes it harder to compare your results with other websites, but I don't know ANYONE who runs Skyrim on PC with no mods.

If you do what TekSyndicate do and run all their tests on a heavily-modded Skyrim, the game will push the GPU further. And so long as none of the mods are ENBs or anything, (so stick to texture and lighting mods) the performance should scale with texture quality, etc.

Might also make Skyrim more relevant as a benchmark, since in most of your latest reviews you've said how pretty much any new GPU can run Skyrim maxed out.

December 31, 2013 | 05:54 PM - Posted by Anonymous (not verified)

Personally, I wouldn't mind seeing these cards put through their paces with an ENB and DDOF. I noticed on one review - I think an Anandtech review - the AMD cards seemed to react worse to DDOF in Bioshock Infinite than Nvidia ones, and I'd very much like to see if that is a game-specific aberration or a trend.

The beauty of modded Skyrim as a benchmark is that you can test the cards for different kinds of advanced graphics/lighting features from mods and then compare it to vanilla Skyrim as a control.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.