Review Index:

ASUS Radeon R9 290X DirectCU II Graphics Card Review

Author: Ryan Shrout
Manufacturer: ASUS

Cooling and Clock Benefits to DirectCU II

The Dual BIOS Switch Effects

Just like the reference models of the R9 290X, the ASUS DirectCU II model has a BIOS switch on the top meant to interchange BIOS settings.  The switch in the default position (away from the output connectors) indicates the card is in Performance Mode while the switch in the opposite position (towards the output connectors) puts the card into a Quiet mode.  Sound familiar?  It might but the behaviors are very different than the first 290X cards.

View Full Size

Here in performance mode, you'll notice that the AMD Catalyst Control Center doesn't list a target GPU temperature and also defaults to having the maximum fan speed at 50%.  Don't be scared though!  These fans are MUCH more quiet than the reference cards.

View Full Size

In the quiet mode you have the same settings that we saw previously with a target GPU temperature of 95C and a maximum fan speed of 40%. 

Clock Speed Variance - Is it fixed??

First things first, let's look at how the clock speeds vary on the ASUS R9 290X DirectCU II.  To see this over an extended period of time, we are looking at 25 minutes of consistent, looped gameplay in Metro: Last Light.  This is the same testing loop utilized in our discussion of the 290X variance problems from last month.  We'll be compare this new custom R9 290X from ASUS to the reference card we got from AMD initially as well as the Sapphire R9 290X that we purchased ourselves that uses the stock cooler.

For this test, all three cards are in their default state.  That means the sampled and Sapphire R9 290X are in Quiet mode while the R9 290X is in performance mode.  More on that below.

View Full Size

Well look at that, the ASUS R9 290X is quite the champ!  Throughout our 25 minute session of Metro: Last Light it stays at the 1050 MHz frequency pretty much the whole time with only the occasional dip to something in the 600-700s when the level reloads.

What does this mean?  The custom cooled R9 290X DirectCU II from ASUS is the first R9 290X we have seen to consistently run at its rated frequency and as a result it will be among the fastest cards we have ever tested!

View Full Size

To put it in a different perspective, the average clock speed of the ASUS card was 1047.57 MHz over the 25 minute period while the sampled reference card was only hitting 930 MHz.  The Sapphire card, at 869 MHz average clock, is 17% slower than the ASUS card.  The press sampled card is 11% slower than the ASUS card.  Those are large, tangible and hard to swallow results for buyers of reference R9 290X models.

I do realize that the default clock of the ASUS DC2 card is 1050 MHz while both the press sampled and Sapphire reference cards are rated at "up to" 1000 MHz and that results in a 5% delta out the gate.  Still, it is clear based on the clock speed comparison graph above that the stock blower cooler was never up to the task of keeping the Hawaii GPU in the thermal envelope it needed to be.

View Full Size

Keeping mind that these are totally different cooler designs, the ASUS card is able to maintain a 1050 MHz clock speed with a fan speed of just 1600 RPM or so, though we do have TWO fans.  How do that effect sound levels?

View Full Size

As it turns out, the DirectCU II cooler, in performance mode, is significantly more quiet than the reference cooler in quiet mode.  Think about that for just a second.  Okay, let's move on.

The custom built ASUS R9 290X is 6 full dbA lower in our noise testing than the reference R9 290X and is even able to come in under the sound levels of the GTX 780 Ti.  It's still louder than the original GTX 780 that runs at a lower fan speed, but it's pretty close.

In my testing, the R9 290X provided about as good of a sound experience of any high end graphics card we have tested in the past year.  This is exactly where the AMD R9 290X should have been the whole time.

View Full Size

Another huge benefit you get with the ASUS DirectCU II is lower temperatures.  Not only does this card run faster than the reference, and quieter, it is doing all of that while keeping the GPU at a temperature 18C lower than the reference cooler.  This is nothing to balk at or just dismiss, this is serious improvement by ASUS.

View Full Size

Finally, does all of this come at the cost of power consumption?  Not really - the ASUS DC2 was actually drawing a few watts lower power than the reference card in our testing under a full load.

How does the Quiet Mode change things?

Though we did all of our primary testing with the switch in its default state, Performance mode, Quiet mode has some interesting characteristics worth noting. 

View Full Size

Sound levels in the quiet mode are, well, quieter (as you would expect).  The jump from 37.1 dbA to 34.1 dbA isn't as big as the move from the reference 290X to the ASUS custom card but the noise difference is still noticeable.

View Full Size

The clock speeds reported over the same 25 minute period are interesting.  For most of the recorded period, the ASUS R9 290X DirectCU II in quiet mode is able to maintain the same 1050 MHz clock speed as it does in the performance mode.  Between the 550 and 850 second period though there is some obvious throttling the occurs with clock speeds going to nearly 950 MHz.  Even though that is definitely going to cause performance drops, it is nothing close to the 830 MHz clocks we saw with the Sapphire reference cooler testing last month.

The clock speed does eventually restabilize at 1050 MHz though for reasons you'll see below.

View Full Size

Even with that clock speed variance above, the average frame rate of the ASUS DC2 card in quiet mode is only 8 MHz lower over the entire 25 minutes testing period.

View Full Size

In order to hit just 34.1 dbA in quiet mode, the ASUS card only briefing breaches the 1200 RPM mark at about the 950 seconds.  It then drops again and seems to find a stable spot at 1180 RPM or so.

View Full Size

How can the ASUS card maintain the same performance levels (or close to the same performance levels) with lower noise in quiet mode?  By allowing the GPU to hit higher temperatures.  In performance mode our GPU hits the 76-77C mark towards the end of its 25 minute test run while in quiet mode ASUS allows the GPU to 94C.  It is not a coincidence that it happens in the same 650-850 second window where we saw the clock speed come down and the fan speed quickly ramp up.  The DC2 card then finds the best spot for it to maintain 1050 MHz with a 92C temperature and ~1200 RPM fan speed.  This is AMD PowerTune at work!

This creates an interesting choice for the gamer - you can basically get the same 1050 MHz clock speeds from the ASUS R9 290X DirectCU II by either allowing the GPU to hit 94C and getting lower noise levels OR by keeping the GPU cooler (around 77C) with higher noise levels.  The key this time is that performance is nearly identical in both cases and that both the Quiet and Performance defaults have dramatically improved noise levels compared to the AMD R9 290X reference "quiet" mode.

December 18, 2013 | 04:34 PM - Posted by Daniel Meier Nielsen (not verified)

Damn, did not expect it to actually do this well. Nice review Ryan.

December 18, 2013 | 04:35 PM - Posted by Imre (not verified)


December 18, 2013 | 04:36 PM - Posted by Imre (not verified)

Please Jesus:((

December 19, 2013 | 11:07 AM - Posted by StewartGraham (not verified)

Jesus can't help you.

December 18, 2013 | 04:42 PM - Posted by OneAgainstOne (not verified)

Exactly, let's wait til these actually come out and what Nvidia is going to do to counter this cause we it could cause a TI drop to reasonable pricing. Thanks Ryan as always!!

December 21, 2013 | 06:29 AM - Posted by nobodyspecial (not verified)

NV doesn't have to respond with anything. You can buy an MSI 780ti gaming card for $699 that is 1020/1085. So a FREE OC is already priced at REF MSRP. It won all benchmarks at hardwarecanucks review of this 290x card except for Dirt Showdown which nobody plays (I say that because it hasn't sold 100K copies yet according to vgchartz which is what it takes to show up on their charts, total failure). Also it won everything without OCing, which nets another 10% as it runs up to 1225 as they show.

Tried to put a link in to hardwarecanucks but caught for spam. I'm sure anyone can get there on their own anyway (newegg below also).

You should be testing at MSI speeds since you can get an OC card for ref pricing. You are testing an OC card here (290X asus) vs. REF 780ti and asking if they took the crown? NOPE. Because you can get OC 780ti for $699 already that easily wins.

The EVGA REF model is now $680 also on newegg once in the cart. More expensive than 290x maybe, but comes with gsync, 3 AAA games, cooler etc.

December 21, 2013 | 04:01 PM - Posted by Anonymous (not verified)

You really are a Nvidia fanboy. This 290x beats the 780 TI through and through. Not only does it run slightly cooler, the frame rate is the exact same. OH wait, its $150 cheaper and if Nvidia fanboys can't realize that Nvidia does need to respond with a price drop, they will be seriously hurting. I know fanboys like you will always keep them in business though because you're paying for an overpriced piece of crap. I don't know where you get that the EVGA Ref card is cooler, but it isn't. Look at the 4 degrees difference. Also AMD comes with the Never Settle Bundles, so you have zero point there with the AAA titles. Gsync is pointless. Drivers are far better on AMD now. What's that? OH wait...Games like BF4 AMD dominates Nvidia in every shape and form and this is all before mantle. Is Nvidia in the PS4 or Xbox? Nope. If Mantle does well, is every game from here on out going to adpot it? Yep. So basically Nvidia has an overpriced card that does the same things as the 290x but for way more money and....No thats pretty much it. Have a nice day.

December 21, 2013 | 06:19 PM - Posted by Anonymous (not verified)

fanboy, lol me think you can't read graphs, only one where it comes out on top. Then you have the 780ti direct cu2 which eats your AMD fanboy ass

December 22, 2013 | 03:18 PM - Posted by snook

This is the first 290X non-reference card out. If you think there aren't going to be faster non-reference 290Xs released, you've lost it.

enjoy the crow when it's served.

December 22, 2013 | 04:10 PM - Posted by Anonymous (not verified)

Yes kudos to ASUS for fixing a broken product. Ryan , put it up against the Gigabyte 780ti Ghz Edition or the EVGA SC ACX, would be interested in the results for non ref v non ref

December 22, 2013 | 04:07 AM - Posted by Anonymous (not verified)

>g-sync is pointless
You do know what g-sync does right?

December 24, 2013 | 02:54 AM - Posted by Niabureth (not verified)

I'll second what Snook said. Until G-sync gets mainstream, (meaning it would be fully compatible with most displays out there) which won't happen anytime soon, I don't really see the point of it. Most people won't be able to use it. And it's not exactly cheap. Don't get me wrong, G-sync is great and all, but it will take some time before it's really a valid argument for going green.

December 22, 2013 | 03:29 PM - Posted by snook

it doesn't come with g-sync. it comes with the ability to use g-sync. huge difference. you get to pay extra for an aftermarket kit (per ryan, limited) or buy a g-sync enabled monitor.

last podcast points out the g-sync limits also. many more than I thought it would have.

December 24, 2013 | 02:42 AM - Posted by Niabureth (not verified)

OMG, you are actually getting really upset about this review, and getting into a defensive position for Nvidia? *The fanboy is strong in this one*

December 18, 2013 | 04:44 PM - Posted by Anonymous (not verified)

How much more FPS do you get with the sticker ?
Which color combination gives the most FPS ?

December 18, 2013 | 05:19 PM - Posted by D1RTYD1Z619

Actually blue stickers run cooler giving you better overclocks, its science and junk. I have a friend in the "know" that says Asus is saving the blue stickers for the Rev. 2 models.

December 18, 2013 | 05:25 PM - Posted by Anonymous (not verified)

speed holes, get out that pickaxe, or shotgun, and make some Holes, at least on those refrence SKUs!

December 18, 2013 | 06:53 PM - Posted by N3n0 (not verified)

Shutup Homer.

December 22, 2013 | 04:11 PM - Posted by Homer, the poet (not verified)

Shut up, pleb.

December 24, 2013 | 02:56 AM - Posted by Niabureth (not verified)

Damn! Can't wait for those.

December 18, 2013 | 05:01 PM - Posted by Anonymous (not verified)

how could AMD have not come up with a similarly performing solution for their flagship cards?

AMD have cut their engineering workforce down to almost the bare-bones to stay alive, and maybe AMD should go with a flagship non-refrence design partner, and forgo the costs of thermal engineering to that partner, until AMD rehires/recommits more of its engineering assets towards thermal engineering and cooling/sound design. Yes the parts only cost $20 more, but the engineering costs much more than the materials, and if your engineering workforce is not there, or not available, it is better to get a flagship partner, and utilize that partners engineering workforce.

Alms for the cooling design challenged.

December 18, 2013 | 05:03 PM - Posted by JohnGR (not verified)

Great. But I think most of us knew it for over a month now when many used third party coolers on 290X and had excellent results.
This must be the only time ever that the same gpu that lost the performance crown took it back with just a cooler change.

March 1, 2014 | 06:02 PM - Posted by nashathedog (not verified)

It's shocking how dumb some people are....

December 18, 2013 | 05:49 PM - Posted by Brett from Australia (not verified)

very comprehensive review Ryan, and you have made some good points why AMD could not have done a better job with these cards when they came out of the gate is worrying. These cards do create a lot of heat and as such should have better built in cooling.

December 18, 2013 | 05:59 PM - Posted by envygamer (not verified)

FAWK YEA am selling off my 290 ref!

December 18, 2013 | 06:07 PM - Posted by mikeymike (not verified)

AMD shut up and take my money!!!

December 18, 2013 | 06:08 PM - Posted by Anonymous (not verified)

Do you run your Skyrim tests with the official HD Textures DLC?

December 18, 2013 | 07:41 PM - Posted by Ryan Shrout

It's the DLC textures but no mods.

December 21, 2013 | 07:31 PM - Posted by BattMarn (not verified)

Ryan, I know it makes it harder to compare your results with other websites, but I don't know ANYONE who runs Skyrim on PC with no mods.

If you do what TekSyndicate do and run all their tests on a heavily-modded Skyrim, the game will push the GPU further. And so long as none of the mods are ENBs or anything, (so stick to texture and lighting mods) the performance should scale with texture quality, etc.

Might also make Skyrim more relevant as a benchmark, since in most of your latest reviews you've said how pretty much any new GPU can run Skyrim maxed out.

December 31, 2013 | 08:54 PM - Posted by Anonymous (not verified)

Personally, I wouldn't mind seeing these cards put through their paces with an ENB and DDOF. I noticed on one review - I think an Anandtech review - the AMD cards seemed to react worse to DDOF in Bioshock Infinite than Nvidia ones, and I'd very much like to see if that is a game-specific aberration or a trend.

The beauty of modded Skyrim as a benchmark is that you can test the cards for different kinds of advanced graphics/lighting features from mods and then compare it to vanilla Skyrim as a control.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.