Review Index:

Frame Rating: AMD Radeon R9 290X CrossFire and 4K Preview Testing

Author: Ryan Shrout
Manufacturer: AMD

Battlefield 3

Battlefield 3 (DirectX 11)


Battlefield 3™ leaps ahead of its time with the power of Frostbite 2, DICE's new cutting-edge game engine. This state-of-the-art technology is the foundation on which Battlefield 3 is built, delivering enhanced visual quality, a grand sense of scale, massive destruction, dynamic audio and character animation utilizing ANT technology as seen in the latest EA SPORTS™ games.

Frostbite 2 now enables deferred shading, dynamic global illumination and new streaming architecture. Sounds like tech talk? Play the game and experience the difference!

Our Settings for Battlefield 3

Here is our testing run through the game, for your reference.

View Full Size

View Full Size

View Full Size

View Full Size

View Full Size

At 2560x1440, even though the R9 290X is faster than the GTX 780 in the single card results, when we double up the cards to CrossFire and SLI we see the GTX 780s take a small performance lead.  There also appears to be some more "hiccups" in the CrossFire results that would indicate some stutter.  Frame times are more consistent on the GTX 780 SLI results though the orange line representing the R9 290Xs in CrossFire are within reason as well.


View Full Size

View Full Size

View Full Size

View Full Size

View Full Size

At 4K, the R9 290X pair in CrossFire is pulling ahead of the GTX 780s across the board in our observed frame rate graphs.  The only questionable point is in the frame times graph, where the orange line is clearly showing more frame time variance than the blue line of SLI.  Still, it should be noted right away, that before the release of the R9 290X today, there was no AMD CrossFire solution that would work correctly as we see here. 

There are no dropped frames, no runt frames, etc. to be found in our first result! 


October 24, 2013 | 12:09 AM - Posted by Alpha1Omega (not verified)

Good read cant wait to see if the oems can find a better quieter cooler .

October 24, 2013 | 12:35 AM - Posted by Anonymous (not verified)

Shill on pcper, shill on. Keep those nvidia checks rolling in.

October 24, 2013 | 01:27 AM - Posted by Anonymous (not verified)

AMD needs to repair its problems on the older 7000 series video cards than introduce another new and more expensive AMD video card using no bridge for Crossfire. I have been duped by AMD Crossfire technology as an owner of their crapware.

October 24, 2013 | 01:47 AM - Posted by Anonymous (not verified)

Can you blame him ?

Nvidia basicly used him as a propaganda tool with F-CAT through out this whole time and now I'm sure they will ask him to only test Nvidia with G-Sync system only which makes F-Cat obsolete.

Hes got to milk it for all its worth in the mean time.

October 24, 2013 | 01:55 AM - Posted by Anonymous (not verified)

AMD's Drivers have always been crap. Truth hurts. I've owned both brands. The reality is AMD needs to fix their driver design team.

October 24, 2013 | 07:39 AM - Posted by Timmeh (not verified)

Keep on trolling, Anon.

October 24, 2013 | 01:56 AM - Posted by Scyy (not verified)

Did you even read the article? It talks about how AMD has fixed most of the issues it had with xfire on eyefinity...

October 25, 2013 | 09:40 PM - Posted by BlackDove (not verified)

What's really funny, is that AMD's newest and best GPU doesn't work properly in CrossFire, uses significantly more power, gets considerably hotter, is louder and performs worse in AMD optimized games than Nvidia's old 780.

This card is also ridiculously expensive for what it is.

October 24, 2013 | 02:46 AM - Posted by Funkatronis

Great articles, I'm glad to see AMD stepping it up. Right now I have a NVidia GTX 690 so i'm going to wait and see what GPU's late next year have in store. If AMD can keep pace with Nvidia in next year's card battle I may jump the fence again.

any rumor on a single PCB double R9 290X?

October 24, 2013 | 10:47 AM - Posted by Ryan Shrout

I wouldn't expect that in any reasonable time frame.

October 24, 2013 | 03:33 AM - Posted by Dr Evil

Good job once again. Now let's see if we'll get those fixes for my 7990 as well... I'm hoping/praying that they didn't want you to publish these results yet, because their full lineup solution is just around the corner :)

October 24, 2013 | 04:34 AM - Posted by RTR (not verified)

the 4K CF tests looks ok because AMD fixed multi display or because the 290X support 4K as one monitor ?

October 24, 2013 | 04:56 AM - Posted by Dr Evil

It's not about 290X supporting 4K as one monitor. The monitors itself drive the display with two controllers as there isn't any display out there that can drive 4K at 60hz on it with just one display controller. Next year or 2015 at the latest such controllers should become available and after that 290X can drive the display as one monitor.

October 24, 2013 | 05:52 AM - Posted by RTR (not verified)

ok ty.

so we can see CF problem as fixed as long as u dont play DX9 games... nice.

October 24, 2013 | 05:12 AM - Posted by BennJ (not verified)

Tahnk your for your review :) Great work as usual.

Are the 13.11 V5 drivers compatible with the 7000 series ? Can you run some test ?

October 24, 2013 | 10:47 AM - Posted by Ryan Shrout

Worth a shot!

October 24, 2013 | 09:17 AM - Posted by userbeneficiary (not verified)

Tks 4 the review ...
5760x1080 is the rez i want to know more details and compare,
also didnt find in any review so far about the recommended power supply for the R290x.

for now... It is my opinion that this card is a 7 out of 10, and it will problematic in many systems becouse of the temps and power consumption. no matter how good system it is in , its just to hot summer global warmming will have a blast, for sure.

Also ...i belive the core mhz flutuation will be also a problem for non water cooled systems.

AMD is catching up to nvidia in fps, but fps is not all that is important in my opinion, also the next generation from nvidia is around the corner.... so... nice try AMD

Your move Nvidia!

October 30, 2013 | 10:22 AM - Posted by Wade B (not verified)

I just read that Nvidia is dropping prices on gtx 780 by $150. Still, I'm looking hard at AMD once again. It feels like competition is brewing again.

October 24, 2013 | 09:33 AM - Posted by Squijji (not verified)

Serious kudos for being the first site with a proper Crossfire review. The results are very encouraging; let's hope this is a good indication for the future.

October 24, 2013 | 10:48 AM - Posted by Ryan Shrout


October 24, 2013 | 11:38 AM - Posted by lima (not verified)

Did you wait for Kepler cards to "heat-up" before you did benchmarks after nvidia introduced Boost?

October 24, 2013 | 12:40 PM - Posted by Anonymous (not verified)

I had a feelling that Toms Hardware wasnt showing 1440p resolutions on purpose(slightly short of elite award I would say) and it looks like until a driver update from amd that nvidia is still supeior at current generation and the new wqhd ips pls monitor resolution 1400p(these 1080p benchmarks are inline with toms hardware) at a higher framerate making the 780 the best mate with the overclockable korean monitors with the lowest frame variance providing the superior overall solution. Downscalining a larger than viewable resolution is nice but ultimately unusable in competitive gaming which needs frame speed over resolutions too high for ingame textures. I would like to see the benchmarks at very low resolutions like 720p for truely competitive ultimate speed comparisons at low graphics. Crysis 3 isnt going to look any better at 4k over 1440p its just going to run allot slower(unless screen size is your thing you arent going to get much more pixel density than 27" wqhd which already has near invisible pixels anyway). Maybe the next amd update(baring some soon to be released unforseen driver if this driver is released i hope pcperspective tests it against the latest nvidia driver) will be what to look for but for now its time to look at what Nvidia does for its 8 series. If I don't see enough power there I'm probably going to switch. That asus monitor is 4000 dollars last I checked and really I think people should wait for 4k when 4k oled comes down from the stratosphere. Then there will be 4k content and you will have an appropriate blacklevel and ridiculous true contrast ratio. I notice a little improvement with 4k remastered blurays on wqhd but its not worth the price of initial/prototype products unless your some rich guy with a need to show off and impress colleagues and investors(but it wouldnt prove you have much in the way of brains or that you are a true videophile).

October 24, 2013 | 12:41 PM - Posted by haroldf (not verified)

hi Ryan, congrats... :-)
in your opinion, the loss of the crossfire bridge means that the old 7000 series will always have issues and limits because of the bridge or the situation (drivers) will constantly improve anyhow?
and so, this xdma will be just adopted for some particular resolutions and setups? "AMD claims that this new XDMA interface was designed for Eyefinity and UltraHD resolutions"

the old 7000 series will be good with Crossfire at 1920x1080 res. in any case or is this a way from AMD to admit that the old technology has insurmountable limits?

October 24, 2013 | 12:58 PM - Posted by haroldf (not verified)

last question...please
crossfire technology is even related to the graphic engines features and concept and innovation (also engine release date, e.g. Unreal Engine 3 was first released about 10 years ago)
or these things are not related and is a drivers topic only?

October 24, 2013 | 01:32 PM - Posted by Panta

Temperatures ?

October 24, 2013 | 02:00 PM - Posted by Ryan Shrout

Check out the full review here:

October 24, 2013 | 09:44 PM - Posted by Panta


wonder how will it perform on average pc case in a hot day
in cross fire with all that heat, would it still be better performing then a 780/Titan?

October 24, 2013 | 02:53 PM - Posted by Anonymous (not verified)

Im not totally sure as I don't sure unreal engine forums but I think its updated or at least most of the big titles using unreal 3 modify the engine and add their own drivers and binaries to make a new modified engine each different from the others. I know the configuration files are all different and the more recent ones like Borderlands 2 and Bioshock Infinite have radically different cvars and game code functions.

October 24, 2013 | 03:08 PM - Posted by haroldf (not verified)

ok, thanks, but really i didn't ask if it's updated or not, (anyway, the core is modified or minor parts?) but if the engine is even related to the multigpu performance and smoothness... ;-) sometimes in patch release notes from developers I can see e.g. "multigpu optimizations" or something like that... so the question arises

October 24, 2013 | 03:11 PM - Posted by castlefox (not verified)

How does this card work in Linux ?

October 24, 2013 | 03:14 PM - Posted by brett from Australia (not verified)

Very good write-up Ryan. Comprehensive review showing some impressive numbers there, this GPU looks to have the goods.

October 24, 2013 | 10:27 PM - Posted by Havor (not verified)

Love the review, i waited buying the Titan, to see what AMD A-game would be, and or what it would to prices, and i can say i am happily surprised!

Just one question, why the 4K testing, i know the ASUS PQ321Q is a hell of a monitor, but come on, a $3500 monitor is not something a lot of people are going to buy.

Why not a more real world setup, of 5760x1080 or pref 1200, you said it was no problem using Eyefinty in your test setup with the capture card.

The reason why i say this, is that i think there properly will be real world differences between 6m pixels and 8m, and for at least the the coming 2 years, 5760x1080/1200 will be by far the most used Eyefinty gaming.

I just don't see the benefit of using the PQ321Q for Eyefinty testing over a three monitor setup, other then it takes less space on your test bench. ^_^

October 25, 2013 | 01:37 AM - Posted by Dr Evil

There are plenty of 4K TVs in the market and their market penetration is accelerating quickly. 60hz support will soon be there across the board, so 4K gaming is not a pipe dream anymore and imo will become more relevant than 3 display gaming.

October 25, 2013 | 11:28 AM - Posted by Havor (not verified)

I am not saying 4K game is not coming, but it will always stay a niche, as 32'' 4K monitors are gone be the new 30'', and i am saying its at least 2y away before they will be, and by then we have new and improved hardware that replaces the 290X.

So imho for now its more or less irrelevant, whit the next gen cards its propperly something you wane have tested, but still think more gamers wane get a Eyefinty setup for $500 then a 32" 4k for $1000~2000.

But thats just me!

October 25, 2013 | 02:18 PM - Posted by Havor (not verified)

I am not saying 4K game is not coming, but it will always stay a niche, as 32'' 4K monitors are gone be the new 30'', and i am saying its at least 2y away before they will be, and by then we have new and improved hardware that replaces the 290X.

So imho for now its more or less irrelevant, whit the next gen cards its propperly something you wane have tested, but still think more gamers wane get a Eyefinty setup for $500 then a 32" 4k for $1000~2000.

But thats just me!

October 25, 2013 | 11:50 AM - Posted by Dr Evil

I want to get a 65" 4K TV for 3-4000$ with good 60Hz support, and that is NOT far away and imo is very much preferable over 3 small displays. yes it's also more expensive, but a large screen is good for many other things as well, for example movies, so the higher price can be somewhat justified.

October 26, 2013 | 12:18 AM - Posted by Havor (not verified)

How are you gone fit a 65" on your desk?

For gaming i rather have the benefit of Peripheral vision you get from Eyefinity, like in the real world, as games are build to resemble real life.

Al game's use a more or less 2D+ world ware the third dimension is used to go up or down hills and building, but your still moving more then less horizontal.

So whats the benefit of having a High Def 16:9 window even on a 32" 4k, if you can have 58:9 or 58:10 view (3x16=58), the same way your eyes work in the real world.

October 26, 2013 | 03:38 AM - Posted by Dr Evil

It won't be on a desk, but on a TV stand. I have dedicated space for it. With regards to your example 4K isn't kind of 16:9, but 32:18 instead. Remember I'm talking about a 65" TV, which essentially is 4 32" 1080p monitors without bezels covering the screen.

The high resolution of 4K allows one to move very close to the screen and thus allowing very large horizontal and vertical field of view and a massive perceived screen size. Watch the 65" from 4-5 feet away and it's about the same as the first few rows in a modern movie theater. This is enough of horizontal view imo, at least until high res Oculus Rift, where also 4K will be the ultimate sweet spot few years down the line.

edit: Just wanted to point out more that Oculus Rift uses a 16:10 screen at the moment, so covering your field of view is not about the screen aspect ratio, but how that screen is implemented to your field of view.

October 26, 2013 | 02:26 PM - Posted by snook

Ryan "R9 290X is the best single GPU avaiable".
random RERE: "shill on, keep that Nvidia money coming in"
snook: someone needs to google shill.

something I feel is being overlooked. the 290X was down clocking to ~840Mhz and was beating the Titan and 780
with, correct me if I'm wrong, boost 2.0 working. They had an average of about ~200Mhz over the 290X.
that is a seriously brutal result for Nvidia.

currently setting the money aside for my george foreman grill,
errrr, R9 290X.

thanks for the videos and articles PCper

October 27, 2013 | 06:11 AM - Posted by Fergdog (not verified)

I don't think it makes sense to compare clock speeds if at the higher clock speeds the Titan/780 are still consuming a fair bit less power. The 290X can't have the clock speeds of the 780/Titan or else it would melt but then again it doesn't need to. You have to take more into account than clock speed to compare efficiency.

October 27, 2013 | 06:11 AM - Posted by Fergdog (not verified)

I don't think it makes sense to compare clock speeds if at the higher clock speeds the Titan/780 are still consuming a fair bit less power. The 290X can't have the clock speeds of the 780/Titan or else it would melt but then again it doesn't need to. You have to take more into account than clock speed to compare efficiency.

October 27, 2013 | 06:13 AM - Posted by Fergdog (not verified)

It is strange that the 290X consumes more power with a lower clock speed and smaller die though. The 780 isn't using near it's full die tho so size of the die operating is similar to the 290X. Must be the bigger memory controller etc. I don't know.

October 27, 2013 | 07:56 PM - Posted by Roger Barrett (not verified)

I think sometimes the Green Team/Red Team thing gets a bit out of hand. Personally I want both companies to make good products so that I always have a good choice.

Think what it would be like if either company stopped making GPUs. The remaining company would simply double or treble prices and put just enough effort in to stay comfortably ahead of Intel's integrated graphics ( which would not be that hard ).

I also think that most people underestimate how hard it is to continually produce good GPUs. Every time the process nodes shrink there is a new set of design parameters waiting to trip them up. In the current generation, This generation, AMD are obviously running hot due to leakage issues, but NVidia have also had duds with most of their part designs. Of their original set of parts, only GK104 was successful, and thankfully for them it was really quite good.

October 30, 2013 | 03:57 PM - Posted by Particle (not verified)

I know this is kind of out in left field in 2013, but could you do a quick test to see if older generation cards like 6970s pace their frames properly with the betas we've been given these last few months? Much of the current focus is understandably on the 7000 series, but many of us are still interested despite using older hardware. :)

November 5, 2013 | 11:58 PM - Posted by Anonymous (not verified)

Although there is some intelligent incite here, this has to be the most biased website I have ever visited. There is no real substance to the claims made here and it just seems like you guys are talking out of your a** only informed by what nvidia tells you.

Excellent journalism guys!

November 6, 2013 | 05:48 PM - Posted by Jack Parkinson (not verified)

The problem with the 780's at 4k is the 3gb of v-ram, why do you think the titans have 6gb of v-ram? 4gb of v-ram is about right for 4k but if you cranked everything up to the max in crysis 3 (8xMSAA) then that 4gb of v-ram wouldn't be enough. I know this for a fact as on my 5760x1080 setup which is basicly 3k, crysis 3 maxed out (8xMSAA) uses a massive 4.7gb of vram.

November 10, 2013 | 07:33 PM - Posted by drbaltazar (not verified)

Great job pcper!can't believe the amount of work required to pull a fast one on AMD !(grin)I guess AMD didn't feel they were ready .uhd looked close to ready for review of pcper to me . as other have mentioned tho . I wonder about the 7xxx serie . anyhow , I suspect window have a big say in some way on how gaming gear perform . the .main issue these days is all the widening of pipe .like multithread .message signal interrupt , various cache etc etc etc .and it doesn't take a big thing to mess everything . and with ms just having adopted the timing fix a couple of month back (invariant tsc) a lot of cache won't be needed anymore . anyhow ! Keep up the good work guys .

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.