8 GB Variants of the R9 290X Coming This Month

Subject: General Tech, Graphics Cards | November 5, 2014 - 12:56 PM |
Tagged: radeon, R9 290X, R9, amd, 8gb

With the current range of AMD’s R9 290X cards sitting at 4 GB of memory, listings for an 8 GB version have appeared on an online retailer. As far back as March, Sapphire was rumored to be building an 8 GB variety. Those rumours were supposedly quashed last month by AMD and Sapphire. However, AMD has since confirmed the existence of the new additions to the series. Pre-orders have appeared online and are said to be shipping out this month.

View Full Size

Image Credit: Overclockers UK

With 8 GB of GDDR5 memory and price tags between $480 and $520, these new additions, expectedly, do not come cheap. Compared to the 4 GB versions of the R9 290X line, which run about $160 less according to the online retailer, is it worth upgrading at this stage? For the people using a single 1080p monitor, the answer is likely no. For those with multi-screen setups, or those with deep enough pockets to own a 4K display, however, the benefits may begin to justify the premium. At 4K though, just a single 8 GB R9 290X may not provide the best experience; a Crossfire setup would benefit more from the 8 GB bump, being less reliant on GPU speed.

AMD’s 8 GB R9 290X’s are currently available for preorder: a reference version for £299.99 + VAT (~$480) and a Vapor-X version for £324.99 + VAT (~$520). They are slated to ship later this month.

Video News


November 5, 2014 | 01:20 PM - Posted by nathanddrews

A $200 premium for 4GB extra RAM on an EOL GPU? Good luck.

November 5, 2014 | 03:05 PM - Posted by YTech

$200 sounds about right for 4GB of DDR5 of RAM.

Although, I do agree with you Nathanddrews. At such high premium, a 8GB GPU with a short EOL doesn't seem worth it today.

Plus, Andre DeCoste suggest Crossfire will provide better performance for this particular GPU architecture. Double-NO on this one.

November 5, 2014 | 05:07 PM - Posted by Scott Michaud

The point was more that 8GB is not expected to provide too much of a help for 4K/multi-monitor with just a single GPU -- you are probably compute-limited, too. It will be when you start getting 2, 3, or 4 GPUs that you will more likely be memory-limited (at those resolutions).

The other direction is for games that require 6 or 8GB of VRAM because they are unoptimized ports from systems with 8GB of shared memory, but relatively low processing power (compared to an R9 290X).

November 7, 2014 | 10:50 AM - Posted by obababoy

Scott, well said. I totally disagree with Ryan. He thinks just because he didn't receive an 8GB card to review means it wont be needed...If Shadows of Mordor used 3.8GB of Vram on my 1080p monitor and like you said console ports are easier because they are using x86 and have 8gb vram, how is this not mandatory from here on out.

November 7, 2014 | 02:04 PM - Posted by aparsh335i (not verified)

It's not mandatory because we don't all play console ports...
If you don't play console ports then why would you want to pay extra for something you don't need?

At the same time, if you mainly do play console ports, then AMD Offering this 8gb card is fine I guess. My take on them offering this is that they are going after the uneducated crowd that thinks Memory = performance. The type of guys that don't know the model # of their video card but know how many GB of vRam it has.

The #1 game in the entire world is not a console port and does not need 8gb of GDDR5 vRAM.

November 10, 2014 | 07:59 AM - Posted by obababoy

What I am getting at is that venders are going to mirror game design now that the consoles are x86 and they have 8gb vram. It now costs less(mainly time wise) to make a game for all three systems because the architecture is very similar therefore ported games will happen often. Sure there are PC only titles but what about Crysis 3? Console port? I think you speak for the minority of PC gamers saying you don't play "console ports"..

btw, what is your #1 game haha.

November 5, 2014 | 03:49 PM - Posted by Anonymous (not verified)

Where do you get that the GPU is end of life? Last time I checked, which was about 2 min ago, is that the next series of GPUs is not due out for at least 4 or 5 more months. It will be EOL once AMD has stopped making them, not until then.

November 5, 2014 | 01:22 PM - Posted by Daniel Masterson (not verified)

ya that's some BS I will wait when 4k is the standard and 8GB is the standard as well maybe in like 2 or 3 years?

November 5, 2014 | 01:27 PM - Posted by MarkT (not verified)

Most bs iv ever seen is a markup such as this, they should just cancel the product.

November 5, 2014 | 04:12 PM - Posted by Ophelos

But you'll be paying around $700 for the GTX 980 8GB versions.

November 5, 2014 | 03:00 PM - Posted by annoyingmouse (not verified)

Are they getting a clock speed bump or something to help justify the price premium?

November 5, 2014 | 04:05 PM - Posted by arbiter

Doubt it, the gpu runs hot and power hungry as it is. I think most 290 series gpu only gets about extra 100mhz over base to maybe 1150 ish. Would have to massive binning of them.

November 5, 2014 | 04:17 PM - Posted by Ophelos

Arbiter, I'm really hating people like yourself always bitches about something running to Hot or how power hungry something is when in fact the AMD cards are not running as hot as you think they might be. I remember over 10 years ago when it was super easy to fry a graphic card, an hell those things used a ton of power also but noone bitched about it at all..

But now we got annoying a-holes like yourself who always try an finds the bad something just because they can.

So the next time u wanna bitch about something running to hot or something is power hungry, just go QQ in a corner somewhere so the rest of the world don't gotta listen to it... Geez, such annoying people these days.

November 6, 2014 | 11:28 AM - Posted by annoyingmouse (not verified)

That seems like a disproportional amount of hate directed at a rather uncontroversial comment.

November 5, 2014 | 06:23 PM - Posted by JohnGR

That 100 pounds difference in price between a 290X 4GB and an 8GB 290X will probably be more close to $100-$150 dollars than $160 or more.
In Europe thank to hi tax and many other parameters, euros and pounds are usually treated much more as dollars than someone would expect looking at the exchange rates.

November 5, 2014 | 08:45 PM - Posted by alkarnur

I can play Shadow of Mordor with textures on High on my venerable 7970, which require the 3GB of VRAM the 7970 has.

Now, textures can be set on Ultra (6GB VRAM required) without having to subject oneself to the daily robbery that the pricing of the Titan cards is.

November 6, 2014 | 03:32 PM - Posted by arbiter

i ran the game on ultra with HD pack on a 4gb card and it ran fine. Yea it used all 4gb of ram on the card but fps still stayed at 60fps.

November 5, 2014 | 09:09 PM - Posted by Terje_P (not verified)

They have already been out in limited capacity.

Here is A review:
http://www.computerbase.de/2014-05/sapphire-amd-radeon-vaporx-r9-290x-oc...

November 6, 2014 | 08:40 AM - Posted by Anonymous (not verified)

Here's another. Slower than the 970 at 1080p, 5% faster at 4K. 14% faster 4K performance than the 4GB OC 290X. Still $100 more than the 970. Meh.

http://www.tomshardware.com/reviews/sapphire-vapor-x-r9-290x-8gb,3977-6....

November 6, 2014 | 03:50 PM - Posted by arbiter

Yea when rumors of 900 series started, AMD should dropped an 8gb card. Kinda seems like desperate move toward the Ultra HD rez where their cards still have the muscle at.

November 9, 2014 | 02:30 AM - Posted by Anonymous (not verified)

Smart move.

900 series has a limited bus once you get into higher rez, AA along with multi monitors. If the trend of PC ports continues of slapping huge amounts of textures to make it stand out for PC versions then you'll need more memory to compensate for the crappy port job that's sure to be there. Less swapping will be smoother game play with more features activated.

Sapphire showed 8GB R9 290s at CeBIT
http://www.anandtech.com/show/7860/cebit-2014-sapphire-shows-two-r9-290x...

If at the very least they were planned and this allows them to get rid of Hawaii chips and highlight its advantages in those areas.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.