Review Index:
Feedback

The AMD Radeon RX Vega Review: Vega 64, Vega 64 Liquid, Vega 56 Tested

Author: Ryan Shrout
Manufacturer: AMD

Clocks, Power Consumption, Overclocking

Before we dive head-first into the world of benchmarks, it is important to look at how the technology and cooling capability that AMD built around the RX Vega graphics cards turned out. Let’s start with a look at how the measured clock speed of the three new cards, the Vega 64, Vega 64 Liquid, and Vega 56 compare.

View Full Size

View Full Size

The Vega 64 Liquid has the highest sustained clock speeds and averages 1663 MHz over the course of our Unigine Heaven loop. The standard Vega 64 averages about 150 MHz lower at 1513 MHz and the Vega 56 brings up the back with an average clock speed of 1433 MHz. If we compare that to the rated clocks of the card from AMD, which we now know are supposed to represent the “typical” clocks users will see while gaming, all three are lower than expected. The Vega 64 Liquid is rated at 1677 MHz, with our results just behind that by 14 MHz. That difference is within a reasonable margin.

The Vega 64 standard card was rated at 1546 MHz, and our 1513 MHz result is 33 MHz lower. The Vega 56 is rated at 1471 MHz but in our testing, we saw it 38 MHz lower. AMD is closer than they have been and I still think the move to reporting an “typical” clock speed is a welcome change. Hopefully over time they can tweak this process to rate more accurately.

A delta of 150 MHz between the RX Vega 64 cards in air- and liquid-cooled scenarios is a substantial difference and one that will without a doubt result in performance increases. Though the price is $200 higher, the liquid cooler can keep the power hungry and hot Vega 64 GPU under control and maintain clock speeds that are nearly 10% faster on average.

Detailed Power Consumption Testing

One of our architectural concerns when testing the Radeon Vega Frontier Edition was power consumption. While we don’t expect things to change much for the RX Vega 64, I was very curious to see how the new Vega 56 would stack up.

If you don’t know how we measure power, check in on one of our previous reviews. Our methods measure power directly to the graphics cards through the PCIe bus and the auxiliary ATX power connections. This is not a basic “at the wall” measurement.

View Full Size

View Full Size

In both The Witcher 3 and Rise of the Tomb Raider, there is a distinct difference between the RX Vega GPU and the GeForce Pascal-based products. The RX Vega 64 Liquid uses 350 watts in its default configuration while the standard RX Vega 64 consumes around 290 watts. The Vega 56 more or less hits its expected 210-watt TDP.

On the NVIDIA side of things, the primary competition for the RX Vega 64 is the GTX 1080 that uses 170-180 watts under a full gaming load. The GRX 1070, which is the target of the new RX Vega 56, consumes around 140 watts in both games here. The GTX 1080 Ti sits right at its 250-watt power draw rating, but will outperform the RX Vega 64 and GTX 1080 by a wide margin.

Power consumption differences show the efficiency of the performance of each of these GPUs. Assuming the standard RX Vega 64 matches performance with the GTX 1080, the Pascal GPU can offer similar performance but at 37% less power. While enthusiasts and gamers are well known for their ability to overlook power draw, running at lower power means you can run at lower temperatures, lower noise levels, and possibly fit the product in smaller spaces. The RX Vega 56 uses 33% more power than the GTX 1070, so the same analogy would apply if performance is equal. We’ll dive more into that on our performance analysis pages.

Overclocking RX Vega

Overclocking these three cards produced very different experiences. Below is the graph of my peak overclock and resulting clock speeds for each.

View Full Size

View Full Size

Starting at the flagship level with the RX Vega 64 Liquid, I was only able to squeeze another 15 MHz out of the card. In fact, even increasing the clock speed by 1% in the Wattman slider would result in a crash or a black screen, even with the temperature maxed out at 70C and the power target slider moved to +50%. Therefore, the differences you see for the RX Vega 64 Liquid in the graphs is the result of adjusting the power slider only – pretty disappointing. Clearly AMD has pushed the Vega 10 GPU to its limits already to get the Vega 64 Liquid to its current performance levels.

The standard RX Vega 64 card was able to run at 3% higher clock speeds in Wattman, resulting in an average clock speed that is 86 MHz higher than stock settings. This includes a +50% power target adjustment.

View Full Size

Overclocking on RX Vega 56 - Note that GPU-Z still reports the clock speed as the MAXIMUM, not the "typical." This will be updated in the future I'm told.

The RX Vega 56 saw the biggest jump, accepting a 7.5% clock speed increase with the same +50% power target shift, resulting in a 94 MHz increase in average clock speed. Though the card remained stable even moving the clock slider to +10%, clock speeds never seemed to go above what we show here.

Another interesting trait that I found, that may be a result of the current GPU-Z implementation for Vega, is that the clock speed steppings when overclocked were as small as 2-3 MHz. In stock settings though we found those steps to be much larger, on the order of 100 MHz in some case! I don’t know what in the design would be causing this kind of change, but it would be interesting if overclocking the card slightly allowed for less variance in frame times by lowering the clock speed steppings between states.

View Full Size

The power results of this overclocking deserve discussion as well. The RX Vega 64 Liquid card was pushing 440 watts with the power slider adjusted over, even though the performance increases we saw were near zero. The RX Vega 56 (grey line) moves the 210-watt card up to 310 watts, near the 50% increase we would expect with the power slider change on a product with headroom.

The air-cooled RX Vega 64 had similar problems as the Vega Frontier Edition in regard to temperature throttling. Leaving the fan settings untouched, we saw frequent and regular throttles from 340 watts of power draw down to 200 watts, which caused clock speed dips and noticeable stutter. By manually moving the fan speed up to 3400 RPM, which is very loud, the card could run unthrottled at those settings, providing more consistent performance and lower power consumption to boot (~10 watts).

Though a single sample of each does not provide enough information for me to make any strong statements on the overclockability of these RX Vega family, my assertion is that the RX Vega 56 will have a high amount of headroom due to its out-of-box power set to 210 watts, even though it is essentially the same GPU and thermal solution as the RX Vega 64. Vega 64 on the air-cooled variants has headroom to reach higher clocks nearing the liquid-cooled option if you are willing to put up with the noise levels of a very loud blower fan. (Partner cards with better coolers might work around this.) As for the RX Vega 64 Liquid, it seems that AMD is already getting the Vega 10 GPU to its near peak capabilities, so I don’t expect that much headroom for it even with the fancy cooler design. That 150 MHz delta between the air and liquid versions seems to be near the pinnacle of Vega 10 in its current state.


August 14, 2017 | 09:04 AM - Posted by Henry (not verified)

So what is the mining performance?

August 14, 2017 | 09:06 AM - Posted by Ryan Shrout

It's not good as of today. ~35 MH/s.

August 14, 2017 | 09:24 AM - Posted by Hishnash (not verified)

did you manage to find a new optimized mining code, given mining code it typically adjusted for each GPU and given there are new instruction sets just for hashing on this card using a code compiled for RX580 is not a good benchmark?

August 14, 2017 | 09:40 AM - Posted by Ken Addison

we used Claymore Dual 9.8 which is optimized for Vega, as well as ethminer

August 14, 2017 | 10:42 AM - Posted by Prodeous

That is really good news.. hope it stays this low. giving us gamers and blender(GPGPU) users a chance to actually get one ...

August 14, 2017 | 11:14 AM - Posted by Ryan Shroud (not verified)

This card is a dog. More than a year late. A power guzzling POS. Only an AMD fankid would buy this trash.

August 14, 2017 | 11:34 AM - Posted by DocSwag (not verified)

Or someone like me who has a freesync monitor.

August 14, 2017 | 12:19 PM - Posted by Just an Nvidia User (not verified)

Yeah AMD zealots will still defend it saying incomplete drivers and etc. They had over a year longer to come out with product that is over clocked near maximum (to just match 1080 more or less). Definitely way beyond the efficiency of the architecture. It needed HBM2 or consumption would have been worse or maybe couldn't have matched 1080 even.

Could have delayed things a little longer and polished it but crap (GCN is outdated) doesn't shine up very well. LOL

August 15, 2017 | 06:39 PM - Posted by StripesOnLinux (not verified)

You are a rather Daft one not to see that even Nvidia's SM/shader based sorts of designs have not changed much generation to generation like AMD's GCN keeps getting it's share of improvements. What AMD lacks currently is the funding to come up with a mobile first sort of base GPU micro-arch design like Nvidia bagan with Maxwell's mobile first design that is different from Nvidia's compute/professional designs. AND lacked the funding for 2 seperate base designs and nobody can not change that fact.

So Nvidia's basic SM/shader designs are similar to AMD's GCN/NCU sort of shader designs in that tweaks/featres are added each "generation". Just because Nvidia's marketing uses an entirely different famous scientist's name does not imply any radically new GPU micro-arch simply with using a completely different name for each "new" generation.

And the GTX 1080Ti's 88 ROPs are directly responsible for those higher FPS metrics compared to AMD compute hevy Vega 64 design. So looking at Vega 56's 210 Watts
and 64 ROPs(same number as the GTX 1080's 64 ROPs/and Vega 64's 64 ROPs) should tell you that if Vega 56 had just a little more compute stripped out that maybe it could be clocked higher and get damn close to the 1080's performnace metrics. Vega 56 with some AIO water cooling may do just that and get Vega 56 damn near the GTX 1080 in performance.

GCN/shader design is not Outdated, it's just a name, same for Nvidia and its Shader designs. And all AMD really needs to do with Vega is keep up with removing any excess compute out to have a "Mobile First" design, because that is all Nvidia really did with Maxwell.

AMD's Vega GPUs are power hungry because Shaders/cores are power hungry animals, same for Nvidia's shader cores. And to make a shader core able to run with those higher clocks requires deeper ALU/INT/FP piplines so boht AMD and Nvidia have to use more transistors in their high clock speed GPU designs.

GCN has nothing to do with shader counts or shader counts relative to ROPs/TMUs and that's where Nvidia gets its lead by stripping out shader cores and keeping the ROP/TMU to total shader cores ratios in check by providing just enough shaders to serve the ROP/TMU flow and no more than that.

Vega is not a gaming only GPU design it's a compute design also. And some of that extra compute helps with DX12/Vulkan games that are designed to make use of the extra shaders resources on Vega. It helps with coin mining also and that's a GPU sold for whatever reason towards AMD's increasing revenues to match those Zen/Epyc revenues that are about to dwarf AMD's Gaming Only focused market/revenues is size!

I'm all for GPU compute usage putting gaming only usage for GPUs on the back burner and with DX12/Vulkan that back burner may still net increased gaming performance with that compute becoming necessary for gaming also along side those ROPs/TMUs!

August 14, 2017 | 12:41 PM - Posted by Power (not verified)

Or someone like me wants open source driver.

August 16, 2017 | 12:53 PM - Posted by Jefffrey Hramika (not verified)

Who cares Nvidia with Nvidia Founders Tax and Gsync monitor prces still a better value/performance.

August 16, 2017 | 07:34 PM - Posted by Just an Nvidia User (not verified)

Especially since the news is now if you want a non package Vega black it will now be $100 more. The original price was limited to launch only and prices increased to new MSRP after an hour when initial supplies ran out.

Not a good value at all now. Fanboys and miners get your wallets out.

As of now with new mining driver from AMD Vega 56 can get up to 41MH in Ether

August 28, 2017 | 11:49 PM - Posted by Streetguru2 (not verified)

Vega 56 is the only one worth buying and only at $399, so it wins for now but no one can actually buy one. 1070s are still well over $400 right?

You should also 100% undervolt Vega to improve performance. Hopefully AIB cards are actually available as well in September, it's going to make Vega look far better, not sure why AMD still bothers to release terrible blower cards that make their cards look really bad.

August 14, 2017 | 03:17 PM - Posted by MineAndDine (not verified)

No matter the GPU Video memory size requirements the ethereum mining folks may require for the hashes/hash tables in the future, that Vega HBCC/HBC using the HBM2 as a last level cache IP and 512TB of Virtual memory addressing in Vega will be hard trip up with any hash table sizes larger than the HBM2's actual VRAM size on any Vega SKUs! Vega can spill over it's HBM2 cache into/out of main system memory via its HBCC/HBC technology to fill up system DRAM space, and do so at a relatively little degradation in hashing performance. The mining code will love that HBCC/HBC-HBM2 IP in short order, so get Vega while you can stocks will be in short supply.

August 14, 2017 | 12:23 PM - Posted by Just an Nvidia User (not verified)

Who started the rumor that it was 70-100mh? Probably someone who had AMD stock and wanted to unload it before it tanked. Do we have to wait two years for this to materialize as well? Driver magic. LOL

August 14, 2017 | 04:21 PM - Posted by DaFutureIsPlasticSays (not verified)

Not really true, Vega only needs to get it M-Hashes/Watt metric a little better and undervolting/underclocking will fix that power usage metric PDQ!

They have already been tweaking on Vega FE those Minier 49ers with the software chops to take advantage of Vega's new IP, and that software development process takes a little time! But, in short order, those mining hash browns will be as delicious as gamer tears!

August 14, 2017 | 04:47 PM - Posted by Anonymouse (not verified)

How do you think the HBCC will improve performance with new drivers?

August 16, 2017 | 09:12 AM - Posted by ThatzSomeComputeYouGotThere (not verified)

This entire article is a good read on Vega's compute performance, and it shows Vega 64 beating Quadro in some interesting ways and consumer Vega offers (1/16 DP FP compute ratio of its FP32 rate) vs Pascal's (1/32 DP FP compute ratio of its FP32 rate) compute. And That NVIDIA Quadro P6000 SKU has 24GB of V-memory.

[See the LuxMark chart]

"Well, well, well. Would you look at this? AMD’s Radeon RX Vega 64 falls just behind the Radeon Pro WX 7100 in Cinebench, but it soars past the same GPU in both LuxMark and V-Ray. In fact, I should stress that your eyes don’t deceive you: Vega 64 really does beat the entire lineup in LuxMark. That includes beating out the $5,000 Quadro P6000."
[See page 2 titled:"Rendering: Autodesk 3ds Max, Cinebench, LuxMark & V-Ray Benchmark"](1)

At the Techgage website concerning mining potential the author states:

"Hot damn. I feel like these kinds of gains are those that AMD should promote out the wazoo. So many reviews posted today are likely to paint a rough picture of this card’s gaming performance, but on the other side of the fence, compute performance on Vega quite simply kicks ass. The results here may be able to give an impression of Vega 64’s future mining performance. Mining benchmarks you’ll see around the web in other launch reviews will show an edge over a top-end NVIDIA card. I would not be surprised if AMD optimizes its driver sometime in the future to vastly improve mining performance. Vega should technically be better than the 30MH/s you’ll see reported today, based on all of the compute performance seen here."(1)

[Go to the page titled:"Sandra: Cryptography, Science, Finance & Bandwidth that is on page 5 of the article}

(1)

"A Look At AMD’s Radeon RX Vega 64 Workstation & Compute Performance"

"by Rob Williams on August 14, 2017 in Graphics & Displays"

https://techgage.com/article/a-look-at-amds-radeon-rx-vega-64-workstatio...

August 14, 2017 | 09:07 AM - Posted by Searching4Sasquatch (not verified)

All that hype and waiting was for THIS? Gamers have had this performance (or faster) for more than a year already. I can't wait to see how long it will take the AMD fanboyz to start up with "Just wait until Navi!"

August 14, 2017 | 09:41 AM - Posted by Benjamins (not verified)

Well this is not good for the GPU market, it means Nvidia don't have to cut prices or release new cards.

August 14, 2017 | 10:13 AM - Posted by CB

+1

I think this is what frustrates me the most. Nvidia won't be forced to compete because AMD failed.

I guess expecting AMD to be competitive with both Intel and Nvidia is too much to ask...

August 16, 2017 | 03:30 PM - Posted by IdoNotSeeFailureAtALL (not verified)

Go look at Vega's compute benchmarking against even a Titan XP, with the Titan XP even having more VRAM, and you can see where your worried assesment is wrong. Gaming is not going to be AND's most profitable usage for the Vega GPU Micro-Arch, and AMD's Epyc sales/revenues will eclipse AMD's gaming only sales in short order. [see the Techgage website refrences in some of the posts on this page for some interesting Vega 64 compute benchmarking results against even the Quadro P6000-24GB SKU]

AMD's does not even need to outright beat Nvidia in the top end DX11 gaming metrics as AMD's mainstream Polaris and now high end Vega SKUs are really showing their form in the DX12/Vulkan optimized titles. But gaming markets are not really as interesting as the CPU/GPU compute and AI markets where no matter HBM2's current costs those SKUs will command the highest markups and that professional market will get the Cherry Picked Vega Dies with the best thermal metrics.

You should also note that Those Cherry Picked Vega 10 dies will also be clocked lower in the professional GPU SKUs and that Vega 10 will do very nicely on that compute performance/watt metric what with Vega 10's massive amounts of Shaders providing plenty of higher DP FP and SP FP MFlops/GFlops, and for Vega 16/8 FP GFlops also! Pascal's DP FP rate in only 1/32 its SP FP rate while Vega 10's DP FP rate is 1/16 its SP FP rate!

AMD's management told Raja that they needed a Vega Compute focused GPU micro-arch to complement AMD's Zen/Epyc CPU micro-arch and that's waht Raja produced, and Vega games pretty well also. AMD's still stands to gain performance on its Vega SKUs now that The Vulkan/DX12 graphics APIs are being used for more and more gaming titles. So look for Nvidia to be left, like Intel, scrambling. Only in Nvidia's case, it will be to get that compute ability back into its consumer GPU SKUs. That AMD/Nvidia Async-Compute debate will be raging again with each and every new fully DX12/Vulkan optimized title that becomes available.

The only market where AMD needs to maybe create a mobile first type of GPU/Maxwell like GPU micro-arch is for mobile/laptop market where battery power needs conserving and thermal headroom is lacking. But AMD's Epyc revnues will give AMD enough by themselves to fund AMD/RTG's creation of a more enegry efficient Vega/mobile GPU micro-arch variant.

Nvida's JHH is about to be able to raise its prices as AMD appears to be wanting to set a range of wholesale prices its carges its Retaile Channel partners. This is so AMD can benifit more from the spot demand pricing that the retailers have been mostly profiting from at AMD's expense! So Nvidia's prices are going up as are AMD's prices until the miners have their needs met! And AMD has R&D and other costs to amortize as quickly as possible so AMD will have more pricing latitude later when GPU demand pricing relaxes and GPU prices fall.

August 14, 2017 | 11:17 AM - Posted by zme-ul

to the contrary, it's very good for the market because it will be the 1st time in a very long time that GPUs will have a larger life span, and that's very good for the consumer considering the price premiums we had to pay for Pascal

also, nVidia expressed no intention to launch consumer Volta this year - taken from the recent earnings call

August 15, 2017 | 11:26 AM - Posted by Too-Old-Yet-Too-Young (not verified)

Question for people who were around during this time: Was it really better in the late 90's when the new GPU that you bought would be outdated and obsolete by the years end, or now, when a three year old GPU is still regarded as high end and quite capable.

Certainly as a reviewer, having new stuff out every 6 months that dwarfs everything out previously is great and a lot of fun, but I'd imagine that to a gamer, it's frustrating.

August 15, 2017 | 06:20 PM - Posted by Anonymously Anonymous (not verified)

It's much better now for the consumer when you buy something and it lasts a long time.
I got a 7970 back at the early part of 2012(reference design card) and ran it over 4 years for 1080p gaming until I got the funds to get a 1440p, 144hz VRR monitor and compatible GPU. Otherwise I'd still be running it, still a great card for 1080p gaming.

August 16, 2017 | 05:49 PM - Posted by rgray318 (not verified)

Exactly this, now NVIDIA themselves are saying they probably wont be releasing Volta for gamers until late 2018 (unless AMD drivers can drastically improve Vega of course). If we all thought Vega was late, lol just wait for Volta, NVIDIA will rewrite the book on being late with their next architecture. No sense at all dumping millions of dollars into a new GTX architecture that you don't even need. Thanks to AMD's lack of extreme performance and high cost power consumption NVIDIA can keep on asking multi thousands of dollars for their Pro Volta cards and keep the aging 10XX series around for much much longer. Not only that think about this, what if NVIDIA decides to just drop the process node down on their current Pascal, say down to 12nm or 10nm and keep the same prices? LOL, NVIDIA knows how to win and knows how to keep the market anti-consumer as long as they want. AMD is being pushed into being anti-consumer and not only that it looks to me like AMD is pushing compute performance over anything else (good for them actually). If it weren't for the fact we need to limit anti-consumer practices I would be purchasing the 1080, but I can't do that knowing AMD is the one who needs the money more in order to try and keep competition strong. It's all about having competition in the market space. Without competition between companies, we are all going to be BIG time losers. That absolutely can not be allowed, PERIOD.
Here is the old Volta road map (before it was changed) which says 2016, lol. Now its going to be late 2018, especially for us gamers. All thanks to AMD not having enough money to pull off a decent Vega architecture for gaming. http://cdn.wccftech.com/wp-content/uploads/2017/04/05791082-photo-nvidia...

August 14, 2017 | 11:57 AM - Posted by Just an Nvidia User (not verified)

Not long at all look at a post below. Wait for Navi. LOL

Vega leaks showed it to be an undeperformer compared to Pascal. No big surprise here.

There is no disappointment in all of AMD land for there are plenty of fanboys that will still buy one of these.

Hurry up and buy RX Vega now at today's prices (as much as Nvidia cards) but less performance per watt(over 33%). That fine wine drivers might have you there in 2 years but by then the card will be 50% off of retail.

If you got to have Vega wait a few months until they rebate it some.

August 14, 2017 | 01:04 PM - Posted by quest4glory

Most people don't buy the 1080 Ti anyway. So, guess what? You have a situation where gamers can now choose from competitive products in all ranges below the high-end 1080 Ti.

They did not have that choice before today.

Prices will probably go down as a result of competition.

Competition is almost *never* a bad thing.

August 14, 2017 | 09:11 AM - Posted by CB

Gold Award? Really? Is that the PC Perspective participation trophy?

Newegg is already charging $100 more than MSRP.

Complete effing fail.

I can't believe I actually considered replacing my 1080.

August 14, 2017 | 09:11 AM - Posted by Ryan Shrout

The Vega 56 got the award, and nothing has been IN stock yet!

August 14, 2017 | 09:18 AM - Posted by CB

https://www.newegg.com/Product/Product.aspx?Item=N82E16814202300&cm_re=r...

64's in stock as of posting.... $599.99 (Read in Josh doing the voice thing)

Will be interesting to see how much markup is on 56's when they release later this month.

August 14, 2017 | 09:33 AM - Posted by GPU Fan (not verified)

Those $599 are the Packs (see the free games listed)

Its $499 without the Packs Here

August 14, 2017 | 03:02 PM - Posted by MineAndDine (not verified)

Those Packs will not matter to those MAD Hatter Miners, so look on eBay for those very Packs' parts not needed for mining to go up for bid. There are even Nvidia fans looking to get a Vega 64 SKU to sell that at enough markup to the Mad Hatter Miners and afford the hardened Nvidia Fan a GTX 1080Ti on that little bit of wheeling and dealing action on some Vega Mad markup individual Vega sales to the Mad Hatter Miners.

It's a Mad Mad Mining dash in that MAD MAD GPU coin mining world that may even net Nvidia Itself some extra indirect sales via some Vega back room markup deals from some Nvidia fans with their Vega purchasing reservations already in place in order to make enough monty off of Vaga to get their GTX 1080Ti fix filled for less money in that sort Mad Hatter Mining market dealings.

August 14, 2017 | 09:20 AM - Posted by terminal_addict (not verified)

Newegg is showing Out of Stock for all but one part. Did they already sell out, or they just haven't received yet?

August 14, 2017 | 09:25 AM - Posted by Crono (not verified)

They sold out. Probably within a minute, but definitely after 5 minutes.

August 14, 2017 | 12:45 PM - Posted by Clmentoz (not verified)

What are your thoughts on a dual RX Vega 56 on one PCIe card variant with maybe not as high a core clock, and the two RX Vega dies communicationg via the Infinity Fabric protocol rather than and PCIe protocol method. That Infinity Fabric on the Zen/CPU cores appears to support more cache coherency capability among the Zen/CPU processor cores across dies/CCX units. So the Infinity Fabric and 2 Vega 56 dies on the same PCIe card or on a Dual GPU MCM type of arrangement and the GPUs having a more unified HBC/HBM2 cache sharing arrangement managed by the GPU die's HBCC controllers able to link up logically via the Infinity Fabric IP.

August 14, 2017 | 01:53 PM - Posted by Vector3 (not verified)

So, Vega 56 gets the gold...

What does Vega 64 get? Bronze? Did it even make it to the medal rounds?

August 14, 2017 | 01:40 PM - Posted by Spunjji

I call bullshit on your post. What were you going to replace your 1080 with, (stop it with making Freud seem correct)? There is no serious reason to consider replacing a card that good.

August 15, 2017 | 12:02 AM - Posted by Thatman007

Vega 56 is complete trash. 300W power consumption and terrible performance. Wonder how much AMD had to pay to get that award.

August 15, 2017 | 08:37 AM - Posted by Anonymous156 (not verified)

Vega 56 used around 200w. Might want to get your eyes checked.. I suppose the 1070 also has terrible performace then..

August 14, 2017 | 09:21 AM - Posted by remc86007

Is the Vega 56 overclock being artificially held back? I don't understand why it can't at least match the average clocks of the Vega 64 (air). It seems, if anything, it would clock a bit higher than the 64. Was AMD worried about their $400 part being within 3% or so of their $500 part when overclocked?

Ryan, if you have the time, please do more Vega 56 overclock testing, especially once partner boards are available!

August 14, 2017 | 01:14 PM - Posted by remc86007

Answering my own question after reading the answer elsewhere. Apparently the 56 is hard limited in the bios to 300W draw. If that limit could be removed (hopefully in partner boards) we could be looking at 1080 performance for $400.

August 31, 2017 | 01:43 PM - Posted by Stefem

At more than 300W :)

August 14, 2017 | 09:28 AM - Posted by Humanitarian

Ugh, so much for newegg and amazon selling 64 air at 499.

literally no reason to buy at their 599 price.

August 14, 2017 | 09:38 AM - Posted by Joseph Taylor (not verified)

So Newegg Canada has Vega RX 64 priced $200-300 CAD above AIB GTX 1080. AMD likes to talk a good game about prioritizing gamers above miners but they would sell effectively 0 cards (in Canada at least) when they are priced like 1080Ti if it weren't for miners.

August 14, 2017 | 09:43 AM - Posted by Dark_wizzie

The 6 in the Witcher 3 results page at the end of that page says Vega is 6% faster than 1070 doesn't have the percentage sign.

Your review will now 0.000000000001% better once it's fixed! :P

August 14, 2017 | 10:53 AM - Posted by Ryan Shrout

LOL will fix!

August 14, 2017 | 09:49 AM - Posted by Searching4Sasquatch (not verified)

https://media.giphy.com/media/26n6Ocpiegpl3wXCg/giphy.gif

August 14, 2017 | 09:56 AM - Posted by Lucidor

The comparison table on the first page has a the wrong memory bandwidth figure listed for Vega 56.

August 14, 2017 | 10:55 AM - Posted by Ryan Shrout

Fixed! 409.6 GB/s it is.

August 14, 2017 | 10:02 AM - Posted by Anon-y-mous (not verified)

Price performance leadership are you serious? The street price for Vega 56 will be much higher and if we go by MRSP then the GTX 1070 wins since it is $349 vs $399. Add the higher energy costs to that and you get a mediocre price performance ratio for gaming.

August 14, 2017 | 11:36 AM - Posted by TrollFinder (not verified)

Provide a link to a single $350 GTX 1070 available and in stock on Amazon.

August 14, 2017 | 01:31 PM - Posted by CB

"Provide a link to a single $399 Vega 56 available and in stock on Amazon."

FIFY

August 14, 2017 | 01:44 PM - Posted by Jeremy Hellstrom

Show me a link to buy an Amazonian Loch Ness monster for tree fiddy.

August 14, 2017 | 03:34 PM - Posted by Just an Nvidia User (not verified)

Mining has inflated 1070 prices. However you can get a 1080 cheaper than a 1070 and Vega 64 now. $540

https://www.amazon.com/Gigabyte-GeForce-Graphic-Computer-Graphics/dp/B07...

August 14, 2017 | 06:37 PM - Posted by Anonymously Anonymous (not verified)

Amazon is disappointing, but I did find one on ebay for $3.50 at time of writing this comment:

https://www.ebay.com/p/Water-Horse-Crusoe-Loch-Ness-Monster-Toy-Plush-6-...

August 14, 2017 | 10:27 AM - Posted by Alamo

it's probably the worst GPU release from AMD in a long time.
how can a 480mm² die get a Golden Award for barely matching a competitor with a die 170mm² smaller, a year later.
raja needs to put down the bottle and the cigare, vega was 100% his design afterall, i wonder if he will survive this, because imo the board should fire him.

August 14, 2017 | 11:21 AM - Posted by zme-ul

I seriously doubt it's anything but a redesign of Fiji and that's a redesign of Hawaii - GCN finally showing its limits

August 14, 2017 | 11:35 AM - Posted by DocSwag (not verified)

Look at the anandtech review, a lot of work was put into getting higher clock speeds (hence the much higher clocks than Polaris)

August 14, 2017 | 12:14 PM - Posted by Alamo

this was not a simple die shrink of fiji, they changed alot of stuff, important stuff that should yield significantly more performance than fiji, but they ended up with worst perf.
the power draw i get it, from polaris and ryzen, it was clear that glofo's node isn't up to the task for high frequency, where it just sky rockets when it reachs a certain point.
but what i do not get is the GPU core efficiency, it doesn't scale as itn should, it's like FIJI all over again, like if AMD doesn't know how to optimize workloads passed 2.5k stream processors.

August 15, 2017 | 12:42 PM - Posted by James

GPUs are very sensitive to software optimization. If we had DX12 based games (designed from the ground up, not hacked in) with proper optimization for Vega, I suspect Vega cards would outperform the Nvidia competition by a significant margin. Unfortunately, optimization will be targeted at the current market leader, helping them maintain a near monopoly position. Nvidia has always seemed to have a policy of there will be another GPU in 6 months, so they are not very forward looking and they don't care how their cards perform in a year or two. In fact, it is better if their cards perform worse in a year or two so that loyal Nvidia customers will buy a new Nvidia card to replace the obsolete one. Given the stagnation in process tech, I suspect that people will be keeping their cards a lot longer than previous generations. The move back to 16-bit could be a big thing that Nvidia cards don't have yet (great for planned obsolescence). This doesn't help AMD sell cards now unfortunately. It looks like the trolls are out more in force for the GPU release than they are for the cpu release though. Interesting. Maybe they need to kill it quick before anyone optimizes software for it. I personally would probably buy a Vega 56 if it wasn't for the inflated prices.

August 15, 2017 | 04:42 AM - Posted by Power (not verified)

Please, take into account the area of memory dies.

August 15, 2017 | 08:21 AM - Posted by Alamo

it is 484mm² for the gpu die alone not including the hbm2 stack, that is why it's so weird to see this kind of performance.

August 14, 2017 | 11:11 AM - Posted by Anonymous2 (not verified)

If I may be constructively critical, the average FPS performance summary tables are all very confusing. Alternating between V64 Air, V64 Water, V56, and the cards compared against within each table change.

I suggest consolidating these summary tables into one giant table on its own page as an overall performance summary.

August 14, 2017 | 11:14 AM - Posted by Anonymous1234321 (not verified)

Sorry for my english level, I am novice level.

Ryan Shrout and Brian Maltavino please answer my query, if this information is already aware, Thank you forward:

1. I want to know, does vega CARDS have Draw Stream Rasterizer enabled on this driver version?

2. Has Primitive Shader features are enabled?

3. I want to estimate potential improvement of card, thank you for all your hard working on review and in depth analysis on technology.

I will purchase via link when card gets stock.

August 14, 2017 | 11:16 AM - Posted by Ryan Shroud (not verified)

***edited by Jeremy ***

Start acting like you are old enough to have been potty trained, the bunch of you.

August 14, 2017 | 10:09 PM - Posted by quest4glory

"the bunch of you."

Seriously?

August 15, 2017 | 01:40 PM - Posted by Jeremy Hellstrom

Yes, the thread I killed was worse than the usual crap spewed out in the comments.

August 14, 2017 | 11:34 AM - Posted by PJ (not verified)

It looks like they will be competing on price until drivers improve unless someone is interested in 4K gaming in which case the Vega64 is the clear leader. If the Vega 56 is actually available for purchase in September and not sold out to miners then that will be the one I go with to replace a GTX 970.

August 14, 2017 | 12:04 PM - Posted by Just an Nvidia User (not verified)

Vega 64 does not lead in 4k gaming. That crown is held by Nvidia's 1080ti. 1080 is not for 4k gaming really so by extension neither is Vega 64. These cards are best at 1440 resolution. It's why the subjective tests they ran while showing off Vega 64 against a 1080 ran in 1440. Look up a few reviews that show 4k results along with frame times. Good luck with your Vega 56. Still not as good as a 1070 in performance per watt but much better than Vega 64 is.

August 14, 2017 | 01:21 PM - Posted by Clmentoz (not verified)

With 88 ROPs on the 1080Ti over Vega 64's, only 64 ROPs that 4K gaming lead should belong to the GTX 1080Ti. Now with more DX12/Vulkan gaming and maybe more gaming features depending on mre Vega's extra compute things may age differently.

And no one should focus on DX11 in the future more than say currently they are, to an excess, as DX12 and Vulkan are in the process of taking over. So will the games developers be able to begin to make more use of any extra compute for gaming effects over the next year and longer before the Navi and Volta SKUs arrive.

There is also the question of FP16 and more AAA games taking advantage of FP16 in order to do more at 4K.

No resonable person expects the RX vega 64 to beat the GTX 1080 TI with that 88 ROP lead over any of the current Vega 64 and 56 offerings, but Nvidia lacks as much available compute compared to AMD, and currently the miners may like that more!

But the games makers are Eyeing GPU compute and they are also Eyeing Vega's HBCC/HBC IP and that IP's ability to make available 521TB's of addressable texture/mesh data space available via Vega's HBCC/HBC using the HBM2 as a last level cache to leverage all of the PC system's regular DRAM and NVM memory paging pools for an effective VRAM size of 512TB of effectve VRAM addressing space for Textures/Mesh Data on any Vega Micro-Arch based discrete GPU SKU.

This Vega HBCC/HBC and HBM2 IP and the FP16 bit IP(rapid packed math) and the New DX12/Vulkan API's able to do more gaming effects with a GPUs available compute. And even the Vega GPU Micro-Arch's support for the Infinity Fabric and the potential for Maybe a Dual Vega 56 on one PCI variant that can use the Infinity Fabric IP to make those 2 Vega 56 dies act as one bigger die logically, similar to the way the Infinity Fabric does that for the Zen/CCX unit/Zeppelin dies on TR/Ryzen and Epyc SKUs.

August 14, 2017 | 11:40 AM - Posted by WhyMe (not verified)

Do we know if AMD has enabled tile-based rendering, IIRC rumors said it was disabled in early testing/drivers.

August 14, 2017 | 10:20 PM - Posted by DocSwag (not verified)

I am relatively certain it is enabled.

August 14, 2017 | 12:25 PM - Posted by Clmentoz (not verified)

"The AMD Radeon RX Vega Review: Vega 64, Vega 64 Liquid, Vega 56 Tested"

Really the title should read: The AMD Radeon RX Vega Preliminary Review(with lots of DX11): Vega 64, Vega 64 Liquid, Vega 56 Tested.

And yes the power metrics on the Vega 64 is going to be up there with the RX Vega 64's more shader/cores!

RX Vega 64's:

Shading Units: 4096
TMUs: 256
ROPs: 64
Compute Units: 64
Pixel Rate: 98.94 GPixel/s
Texture Rate:395.8 GTexel/s
Floating-point performance:12,665 GFLOPS

And the RX Vega 56's:

Shading Units: 3584
TMUs: 224
ROPs: 64
Compute Units: 56
Pixel Rate: 94.14 GPixel/s
Texture Rate: 329.5 GTexel/s
Floating-point performance: 10,544 GFLOPS

and look at the RX Vega 64's TMU and ROP counts relative to the RX Vega 56's. If only AMD would design its GPU micro-archs in such a way as to be able to strip out only the excess compute rather than any ROP/TMU units they could match Nvidia's power usage matrics more closely. That said Vega 56's remaining TMU/ROP capability is rather close to the Vega 64's TMU/ROP capability.

The GTX 1080's:

Shading Units: 2560
TMUs: 160
ROPs: 64
SM Count: 20
Pixel Rate: 110.9 GPixel/s
Texture Rate:277.3 GTexel/s
Floating-point performance: 8,873 GFLOPS

The GTX 1080 Is closer to the RX Vega's 56's feature set but Nvidia has stripped out of more compute and clocked its SKUs higher. So if AMD could strip out more of Vega 56's compute and get the clocks up there then the RX Vega 56 could probably get closer. If AMD could do the same for It's Vega 64 SKUs in stripping out the compute only then those clocks could be high enough to maybe get the Vega 64 closer into the range of the GTX 1080TI.

The GTX 1080Ti"s:

Shading Units:3584
TMUs: 224
ROPs: 88
SM Count: 28
Pixel Rate: 139.2 GPixel/s
Texture Rate: 354.4 GTexel/s
Floating-point performance: 11,340 GFLOPS

And looking at the GTX 1080Ti's ROP(88 ROPs) count lead over RX Vega 64(64 ROPs) and 56(64 ROPs), we can see why Nvidia can flip those FPS's out there. I think that if the RX Vega 56 can be over clocked higher that it may just get closer to the GTX 1080 than then RX Vega 64 will be able to get to the GTX 1080 TI, with the Ti's 88 ROP lead over RX Vega's 64's only 64 ROPs.

And we can see why Nvidia can remain ahead of AMD in the DX11 gaming only benchmarks and it's those ROPs and less compute getting in the way of the higher clocks for Nvidia and, the jury is still out on Vega's DX 12 gaming perfotmance across a larger sample of DX12 fully optimized titles.

AMD/Games makers have the usual gaming/driver tweaking to do but more DX12 optimized titles are available this time aroud. AMD with Navi needs to totally decouple it's compute from its ROP/TMU functionality is such a way that AMD can come up with a base GPU mciro-arch where AMD can fuse off/power gate only compute while leaving in just enough ROP/TMU resources to flip out the frames and win the FPS contest.

That said I'd rather have the extra compute over any FPS overkill above 60-140 FPS low/high end metric for a 4K gaming optimal gaming target with settings turned down. There needs to be more FPS variability gaming benchmarks available also.

The Dual Vega GPUs on a single PCIe card SKUs rumors abound also so maybe a dual RX Vega 56 with less compute and more ROP/TMU capability sounds interesting. AND with AMD's Infinity Fabric support on is Vega GPU Micor-Arch based products maybe the 2 Vega dies on one PCIe card communicating over the Infinity Fabric can be made to act more like a single monolithic GPU core design to the Graphics API/driver software. That prospect has got me excited!

August 14, 2017 | 01:50 PM - Posted by Hood

All those words and basically said nothing; sad. Someone needs to start a referral service, like "Ask Gary", but for AMD victims. Professional help may be required before they can move past this over-hyped disappointment, and get on with their lives.

August 14, 2017 | 01:52 PM - Posted by Jeremy Hellstrom

What is an Ask Gary?

August 14, 2017 | 02:27 PM - Posted by Clmentoz (not verified)

And yet you are so personally offended by someone showing some actual ROP/TMU figures and doing a little bit of simple inductive and deductive reasoning in attempt at getting at some of Nvidia ROP advantages and Nvidia using less compute/more ROPs to get those Frames flung out there at all costs to win the FPS/power uasge metrics in mostly DX 11 titles. All while there are now some interesting DX12/Vulkan titles results at some other websites and Vega doing good(In DX12/Vulkan) in spite of some ROP disadvantages or clock speed disatvantages to Nvidia's GTX 1080/1070 SKUs.

And you can only attack the numbers of words or the TechPowerUp data base GPU figures That I used in my comparsion and contrasting the ROP/TMU/Compute resources differences and how that affects the ability to fling the frames out the for those frames flung out for that FPS metric contest.

I personally think AMD did better and will do better focusing on the Professional CPU/GPU markets for the mad revenues that those markets will provide and that AMD's Vega micro-arch based professional GPU Accelerator/AI Accelerator usage performance is the more necessary for AMD's growth in revenues over any gaming only performance metrics. And AMD's Vega SKUs are great in competing with Nvidia's GTX 1080 and GTX 1070 SKUs for DX12/Vulkan Gaming and not so bad for the DX11 gaming that is on it's way out.

So that Sad applies more to your taking offense to a little closer look at things for gaming's future rather that that DX11 only past. And what is really sad is that you can not offer any constructive rebuttal other than expressing your "Sadness". Now that just leaves the question of Your true Sadness and that is more to do with your lack of abilities for offering any cogent rebuttal to what was posted.

August 14, 2017 | 03:52 PM - Posted by Just an Nvidia User (not verified)

About time dx11 is gone and dx12 takes over Vega 64 will be in a dirt nap too. Meaningless to compare future to what is today. Dx11 just works and dx12 requires lazy and overworked programmers to code everything video card related into it. That costs more time and money.

Compute overkill isn't needed in games I'm sorry to say.

If AMD starts fuzing two Vega cores onto one die I'm going to start buying stock in electric companies and become a millionaire.

August 14, 2017 | 04:47 PM - Posted by DaFutureIsPlasticSays (not verified)

Vega has a lot of compute and who the hellz cares about gaming only metrics, the real Epyc money is in the Professional CPU/GPU accelerator/AI markets for AMD, and compute matters more there than any FPS metric!

Gamers can get over themselves they do not matter at all in the larger scheme of things for both AMD's and Nvidia's future revenue growth potential!

August 14, 2017 | 12:26 PM - Posted by a22 (not verified)

so which version are we getting for the vega 64 air cooled? the ones with black plastic shroud or the brushed metal limited edition ones?

August 14, 2017 | 12:41 PM - Posted by Anonymously Anonymous (not verified)

limited edition is limited, once they are gone then only the black shrouds will be left, that is until the AIB's come out with their custom coolers, then at that point who cares about the reference designs.

August 14, 2017 | 12:45 PM - Posted by quest4glory

Now someone needs to do 4X Crossfire testing with Vega on Threadripper. I know more than 2X is diminishing returns, or even negative scaling in some cases, but it's one of the things AMD has that Nvidia doesn't as of the Pascal generation.

August 14, 2017 | 01:26 PM - Posted by AnonCannon (not verified)

Hi Ryan - any intelligence on availability, looks like the card is only meant for reviewers.. goin by today's launch :)

August 14, 2017 | 01:52 PM - Posted by odizzido2 (not verified)

Would have been interesting to see some ryzen based testing as well. Maybe you guys could do a follow up review with that?

August 14, 2017 | 02:18 PM - Posted by MaxAsks (not verified)

How do Vega 56 and Vega 64 compare to Vega Fe in professional workstation tasks? If half the vram don't crush it two hard, Titan XP performance at half price would be sweet, (for those who aren't in it for hardcore gaming.)

August 14, 2017 | 03:37 PM - Posted by Jeremy Hellstorm (not verified)

Take off, ya hosers.

August 14, 2017 | 05:28 PM - Posted by Geforcepat (not verified)

What a bunch of sell out's gold really? Neither card deserves an award. much less gold.

August 15, 2017 | 07:34 PM - Posted by Cellar Door

Ryan - you guys are killing me with the fonts on these graphs, are they for ants?

August 14, 2017 | 06:56 PM - Posted by skysaberx8x

I'm sorry if I missed it but what was the clock speed of the gtx 1080 and 1070 or were they at stock speed?

August 14, 2017 | 07:26 PM - Posted by DemDualzIFconnected56s (not verified)

From Anandtech's Vega review:

"Connecting the memory controllers to the rest of the GPU – and the various fixed function blocks as well – is AMD’s Infinity Fabric. The company’s home-grown technology for low-latency/low-power/high-bandwidth connections, this replaces Fiji’s unnamed interconnect method. Using the Infinity Fabric on Vega 10 is part of AMD’s efforts to develop a solid fabric and then use it across the company; we’ve already seen IF in use on Ryzen and Threadripper, and overall it’s a lot more visible in AMD’s CPUs than their GPUs. But it’s there, tying everything together.

On a related note, the Infinity Fabric on Vega 10 runs on its own clock domain. It’s tied to neither the GPU clock domain nor the memory clock domain. As a result, it’s not entirely clear how memory overclocking will fare on Vega 10. On AMD’s CPUs a faster IF is needed to carry overclocked memory. But since Vega 10’s IF connects a whole lot of other blocks – and outright adjust the IF’s clockspeed based on the workload need (e.g. video transcoding requires a fast VCE to PCIe link), it’s not as straightforward as just overclocking the HBM2. Though similarly, HBM1 overclocking wasn’t very straightforward either, so Vega 10 is not a great improvement in this regard."

Interesting so maybe that Infinity Fabric on Vega maybe crossing the same PCIe card and a dual Vega 56 card to take on 4K! More of this GPU based IF technology/IP needs to be looked at for maybe dual RX Vega 56 PCIe card SKUs to come!
Can that IF cross ove GPU dies like it crosses ove CPU dies is the big question to be looked at.

August 14, 2017 | 07:39 PM - Posted by Giantmonkey101 (not verified)

Vega gets its biggest performance gains from HBM2 overclock not by touching its already high Core clocks especially on the liquid version, so i'm confused has to why you touch the memory speeds?? Thats like 10% performance increase across the board with little impact on power consumption, @#% really??

August 15, 2017 | 12:39 AM - Posted by i7Baby

I think I'll just stick my 2 x R9 Nanos.

There doesn't seem much to gain by going to Vega.

August 15, 2017 | 04:49 AM - Posted by Martin (not verified)

Did you re-test Vega FE or are you using the results from previous testing?

I guess, the actual question here is whether all the changes made for RX Vega also apply to Vega FE or not?

August 15, 2017 | 08:24 AM - Posted by Anonymouson (not verified)

VEGA is good only in Canada, Greenland, Norway, Sweeden and Russia, but there is a rumor that VEGA will be banned from global market because of global warming....

August 15, 2017 | 04:58 PM - Posted by Just an Nvidia User (not verified)

That wouldn't be wise to have all the Vega cards up north. Might detach a few giant icebergs from the northern glacier with all that excess heat.

August 15, 2017 | 07:01 PM - Posted by StripesOnLinux (not verified)

Ha ha ha that's funny but I'm all for a consumer GPU Tax to maybe fund those 2 Toshiba/Westinghouse nuclear reactors in Georgia. So lets TAX consumer GPUs worldwide and get some Thorium reactors online and that should offset global warming. Lets make Toshiba/Westinghouse finish the job at break-even costs and use a consumer GPU tax to do so. Tax Toshiba's NAND production also. Bit coin mining GPUs should be double taxed and any bitcoin farms should be inspected to make sure they are taxed enough to cover the global warming impacts of coin mining.

They could even build Thorium reactors at all the recently closed nuke plants and use the Thorium reactors to burn up all that nuclear spent fuel and generate power in the process until there is no more/little waste remaining that would need to be shipped off site and stored for hundreds of years.

August 16, 2017 | 07:53 PM - Posted by Just an Nvidia User (not verified)

I'd be for a GPU tax you describe but with one caveat. The company with the least efficiency pays more, the company that makes more efficient cards should be penalized less. A petrol tax of a few cents would probably net more revenue however.

August 16, 2017 | 09:55 PM - Posted by CorpTaxOnProfitsOnly (not verified)

No that GPU tax is paid by the consumer at the point of sale and all consumer GPUs need a global warming tax. No business or professional GPU tax as that's GPUs used for productive uses. Petrol is already taxed. All gaming/mining GPUs need to be taxed because that usage in not as necssary as say GPU's used for medical/medicine research, engineering, real productive usage of CPUs/GPUs.

AMD's Vega GPUs, if they are underclocked can be closer to Nvidia's effency levels and AMD's Vega GPUs clocked lower and used for compute workloads can even beat Nvidia in the compute performance/Watt metrics.

So only consumers gaming and coin miners should get to pay the GPU tax, because those usages are not essential and use plenty of power and tax the grid. Add some CPU taxes there also for non professional CPU usage.

You do not tax Companies on any things but net profits and it's the companies' Stockholders that get to pay capital gains taxes. Taxing Net profits encourages companies to invest more of the profits back the companies product development and that also creates more jobs.

I also believe in a military draft lottery based on finding those military age gamers with a propensity towards FPS/military games usage where everyone of military service age across the whole population gets a single draft number and a single capsule to be tossed into the military draft lottery barrel.

And then for each FPS/Military game purchased that purchased game nets that gamer an extra required capsule with his's/her's/inbetween's extra draft capsule added to increase that person's chances of geting some real FPS/Military gaming experience without the chance of respawn if they are taken out!

Special methods should be used to find and reward the FPS/Military high score earners with some extra capsules earned into the draft lottery barrel and an an even greater chance of getting some really high resolution war games experience as the proud property of U-Sam's Government Issue FPS/Wargaming team.

August 16, 2017 | 09:56 PM - Posted by CorpTaxOnProfitsOnly (not verified)

No that GPU tax is paid by the consumer at the point of sale and all consumer GPUs need a global warming tax. No business or professional GPU tax as that's GPUs used for productive uses. Petrol is already taxed. All gaming/mining GPUs need to be taxed because that usage in not as necssary as say GPU's used for medical/medicine research, engineering, real productive usage of CPUs/GPUs.

AMD's Vega GPUs, if they are underclocked can be closer to Nvidia's effency levels and AMD's Vega GPUs clocked lower and used for compute workloads can even beat Nvidia in the compute performance/Watt metrics.

So only consumers gaming and coin miners should get to pay the GPU tax, because those usages are not essential and use plenty of power and tax the grid. Add some CPU taxes there also for non professional CPU usage.

You do not tax Companies on any things but net profits and it's the companies' Stockholders that get to pay capital gains taxes. Taxing Net profits encourages companies to invest more of the profits back the companies product development and that also creates more jobs.

I also believe in a military draft lottery based on finding those military age gamers with a propensity towards FPS/military games usage where everyone of military service age across the whole population gets a single draft number and a single capsule to be tossed into the military draft lottery barrel.

And then for each FPS/Military game purchased that purchased game nets that gamer an extra required capsule with his's/her's/inbetween's extra draft capsule added to increase that person's chances of geting some real FPS/Military gaming experience without the chance of respawn if they are taken out!

Special methods should be used to find and reward the FPS/Military high score earners with some extra capsules earned into the draft lottery barrel and an an even greater chance of getting some really high resolution war games experience as the proud property of U-Sam's Government Issue FPS/Wargaming team.

August 16, 2017 | 09:57 PM - Posted by CorpTaxOnProfitsOnly (not verified)

remove double post, s p a m filter be crazy!

August 17, 2017 | 12:53 PM - Posted by Jeremy Hellstrom

I would strongly suggest you stop blaming me for your mistakes.

August 17, 2017 | 03:21 PM - Posted by DoubleDoublePlusBuggyOOcode (not verified)

Through no fault of you own, the entire software stack with which respect yours/others websites runs on is buggy from the Drupal, Open Source CMS framework, on down into the call stack into the IE/Other browsers and deeper down into the windows OS software frameworks and that includs the frameworks that any web provided Spam filter(service) is based on(open source and proprietary). And that's just the nature of the complex software/hardware state machine designs that the entire computing industry is based on from since forever with respect to computing systems and that buggy software state of affaris that can never be proven entirely correct.

That said, I do get your /S.

But it appears that M$ has Over-Tweaked Its windows Base, and derived, Textbox Class Objects and that's as buggy as hell also by any resonable software standards. So double posts are par for the course and somtimes it's the failure of the spam filter software stack and sometimes other software stacks especially where web based posting functionality and software framework systems transactional atomics are concerned.

Just try posting on your website a reply to a post on the second or higher page, if your forum posts/replies run into multi-page lengths, and that is buggy with respect to users PC OS systems, and the Drupal/script/PHP/DB/other systems software stacks also.

But thank goodness that the latest M$ cummulative IE 11 security updates has fixed some of the problems with long running ad scripts totally borking the browser(IE11) because that was really a problem for a few months there, and IE11 is not the best way to browse by any stretch on the imagination.

There is no better example of why Linus Torvalds insists on using C and not C++ for the Linux Kernel when one looks at some of the windows OS/frameworks(C++ mostly) and how Buggy they are. Linus Has the right Idea, but even C is not without its issues.

August 17, 2017 | 04:00 PM - Posted by Jeremy Hellstrom

I am the spam filtre, there is no automatic filter in place.

August 17, 2017 | 05:20 PM - Posted by SpaaaaatzTwoTone (not verified)

Currently "You" are, but not always and there is that captcha thingy that freaks out and drupal nurples from time to time also, along with the spam filter service borking.

And NOW we know that you are in fact a Turing complete AI, Jeremy! How's your brother Max doing! I hear that he went into a VR bar and got so row hammered that he had to be cold booted.

August 18, 2017 | 01:35 PM - Posted by Jeremy Hellstrom

Have been since we moved to Drupal years ago.  There can be flakiness with double posts, especially on iThings but that is a different issue.

Max is hanging out with a bad crowd now, Enzo and he are trying to convince Dot to start a video site.

August 15, 2017 | 10:41 AM - Posted by Irishgamer01

No cards available at launch.
Checked all my usual sites.
Now I have been in work and constantly checked yesterday and today. Couple of times an hour. 5 different sites. In Ireland/UK.

The Pre-Order prices have added extra 100.?
No its not suppose to be any better mining than a fury, so whats going on

I wouldn't be getting one except I have a wide screen free sync monitor.
Might hook my 1080 up and see what its like and just get another one.

August 16, 2017 | 07:46 PM - Posted by Just an Nvidia User (not verified)

The extra $100 was added by AMD as rebate was only limited to initial batch of Vega 64 cards. What's this now AMD is gouging it's customers. I thought they were the good guys.

August 15, 2017 | 07:54 PM - Posted by TomH (not verified)

Any word on what monitor outputs are on these cards? The only thing that will make me want a new video card is 4k+ resolution with 120+ Hz refresh rate support.

August 16, 2017 | 07:59 PM - Posted by Just an Nvidia User (not verified)

Three displayport 1.4 ports and 1 HDMI 2.0. Supposedly you can have 120hz support at 4k with DP 1.4.

August 15, 2017 | 11:30 PM - Posted by qwerty (not verified)

Vega interest me alot from the technical side, how much of this translates into additional performance i dont know but it will impact it. Currently so many features of vega arnt enabled, and alot of the stuff even when enabled will do diddly squat for most games out now.

But off the top of my head things not enabled are HBCC, that could have a huge effect not on performance exactly but 512tb of vram is just mind blown.

Primitive shaders, that could be a massive pump in performance.

Rapid packed math, which is basically black magic to me but as i understand it thats basically hyperthreading for a gpu, if thats wrong please correct me.

August 31, 2017 | 02:03 PM - Posted by Stefem

FP16 packed math hasn't anything to do with hyperthreading, it's just combining two 16 bit operation on a single FP32 ALU dubling performance at the cost of reduced precision but is not a magic bullet, there was a reason if the industry moved from FP16 to FP32 15years ago, 16bit aren't enough most of the graphics

August 16, 2017 | 12:04 AM - Posted by Anonymousdfdf3 (not verified)

A GOLD award for vega? You are joking right?? It's a piece of hot shit, a total technological failure, a regression in almost every possible way. More than 2 generations behind the competition now and priced like garbage. You guys are tarnishing your reputation with that nonsense. AMD doesn't need your charity awards.

August 16, 2017 | 07:52 PM - Posted by GinormousDaftsNot (not verified)

Go read that techgage article linked to in many posts on this forum thread! Vega 64 is a compute monster for thousands less than that Quadro P6000-24GB! Even the Titan XP falls to Vega on some workstation compute workloads.

AMD's stockholders Know that Vega is a winner in that professional compute/AI market that counts more for some higher margin revenues than any gaming only market will produce. And Nvidia's JHH Knows this also about compute/AI markets!

Vega 64 is right up there with the GTX 1080 in gaming in spite of that extra compute, ditto for the Vega 56 with a little more compute stripped out but still enough ROP/TMU resources to closely match the GTX 1080's ROP/TMU resources. So that Vega 56 will beat the GTX 1070 and most likely be overclocked to get nearer to the GTX 1080 in gaming performance metrics.

And the miners and pro markets love all the extra compute that the Vega 10 GPU micro-arch can spare, and that's money in AMD's bank, same as any gaming only money/revenues!

Money Be Money Sonny!

August 31, 2017 | 02:15 PM - Posted by Stefem

You mean many post made by the same guy with different name? :)

Compute monster... really? Vega has the same theoretical compute power of GP102 but in practise due to their lower compute core occupancy it won't even match it

August 16, 2017 | 03:28 AM - Posted by the friendly neighborhood creep (not verified)

A gold award for matching yesteryears performance? Wow PCPer... what happened to you guys? *sigh* I remember when you used to have integrity.

August 16, 2017 | 07:53 PM - Posted by GinormousDaftsNot (not verified)

see GinormousDaftsNot's reply to Anonymousdfdf3!

August 18, 2017 | 05:50 PM - Posted by the friendly neighborhood creep (not verified)

Go read that techgage article linked to in many posts on this forum thread!

So they awarded the gold for someone else's article?

Vega 64 is a compute monster for thousands less than that Quadro P6000-24GB! Even the Titan XP falls to Vega on some workstation compute workloads.

Wasn't the Frontier Edition made for that purpose? This is a gaming card you putz. Nobody gives a crap about compute in gaming cards - they should be judged for their GAMING performance. Goddamn intellectually bankrupt shills.

August 16, 2017 | 06:33 AM - Posted by Trey Long (not verified)

Vega is ok. A Custom cooled over clocked
1070 or 1080 will easily beat Vega out of the box and use less power. The higher production costs might limit availability.

August 16, 2017 | 07:54 PM - Posted by GinormousDaftsNot (not verified)

see GinormousDaftsNot's reply to Anonymousdfdf3!

August 18, 2017 | 05:44 PM - Posted by doomcrazy (not verified)

Why does this review show such a large difference between RX Vega 64 and Vega Frontier Edition when this Witcher 3 video shows that they're essentially identical?

https://www.youtube.com/watch?v=RNXGr-8jcnE

August 25, 2017 | 02:05 AM - Posted by Snoozie (not verified)

I just picked one up in Akihabara.
Winter is coming. This will power my games and make a good space heater.

September 15, 2017 | 02:25 AM - Posted by Philip Taylor (not verified)

Contact: drocusodospellcaster@gmail.com for Urgent lottery winner Fast, VERY POWERFUL:100% GUARANTEED RESULTS
I want to use this opportunity to thank Dr.OCUSODO for helping me to win the lottery.I have been playing lottery for the past 8 years now and the only big money I have ever won was 800$ ever since then I have not been able to win again and I was so upset and I need help to win the lottery so I decided to go to a friend of mine call Robert,and he introduce me to Dr. OCUSODO, there I saw so many good talk about this man called Dr. OCUSODO of how he have cast spell for people to win the lottery.I contact him also and I tell him I want to win a lottery, he cast a spell for me which I use and I play and won 80million GBP. I am so grateful to this man just in case you also need him to help you win, you can contact him through his email: drocusodospellcaster@gmail.com is my part of promise I made, that if I win I tell the word how I win my game. drocusodospellcaster@yahoo.com or also call him or what-app +2349067457724.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.