The Radeon RX Vega 64 at 4K

Subject: Graphics Cards | October 20, 2017 - 04:30 PM |
Tagged: amd, RX VEGA 64, 4k

[H]ard|OCP updated their benchmarking suite with several new games and have published a review of AMD's Vega 64 focusing on 4K performance.  The race between the GTX 1080 and Vega 64 is quite close, with many benchmarks showing less than a 10% difference in performance.  Neither card came close to touching the GTX 1080 Ti, that card is still the only one that can truly handle 4K gaming with graphics options on high or ultra.  For 1440p performance, the GTX 1080 is better overall but the Vega is still a very strong contender. 
Pop over for a look at the detailed results.

View Full Size

"Does the AMD Radeon RX Vega 64 play games well at 4K resolution? What game settings work best at 4K, and how does it compare to GeForce GTX 1080 and GeForce GTX 1080 Ti? Ten games are tested, new and old, DX11, DX12, and Vulkan at playable game settings and pushed to the max in this all out 4K brawl."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

October 20, 2017 | 11:35 PM - Posted by Godrilla

One would think that with such performance amd would try to improve on multi gpu scaling at least especially when sli has lack there of; to be competitive. The 1070ti is going to be a big headache for AMD as well. If the 1080 is superior to the Vega 64 then the 1070ti will probably match it at the $429 price point although at a rumored locked clocks.

October 21, 2017 | 01:59 AM - Posted by MoreTweakingIncomingForRED (not verified)

At the 1080ti's 88 ROPs and Pixel Rate of 139.2 GPixel/s compared to the Vega 64's 64 ROPs and Pixel Rate of 104.3 GPixel/s, the GTX 1080 Ti should be beating the Vega 64 on the amount of FPS pushed out. AMD's texture fill rates on its GPUs are higher though with the Vega 64's Texture Rate of 417.3 GTexel/s compared to the GTX 1080 Ti's Texture Rate of 354.4 GTexel/s.

I'd like to see more Vega tesing as time passes because there is a lot of new Vega IP that has not been fully programmed for in games/gaming engines namly the primitave shaders and 16 bit rapid packed math, and Forza 7 and other DX12 optimized titles are probably not yet taking advantage of all the new features in the Vega GPU Micro-Arch so it's more fine wine until some Vega refreshes and then Navi arrives. CF and SLI are dead for the New DX12/Vulkan APIs as that GPU load balancing act in now on the gaming engine/games developers side and not on the GPU driver side any more.

So the entire gaming engine development industry can now compete with the gaming engine development companies competing with each other for the best graphics API managed GPU load balancing built into the respective gaming engine makers products. Vega has only been on the market a short time and Forza 7's performance with Vega is not to shabby and there is plenty of new Vega IP to tweak for going forward.

AMD is still selling plenty of Vega GPUs for whatever usage so those unit sales mean that more economy of scale production costs efficienies will be had as the HBM2 makers recoup all of their HBM2 tooling and production and R&D expenses iwth each stack of HBM2 sold. So maybe once SK Hynix can get more of their HBM2 into AMD's hands the pricing will come down even more once the HBM2 suppliers get into more direct competition with each other.

I'm wanting to see what Vega 11 is going to do for the mainstream GPU market with some RX 570/580 replacements and more affordable 4GB HBM2 based mainstream options that can make good use of Vega's HBCC/HBC(HBM2 used as cache) IP that can create a larger than the HBM2's VRAM capacity in virtual VRAM paged out to system DIMM based DRAM while using the HBM2 as a last level GPU cache(HBC). So a replacement for the Polaris GPU micro-arch based SKUs with some Vega based updates with better gaming performance for the mainstream AMD GPU line of GPU SKUs. Also HBM2 on some mainstream GPU SKUs for the first time and the GPUs fabbed on that 12nm process tweak and able to be clocked a little higher and use less power also.

October 21, 2017 | 11:46 AM - Posted by msroadkill612

"...mainstream GPU market with some RX 570/580 replacements and more affordable 4GB HBM2 based mainstream options that can make good use of Vega's HBCC/HBC(HBM2 used as cache) IP that can create a larger than the HBM2's VRAM capacity in virtual VRAM paged out to system DIMM based DRAM while using the HBM2 as a last level GPU cache(HBC)."


amd have the means to make a 4GB gpu punch far above its weight.

i dont believe there is a problem with gpu supply, only hbm2. So given amd are not in the business of selling ram, halving the included hbm2, doubles gpu sales.

Still, it aint exactly raining dram either atm.

You omit perhaps the most exciting part of Vega's hbcc, which is that it can use nvme arrays to extend gpu cache as well as dimms.

Most would scoff at the notion of using storage to simulate ram, but note that we are still adjusting to going from 100MB/s sata3 hdd, to 500MB/s sata ssd, to 3200MB/s nvme ssd.

Now we have widely affordable nvme raid on threadripper, these speeds can be multiplied by 6x or more.

October 21, 2017 | 02:46 PM - Posted by MainStreamPortableLife (not verified)

Yes it can use GPU managed paged memory to SSD but are games really going to need more VRAM than say what the system DRAM provides out on the DIMMs, So for the most part Gaming PCs are going to have 16GB of DRAM installed and the 4GB of HBM2(HBC)/Cache that the GPU will feed from. So it's more than likely that any paging needs to SSDs will be small. Now it's a little differet matter if the SSD can be on the GPU's PCIe card and there can be less need to go out over the system PCIe links to any system attatched SSD on the motherboard.

If the Radeon mainstream GPUs get that 4GB of HBM2 used as HBC then that and say 12GB out of the system's 16GB in the form of Virtual VRAM paged out to DIMM based system DRAM should be enough for 4K gaming.

I'm very interested in seeing AMD develop a fully on silicon interposer based APU and at least offer 4-6 Zen+/Zen2 cores and a larger number of Vega/Navi nCUs so any gaming APU could maybe match a RX 570 in gaming performance and be able to fit in a Mac-Mini/NUC sized form factor mini-PC case at around 65 watts for the APU.

If you look at and current large monolithic GPU die designs those SKUs are already subdivided into modular separately functioning CUs/SMs that are wired up via some on monolithic die fabric. So it would not be hard for AMD/Others to split that into Dies/chiplets and move the entire coherent fabric onto the interposer's silicon including any active fabric/traces and management circuitry etched into the interposer's silicon including power delivery traces/circuits and then attatch the modular GPU dies/chiplets to the active interposer. The same can be done for the CPU cores with a single 4 or 6 core die with The CPU die, GPU dies/chiplets and HBM2/HBM3 stacks all hosted on the active interposer.

So really I'm wanting a portable mini desktop that I can take with on any travel along with a foldable keyboard and USB driven portable monitor for on the road usage. OEM Laptops are just not going to be as configurable as a standard Mini Desktop motherboard AM4/other and pleny of APU processing/graphics power in a mini desktop syetem that could easily fit into a backpack for travel usage. And maybe even some form of battery pack and the device able to be set to a lower TDP by the user/OS for longer battery life.

A portable mini desktop device MB/Case standard is what is really needed and users able to build their own systems and buy a standarized battery pack to attatch for usage where plug-in power is not available. Gaming laptop are nice but they are way too overpriced and really are lacking in the cooling and user update-abliity categories. Those Intel NUC's types of form factor systems with socketed motherboards are what I want but with more AM4 options and users able to update/build their own systems with standarized components with no device OEMs needed and no OEM Gimping.

October 21, 2017 | 01:49 PM - Posted by renz

"I'd like to see more Vega tesing as time passes because there is a lot of new Vega IP that has not been fully programmed for in games/gaming engines namly the primitave shaders and 16 bit rapid packed math, and Forza 7 and other DX12 optimized titles are probably not yet taking advantage of all the new features in the Vega GPU Micro-Arch so it's more fine wine until some Vega refreshes and then Navi arrives."

don't count on it.

October 21, 2017 | 03:07 PM - Posted by MainStreamPortableLife (not verified)

Why Not, AMD's GPU support is getting better and gaming engine makers do improve their gaming engines all the time to remain competative to the games makers that purchase the gaming engine SDK's/gaming engine company services.

Now that Vega is out there in the developers hands there will be continued improvments over time until the next GPU Micro-arch arrives. Navi is going to be more about creating modular GPU dies and increasing yields and less about sdding more IP features that are already present in Vega. So Navi will have scalable dies that AMD can use to create all of its low end, mainstream, and high end/flagship GPU SKUs. So Navi with a little bit more GPU micro-arch improvments over Vega and scalable GPU designs all reaching the market in the same time frame instead of mainstream one year and flagship the next.

AMD's revenue issues are in the past with Zen/Epyc producing enough revenues in the server market alone to keep AMD in the black. Those Project 47 supercomputer in a single cabinet each require 80 Vega 10 die based Radeon Instinct MI25 GPU SKUs ant that abd the Radeon Pro WX SKUs is where the best Vega 10 die bins are going at a much higher markup than any gamers or miners can afford to pay. So that professional Radeon/Vega GPU market will fund the majority of AMD's Vega driver improvments including any middleware that will be damn good at multi-GPU load balancing Via the Vulkan Open Standards Graphics API.
Look that the Radeon Pro Render Blender-3d plug-in that AMD has just made available and that's only the beginning.

October 21, 2017 | 11:45 AM - Posted by bria5544

With nearly double the power draw of the 1080 and an inability to even maintain parity, I really think the Vega 64 is a poor choice for anyone looking for a GPU. Vega 56 is a decent choice for the money but the upcoming 1070Ti might just make that a poor choice as well. Only time and benchmarks will tell.

October 21, 2017 | 03:36 PM - Posted by MainStreamPortableLife (not verified)

Vega 56 is closer to the GTX 1080 in ROP resources and Vega 56's Texture fill rate is higher than the GTX 1080 so maybe at 12nm and the Vega SKUs updated on that tweaked process node the power usage metrics for Vega 56 can be brought more inline with Nvidia's power useage metrics.

It's also a known fact that DX12 and Vulkan games will be making more use of that extra compute on AMD's Vega SKUs. So that's something to consider also in spite of the GTX 1070 Ti's release. Just go over to Techpowerup's GPU database and look up the Vega 56's ROP fill rates and much higher Texture fill rates even compared to the GTX 1080's Texture fill rates and if a Vega 56 refresh at 12nm comes out with a little bit higher clocks then that's a different game again to consider.

It's the Vega GPU demand that is keeping Vega's prices higher and that's not set by any gaming only market usage alone as GPU's for compute in the professional markets have been popularized by none other than Nvidia. And now AMD is profiting from its GPU compute even more in the consumer markets than Nvidia has been making mad profits likewise on the professionl GPU markets. AMD will also be earning mad markup revenues with its MI25 instinct and Radeon Pro WX SSD SKUs in the professional compute/AI markets.

"Vega 64 is a poor choice for anyone looking for a GPU"
No not really if that Anyone is wanting to do some compute workloads and then the Vega 64 beats the Quadro p6000 and Titan XP in some very interesting GPU compute benchmarks for thousands of dollars less. Just go and read this(1).


"A Look At AMD’s Radeon RX Vega 64 Workstation & Compute Performance"

October 22, 2017 | 06:51 AM - Posted by psuedonymous

"It's also a known fact that DX12 and Vulkan games will be making more use of that extra compute on AMD's Vega SKUs"

We've been hearing variants on this theme since Polaris (or indeed, since the first GCN cards and Mantle), but it has yet to actually materialise.

October 22, 2017 | 10:14 AM - Posted by bria5544

You can spout off all the theoreticals you like, if the hardware's benchmarks are slower right now then it deserves to be ranked where it is. Both Vega 64 and 56 are almost always slower than the 1080 and both have a much higher power draw. Vega parts are already on a 14nm process, a drop to 12 won't matter hardly at all regarding power and heat. Nvidia's parts are using a 16nm process and they're much more power and heat efficient. AMD dropped the ball with Vega and we can only hope they figure out how to tame this new beast of a chip they've created by the time Navi comes out. We deserve a GPU not a space heater.

October 22, 2017 | 11:58 AM - Posted by GamingBumpkinzTearz (not verified)

Vega 10's compute performance is real and those Vega 10 dies in those Radeon Instinct MI25's and Vega 10 die based Radeon Pro WX 9100s will earn AMD plenty. It's as if gaming does not really matter to the larger world out there and Nvidia sure is not focusing on only gaming either.

AMD dropped no ball with Vega and it is about the profesional markets with GPUs for both AMD and Nvidia. That beast of a Vega 10 die is not really for gamers only and the miners are not complaning about AMD's extra compute so their money is a good as any gamers to both AMD and Nvidia.
Raja designed Vega for the professional markets first and the gaming market gets somthing that performs well against the GTX 1080 with Vega getting better as the DX12/Vulkan ecosystem comes online and Vega's IP becomes more utilized for games.

If you need to ego gratification of having the most FPS flung out there then fine pay for Nvidia's GPUs but AMD's Vega GPUs can game just fine and are so good for compute usage that the Vega SKUs, along with the Polaris GPUs before that, are selling above MSRP because of the demand. So AMD is not having problems selling Vega to a wider than gaming only market and producing the larger per unit revenues from the professional markets and the coin mining markets are adding to the unit sales along with any gaming market sales for Vega 64 and Vega 56 in the consumer market.

There will be a Vega refresh at 12nm and Vega 11 and Vega 20(For the professional market with DP FP at 1/2 the 32 bit FP rate) So the Vega GPU micro-arch is not a failure by any stretch of the imagination.

Vega will get the tweaks needed by AMD and all that will be funded by the Professional market sales! So that market's GPU Driver/Graphics API optimization path will net Vega GPUs a constant rate of improvments until the next/Navi arrives. And Gamers will be on the recieving end of whatever Vega tweaks are generated by the professional market for its needs so Vega's improvements are assured gaming market on not!

Get over yourselves gaming only bumpkins, because GPUs have other usage besides gaming only.

October 23, 2017 | 07:44 AM - Posted by Anonymouss (not verified)

You are wrong mate. For professional market is Firepro or FE.
Vega 54 and 64 were designed for gaming, been advertised for gaming but they failed in every aspect. Who is going to make a custom Vega cards? No one?

November 2, 2017 | 05:32 AM - Posted by Anon1233456 (not verified)

Fire pro yes, FE no lol.

Pleanty of professionals buy 'consumer' radeons and GTX all the time. The sheer performance power makes it worth it vs the additional $1000+ worse preforming workstation version of the card. a few features like 10-bit color with photoshop are just locked out on the 'consumer' cards in the drivers for the most part, unless you need the higher binned chips.

November 2, 2017 | 05:43 AM - Posted by Anon1233456 (not verified)

Umm its not really a 'theory' that Vega is newer and is more future proof than the current GTX line up or that Vega performance is due to bad drivers.

Just look at the Rysen launch, great cpu's terrible launch pains even close to 6 months after launch, the benchmarks will definitely get better, the only question is by how much and how soon. Even half broken Rysen was kicking ass in a few months, what actually proof do you have that Vega won't to the same in a few months.

If thats still not enough look at how opensource drivers beat AMD's day one before people even really started developing them or how they missed how under clocking actually increases performance in games. Its the drivers that are total garbage not the hardware, and drivers are fixable.

November 2, 2017 | 05:20 AM - Posted by Anon1233456 (not verified)

Really both the arguments here are half true. Yes the Vega isn't the gaming power house people wanted it to be, that really at this point is undeniable.

What is also undeniable that people rarely talk about is that it is much more versatile than the current GTX line up in general. However the drivers weren't optimized, it was marketed wrong, and has a had terrible launch for gamers.

Vega its self if fine, especially forward looking but its really more a general graphics powerhouse than a 'gaming' card. so AMD is kinda deceitful selling it strictly as a 'gaming card' then expect people to go buy it before the products is optimized.


Alot of people will opt to use it over the professional grade card because its price to performance is better at a more reasonable price - just like they do with the gtx line up as well -. Only gain some benefits from Vega at the cost of game performance.

Generally a radeon or gtx will win out because of pure performance power but will be missing minor features with professional software like 10-bit color in adobe. Which unless your doing bill boards for print or top end cinema video you really don't need as long as you buy the right monitor and a color calibrator.

Copy and paste several other work arounds for other professional areas, Gaming cards are a good for work at their price points. Plus most professionals know its just a artificial lockout for the drivers for many professional features anyways.


Professional applications are going to like Vega more than the current gtx line up on average. Photoshop and final cut are easy examples of the vegas making more sense over 1070 and 1080, and arguably better price to performance than the 1080it. There is also expanding use of graphic card usage in other applications you might not normally notice. So much so Vega will most likely be featured in the next iMac and mac pros with metal 2. Unless windows magically makes directx make more sense for software outside of games thats not going to change.


Even the current Radeon drivers are barely optimized - a real failure on AMD trying to play catch up - you can easily look to how the opensource drivers on linux out preformed all other gaming cards on day one but the provided AMD drivers were preforming like total garbage under even low end GTX. Then apply that to the windows and OSX performance to a lesser degree. Its obvious that the drivers are still a major issue holding the card back, never mind the new features in vega. The 'fine wine' is real but its actually a negative thing because they should be completed already if their going to market to 'gamers'.

Read the performance gains from under-clocking in general as another example or how poorly optimized they are.


The drivers will get a massive overhaul regardless because of yes workstation cards but also because of Vega being on AMD's mobile and desktop cpu lineup. A area where AMD is actually beating intel right now.

Also because Vega has better vulkan and opencl support than the GTX linue, period. So Linux, MacOS, Android and pretty much any other OS on the planet besides windows - like the PS4 and Nintendo platforms - are also going to really like vega over a GTX naturally.

They actually have to go out of their way to support the existing GTX they use already when they do, at least in comparison to Radeon cards. Plus from a profits view it makes no sense to support a graphics card company that preferentially supports one competitor in any hardware line up - aka microsoft -.


The good:

If you DON'T HAVE A GRAPHICS CARD and can get a Vega at the retail price of there competing cards respectively there not a bad pick particularly if you have a free-sync monitor. They can be a great pick if you have a specialized workflow and also want to game, or are using any OS that is not windows. They are also much more 'future proof' than the GTX line up which is already getting old.

The Bad:

If you HAVE A CARD you may as well wait to see what AMD or Nvidia do; a more budget oriented radeon or - much later - a more future proof GTX would be better. If you can't get use of the extra features and don't care if you have to upgrade to a new GPU sooner the GTX make more sense for strictly gaming performance. DON'T BUY at the inflated AMD prices.

Plus for the really optimized AMD drivers you going to be waiting quite awhile - I bet much longer than previous years - since their launching so many totally new line ups this year and issues with things like drivers seem to be a problem with all of them.

February 13, 2018 | 04:08 PM - Posted by Anonimoose (not verified)

$400 for GPU is my limit anything over that is overkill and overpriced. Even most AAA games are not programmed as good as people think they are, the illusion of choice kicking in when you have wads of money to blow.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

This question is for testing whether you are a human visitor and to prevent automated spam submissions.