Review Index:
Feedback

The AMD Radeon Vega Frontier Edition 16GB Liquid-Cooled Review

Author:
Manufacturer: AMD

Specifications and Design

Just a couple of short weeks ago we looked at the Radeon Vega Frontier Edition 16GB graphics card in its air-cooled variety. The results were interesting – gaming performance proved to fall somewhere between the GTX 1070 and the GTX 1080 from NVIDIA’s current generation of GeForce products. That is under many of the estimates from players in the market, including media, fans, and enthusiasts.  But before we get to the RX Vega product family that is targeted at gamers, AMD has another data point for us to look at with a water-cooled version of Vega Frontier Edition. At a $1500 MSRP, which we shelled out ourselves, we are very interested to see how it changes the face of performance for the Vega GPU and architecture.

Let’s start with a look at the specifications of this version of the Vega Frontier Edition, which will be…familiar.

  Vega Frontier Edition (Liquid) Vega Frontier Edition Titan Xp GTX 1080 Ti Titan X (Pascal) GTX 1080 TITAN X GTX 980 R9 Fury X
GPU Vega Vega GP102 GP102 GP102 GP104 GM200 GM204 Fiji XT
GPU Cores 4096 4096 3840 3584 3584 2560 3072 2048 4096
Base Clock 1382 MHz 1382 MHz 1480 MHz 1480 MHz 1417 MHz 1607 MHz 1000 MHz 1126 MHz 1050 MHz
Boost Clock 1600 MHz 1600 MHz 1582 MHz 1582 MHz 1480 MHz 1733 MHz 1089 MHz 1216 MHz -
Texture Units ? ? 224 224 224 160 192 128 256
ROP Units 64 64 96 88 96 64 96 64 64
Memory 16GB 16GB 12GB 11GB 12GB 8GB 12GB 4GB 4GB
Memory Clock 1890 MHz 1890 MHz 11400 MHz 11000 MHz 10000 MHz 10000 MHz 7000 MHz 7000 MHz 1000 MHz
Memory Interface 2048-bit HBM2 2048-bit HBM2 384-bit G5X 352-bit 384-bit G5X 256-bit G5X 384-bit 256-bit 4096-bit (HBM)
Memory Bandwidth 483 GB/s 483 GB/s 547.7 GB/s 484 GB/s 480 GB/s 320 GB/s 336 GB/s 224 GB/s 512 GB/s
TDP 300 watts
~350 watts
300 watts 250 watts 250 watts 250 watts 180 watts 250 watts 165 watts 275 watts
Peak Compute 13.1 TFLOPS 13.1 TFLOPS 12.0 TFLOPS 10.6 TFLOPS 10.1 TFLOPS 8.2 TFLOPS 6.14 TFLOPS 4.61 TFLOPS 8.60 TFLOPS
Transistor Count ? ? 12.0B 12.0B 12.0B 7.2B 8.0B 5.2B 8.9B
Process Tech 14nm 14nm 16nm 16nm 16nm 16nm 28nm 28nm 28nm
MSRP (current) $1499 $999 $1200 $699 $1,200 $599 $999 $499 $649

The base specs remain unchanged and AMD lists the same memory frequency and even GPU clock rates across both models. In practice though, the liquid cooled version runs at higher sustained clocks and can overclock a bit easier as well (more details later). What does change with the liquid cooled version is a usable BIOS switch on top of the card that allows you to move between two distinct power draw states: 300 watts and 350 watts.

View Full Size

First, it’s worth noting this is a change from the “375 watt” TDP that this card was listed at during the launch and announcement. AMD was touting a 300-watt and 375-watt version of Frontier Edition, but it appears the company backed off a bit on that, erring on the side of caution to avoid breaking any of the specifcations of PCI Express (board slot or auxiliary connectors). Even more concerning is that AMD chose to have the default state of the switch on the Vega FE Liquid card at 300 watts rather than the more aggressive 350 watts. AMD claims this to avoid any problems with lower quality power supplies that may struggle to hit slightly over 150 watts of power draw (and resulting current) from the 8-pin power connections. I would argue that any system that is going to install a $1500 graphics card can and should be prepared to provide the necessary power, but for the professional market, AMD leans towards caution. (It’s worth pointing out the RX 480 power issues that may have prompted this internal decision making were more problematic because they impacted the power delivery through the motherboard, while the 6- and 8-pin connectors are generally much safer to exceed the ratings.)

Even without clock speed changes, the move to water cooling should result in better and more consistent performance by removing the overheating concerns that surrounded our first Radeon Vega Frontier Edition review. But let’s dive into the card itself and see how the design process created a unique liquid cooled solution.

Continue reading our review of the Radeon Vega Frontier Edition Liquid-Cooled card!!

The Radeon Vega Frontier Edition Liquid Cooled Card

View Full Size

The liquid cooled card shares dimensions with the air-cooled card, though without an integrated blower fan, the likeness stops there. The color scheme is reversed, with a yellow brushed metal body and blue accents and illumination. The top Radeon logo and the blue R cube on the end light up in blue, and as I stated on Twitter, I really hate blue LEDs. They are just uncomfortable to my eyes and I know I’m not the only one. Otherwise, the design of this card is just as sexy as the first Vega FE we looked at.

View Full Size

It still requires a pair of 8-pin power connections to run and the liquid cooling tubing and power to the radiator comes from the front of the card. There is plenty of length to the tubing and cabling, allowing for installation in nearly any chassis configuration.

View Full Size

On the back is a full cover back plate with an exposed area for the GPU tach, a set of LEDs that defaults to blue and indicates the GPU workload of the card. The blue on these is particularly piercing…

View Full Size

Internally we have a unique liquid cooler design. On the left is the pump and block covering the GPU and HBM2 stacks and a blue block covering the power delivery on the card as well. Liquid flows in from the top into the GPU block, through the GPU block outlet on the upper right, down through the VRM cooling, around to the far left, and the back out to the radiator.

View Full Size

This unit on the right is part of the diaphragm pump design that makes this card interesting. Think of this is as a flexible reservoir with a high-tension spring to create pressure back into the system. A diaphragm pump works with one-way check valves and reciprocating diaphragm material to create alternating areas of pressure and vacuum. The T-split you see at the top of the primary pump allows the liquid stored in the overflow area to maintain reliable usage of the cooler through the course of natural evaporation of fluid. This is very similar the kinds of pumps used in fish tanks and artificial hearts, likely a more expensive solution than was found on the Radeon Pro Duo or Fury X as an attempt to correct the deficiencies of older generations (noise, reliability).

This kind of cooler design was only made possible by the extended PCB of the Vega Frontier Edition, either by design or as a happy accident. The noise made by this pump is very different than traditional AIO coolers we have used in the office, more of a “gurgle” than any kind of “whine”. It’s more muted than the Radeon Pro Duo or Fury X, that’s for certain.


July 17, 2017 | 02:46 AM - Posted by Your Name (not verified)

LOL, my maybe next video card will gurgle when it gets excited!

July 17, 2017 | 02:57 AM - Posted by Your Name (not verified)

Then I read the review. My next card will not be gurgling.

July 17, 2017 | 04:40 PM - Posted by Dante (not verified)

Waaahhh, I'm angry because there's a thing I can buy.

July 20, 2017 | 05:21 PM - Posted by Your Name (not verified)

I'm not angry. I was trying to a bit funny.

July 17, 2017 | 04:40 PM - Posted by Dante (not verified)

Waaahhh, I'm angry because there's a thing I can buy.

July 17, 2017 | 04:09 AM - Posted by Johan (not verified)

AMD made it clear that RX will be faster in games. The FE was designed as a professional card that does have gaming drivers as well. It might be a mistake to assume that RX will perform the same.

I cannot prove it, yet it cannot be disproven either. The answer is that we will have to wait and see. We still might be very surprised when RX arrives. It is not too long from now. This LC version is surely impressive. I would have liked to see the pro performance, sad to not see it here, not even one.

July 17, 2017 | 05:57 AM - Posted by KARMAAA (not verified)

Ah yes, we will see, but what will AMD fans say once it does exactly the same as FE in gaming?

July 17, 2017 | 08:16 AM - Posted by Clmentoz (not verified)

If RX Vega is between the GTX 1080 and the GTX 1080 TI, that's good enough for most. If you want the top FPS performance and do not care about the compute, then by all means go with Nvidia and part with more than a few of your Benjamins. But Nvidia has no x86 license to speak of and all those Radeon Pro WX(Formally FirePro branded) GPU SKUs and Radeon instinct mi25 AI GPU SKUs will go very well package priced with the Zen/Epyc workstation/server/HPC CPU SKUs.

So Lisa and Raja can both laugh all the way to the bank with that Zen-Epyc/Vega(Radeon WX/Instinct) combo deals and those mad professional market revenues that those professional SKUs will produce.

July 17, 2017 | 10:11 PM - Posted by renz

having no x86 license is not really a problem to nvidia. for professional solution price is not #1 metric to look for unlike how it was with regular consumer. they most important metric is always about if you can do the job or not. that's why nvidia able to sell their solution at such high price to begin with.

also Vega 10 lacked FP64 performance. so for HPC class of machine even if they end up using AMD CPU the GPU portion will still very likely to end up with nvidia tesla due to the need of massive FP64 performance.

July 17, 2017 | 11:58 PM - Posted by Clmentoz (not verified)

Nvidia's not having any x86 license means that AMD can and will package price its Epyc SKUs with It's Radeon Pro WX/Radeon Instinct mi25/other AI SKUs. And Nvidia can only price its GPU SKUs. So Nvidia can not offer any CPU/GPU package deals, and that includes also any server/workstation motherboard deals also with AMD and its motherboard partners for added Epyc CPU, Radeon Vega WX/Instinct GPUs, and motherboard package deals.

Now Nvidia has some of its power9 customers but there also AMD's founding membership in the IBM/partners created OpenCAPI standards group with IBM, AMD, and others. So AMD will also support OpenCAPI with its GPUs and be able to get some GPU business with any Power9 OpenPower licensees, the Power9 SKUs all support IBM's/OpenCAPI group's OpenCAPI coherent interface standard. But currrently most of HPC/Server/Workstation market is based around the x86 32/64 bit ISA and Epyc sales will get AMD some extra Radeon Pro WX/Radeon Instinct sales.

You are going to have to refrence your statment with some form of GPU double percision FP comparsion tables.

The Peak Double Precision Compute Performance of 819 GFLOPS(?) for Vega FE according to Reddit but Vega FE is not a Radeon Pro Branded WX SKU, so who knows about any professional compute products that AMD may have at the moment. And again Wikipedia's Quadro DP numbers are lacking and you can not at least provide some Quadro FP figures to back up your statment.

What about the Radeon Instinct mi25 and AI/Infrencing workloads, and there will be Epyc/Vega systems using radeon Vega WX 9100 SKUs for double precision workloads and the clock rates on any professional cards will be lower.

I wish someone would publish a definitive table of GPU double percision FP metrics for all relevent GPUs, and Wikipedia appears to be in the business of collecting donations and spaffing the money on non essential uses. If the Radeon Vega WX 9100 has the same FP figures(Don't Know) then maybe the cost will be lower than any GP100 based Quadro, or there will most likely be Dual Vega designs, who Knows. But currently without some DP FP GPU comparsion tables that list all of the most relevent GPUs and their respective true DP floating point metrics it's hard to say. And not all HPC workloads require double percision FP, including the AI/infrencing workloads.

Steve Burke over at GamersNexus has just published some undervolting benchmarks so some improvment with an air cooled Vega FE under-volted, etc.

See article titled:

"Fixing Vega FE: Undervolting to Improve Performance & Power Draw"

July 18, 2017 | 11:27 AM - Posted by renz

"Nvidia's not having any x86 license means that AMD can and will package price its Epyc SKUs with It's Radeon Pro WX/Radeon Instinct mi25/other AI SKUs"

as i said it is not a problem. professional client look at solution first not the package. those whole package is useless if AMD cannot provide the solution needed by the professional client.

"You are going to have to refrence your statment with some form of GPU double percision FP comparsion tables."

this is directly from AMD page:

https://instinct.radeon.com/en-us/product/mi/radeon-instinct-mi25/

AMD directly states that M125 FP64 is rated at 1/16 of the card FP32 (798Gflops). the info is all there all along on AMD product page.

July 18, 2017 | 09:41 PM - Posted by Clmentoz (not verified)

The MI25 is for AI/Infrencing workloads so the 24.6 TFLOPS FP16 metric is what is going to be used for those AI/Infrencing workloads and try and have the figures and such listed in your original reply. Also do you know of any full tabular listings on the web that lists explicitly all the makers GPU's FP64 numbers, because the Wikipedia folks need to be called out on their crappy GPU listings. And what about any reviews that start out with comparing Nvidia's ROP/TMU counts with AMD's ROP/TMU counts because that's never done mostly. And Nvidia, what about Nvidia's top rated FP64 bit SKUs where are those FP64 numbers Wikipedia? AMD does need to develop some tensor processor IP of its own, as Nvidia has added that to its professional Volta offerings.

Professional Clients will price compare for their entire server systems and any clients doing AI/Infrencing, or other workloads, will always look at the total cost of the hardware and the total costs of ownership, power usage included.

So Nvidia not having any direct pricing influence over the CPU hardware part puts Nvidia at a disadvantage compared to AMD. To add to that AMD can and will package price it's Radeon WX/Instinct brand of Professional GPU products in some package offerings with its Epyc professional CPU offerings, so that's an advantage for many professional solutions that Nvidia does not have, and I'll include the AMD server motherboard pricing with AMD negotiating with it's motherboard partners some better package pricing to seal the deals.

All the servers have to have CPUs, so AMD has that x86 advantage in a market that uses mostly the x86 32/64 bit ISA, and no Nvidia offerings there, ditto for server motherboards for Epyc server/HPC/workstation systems. Nvidia can only directly price its GPU SKUs while AMD has direct control over CPU and GPU pricing, and some control over its motherboard pricing for its server platforms.

AMD also has all of its SeaMicro server business IP, that AMD can now license to any of the server OEMs that make servers based on any Epyc SKUs. AMD may have shut down its SeaMicro server builder business but that IP remains for licensing to any of AMD partners that create complete server systems based on AMD's Epyc/other IP.

It's going to be AMD working with the server OEMs to price all of a server's processor hardware parts and AMD has a triple threat there CPU, GPU, Motherboard(Chipsets on the Epyc SOC). Those 128 PCIe lanes that Epyc 1P and 2P processors offer are via the Zeppelin dies so that's a saviings for MB makers right there.

The Infinity Fabric supported on the Epyc and Vega processors is similar in scope to what Nvidia offers with its NVLink technology and neither AMD or Intel will be using NVLink for any CPU to GPU coherent interfacing and Intel does not use the Infinity fabric. So that will make AMD Epyc CPUs the only choice for x86 based systems that have any workloads where the CPU needs to communicate with coherency to any Vega WX/Instinct based GPU SKUs.

Nvidia's NVLink is only being used for limited Power9 based systems, and Power9 also is certified for the OpenCAPI(derived from IBM's CAPI coherent protocol) usage/interfacing. AMD is a founding member of OpenCAPI Consortium along with IBM/Others so there will be AMD server CPU and professional GPU support for OpenCAPI coherent interface also for any Power9 based interfacing and AMD's GPU accelerator products.

Epyc sales alone look to be more profitable for AMD than any GPU only markets, if you look to AMD's past Opteron market share figures. And Epyc is beating Intel in integer workloads and AMD has its Vega GPUs for any HPC FP workloads where Epyc has a AVX disadvantage compared to Intel. AMD probably limited Epyc's FP/AVX with power savings in mind for the majority of server workloads that do not need that AVX ability. AMD has its Vega GPU accelerator SKUs to offer up for any heavy FP number crunching workloads so that's what will make Epyc systems usage plausible for the HPC market.

July 18, 2017 | 11:05 PM - Posted by renz

stop moving the goal post. in my original post i was talking about HPC (supercomputer) not machine specific to machine learning/AI only. for this kind of machine massive FP64 performance is a must due to various kind of workload being run on it. so Vega 10 GPU will never be used in this kind of machine due to it's limited FP64 performance. and the you said this:

"The Peak Double Precision Compute Performance of 819 GFLOPS(?) for Vega FE according to Reddit but Vega FE is not a Radeon Pro Branded WX SKU, so who knows about any professional compute products that AMD may have at the moment"

that's why i point directly to M125 page pointing it FP64 performance. i know you want to say that M125 is Machine learning/AI so it's FP64 might be being held back on purpose by AMD. sadly that's not the case at all. tell me what benefit AMD will have by limiting the FP64 performance on M125? if they have massive FP64 performance then they can directly compete with GP100 not just in AI but true HPC machine as well.

you know what? you only living in your own world and just want to talk what you want. even if people try to guide you to the right direction you simply choose to be ignorant about it. get a life buddy.

July 19, 2017 | 08:11 PM - Posted by Clmentoz (not verified)

No you are not much caring about Nvidia's or AMD's GPUs and the the goal post is the same, Nvidia does not have any server grade CPU IP in which to get Nvidia any package deal sales like AMD will be able offer. Any potential Epyc customer for AMD is a potential GPU customer and Nvidia can not tell Intel how to price Intel's server SKUs. AMD can and will offer any of its Epyc customers Epyc/Radeon Pro/Radeon Instinct MI25/other GPU package pricing discounts, and even motherboard discounts. So any of AMD server OEM partners can offer their potential customers deals that Nvidia does not have the ability to match.

We already know that AMD/RTG professional WX/Instinct SKUs will be lower priced than Nvidia's Quadro SKUs and AMD having the full pricing control of it's Epyc server SKUs that are much more affordable that any of Intel's offerings is going to get Radeon WX/Instinct into a lot of Epyc server systems over the next few years.

It's going to be that Total Cost of Ownership that gets plenty of Epyc paired with Radeon WX/Instinct GPUs sales for AMD. CPUs, GPU, and the Motherboards in the server/workstation/HPC systems via AMD's pricing latitude trifecta that Nvidia can not match. Those Teraflops/dollar and TeraFlops/watt figures will be looked over and Professional grade GPUs are underclocked/undervolted to reach the optimum Tflops/Watt metrics, so overclocking ability is not a factor in the professional markets.

GamersNexus has done its first round of Radeon Vega FE(air cooled) undervolting/underclocking testing and hopefully they will also do some of the professionla workloads testing for the Radeon Vega FE, and the FE was already winning many professional workloads benchmarks relative to the competition.

And did you find any of the Nvidia Quadro DP FP figures, they are hard to find in any comprehensive tabular format online.

July 23, 2017 | 01:20 AM - Posted by Lieutenant Tofu (not verified)

I assume Wikipedia spends (at least part of) their donations on the costs of running the servers. Unless I'm misunderstanding you, I can't think of why it would be Wikipedia's job to collect and publish GPU performance numbers. Anyone (you or anybody) could do that.

Pages about computer hardware do sometimes lag behind, I agree- sometimes still stating that a certain CPU/GPU/etc. is the performance leader in some area when that hasn't been the case for years- or they'll lack more up-to-date info on newer products. Perhaps Wikipedia pays some employees to keep sections up to date, but that's purely speculation on my part.

July 17, 2017 | 08:35 AM - Posted by Anon-y-mous (not verified)

They will say "wait for drivers!".

July 17, 2017 | 11:13 AM - Posted by Voice of Reason (not verified)

You must be some kind of stupid buddy. I have no bias for either side, but for you to be such a little fuck to waste your time commenting on this just to be negative, is utterly retarded.

Your mother should've raised a more intelligent, respectful, and less wasteful individual and you should be ashamed of yourself.

If you knew you weren't even going to give the product a chance then why click on this page?

Check yourself in the future.

July 17, 2017 | 11:39 AM - Posted by Anomynuosrex (not verified)

Someone found some cookies in their cornflakes this morning, didn't they? The previous comment is not only harmless, its quite funny regarding, not only AMD, but Game developers too. No Man's Sky, Mass Effect Andromeda, as well as some more recent BIG TITLE games, are having issues with drivers and hardware compatibility. Also, Is your name irony?

July 23, 2017 | 01:22 AM - Posted by Lieutenant Tofu (not verified)

Um, I think the expression is "did someone piss in your cornflakes". I tried to find a previous use of the expression "cookies in [one's] cornflakes" on Google without success.

Unless you're trying to eat healthier, I don't see the harm in a cookie or two turning up there

July 23, 2017 | 01:22 AM - Posted by Lieutenant Tofu (not verified)

Um, I think the expression is "did someone piss in your cornflakes". I tried to find a previous use of the expression "cookies in [one's] cornflakes" on Google without success.

Unless you're trying to eat healthier, I don't see the harm in a cookie or two turning up there

July 17, 2017 | 01:42 PM - Posted by flippityfloppit...

Lol, one of these types.

July 17, 2017 | 02:28 PM - Posted by Anonymouse (not verified)

You're the one with the issues, dude. Look at what he said, and look at your response, seriously. Take out whatever giant object is firmly lodged in your ass.

July 17, 2017 | 09:49 PM - Posted by SuperkoopaTrooper (not verified)

Wow projection much?

July 23, 2017 | 01:28 AM - Posted by Lieutenant Tofu (not verified)

Probably shouldn't let that remark get so under your skin. It's a common fanboy argument. Whether gaming-centric software improves Vega's gaming performance or not remains to be seen. The comment that bothered you seems to imply it won't, but there's no evidence to support it, either.

I'm disappointed to see ANOTHER reputable PC hardware site focus on gaming performance and not what the card's actually meant for. Everyone would be going "WTF, mate?" if the RX Vega or a GTX card were being reviewed only with workstation or enterprise compute software.

Furthermore, all this focus on game framerates on a card NOT intended for gaming is giving people the impression that this is (guaranteed) the gaming performance to expect from the RX SKUs and that the software is somehow "broken" and RX Vega's software has to "fix it". In actuality, the only reason for concern would be if RX Vega significantly outperformed Frontier Edition and the drivers were not updated to improve its "Game Mode" retroactively.

July 23, 2017 | 01:38 AM - Posted by Lieutenant Tofu (not verified)

Oh, to clarify: I don't particularly care if one company is ahead or the other is- just that there's competition to drive innovation, higher efficiency and lower prices.

Currently, I'm expecting performance beating the 1080 but probably not the 1080ti when the actual RX Vega gaming cards come out, with performance gradually improving with driver updates in the long-term. It seems pretty obvious that power efficiency is not as good as nVidia's GTX cards right now, though. I'm not expecting that to change with different software.

I'm planning an mATX Ryzen 5 1600-based build this year (Mini-ITX if an AM4 ITX board comes out that I particularly like) and will be using whatever is the best card for my money in performance/$. The card I pick also can't have an excessively loud cooler. Aside from that, whether the card has an AMD or an nVidia GPU doesn't particularly concern me.

I would *like* for that best-bang-for-your-buck card I choose to support FreeSync, though, due to the exorbitant cost of G-Sync monitors. Using HDMI or DVI and VSync will leave performance on the table.

July 23, 2017 | 01:40 AM - Posted by Lieutenant Tofu (not verified)

and no, I'm not expecting nVidia GPUs to support FreeSync anytime soon, limiting my choices to AMD. But I'll take DVI/HDMI and Vsync and spend a little more on, say, a 1060 instead of a 1050/ti if the performance just isn't there on AMD's side. That way I'll have some extra FPS for Vsync to eat into without harming my gameplay experience.

July 17, 2017 | 07:58 AM - Posted by Clmentoz (not verified)

There will be hopefully some time for more Vega FE testing on games once the RTM Radeon RX Vega gaming drivers are released, But Vega FE is not for gaming but for development/games and other software development(FE is Not to shabby on non gaming compute and non gaming graphics workloads). And as usual with AMD(At least until the Epyc revenues start rolling in) it will take some time for the drivers and gaming engine/gaming software ecosystem to get fully optimized for the new Vega GPU micro-arch.

The Green Meanies can laugh all they want, but AMD has done a great job with its New Zen CPU Micro-Arch and Vega FE does great on those non gaming workloads. If RX Vega is between the GTX 1080 and GTX 1080 ti at release then AMD/RTG did their job and more than kept their promise for a Flagship offering. And AMD created Zen and Vega with a fraction of the R&D funding that is/was available to Intel/Nvidia.

AMD's Ryzen SKUs are selling well, and AMD's Polaris RX 470/570 and RX 480/580 SKUs are consistently sold out, for whatever gaming/mining reasons, and those GPU/CPU sales are still producing the revenues for AMD/RTG to be back big time, For Real, this time around.

July 17, 2017 | 09:27 AM - Posted by psuedonymous

"Pro cards are bad at gaming" seems to be a popular myth, but not one that has been true for many GPU generations. At best, you could claim that pro GPUs are clocked lower than consumer ones, but this is not the case with Vega FE (as it clocks as hard as it can within its thermal and power envelopes). Nor is it hobbled by ECC, as the Vega FE does not support in-die ECC, nor does it have any application certification (unlike a FirePro or Quadro).

I would be willing to bet that, clock for clock, RX Vega will perform the same as Vega VE does. I would also bet that RX Vega is not going to magically achieve higher clocks than Vega FE for the same power.

July 17, 2017 | 11:11 AM - Posted by Clmentoz (not verified)

Pro cards are bad at gaming, relatively speaking, compared to gaming cards if that FPS metric is your only goal. And Pro Graphics SKUs are not tuned for any FPS usage first and formost like gaming cards are. Gaming Cards can get buy with some quick and dirty math Libraries that are unacceptable to use for any pro workloads where accuracy is of prime importance, ditto for the graphics fidelity and error free rendering needed for the Pro Graphics workloads, or even compute workloads accelerated on the GPU.

You will have to pair that Professional GPU card with an Actual workstation grade/pro CPU and motherboard to get that full level of ECC/driver certified support for any Real workstation platform. So Gaming GPUs are better for gaming and Professional GPU are better for professional error free usage. And some issues of public safty come into play if the workstation platform is used for any structural engineerging workloads/etc. where things need to be as error free as one has the Liability Insurence to cover. And any errors that may lead to lawsuits/criminal penalties need to be kept to a minimum.

July 17, 2017 | 04:14 PM - Posted by biohazard918

This is not true. Linus did a video on this comping a quadro to a equivalent gaming card. They preformed exactly the same in games. With a few exceptions it is the exact same silicon. There may be differences in binning or tdp but when compared apples to apples they are exactly the same in games.

Quadro m6000 vs titan xp

https://www.youtube.com/watch?v=LC_sx6A5Wko

July 17, 2017 | 05:58 PM - Posted by Clmentoz (not verified)

The Titan XP is not a gaming only focused SKU, it's in the same usage/marketing class as the Radeon Pro FE and so its not marketed for pure gaming only workloads. The Titan XP has 3840:240:96/Shader Processors : Texture mapping units : Render Output Units and the GTX 1080 TI 3584:224:88/Shader Processors : Texture mapping units : Render Output Units. The Quadro M6000 is based on the Maxwell GPU micro-arch so that's not the same silicon as the Titan XP and the Quadro M6000's come with 12 and 24 GB video memory options and it has 3072 Cuda Cores.

The Quadro P6000 is the SKU with the 3840 Cuda cores, but thanks to Wikipedia's crappy non standard way of listing the Quadro specs there is no TMU/ROP figure for the Quadro SKUs. And looking at Linus video it is in fact a Maxwell SKU that supposed to be giving a Titan XP a run for the money! So what is up with that, and it's hard to Know without those Quadro TMU/ROP numbers for both the M6000 and the P6000.

If you are going to list a Video card comparsion in the future at leat list the two card's Shader Processors, Texture mapping units, Render Output Units, vital Numbers and the proper SKU IDs, because that ROP count to TMU/Shader processor counts can be used to guage the GPU's likely FPS advantages. And what's the Version number of graphics API, DX## used for the benchmarks or even the DX## feature level support Pro versus Semi-Pro Nvidia SKU.

AMD's Vega FE is most certianly not going to be able to fling the frames out there without the ROPs in which to do that, but there is plenty of extra compute in Vega for those better professional workloads that may just produce the some of revenues(Epyc revenues will help AMD more) to allow AMD to afford to engineer a gaming only focused GPU design/designs.

Nvidia's is always going to be able to get higher FPS than AMD because look that the ROP ratios on the Nvidia SKUs. Look at the number of ROP's On the Vega FE(64) so that matches the GTX 1080's 64 ROPs while the 1080 TI has 88 ROPs. The Radeon Pro FE is heavy on the compute/shader at 4096 shader cores compared to Nvidia's lesser compute resources. AMD needs to get a gaming focused SKU that gives up some compute/shaders and adds in a better ROP count but that will have to wait until AMD has the revenues to engineer a gaming only focused GPU series.

The miners love AMD's Polaris GPUs for that extra compute reason alone, miners do not care about gaming, but AMD has to Focus Vega's extra Compute on the professional market where the revenues are to be made. look at where Nvidia gets the Revenues for R&D it's in that professional market. The Vega FE is doing good just getting the FPS its currently getting with the number of ROPs that it has. The Vega FE has way more compute/shader counts than Nvidia on the consumer/prosumer level. Unless AMD can somehow turn those primitive shaders into programmed ROPs then AMD's RX Vega is bound to land someware between the GTX 1080 and GTX 1080 TI on most DX11 benchmarks. With DX12 gaming still in the first stages of Gaming/Games maker adoption there is going to be an extended period of time required to work up any Vega micro-arch based SKU's software/driver ecosystem support.

If you can afford that $5000+ Quadro maybe that M6000 is a good deal used for your gaming needs, and the P6000 is probably even more costly. I do think that comparing a Pro SKUs to a Non-Pro or semi-pro SKU is maybe not so apples to apples looking at the M6000's 12GB and 24GB VRAM options.

There are so many New IP features in AMD's Vega GPU Micro-Arch that it's going to take some extra months of testing while AMD gets its Drivers up to par on Vega, Vega FE, RX Vega, and any Radeon Pro WX(Formally FirePro branded real Pro SKUs). If that Quadro M6000 is beating a Titan XP then maybe there is a difference in the Maxwell(GM200/GM204) compute ability in addition to the extra VRAM on the Quadro SKU, but there are two many unknowns to prove your point and Linus needs to list the full testing specs in a readable format.

July 17, 2017 | 08:18 PM - Posted by Jake Tapper (not verified)

Linus has long been know to be a dumbass in regards to computing, he's good at making video's (actually his staff) and that's about it really.

July 17, 2017 | 10:33 PM - Posted by renz

did you actually watch the video? all that long comment is useless if you did not watch the actual video first. the comparison is not with titan XP but with maxwell titan X. both Quadro M6000 and Titan X have similar spec except the amount of VRAM and the Quadro capability to support ECC. and in games both card (crysis 3) produce the same FPS.

"look at where Nvidia gets the Revenues for R&D it's in that professional market."

nope. nvidia majority of revenue still coming from their gaming business. their professional revenue indeed increasing in the last few quarters but the major contributor (over 50%) still coming from gaming segment alone.

http://www.anandtech.com/show/11361/nvidia-announces-earnings-for-q1-fy-...

July 18, 2017 | 12:48 AM - Posted by Clmentoz (not verified)

No, and it's up to you to back up your statment with readable Text/HTML article based refrence's, Videos are TL:DW, and it looks like Nvidia's gaming revenues as a percentage of their total business is on a relative downword trend(But the PC market is in decline) relative to Nvidia's non gaming revenues which are increasing, and JHH sure spends a lot of on stage time talking about other markets other than gaming.

I never said that Nvidia's gaming revenues where not relevent it's Nvidia's professional market focused earnings growth and research that is giving Nvidia more billions in revenues to justify the investment in specilized professional GPUs and gaming only tuned GPU SKUs that have less compute and use less power.

Look at the trends for Nvidia's professional market sales and look at the wild swings from the gaming market revenues relative to Nvidia's Professional market revenues. JHH is not dumb, he is looking towards Nvidia's future and that's a furure not dependent of the fickle gaming only market.
For Nvidia that's a +186% year to year data center revenue improvement and +50.5% improvement for automotive for the same Y/Y revenue increases. Gaming market revenue swings and the PC market's decline is not good news for both AMD or Nvidia for their respective gaming earnings growth over the long term.

AMD has so little revenues in the professional markets currently. But the Epyc CPU sales revenues will probably surpass any of AMD's GPU revenues in short order and give AMD the funds to give to/invest in RTG that will allow RTG to design some gaming only focused GPU SKUs with those HIGH ROP/TMU numbers and less compute to save on power. AMD will also have to continue with the Radeon WX/Radeon radeon instinct focused investments to get more of that professional GPU market's additional revenue stream above any gaming market revenues also. Gaming only will not keep anyone in business.

July 18, 2017 | 07:27 AM - Posted by psuedonymous

"Pro cards are bad at gaming, relatively speaking, compared to gaming cards if that FPS metric is your only goal. And Pro Graphics SKUs are not tuned for any FPS usage first and formost like gaming cards are. "

There is no magical hardware tuning, at the same clocks on the same die, the performance will be the same. Even in cases where drivers are used to enforce more conservative code paths, merely switching drivers is sufficient to eliminate this disparity.

"You will have to pair that Professional GPU card with an Actual workstation grade/pro CPU and motherboard to get that full level of ECC/driver certified support for any Real workstation platform. So Gaming GPUs are better for gaming and Professional GPU are better for professional error free usage. "

This is not the case with the Vega FE, as it lacks both driver certification AND ECC. No in-cache ECC, no in-memory ECC.

July 19, 2017 | 08:37 PM - Posted by Clmentoz (not verified)

I was talking about Pro GPUs having to be paired with pro CPU/MBs for a fully certified Real Professional grade system and not the Vega FE, so why is your statment off target.

There will not, I Repeat, NOT be any structural engineering firms using the Vega FE or the TITAN X/whatever. So what are you on about. I think that you need to re-read the post that you have replied to, as you have not understood.

"There is no magical hardware tuning, at the same clocks on the same die, the performance will be the same. Even in cases where drivers are used to enforce more conservative code paths, merely switching drivers is sufficient to eliminate this disparity. " (Not Sure what GPU SKU you are talking about I was talking about the Real Pro GPUs)

You really have very little idea how ECC works in hardware and how the chipset drivers and the memory protocols manage ECC in the firmware of both GPUs and CPUs have ECC turned on/ECC abilities. And all that is done in a transparent method to any user space gaming code. Error correction in the hardware takes extra time and adds latency on any type of processor.

The RX Vega is going to be[at launch/introduction] between the GTX 1080 and the GTX 1080 TI, and AMD will have its Flagship SKU. As usual RX Vega will have to go through the nominal AMD level of driver revisions after release to get any higher levels of gaming performance[That Fine Wine thingy that AMD does because of its relatively underfunded driver teams] and the Epyc Revenues will help with that problem the next time around.

July 23, 2017 | 01:48 AM - Posted by Lieutenant Tofu (not verified)

I'm no expert in the area, but I think a common reason for a Pro graphics card to underperform in games is that its drivers are optimized for image accuracy in things like heavy rendering jobs where you want an exact and clear image, not a less-accurate one taking shortcuts to churn out 60+ rendered images back-to-back in a second (like a game situation.)

I'd expect any performance improvements in RX Vega's drivers to be retrofitted into the "Game Mode" of FE down the road.

I'm guessing drivers for cards like Quadro/FirePro are more mature, since I agree they perform fine in games.

I wonder if FE is doing any geometry culling- which is crucial to a decent, sometimes even playable, framerate in games. I mainly work with older game engines, but in my development experience, no culling can tank framerates.

July 18, 2017 | 11:40 AM - Posted by Gas (not verified)

It's the exact same specs. Why would it perform drastically different?

July 17, 2017 | 07:34 AM - Posted by unacom

Neat. But here's the big question, how well does it do for visualization in Solidworks?

July 17, 2017 | 08:32 AM - Posted by I don't have one (not verified)

That's exactly what I was thinking.

"Where are all the professional software suites? Where's Adobe, where's AutoDesk, AutoCAD, SolidWorks? Where's Blender, 3DMax, Maya? Where's all the ray tracing and image detection suites?"

This review pitches the equivalents of Quadro cards at a host of games. Not that there's anything wrong with it, but the actual usage of this variant of Vega is oriented towards professional usage, so where's the professional software benches?

July 17, 2017 | 09:30 AM - Posted by psuedonymous

Vega FE is not comparable to Quadros or FirePros, as it lacks application certification.

July 17, 2017 | 10:52 AM - Posted by Clmentoz (not verified)

Vega FE will work with the professional software just fine for most usage. So Blender and the other software will work mostly. If you are doing production work on a tight deadline then you may want the Radeon Pro WX(Formally branded FirePro) cards with the full hardware/driver certification and such. But you are also going to need the fully certified Epyc workstation SKUs also for any more error free production where errors can cost you a job/further business if that job is not done by the deadline because of production errors. Ditto for any Nvidia/Intel professional workloads where if it's not done correctly and on time that's someone's A$$.

So for learning and development work the Radeon Vega FE is about the same as the Radeon Pro Duo in its workloads usage metrics, with the Vega FE performing better on some workloads than the Radeon Pro Duo.

July 17, 2017 | 06:23 PM - Posted by DaKrawnik

^This guy knows his stuff.

July 23, 2017 | 10:48 AM - Posted by Just an Nvidia User (not verified)

Yeah too much like maybe Clementoz is a paid employee of AMD.

AMD must not have any magical drivers for RX Vega either because it is failing to impress in Budapest and Portland so far against the same most likely stock 1080. Blind test no framerates in BF1. Is this some idea of a joke.

If RX was better they would be killing the media with "leaks" by now showing it's shining performance.

July 17, 2017 | 04:34 PM - Posted by TheCatOfWar (not verified)

Yeah but let's be real, 90% of people looking forward to Vega could not care less and have probably never heard of most of those programs. Most people are only here in hope that it'll shed some light on the gaming performance of the RX.

Though the absence is somewhat perplexing based on the fact the air cooled version had been tested in them by many reviewers.

July 17, 2017 | 06:24 PM - Posted by DaKrawnik

^This.

July 23, 2017 | 01:49 AM - Posted by Lieutenant Tofu (not verified)

EXACTLY.

July 17, 2017 | 04:59 AM - Posted by Anonymouss (not verified)

I think Vega will be the same as R9 Fury series was. Nobody will buy this, is just a waste of money and time for AMD.

July 17, 2017 | 05:07 AM - Posted by SarathGM (not verified)

LOL...Nvidia Fanboys dont even know what they are saying. Actually the Vega FE card is not a gaming card, you all that and still annoying like a child. Feeling the heat of Vega Gaming card which is yet to be released?

July 17, 2017 | 05:15 AM - Posted by Anonymously2 (not verified)

I think we will all feel the heat. Vega family are hot hot hot power hungry dissapointments.

July 17, 2017 | 05:24 AM - Posted by Snippermanden (not verified)

ooohh I see, it is not a"gaming card" so they left out the flux condensator that would somehow turn the card into a "gaming card" silly me! Ill bet my house, wife and kids that Vega will dissapoint all but the most hardcore fanboys

July 17, 2017 | 06:47 AM - Posted by Odizzido2 (not verified)

Wow you bet hardcore. Well, I use both AMD and nvidia, currently an nvidia card, so I don't think you can consider me an AMD fanboy. Let your family know I might be moving in. BTW any of your family members play path of exile?

July 17, 2017 | 05:33 AM - Posted by Hypetrain (not verified)

It doesn't have the crippling 4GB-only HBM the Furies had - so if priced right (with this perf/watt) and if the custom air coolers can handle it well enough, it will find enough buyers who don't care about the power consumption that much.

Still, AMD is in a tough spot if this is the final performance of the chip and there are no drivers coming up to rectify that. Then the hardware will be just too expensive to sell at a competitive price.

July 17, 2017 | 08:42 AM - Posted by Clmentoz (not verified)

The Polaris/Vega(Radeon Pro WX and Radeon Instinct MI25) SKU sales and corrsponding Zen/Epyc sales for the professional markets will obviate the need for AMD to rely on Consumer/Gaming revenues alone for both GPUs and CPUs. So AMD may have some pricing latitude to price its consumer Vega SKUs at very close to break even pricing to gain/maintain Flagship GPU marketshare. AMD's Polaris based RX 470/480 and RX 570/580 SKUs are sold out, for whatever gaming/mining reasons, so AMD's GPU revenues are not hurting at the moment for lack of sales.

The Epyc sales are just beginning and the indications are that Epyc's revenues will get AMD back into the server/workstation/hpc market in a better way than even AMD's Opteron SKUs did/could at above even Opteron's best market metrics way back in those days.

July 17, 2017 | 09:23 AM - Posted by looncraz (not verified)

Has anyone ever actually been crippled by Fury's 4GB of HBM? Because I sure haven't.

July 17, 2017 | 02:06 PM - Posted by Anonymoussss (not verified)

Fury X owner here. Yes unfortunately when I moved to 1440p you cant run a lot of newer games at max settings with only 4GB.

July 23, 2017 | 01:50 AM - Posted by Lieutenant Tofu (not verified)

Hi! My name is "Nobody!" I had a 280X 3GB and loved it. It was the best choice for my money at the time I built my i5 Devil's Canyon machine.

Sweeping generalizations ahoy in here

July 17, 2017 | 05:20 AM - Posted by Hypetrain (not verified)

Radeon Vega Murloc Edition *gurgle*

July 23, 2017 | 01:51 AM - Posted by Lieutenant Tofu (not verified)

My Corsair AIO CPU cooler sounded a little like that, but you could barely hear it at all under the (also quiet) GPU and 140mm case fans. I'll take hearing the liquid move around in there over the shrill whiny noise of small-diameter blower fans.

July 17, 2017 | 07:25 AM - Posted by Clmentoz (not verified)

"While I know that testing the Frontier Edition in ONLY gaming is a bit of a faux pas, much of our interest was in using this product to predict what AMD is going bring to gamers later this month. It is apparent now that if the clocks are in the 1600 MHz range consistently, rather than dipping into 1400 MHz and below states as we found with the air cooler at stock, Vega in its current state can be competitive with the GeForce GTX 1080. That’s an upgrade over where it stood before – much closer to GTX 1070 performance."

Raja said that this was not a gaming card, and it's not. So now on to some more testing with compute/non gaming graphics rendering/video workloads testing etc. But now that you have these cards it would be nice to do the other non gaming workloads testing.

That Includes some PCI pass-through testing with Ryzen/Threadripper(Better yet Epyc also) and KVM, with some virtualized GPU testing to several different running VM/OS instances all sharing one virtualized physical GPU subdivided into logical GPUs(Nvidia and Intel mostly restrict this to their Pro SKUs only). Also some dual and triple Vega FE testing(You have 3 Vega FEs to play with) on maybe Threadripper(Not the best solution) but maybe that Epyc 7401P 24 core($1075), I'm seriously looking at that 7401P SKU, single socket workstation SKU and its 128 lanes of PCIe goodness. Maybe some Radeon Pro WX testing also, if AMD can provide any samples.

Also for any folks that may do Vega FE compute/non gaming 3D rendering(Animation) workloads can you try some undervolting and/or under clocking Vega FE while testing for power usage? If it can be done while not adversly affecting performance then Vega FE will be useful for some hours long 3D animation rendering workloads.

$1075 is not so bad for 24 Epyc 1P server/workstation cores and you already have the semi-pro Vega FE for some video and other workload testing. How hard will it be to get any Epyc motherboard samples from the motherboard makers or even some Epyc CPU samples for testing?

P.S. if RX Vega can get to GTX 1080 levels of performance then at least AMD will have its flagship SKU. But until AMD gets the revenue stream(Most likely from Epyc sales) then AMD/RTG will not have the funds to produce a gaming only focused GPU SKU with loads of ROPs/less compute designed to spit out the FPS like Nvidia is able to produce with its massive revenue stream.

AMD, and its RTG, only had the funds to create that Modular Zeppelin Server Die(Used accross its Ryzen/Threadripper/Epyc) platforms and only enough funds to try and produce a GPU micro-arch that would be great for professional/compute/AI workloads while at least getting AMD a "Flagship" Gaming SKU to go along with its current Polaris based mainstream gaming SKUs.

Navi will be RTG's first all Raja managed design, as the other previous GCN designs were started before RTG was created. Navi(modular GPU die/s) is going to give AMD the ability to create some offerings/introducions(within a few months time) for a full range of Navi Micro-arch based GPU's: Low end, mainstream, and Flagship, in a similar manner to what AMD has offered with Zen/Zeppelin Die based Ryzen/Threadripper/Epyc SKUs that cover a complete product range Desktop to server/HPC. No more mainstream GPU designs one year and flagship GPU designs the next, because with Navi's modular design AMD will have a scalable/modular and smaller GPU die and great wafer/die yields to create GPUs for all of it's gaming markets, ditto for a professional Navi Modular die based designs for the compute/AI markets. Going Modular/Scalable on that Infinity Fabric worked out great for AMD's Zen mico-arch based CPU offerings.

July 17, 2017 | 09:06 AM - Posted by peter j connell (not verified)

"the diaphragm pump design that makes this card interesting. Think of this is as a flexible reservoir with a high-tension spring to create pressure back into the system. A diaphragm pump works with one-way check valves and reciprocating diaphragm material to create alternating areas of pressure and vacuum.

.... maintain reliable usage of the cooler through the course of natural evaporation of fluid. This is very similar the kinds of pumps used in fish tanks and artificial hearts, likely a more expensive solution than was found on the Radeon Pro Duo or Fury X"
....

"The noise made by this pump is very different than traditional AIO coolers we have used in the office, more of a “gurgle” than any kind of “whine”."

Sounds like a/ - they are doing something new and better w/ LIQUID COOLING, & b/ quiet, is something many would pay a premium for.

July 17, 2017 | 01:25 PM - Posted by Allyn Malventano

I wouldn't consider it groundbreaking. I still found it louder than other coolers, but the high pitch whine was replaced with a ~60Hz diaphragm type pump sound. It's just a different type of pump. Time will tell if it is more reliable, etc.

July 17, 2017 | 09:57 AM - Posted by DinoBuaya (not verified)

Doubles as a nice space heater for the winter, maybe by Christmas AMD can hope to make some sales. I can just picture it now, AMD's marketing going: "Game like a pro while keepin' it cozzy for Yuletide!"

July 17, 2017 | 10:09 AM - Posted by Hameedo (not verified)

So all of those performance graphs show the OC'ed Water Cooled version? Why not test it @1600MHz stock?

July 17, 2017 | 01:29 PM - Posted by Allyn Malventano

The overclocked results are on the overclocked results page.

July 17, 2017 | 02:35 PM - Posted by T1beriu Mr. Santa (not verified)

Have you been to the overclock page? There are no results there, just frequency scaling.

I see that almost all charts, except some of GTA V, show WC 350W 1712 Mhz (!!!). Is that a mistake? Is the card at stock 350W BIOS with stock 1600 Mhz or it's real OC at 1712 for most of the tests?

Are the tables at the end of every test for 300W stock clocks, 350W stock clocks or 1712 Mhz OC?

July 19, 2017 | 10:35 PM - Posted by Allyn Malventano

There was a typo in the charts. Configuration for the non-OC tests was with the switch in the 350W position.

July 20, 2017 | 05:11 AM - Posted by T1beriu Mr. Santa (not verified)

Ok. So I see you haven't updated the charts yet or even put a big red disclaimer at the beginning of the review... Good job.

I think that's hurting your readers because you have no OC benchmarks posted so people assume what they're seeing in the charts are real OC tests, thus concluding OC-ing over stock 300W, from 1382/1440 to 1712 Mhh is useless.

July 17, 2017 | 10:11 AM - Posted by Fedc (not verified)

Oh man, the Threadripper deals just keep getting sweeter from AMD. AMD's got them Zeppelin Die/Wafer yields up into the stratosphere and out into space and far to the stars.

Now if Threadripper is rated for 180 watts(TDP) cooling solution then Threadripper should overclock nicely with the included water cooling solution.

"AMD to Include AIO Liquid Coolers with Ryzen Threadripper Processors"

Go to the techpowerup website(Article), the S P and the A M filter is mad about other websites refrences lately!

July 17, 2017 | 11:10 AM - Posted by Salick27 (not verified)

Does this mean you have 2 air cooled and 1 water cooled variant on hand? If so, you should plug them into a system with a i7-7900x and see if you can dim the lights.

July 17, 2017 | 11:31 AM - Posted by Spunjji

One-thousand internet points for you, sir.

July 17, 2017 | 11:25 AM - Posted by DanGer (not verified)

So this is a pro card and as you admit it is a faux pas to bench it for gaming. Seeing that it is designed for pro use and Nvida pro solutions range from 3k to 5k with less performance, how can you possibly take issue with the $1500 price tag???

This is a pro card. It is the best performing pro in price/value and performance in the market today. Period. It is a run away win in for it purpose. It is disappointing to me that game geek reviewers can't evaluate this card for what is instead of they want it to be.

July 17, 2017 | 11:53 AM - Posted by quest4glory

It's a prosumer card. It's not a pro card. Think of this as a less potent TITAN X with a liquid cooled design, as opposed to those expensive professional cards you refer to. It's not the same thing, the drivers aren't certified for pro apps (which is what you're actually paying for) they tend to have single slot designs for at least some of the pro cards, etc.

July 17, 2017 | 01:22 PM - Posted by Allyn Malventano

AMD has stated that the driver code is the same when used for gaming.
This is the same GPU that will be present in the gaming part.
Not hard to connect the dots. Sure there will be some further optimizations, but don't expect a miracle, regardless of how many times you've been promised one in the past.

July 17, 2017 | 04:50 PM - Posted by DanGer (not verified)

They also said that this should not be compared to the RX version and the RX version will be a faster gamer. If it is the same hw it seems the difference is drivers. Obviously gaming drivers are not ready hence this is not a good example of real gaming marks. Maybe when the game optimized drivers are out this will get updated. Drivers matter.

July 23, 2017 | 01:59 AM - Posted by Lieutenant Tofu (not verified)

Well, it seems to be the same faux pas all the PC hardware press is making. It's basically just throwing raw meat to the feuding packs of fanboy gamers the card's not even intended for. Disappointing.

July 17, 2017 | 11:27 AM - Posted by 1Shingo5 (not verified)

Is that a reservoir at the back of card with 1 tube going to it?

July 17, 2017 | 11:51 AM - Posted by quest4glory

In part, but read the article.

July 17, 2017 | 11:44 AM - Posted by quest4glory

A beefy air cooler should be able to compete with the liquid cooler at 1600 MHz (thinking a 2.5 slot design like we see on many 1080 Tis.)

If the RX Vega product isn't much more than this, or is identical to this silicon with similar driver performance, the board partners need to come in around $479 for their custom designs for this to make any significant impact on GTX 1080 sales at current market prices.

Ideally it would be a $400 GPU, but that's probably not going to happen.

July 17, 2017 | 12:26 PM - Posted by T1beriu Mr. Santa (not verified)

I see that almost all charts, except some of GTA V, show WC 350W 1712 Mhz. Is that a mistake? Is the card at stock 350W BIOS with stock 1600 Mhz or it's real OC at 1712 for most of the tests?

Are the tables at the end of every test for 300W stock clocks, 350W stock clocks or 1712 Mhz OC?

July 17, 2017 | 01:10 PM - Posted by LegoGuy23

But is it really a membrane pump, though?

GamersNexus's teardown showed only one fluid tube connecting the water jackets with the tank on the right.
This would imply then, that the water pump is in fact on the main water block, like they thought, and that the tank on the right is simply a membrane tank to provide constant pressure, like the expansion tank atop your water heater.
This would make sense, as it would allow for constant pressure even as some water permeates through the tubing and evaporates.

July 17, 2017 | 01:19 PM - Posted by Allyn Malventano

It has a hum similar to a fish tank aerator pump. The pump is over the block and the additional tank is as you describe. You need an expansion volume for this type of pump to operate efficiently in a closed loop. I don't suspect leakage is that much of an issue on modern closed loop coolers, so the primary purpose would be maintaining pressure during the suction stroke of the pump.

July 17, 2017 | 08:11 PM - Posted by LegoGuy23

Makes sense.
Thanks for the reply!

July 17, 2017 | 07:06 PM - Posted by quest4glory

They (GN) didn't do a "teardown," they commented on pictures someone sent to them from the Internet.

July 17, 2017 | 08:12 PM - Posted by LegoGuy23

That's a fair point.

July 17, 2017 | 02:21 PM - Posted by Mr.Gold (not verified)

1382mhz to 1712mhz and we see a ~10% bump in results ?
And sometime not even 4% ....

If you bump your GPU 25% but only see 4% improvement.
-the GPU is bandwidth limited
-the drivers are limiting the card

Then on the same note. Could we reduce the clock rate by 25% and only loose ~10% performance ? maybe not even 4% ?
This could drop power usage to well under 200w . It seem possible AMD could make a decent 1070 competitor then ?

July 23, 2017 | 02:00 AM - Posted by Lieutenant Tofu (not verified)

nVidia's GPUs don't seem to scale linearly with clock speed anymore, either.

July 17, 2017 | 02:34 PM - Posted by Chris Barrett (not verified)

Ryan,

I just want to say that I think the graphs are a little hard to read. It might be a good time to update they way you display this information. Thank you for providing great coverage of new technology.

July 17, 2017 | 03:51 PM - Posted by Maksiedn12r (not verified)

Too bad nvidia hasn't taken the AIO route like AMD for their GPU especially when heat plays big role with boost.

July 17, 2017 | 04:03 PM - Posted by LongTimePCperReader (not verified)

After reading this review of the WC Vega FE. I am fairly sure AMD is hitting a wall with their GPU's. The fact that AMD is slapping a AIO water cooler on their "pro-sumer" card, tells me their ability to provide efficient cards is out the window. Granted, the air cooled version is a bit of a *fart in the wind* situation.
After reading many reviews of the Vega FE (both air and WC), AMD's drivers are once again the main culprit of their fairly poor GPU performance in games. IMO, the review needed professional/computational benchmarks. Other than that, seeing AMD's new high end GPU falling below a 1080 is a bit concerning. I have lost a lot of hope in AMD's RX Vega line up of GPU's. But, I'll remain optimistic and ultimately hope AMD steps up to the plate with their upcoming gaming oriented GPU's. Good review as always. Thanks for posting the reviews Mr. Shrout and take care (same goes with the rest of the PCper crew).

July 17, 2017 | 04:03 PM - Posted by LongTimePCperReader (not verified)

After reading this review of the WC Vega FE. I am fairly sure AMD is hitting a wall with their GPU's. The fact that AMD is slapping a AIO water cooler on their "pro-sumer" card, tells me their ability to provide efficient cards is out the window. Granted, the air cooled version is a bit of a *fart in the wind* situation.
After reading many reviews of the Vega FE (both air and WC), AMD's drivers are once again the main culprit of their fairly poor GPU performance in games. IMO, the review needed professional/computational benchmarks. Other than that, seeing AMD's new high end GPU falling below a 1080 is a bit concerning. I have lost a lot of hope in AMD's RX Vega line up of GPU's. But, I'll remain optimistic and ultimately hope AMD steps up to the plate with their upcoming gaming oriented GPU's. Good review as always. Thanks for posting the reviews Mr. Shrout and take care (same goes with the rest of the PCper crew).

July 17, 2017 | 05:31 PM - Posted by bria5544

The only reason anyone would buy this or the air cooled model is a fanatical hatred of Nvidia. Even if you went crazy and bought the Titan Xp, you'd still be in for better performance and lower costs than the liquid cooled card. I'm not sure what AMD's game is here but it doesn't look good. The release drivers had better produce some impressive gains or they've wasted 2+ years of engineering time with this "new" architecture. I will continue to recommend the Polaris parts to anyone with a limited budget that asks. As things stand now I can't do the same for Vega.

July 17, 2017 | 06:04 PM - Posted by DaKrawnik

AMD
Another
Major
Disappointment.

July 23, 2017 | 02:01 AM - Posted by Lieutenant Tofu (not verified)

Yeah, Ryzen/EPYC have been soooo disappointing

July 17, 2017 | 10:17 PM - Posted by #650cores (not verified)

Something does not seem right with this launch going to reserve my judgement until the gaming cards launch.

Ryan, can your team run some High Bandwidth Cache test or is that feature disabled?

July 17, 2017 | 10:26 PM - Posted by John Pombrio

From Gamers Nexus:
Screaming “it’s not a gaming card!” from the front page of the internet doesn’t change the fact that, in actuality, AMD does intend for Vega: FE to sit wedged between gaming and production. This is something the company has confirmed with GamersNexus. It also does not change the fact that the card clearly exhibits driver inadequacies, and it is not on the press to “wait” to review an incomplete product – it’s on the company to ship a complete device which is representative of its marketing. We agree, and have published, that this is not a device best used for gaming. We’re not saying it is a gaming device, and have shown some production workloads where Vega: FE does well. What we’re saying is that, having now spent a week on various aspects of this device, giving AMD a free pass by repeatedly copying and pasting “it’s not a gaming card” into comment boxes isn’t going to encourage the company to fix its inadequacies. The drivers are inadequate to an extent that makes it difficult to tell whether the hardware is any good.

July 18, 2017 | 01:24 AM - Posted by Clmentoz (not verified)

Maybe some commerical bank funding from GamersNexus to AMD, or asking the GamersNexus readers to purchase AMD stocks, but Nvidia and JHH are seeing mad revenue growth in the data center, and even the automotive markets, and AMD probably should focus on the Epyc and Radeon Pro WX/Radeon instinct markets until AMD can design a gaming only focused GPU Micro-Arch!

AMD's Epyc server SKUs need AMD matching GPU accelerator/AI products with even more compute, and ROPs/TMUs be damned, those data center revenues and margins are where the growth is for Nvidia, and where AMD is also going to get the most revenue growth, from Epyc revenues even more than any Radeon Pro WX/Radeon instinct GPU or gaming only GPU sales over the next few years.

When those Epyc revenues arrive, then AMD can divert some funding over to RTG for RTG to design some gaming only focused SKUs. And Navi is going to bring that very same Zen/Zeppelin die ecomomy of scale to AMD's GPU development process and allow AMD to go back to providing yearly updates accross all of its GPU lines/GPU markets mainstream to flagship, and even mobile GPUs, where AMD really lacks a presence. Zen/Vega APUs will help there also.

JHH over at Nvidia, sure has spent extra time talking up other markets, and it's those GPUs for Data Center/Automotive where Nvidia is getting its largest Y/Y revenue growth increase figures from.

GamersNexus has just published(7/18) some new undervolting/etc. benchmarks for the Vega FE, and there is some better gaming figures/power usage figures to be had. But AMD does need to hire more software/driver engineers and the Epyc revenue stream will provide some much needed funding for that and other R&D. AMD needs to still plow all its revenues back into the business, including forgoing any profits to dividends payouts. Let the AMD share holders earn via equity over the next few years, but it's reinvest like mad for AMD over the next few years.

July 23, 2017 | 02:04 AM - Posted by Lieutenant Tofu (not verified)

With AI and other "Big Data"-type applications really taking off, coupled with nVidia withholding its newest architectures for gamers in favor of releasing datacenter-oriented devices with logically huge pricetags, it makes a lot of sense for AMD to position Vega for non-gaming use first (to me at least.) Unfortunately, that leads to lots of high-pitched squealing on the Internet ... and not from the professional crowd.

July 18, 2017 | 01:17 AM - Posted by bbf

Yep, AMD fanbois out in force for this article.

I've purchased AMD video cards for the last few generations and was waiting to see if Vega would leapfrog Pascal... and it looks like it won't unless AMD really does some magic with their drivers.

I like reading about magic in fantasy books, but don't believe in it in real life... so after the first Vega benchmarks came out, I decided to fork over $700 for a 1080 Ti.

I am now back with the green team... and am very satisfied with the card. I don't have to create a custom resolution to support my Cinema 4K monitor, it just works, unlike with my AMD/ATI graphics card...

I've put my money where my mouth is and committed to a 1080 Ti. I really doubt that the highest end RX Vega gaming card will come close to 60fps gaming on a 4K monitor with decent quality settings.

July 18, 2017 | 02:19 AM - Posted by jdismjaokdas (not verified)

I bet you that the bullshit AMD said about "RX Vega will be better compared to Vega Frontier in gaming" is just empty words. Nobody does a different GPU for gaming & compute. So, the only chance that RX Vega will be better in gaming is to have more CUs, which I highly doubt.
Also, RX Vega is not yet on the market because HBM2 is very buggy and it costs a lot, so they considered (and rightfully so) that it is better to launch first the more expensive "compute" cards for the professionals and then, when HBM2 starts to ramp up (which didn't happen yet) they will launch the same GFX card but with different name, 8GB of VRAM, same 300W TDP and same close to, but not quite there, GTX 1080 performance.
If I were them I would sue Micron and whoever works on making HBM2, because they lost an awful lot of time, of money, of market share, but also, I would fire all the engineers from power-optimization department, because consuming 300W for a GTX 1080 - 180W comparable performance is just lame. Global foundries might have something to do with that, but as we saw with Ryzen, 14nm process from GF has the potential to be very energy efficient.

July 18, 2017 | 05:10 AM - Posted by Nima63 (not verified)

Read the review again. it needs 350W and a water cooling solution just to get close to a reference 1080 with a blower type cooler. unbelievable, right?

July 23, 2017 | 02:07 AM - Posted by Lieutenant Tofu (not verified)

I haven't seen anything from AMD to indicate that they'd be using a different GPU for RX Vega cards. Different memory, software and power delivery choices on the reference cards, though- sure.

July 18, 2017 | 04:06 AM - Posted by gamer (not verified)

omg!
well,amd really make useless earth crushed erhh...'thing'

its still crystal clear that even gtx 1070 is nuch better than ANY version of amd vega gpu's.

i wonder why Pcper not taking OC's and its 11gb higher mem version from test.
also example bcoz and vega was LIQUID cooling oc'd verion gtx 1080 version shloud be example ZOTAC GeForce® GTX 1080 AMP Extreme+ (11Gbps GDDR5X), and its clear, its beating amd vega liquid cooling vega.

and we cant took only 3dmark scores,sure we MUST check effiency,and that is amd vega lausiest for many years,same way as amd fury x.

if 165w gtx 1070 loose only little amd vega liquied monster its tell all amd vega old lausy techs.

its crystal clear that with thouse numbers and specs amd vega, any of its variant cant and not shoud get any good review score,they are so lausy.
score is average,not even close 'ecomended', or 'good buy'

2,5-3 star

amd is maded lausy gpu,but also very expensive..
i wonder alot what a heck amd thinking when its made and also even release that kind bad effiency old technics ,and i must say slow (for its power eat/fps) gpu??!
yea,its about 1990 years tdp!

its clear its value go down quickly and its well...re-seling value even quicker...if some one eally buy it.

i dont take that earth eater my pc even free.

shame amd,why you offer ppl this kind...'thing'!!??

July 23, 2017 | 02:18 AM - Posted by Lieutenant Tofu (not verified)

> yea,its about 1990 years tdp!
um, no, not even close. TDPs were never this high in 1990 because the CPUs in use then weren't nearly as powerful.

You're complaining about its cost? This isn't the gaming card. Pro cards are expensive. AMD's recent demonstration in Budapest indicates they're going for a lower price point than nVidia (but performance might not be as good.)

Seeing as I typically run games on medium-high settings at 1080 (or 720), a card that performs close to a 1070 but is cheaper would be great for me. I concur the power consumption seems really high, though.

July 23, 2017 | 02:18 AM - Posted by Lieutenant Tofu (not verified)

... that and GPUs didn't exist in 1990. Vector processors, yes. 2D graphics cards? Yes. GPUs? ... absolutely not

July 18, 2017 | 12:00 PM - Posted by Activate: AMD (not verified)

The number of fanboys attempting to give AMD a free pass on Vega's performance is laughable. When Nvidia pulled the same crap with "Titan" and tried justifying charging $1000 for a GPU because it was a "pro" card, those same fanboys saw right through it. Call this chip what it is and stop sugar coating it: it's AMD's lame attempt at at Titan... charging a massive early adopter tax and hiding behind the guise of "pro" branding. The fact that its not branded as an RX card means absolutely nothing. If anything, AMD should be ashamed that they are releasing a $1500 gpu with such lame drivers.

Drivers aren't going to make up for the fact that this card consumes 2x as much power as a 1080, even if they do eventually close some of the gap. The only way RX Vega would perform substantially better than FE Vega would be to add more shaders, otherwise we're talking <10% improvements from drivers in all likelihood. At the end of the day this card is too hot and too slow to compete with cards Nvidia has had out for well over a year.

I'm sick of paying Nvidia's bullshit prices, but if this is AMD's best shot I really have no choice.

July 18, 2017 | 12:50 PM - Posted by Clmentoz (not verified)

Well that was crap for gamers but look where Nvidia is seeing the greatest revenue growth. And some folks used that Titan X semi-pro SKU(And saved thousands over the Quadro prices) for its extra compute, same as will be using the Radeon Pro FE for its extra compute for non gaming workloads. Nvidia has the extra data center market revenues, as well as gaming market revenues, to afford to have a gaming only focused line of GPUs with the ROP/TMU counts to really fling out the FPS at the cost of any unnecessary compute for gaming only workloads.

If I were AMD I would not worry about gaming as much and would instead focus on the Vega/Full professional data center focused GPU/AI focused SKUs to go along with the Epyc data center focused CPU SKUs. That data center market is where both AMD and Nvidia will see the greatest revenue growh potential for their GPUs and CPUs(AMD Only). And Business in the PC and gaming markets is a relatively mature market where there is not so much growth potential and it's just AMD and Nvidia competing for a share of a relatively fixed sized/stagnent market that also suffers wild demend swings. Gaming does not have enough revenue/revenue growth to support both AMD's and Nvidia's recenue growth needs and they both have to look towards the data center/AI/automotive/semi-custom console markets for any revenue growth.

Gamers are going to have to give up their on their illlogical sense of entitlement because the gaming only GPU market is not enough for both AMD and Nvidia to survive on as businesses.

July 18, 2017 | 06:00 PM - Posted by Photonboy

The DRIVERS are fairly up-to-date, so any optimizations that can be done apply to NEWER game titles. The only thing that could affect performance ACROSS THE BOARD is if there is some specific hardware change that is messing things up with the current state of drivers.

But... this isn't like the RYZEN CPU issues where code-jumping to CCX or other issues can affect performance. VEGA is a slightly tuned (AFAIK) version of the prior GPU so not sure what we can expect.

Plus, if there was some "game changing" update to be had with how code is manipulated it should affect non-gaming applications too so it seems unlikely AMD would launch VEGA-FE with some crippling bug in the drivers.

*So, I think sitting just below the GTX1080 on AVERAGE is what we can expect, with the cards remaining HOT and having throttling issues in many instances which may cause more STUTTER on average than the NVidia cards.

With the cost of HBM2 memory AMD's in a tough spot and frankly will probably resort to cherry-picked benchmarks and "magic" promises like the High Bandwidth Memory Controller etc.

I was rooting for AMD, but unfortunately they seem to have severe engineering difficulties getting the HEAT LOW and the FREQUENCY HIGH (I'm aware that frequency vs pipeline is a separate issue but the fact remains the chips put out too much heat and thus have cooling issues).

**Raja claimed that the RX-480 GPU was at lower frequencies because the GPU was initially meant for mobile, and that was why scaling up the frequency was problematic. Okay. Fine. So I had HIGH HOPES that they would get the heat/frequency issue sorted out when working with the fabrication engineers. But 350W to get similar results to a GTX1080? Almost 2x the power? What happened?

July 23, 2017 | 02:20 AM - Posted by Lieutenant Tofu (not verified)

Raja Koduri said in Reddit AMA that his software team "wishes [it] were true" that Vega was a revision of Polaris. He said it's a new GPU architecture and the first able to utilize Infinity Fabric. I don't know all of the ways it's different, though. I've always taken more of an interest in CPUs than GPUs.

July 19, 2017 | 01:07 AM - Posted by Rhodopsin (not verified)

Blue LEDs(and bluish lights in general) cause rhodopsin mediated photoreversal, which physically damages your eyes. This effect is wavelength and spectral power distribution dependent, NOT intensity dependent.

July 19, 2017 | 01:43 PM - Posted by Jeremy Hellstrom

I did not know that, interesting info.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.