Feedback

AMD Radeon RX Vega 64 and Vega 56 Specs, Prices, Power Detailed

Author:
Manufacturer: AMD

RX Vega is here

Though we are still a couple of weeks from availability and benchmarks, today we finally have the details on the Radeon RX Vega product line. That includes specifications, details on the clock speed changes, pricing, some interesting bundle programs, and how AMD plans to attack NVIDIA through performance experience metrics.

There is a lot going on today and I continue to have less to tell you about more products, so I’m going to defer a story on the architectural revelations that AMD made to media this week and instead focus on what I think more of our readers will want to know. Let’s jump in.

Radeon RX Vega Specifications

Though the leaks have been frequent and getting closer to reality, as it turns out AMD was in fact holding back quite a bit of information about the positioning of RX Vega for today. Radeon will launch the Vega 64 and Vega 56 today, with three different versions of the Vega 64 on the docket. Vega 64 uses the full Vega 10 chip with 64 CUs and 4096 stream processors. Vega 56 will come with 56 CUs enabled (get it?) and 3584 stream processors.

Pictures of the various product designs have already made it out to the field including the Limited Edition with the brushed anodized aluminum shroud, the liquid cooled card with a similar industrial design, and the more standard black shroud version that looks very similar to the previous reference cards from AMD.

  RX Vega 64 Liquid RX Vega 64 Air RX Vega 56 Vega Frontier Edition GTX 1080 Ti GTX 1080 TITAN X GTX 980 R9 Fury X
GPU Vega 10 Vega 10 Vega 10 Vega 10 GP102 GP104 GM200 GM204 Fiji XT
GPU Cores 4096 4096 3584 4096 3584 2560 3072 2048 4096
Base Clock 1406 MHz 1247 MHz 1156 MHz 1382 MHz 1480 MHz 1607 MHz 1000 MHz 1126 MHz 1050 MHz
Boost Clock 1677 MHz 1546 MHz 1471 MHz 1600 MHz 1582 MHz 1733 MHz 1089 MHz 1216 MHz -
Texture Units 256 256 256 256 224 160 192 128 256
ROP Units 64 64 ? 64 88 64 96 64 64
Memory 8GB 8GB 8GB 16GB 11GB 8GB 12GB 4GB 4GB
Memory Clock 1890 MHz 1890 MHz 1600 MHz 1890 MHz 11000 MHz 10000 MHz 7000 MHz 7000 MHz 1000 MHz
Memory Interface 2048-bit HBM2 2048-bit HBM2 2048-bit HBM2 2048-bit HBM2 352-bit G5X 256-bit G5X 384-bit 256-bit 4096-bit (HBM)
Memory Bandwidth 484 GB/s 484 GB/s 484 GB/s 484 GB/s 484 GB/s 320 GB/s 336 GB/s 224 GB/s 512 GB/s
TDP 345 watts 295 watts 210 watts 300 watts 250 watts 180 watts 250 watts 165 watts 275 watts
Peak Compute 13.7 TFLOPS 12.6 TFLOPS 10.5 TFLOPS 13.1 TFLOPS 10.6 TFLOPS 8.2 TFLOPS 6.14 TFLOPS 4.61 TFLOPS 8.60 TFLOPS
Transistor Count 12.5B 12.5B 12.5B 12.5B 12.0B 7.2B 8.0B 5.2B 8.9B
Process Tech 14nm 14nm 14nm 14nm 16nm 16nm 28nm 28nm 28nm
MSRP (current) $699 $499 $399 $999 $699 $599 $999 $499 $649

If you are a frequent reader of PC Perspective, you have already seen our reviews of the Vega Frontier Edition air cooled and liquid cards, so some of this is going to look very familiar. Looking at the Vega 64 first, we need to define the biggest change to the performance ratings of RX and FE versions of the Vega architecture. When we listed the “boost clock” of the Vega FE cards, and really any Radeon cards previous to RX Vega, we were referring the maximum clock speed of the card in its out of box state. This was counter to the method that NVIDIA used for its “boost clock” rating that pointed towards a “typical” clock speed that the card would run at in a gaming workload. Essentially, the NVIDIA method was giving consumers a more realistic look at how fast the card would be running while AMD was marketing the theoretical peak with perfect thermals, perfect workloads. This, to be clear, never happened.

View Full Size

With the RX Vega cards and their specifications, the “boost clock” is now a typical clock rate. AMD has told me that this is what they estimate the average clock speed of the card will be during a typical gaming workload with a typical thermal and system design. This is great news! It means that gamers will have a more realistic indication of performance, both theoretical and expected, and the listings on the retailers and partner sites will be accurate. It also means that just looking at the spec table above will give you an impression that the performance gap between Vega FE and RX Vega is smaller than it will be in testing. (This is, of course, if AMD’s claims are true; I haven’t tested it myself yet.)

Continue reading our preview of the Radeon RX Vega 64 and Vega 56!

The air-cooled Vega 64, in both Limited Edition and standard versions, will have a base clock of 1247 MHz and a Boost clock of 1546 MHz. The base clock is more than 100 MHz lower than the Vega FE air cooled card, which is troubling perhaps, but the Boost clock looks like it is 54 MHz lower than the Frontier Edition. However, if AMD’s move to a typical/average clock rating for RX Vega is true, that will be HIGHER than the 1440 MHz average clock rate that we observed with the Vega FE card last month. That ~100 MHz could give the RX Vega a performance advantage of 7-8% over the Vega Frontier Edition.

View Full Size

The liquid cooled RX Vega, which will not be a limited edition according to AMD, has a base clock of 1406 MHz (much better than the air-cooled card, as expected) but a boost clock of 1677 MHz. If that lives up to the claims, it is significant, as it would mean it would be 16-17% higher than the performance of our first air-cooled Vega FE testing. That would give it enough of a performance boost to push up past the GeForce GTX 1080 in nearly all comparisons we performed.

Memory speed and bandwidth for the HBM2 implementation are the same at 1.89 GHz and 484 GB/s of total bandwidth. The RX Vega product family will have 8GB of memory implemented, half of the Vega Frontier Edition. NVIDIA’s GeForce GTX 1080 has 8GB of memory though the 1080 Ti has 11GB. Clearly, the high cost of HBM2 memory went into this decision and though AMD will be touting the benefits of the HBCC and its capability to address other memory more easily, that feature will need work with developers to be properly implemented and to impact gaming performance.

The power draw of the Vega FE cards was telling for the results we see on RX Vega. The Vega 64 liquid cooled will hit 345 watts, the air-cooled version hits 295 watts, and the Vega 56 comes in at a cool 210 watts. These are power hungry cards and getting these clocks necessary to take on the GTX 1080 have pushed the Vega design outside its most efficient area.

View Full Size

The Vega 56 might be the most interesting product on this slide though – it was a surprise release and we don’t have any relatable experience from the likes of a professional card to lean on. With a ~15% drop in shader count and lower clock speed, this product will be at a noticeably lower performance level than the Vega 64. The HBM2 memory is running at 1.6 GHz rather than 1.89 GHz, bringing memory bandwidth down to 410 GB/s. With a price of $399, it will be taking on the incredibly popular GeForce GTX 1070.

Though I don’t have hardware yet, the clock speeds and power rating of the RX Vega 56 indicate that it will be a massive overclocker. You should not be surprised if gamers can push the RX Vega up to the same clock speeds, or higher, as the RX Vega 64, getting a massive performance boost. Obviously, that will increase your power consumption up to near Vega 64 levels, but if the cooler can keep up with it, we should have no issues.

Pricing is going to be an interesting discussion. The base price of the RX Vega 64 is $499 and the Vega 56 is $399. That positions the Vega 64 against the GTX 1080 (in a world where prices are back to normal in the GPU market) and the Vega 56 against the GTX 1070. The Limited Edition Vega 64 will be $599 and the liquid-cooled Vega 64 will be $699 but come with the option of Radeon Packs. What are those you ask?

Radeon Packs – More Gaming, Less Mining

One of the big things that AMD has been trying to address, much at the behest of the gaming community, is how to get graphics cards in the hands of gamers rather than miners that drive up prices. Though a sale is a sale to both AMD and NVIDIA, there is a danger to the long-term vitality of the PC gaming market if these trends continue. I wrote about these risks on MarketWatch back in June, and the risks still remain real. AMD would also like to get users more invested in the other hardware ecosystems that promoted the Radeon brand including FreeSync displays and even the Ryzen processor and motherboard platform. AMD built the Radeon Packs idea as a way to hit both angles.

View Full Size

The simplest way to understand the packs is that gamers that buy the RX Vega 64 in a Radeon Pack will receive two free games (that differ by region but are Wolfenstein II and Prey in the US), a $200 discount on a 3440x1440 100 Hz FreeSync display and a $100 discount on a Ryzen CPU + motherboard bundle. The purchase of the display and the CPU+MB are optional, and are not required to complete the sale, but the discounts are only offered at the time of purchase. The games will come with the card purchase, regardless.

The RX Vega 64 will have two packs available: Black and Aqua. The Aqua Pack is for the liquid-cooled version of the RX Vega 64 and prices the card itself at $699. The Black Pack is for the RX Vega 64 air-cooled card and comes with a $599 price tag and covers both the Limited Edition card and the standard black shroud card. Finally, the Red Pack is for the RX Vega 56 air-cooled card and is $499.

View Full Size

You’re probably wondering about the supposed price increase on the RX Vega 56 and the RX Vega 64 standard card; they are both actually priced $100 higher than the stand-alone units. First, this is the only way to get the Limited Edition RX Vega 64 card and it will not be sold at $499 individually. Once the limited-edition cards are sold out, the standard card will replace it in the Black Pack. The liquid-cooled card is also only going to be available for $699 and in the Radeon Aqua Pack.

For that added $100 you are getting two free games and $300 in potential discounts on other hardware. If you already have a platform you are happy with, or a monitor you like, and don’t want to upgrade them, you do not have to take advantage of those discounts. Instead, you will be paying $100 more for the fancier card and the two bundled PC titles.

AMD’s intent is two-fold. First, they want to get more users to upgrade to FreeSync displays (and Ryzen systems of course) as it helps them for the relative comparisons to NVIDIA hardware today and incentivizes gamers to stay in the Radeon ecosystem. AMD also hopes that this helps to deter the miners from buying these cards because of their higher price and bundle complications. I personally don’t feel that miners will be deterred by any price change that wouldn’t also scare away all PC gamers and leave AMD in a nearly impossible spot. The only thing that COULD have hindered the purchase of cards by cryptocurrency is if AMD required these packs to be purchased with the discounted hardware, but that would be unfair to the gamers that might have already invested in the AMD ecosystem.

Another side benefit that might be occurring as well with this bundle system is to help resellers like Newegg refocus on the gaming customer. Selling a graphics card is typically a low margin sale and resellers care very little if the card ends up in the hands of a dedicated PC gamer itching to play Prey or a miner putting it on a shelf to earn Ethereum 24 hours a day. But if you can tie other, higher cost products to the sale of the RX Vega, that raises ASPs (average selling prices) and could give Newegg/Amazon/etc. a reason to target and hold product back for these types of sales.

As for on sale and review dates, all AMD has told us this far is “in August.” It’s possible that more details may emerge on the preorder and sale dates from the launch event this evening (starting as this story goes live) so we’ll update if so.

AMD has told us that RX Vega will be on sale and reviews will go live on August 14th!!

Experience Based Performance Testing

For our RX Vega pre-briefing, AMD took an interesting turn on talking about performance metrics and comparisons to the competition. Rather than showing average frame rates in comparison the GTX 1080 or even 99th percentile frame time data that we have come to expect from the new guys helping to run the product and marketing teams, instead we are getting performance comparison based on FreeSync ranges and 99th % minimum frame rates.

The idea is this: for a gamer that buys an RX Vega they should be concerned about getting a smooth gaming experience. AMD showed data from two high-end resolutions, 3440x1440 and 4K, both with variable refresh windows based on the associated FreeSync display.

View Full Size

Based on this slide, AMD is telling us that the GeForce GTX 1080 has 99th % minimum frame rate in the range of 45 to 78 FPS in a range of titles including Ashes, BF1, Hitman, and more. At the same settings, the Radeon RX Vega 64 has a 99th % minimum frame rates in the range of 53 to 76 FPS. Based on AMD’s testing, the GTX 1080 dips lower in some of these games than the RX Vega 64. If you are using a display with a 48-100 Hz variable refresh rate, going below 48 FPS means you are outside the VRR range (which to be clear, is only true on FreeSync monitors, not G-Sync) and that equates to a “poor” gaming experience.

View Full Size

As you look at the numbers for that, AMD is pointing out that the GTX 1080 has a good experience in five of the six games, while the RX Vega 64 has a good experience in all six.

View Full Size

Looking at the same testing process for 4K, with a FreeSync display range of 40-60 Hz, we can again see the ranges that AMD measures.

View Full Size

And again, the Vega 64 performs at higher than 40 FPS in all games while the GTX 1080 does so in only four of six.

Soo….clearly this is not a standard or established performance measurement practice. But it is interesting. AMD is basically telling the community that as long as you meet minimum performance metrics across a range of games you care about, and you have the display capable of making that work, then why do you need to know about the average frame rates? There is some merit to that argument but it ignores some critical pieces of the reason we benchmark.

First, benchmarks are used to measure performance in today’s games, but they are also used to help predict relative performance for future games, higher resolutions, or more intense quality settings. When two cards can both cross 60 FPS for a “good” experience on a particular display, the amount OVER that mark is a good indicator of your ability to move to a different monitor, or go to a higher resolution, or increase anti-aliasing etc. If you could pick between two cards at $499, one that runs BF1 at 65 FPS on 2560x1440 and another that runs at 75 FPS, both with comparable frame time consistency, the faster option would have advantages for the future upgrade path for gamers.

Second, AMD’s testing ignores the fact that NVIDIA G-Sync users don’t worry about whether or not their display integrates LFC (low frame rate compensation). All G-Sync displays have effective 0 FPS minimum frame rate (meaning they don’t regress to tearing/stutter below some panel threshold). So, the GTX 1080, even when it runs below the 40 FPS mark on the created situation AMD shows above, will not stutter, meaning they still have a “good” gaming experience.

Closing Thoughts

AMD’s decision to promote performance in this new, and slightly confusing manner, is interesting and makes us more interested than ever to really put the RX Vega 64 through its paces. We have a unique position here at PC Perspective because we have already seen much of the potential of the Vega GPU with the Frontier Edition reviews. As is always the case with launch information, we wait for our own testing to confirm or counter the claims made by every hardware vendor – this is no different.

The pricing of the Radeon RX Vega 64 and Vega 56 cards, taking out the complexity of the Radeon Packs, is exactly where I stated they would need to be to remain competitive with the NVIDIA GeForce cards already in the market. At $499, and with the higher clock speeds than the Frontier Edition, the air-cooled RX Vega 64 should be a strong performance competitor to the GTX 1080. Yes, it will be drawing more power, there is no magic here today to change that. The RX Vega 56 could be the better part in relation to the GeForce GTX 1070 with the potential for impressive overclocking and a $399 price point.

If you want to get one of the $599 limited edition RX Vega 64 cards, be on the lookout immediately. I was told they will only last until early Q4 and were intended to hold the market until partners like ASUS can get their custom cooled designs on virtual and physical shelves. The liquid-cooled cards at $699 are priced much higher than the stock designs, and once the limited edition air-cooled cards are gone, there will be a $200 gap between the “reference” air-cooled cards and the liquid-cooled. I’m curious if that performance gap will stand long term.

That’s all we can dive into today and until we get cards in our hands to do real-world testing and performance evaluation, AMD Radeon RX Vega 64 and Vega 56 have been fully exposed.


July 30, 2017 | 10:36 PM - Posted by Vector3 (not verified)

OK... so... package deals are the only way to sell Vega sanely.

July 30, 2017 | 11:01 PM - Posted by Joseph Taylor (not verified)

Basically. When you are selling an inferior products for the same price as your competitor's cards from May 2016 you have to get stupid and start talking about qualitative junk like you can't tell the difference between the two.

July 30, 2017 | 11:48 PM - Posted by Kundica (not verified)

Actually, Raja was talking about providing "user experience" in qualitative ways over a year ago here on PCPer's stream.

July 31, 2017 | 12:02 AM - Posted by Clmentoz (not verified)

Inferior How?

Let's talk about Nvidia's consumer SKUs and Inferior for compute workloads, and a coin mining sale is just as revenue producing for AMD as a gaming GPU sale. So AMD's management answeres to AMD's stockholders who will be happy with any types of revenues/revenue growth from any Ryzen/Polaris/Vega Epyc/WX Vega/Radeon Instinct sales for whatever workloads.

Nvidia's CEO sure is talking about and getting the mad revenue growth automtive revenues(Non Gaming Revenues increases) and professional GPU/AI revenues increases. Nvidia's best revenue GROWTH figures quarter to quarter and year on year is coming from non gaming sources with Nvidia's massive gaming GPU revenues not showing much growth over the same periods. AMD needs to focus like JHH, on theose professional markets, and new markets(Automotive/Other markets) with the mad revenue Growth potential and the best revenue Growth figures.

July 31, 2017 | 09:07 AM - Posted by Just an Nvidia User (not verified)

Yes because a gaming card does not need compute out the wazoo. Pro cards however do. Video cards should do video card things like tesselation. A 1070 when properly configured is the best mining card in performance per watt used. Maybe not the cheapest but with Ether prices falling and harder to mine with more miners in the market, efficiency is where it's at right now.

AMD lacks the resources to segregate their cards like Nvidia can. If AMD didn't have console dominance and make compute a big push in dx12 their cards would be further behind. It was a good move with their buddy Microsoft to push this garbage out. AMD's survival depended on it. Nvidia did not see the big picture when they were forced out of the console market.

It's called diversity. Nvidia took a Tegra chip that cost a lot of R and D and spun it into a fully open new market created by themselves (Drive Px). Tegra is a great chip and was unfairly forced out of the phone/tablet market by dominant "competitors". Nvidia are ahead of the game and AMD is likely too far behind to do much about that.

The server market however is big enough to allow for AMD to take a slice of.

July 31, 2017 | 10:52 AM - Posted by NotMartinTrautsomething (not verified)

AMD will make more off of its CPU operation than its GPU operation, and maybe by Navi AMD can engineer specilized modular GPU chiplets/dies that are tuned towards only gaming workloads. So why were you not there with the comercial bank lending power to lend AMD the half billion dollars it would need to have such Nvidia like options for specilized GPU gaming and compute lines of GPU tape-outs.

Microsoft is the one funding the design of its semi-custom APU designs from AMD, so AMD has to budget from what funding its customer/s are willing to provide for their custom designs.

AMD's Epyc server CPU SKUs need a GPU compute/AI dance partner, especially for the HPC market, where the Zen micro-arch lacks the large AVX unit power relative to Intel. AMD is smart in designing the Zen micro-arch for the majority of the server markets non AVX using needs while having that GPU IP to pair with Zen/Epyc CPU SKUs for any heavy FP lifting done on Vega FP accelerators for the HPC market that needs the FP power. Vega complements Epyc in the HPC systems room, and even in the standard server/cloud server room where AI/Infrencing workloads are needed to be run.

Nvidia's Drive Px is part of its automotive market and that market is growing in revenues faster that any revenues are increasing in any of Nvidia's consumer gaming market. So yes AMD needs to focus on the Server CPU/Pro GPU markets and the medical markets, slot machine markets, embedded markets, and automotive markets also. Consumer gamimg revenues are rather fixed with Nvidia and AMD both competing for that realtive stagnent in new revenue growth GPU/gaming slice of the same revenue pie.

July 31, 2017 | 02:48 PM - Posted by Aparsh335i (not verified)

"a coin mining sale is just as revenue producing for AMD as a gaming GPU sale" Actually you are incorrect. A coin mining sale is a card that runs 24/7 and is likely to break in less than a year and require warranty work. Warranties cost money. They have jacked the prices up here because of mining and warranty estimations based on the mining. If another crash comes then the market if flooded with cards on the cheap and AMD then has to dump their price, has happened a few times in the past 10 years.

July 31, 2017 | 05:19 PM - Posted by PawnageNot (not verified)

Coin mining cards are undervolted and underclocked to save on power uasge if a card used for mining goes poof after one year of mining then there really was somthing wrong with that defective GPU SKU. And the AMD's/AIB makers' engineering/warrenty actuaries have that card duty rating/lifetime statistics worked down to science for the GPU ASICs. So the cost of a few replacmente is figured into the total cost of production. Ever wonder why overclocking instantlty voids any processor's warrenty. Now it's mostly up to the AIBs to warrenty their cards, so that's not going to affect AMD unless there is a large ASIC defect rate todue to a proven engineering defect on an AMD ASIC part.

All GPU/processor SKUs have samples randomly taken off of the production lines that are tested 24/7 all the time by AMD and from that information/testing results that data is used to figure out MTBF statistics and estimate any additional cost of warrenty replacments figures to add to the wholesale pricing of any processor ASIC SKU.

Processor life cycle testing and warrenty period estimation is an engineering sub-field unto itself and every processor maker and AIB maker has their warrenty engineering/actuarial teams to figure that extra cost into the prices of each processor SKU to make sure to price all the products in such a way as to cover any replecment costs of defective parts.

And undervolting and uuderclocking using the AMD provided wattman tool(inside of the warrenty covered range of voltages and clock speeds) actually extends the life of most GPUs used for coin mining. It's the heat/repeated switching that degrades a transistor over time and underclocking reduces the switching and also reduces the heat generated, undervolting really cuts down on the leakage metrics and less leakage means less heat also.

July 31, 2017 | 08:18 PM - Posted by AndreSpumante (not verified)

The reason they are doing this is because all of their cards got bought by miners which shot the price of AMD cards upwards of $200 more than their normal price. I believe this has happened twice due to crypto currency miners. AMD GPU's are the best for mining.

They want to make sure gamers can buy the cards so they are selling these specific ones for more money but with the ability to use discounts it toward other hardware like mobo & monitor. If you're a gamer this is supposed to appeal to you I guess and therefore create some kind of way for gamers to actually get a fair cost on the card before crypto-miners buy them all and the price goes up.

The reason why AMD cards are expensive is not because AMD higher'd the price, it's because crypto-miners have bought so many that suppliers had to raise the price in order not to sell out (supply & demand) and so they could obviously make money as well.

August 2, 2017 | 12:28 AM - Posted by comcrap (not verified)

T flops bro T flops !

July 30, 2017 | 10:48 PM - Posted by mLocke

What is up with the text placement on these slides? AMD forgot how to use align tool?

July 30, 2017 | 10:57 PM - Posted by Joseph Taylor (not verified)

It is abundantly clear AMD can't compete with Nvidia anymore. I didn't want to believe it until now but it's hard to make lemonade out of this lemon.

July 31, 2017 | 12:12 AM - Posted by Destroy all trolls (not verified)

Joseph Taylor:

https://en.wikipedia.org/wiki/Charles_Taylor_(Liberian_politician)#Verdict

July 31, 2017 | 04:11 PM - Posted by Hishnash (not verified)

Not at all in the pro line, the SSG and WX are both going to outsell Nvidia's offerings very well. Most pro workloads are limited by GPU io these days, while it is possible to write a GPU kernel that streams in data as you need it this is very hard if you:

a) have user interaction so the data you will need soon is not predictable

b) have lots of random access over a large data set. this covers most data modeling simulation workloads where your data against which you are testing is basically always larger than any VRam so just to evaluate your model you need to constantly read back in that data to the gpu.

In the work, I have done writing Cuda kernels the most complicated part has been this memory management pulling in data, and it has also been the part that has had the most effect on performance.

Both the ability to map the enter system memory and drivers to the GPU so that you just manage one memory pointer and the GPU will pull it when required and the addition in the SSG of the 2TB on board NVMe will be game changing for the entire industry.

July 30, 2017 | 11:04 PM - Posted by #650cores (not verified)

Bundle program is a massive joke, Pcper please run some test with the HBC, really looking forward to see how the architecture will address 512TB of system memory.

July 30, 2017 | 11:47 PM - Posted by Clmentoz (not verified)

Vega's HBCC/HBC and its ability to utilize regular system DRAM and any vitural memory/paged memory to SSD/Hardrive using that for texture data/mesh data/othe data with that 512TB of GPU memory addressing ability needs some very deep dive reviewing.

This has more implications for discrete mobile Vega Micro-Arch based SKUs that may have only 4GB/less of HBM2 memory. So Vega's HBCC/HBC using any available HBM2 as a last level cache is really the IP that makes Vega stand out relative to AMD previous GPU Micro-Arch designs. Any Vega discrete mobile SKU will be able to support Textres/mesh data/other data sizes that are larger than the GPU's available HBM2 size with the HBCC/HBC subsystems in Vega treating the HBM2 like a last level GPU cache, and In the Background, swapping the excess textuers/mesh/othe data in and out of the HBM2 cache so the GPU can work mostly from HBM2 at no loss of effective bandwidth effencicy.

July 31, 2017 | 02:20 AM - Posted by MoInfoTricklesOut (not verified)

TechPowerUp(1) is filling in some missing info but this slide conferms that AMD is no fact using the HBM2 as a last level cache. (a) BM2 as last level cache, and (b) how the HBCC manages the HBC pages/paging.

(a) slide: HBM2 use as a last level cache.

https://www.techpowerup.com/reviews/AMD/Vega_Microarchitecture_Technical...

(b) Slide: Page-based memory with and without an HBCC

https://www.techpowerup.com/reviews/AMD/Vega_Microarchitecture_Technical...

Note:all slide are part of article(1), more info will be added as TechPowerUp get it so this is by all means not the complete info, but it's more complete than before RX Vega's release.

(1)

AMD Vega Microarchitecture Technical Overview

https://www.techpowerup.com/reviews/AMD/Vega_Microarchitecture_Technical...

July 31, 2017 | 09:13 AM - Posted by Just an Nvidia User (not verified)

For those that are concerned about the HBC does it phone home to AMD servers to give you that massive amount of 512TB or is that the maximum it can use if you have that absurd amount available in your system. If it does contact AMD what does it share about your system with them?

July 31, 2017 | 11:03 AM - Posted by NotMartinTrautsomething (not verified)

The FUD is strong what that very daft remark, and the G-force is heavy in the dark side of the snooping force on the Green Teams' middle-ware supplied with GPU SKUs and forced on its users. Be sure to be logged in to Nvidia's cloud for all your driver update needs, it's BOHICA every time directly from JHH to you!

HBC is in fact simply the HBM2 on Vaga and the HBM2 is treated like a last level cache by Vega's HBCC to more efficiently manage larger that the HBM2's capacity of video memory/cache with any larger size texture/mesh data/other data pools efficiently swapped to and from the HBM2 last level cache and systems RAM or SDD/hard drive virtural memory swap space.

July 31, 2017 | 02:29 PM - Posted by Just an Nvidia User (not verified)

Nope I don't use GeForce experience or Win10. So that inflated number is basically BS and more of a misrepresentation than ramgate on the 970's. Marketing FUD from AMD at it's finest.

July 31, 2017 | 05:41 PM - Posted by PawnageNot (not verified)

Could you take that fanboy stuff over to ESPN. GPUs are a high technology scientific undertaking and if you need to be assoicated with an outright winner in order to feed your fragile ego do it over at ESPN!

Nvidia is not a football team nither is AMD! Your butthurt for your imaginary sports franchise in the form of Nvidia is rather tiring. Vega should be right there with the GTX 1080 in performance metrices and better in the compute metrics so go and hug a GTX 1080ti it's got better FPS performance than the current Vega SKUs. But keep your eves out for any Vega improvments and any Vega refreshes because AMD's GPU SKUs do get a little more performance improvment over time.

I suggest that you get a big gold chain and hang the GTX 1080ti off of that chain for some bing bling value as you appear to want some form of bragging rights over any game play functionality that any Nvidia SKU may offer. AMD has its flagship offering so there can be no more complaints from the AMD fanboy side with the same fragile ego issues as you.

Go with those AMD fanboys and start a flag football league with a Red Team and a Green Team and get you winning fix needs met there on a weekly basis.

July 31, 2017 | 06:39 PM - Posted by Just an Nvidia User (not verified)

You must be another one of those butthurt compute elitists that make their rounds online. Don't like what you read avert your eyes. Have fun with your heater, hope you live somewhere cold. If not you need to chill out.

August 1, 2017 | 07:28 PM - Posted by MadTheMaxx (not verified)

Hey if the money's better in the non gaming GPU markets then the gamers will just have to suck it up. No normal person cares about having only the top frame rates. They only care about those lower than 30-45 frame rates or any frame rate variability that is noticable in the game's play. If the game is playable and the gaming experience stays smooth and responsive then that's good enough. Everything else is overkill.

All you gaming only basement dewllers, if you want AMD to compete with Nvidia in the gaming only focused GPU market then you better get some AMD stock and invest in AMD's future, even if you are only buying Nvidia GPUs. Becuse if Nvidia has no competition in the gaming market like you all claim, the prices of your Nvidia gaming only focused GPU SKUs will most certianly be going up. The is what will happen without AMD in that game, even part ways, with AMD's all around gaming/compute GPU offerings that game pretty well.

All gaming technology is elitist, because you gaming fanboys(win/loss obsessed) are not college materal, and gaming uses more than its share of PHDs from across the maths and sciences. Take yourself on over to ESPN, there you will get your definite win/lose fix many times every week.

It takes revenues for Nvidia to afford to create those speclilzed gaming only focused GPU SKUs, revenues that fund the R&D, and revenues that AMD currently does not have! But the Epyc/Radeon Pro WX/Radeon Instinct sales may solve AMD's revenue problems before too long. Those coin miners' color of money is the same as any gamers' color of money to AMD's investors and the retailers sure are upping their margins on some RX 400/500 series sales for all that mad hashing goin on.

Vega is here with tons of compute and some GTX 1080 competition on the high FPS metrics and maybe even on some of that highest of the average lows for that FPS variability metric for some smoother game play! Let the benchmarking begin.

July 30, 2017 | 11:05 PM - Posted by Joseph Taylor (not verified)

Also I'm waiting for the tech media to start asking AMD some hard questions about what went so wrong with this product instead of dancing around the issue. There is a chasm between AMD and Nvidia in terms of performance per watt and performance per mm2 now where there was only a relatively small delta with Polaris.

July 30, 2017 | 11:30 PM - Posted by Clmentoz (not verified)

Nothing went wrong with Vega, as AMD can not afford to prioritise gaming only usage metrics on its Vega Micro-Arch designs. AMD lacks the revenuse and R&D funding that those revenues provide currently compared to Nvidia's billions of dollars more in revenues.

You will have to Wait for AMD's full professional Epyc revenues to start rolling in to give AMD the necessary funding to fully fund RTG's need for a gaming only(Stripped of compute) focused base design to placate the gamers(Some gamers) need for that FPS metric at all costs.

What went wrong with Vega was that the revenues where not there yet to provide RTG with the necessary funding to tape out many different Vega Micro-Arch variants focused on a single gaming market usage. AMD's Epyc workstation/server/HPC CPU SKUs need a Vega Compute/AI dancing partner in that high margin high revenue producing professional market.

SIGGRAPH is about the professional markets first and formost and RTG is targeting those markets in a doubled down fashion with gaming only having to wait until the revenues are there for any GPU SKUs tailored to the gaming only market.

August 1, 2017 | 06:49 AM - Posted by psuedonymous

"AMD can not afford to prioritise gaming only usage metrics on its Vega Micro-Arch designs."

Vega IS the gaming-centric design! They chopped out the FP64 units so it's not much good for sim workloads, and it lacks ECC support so the pro market will be wary of using it for production.

August 1, 2017 | 08:07 PM - Posted by MadTheMaxx (not verified)

Yes but that just affects the DP FP acceleration workloads. But all that other virtulization and platform security and even the HBCC HBC/HBM2 IP is ready made for working on large in memory data-sets that are localized in system RAM or HBM2-cache and/or paged memory on the GPU's PCIe card based SSD/s, for the Server/HPC/workstation markets workloads! And some of that Vega IP plays well across both gaming and the professional markets. The Vega Radeon Pro WX SKUs just demoed at SIGGRAPH have all that ECC truned on. And sure AMD changes up the DP FP to SP FP unit ratios on its cards, so does Nvidia. And still with those DP FP ratios reduced AMD's total SP counts are higher relative to Nvidia's so don't try that with me.

P.S. The JEDEC standard for all HBM2 includes ECC support but it up to the device ODM/OEM to include support for ECC in the firmware/PCIe card's hardware.

There are other Vega micro-arch based "WX" SKUs scheduled to be released in the professional markets and then Navi will take things all modular and scalable for GPUs. So AMD will be able to scale up its GPUs like it scales up its Zen/Zeppelin dies across its consumer Ryzen/Threadripper and it Professional Epyc SKUs, using that very same modular Zeppelin die design.

Look at all these Vega variants consumer and pro:

"AMD Vega 10, Vega 11, Vega 12 and Vega 20 confirmed by EEC"

https://videocardz.com/71280/amd-vega-10-vega-11-vega-12-and-vega-20-con...

July 30, 2017 | 11:16 PM - Posted by Clmentoz (not verified)

Still the big ? on the ROP counts on the RX Vega 56 SKUs? ROPs and FPS is directly related and the RX Vega 56's ROP counts need to be filled in before the night is over.

I's also like to see the Shader:TMU:ROP figures on a Per Next Compute Unit basis and if that is limited by any factors for any Vega variants.

AMD needs to get the Whitepapers on Vega's HBCC/HBC as they relate to HBM2, to system RAM, and paged VM to SSD/hard-drive published. Ditto for any Whitepapers on the Vega primitive shader programmibility for games/graphics/compute useage on the new Vega Mico-arch.

After RX Vega's and Vega FE's release pleas do not forget to cover any SIGGRAPH Radeon Pro WX(Vega Beased news) or Radeon Instinct(Vega Beased) news. Also will ther be and AMD product roadmap update news coming out of SIGGRAPH 2017, including any jucy new AMD Zen/Ryzen/Vega APU news.

July 31, 2017 | 12:13 AM - Posted by Voldenuit (not verified)

I like the idea of bundle discounts on freesync monitors and Ryzen CPUs, but frankly, claiming that the R9 Fury X has higher minimum frame rates than a 1080 at 4K is... dubious at best, and makes me wonder if they are fudging the rest of their results.

July 31, 2017 | 03:40 AM - Posted by WSJ-SJW (not verified)

AMD did the same when they showed that the Fury X was faster than the 980ti. Benchmarks cherry picked to the max.

July 31, 2017 | 02:28 AM - Posted by Hakuren

Technically Vega (whatever) is just compute card which can run games at acceptable level.

However while compute is important outside gaming, for gaming card its total waste of time. Bonkers that card produced in smaller process draws about 40% more power vs card which is designed (performance-wise) as a main rival (1080 FE) - ignoring small aspect of being more expensive too. Unless you are total fanboy and have nuclear power plant at your disposal vast difference in power cost is totally unacceptable.

Living in a place where electricity is quite expensive 1080 running at full tilt 24/7 will cost me about equivalent of ~35USD/month. Vega will nearly double that. That's ludicrous.

Once again AMD proved that they know a thing or two how to make good CPUs but they can't compete with nVidia.

July 31, 2017 | 03:05 AM - Posted by NoFUDzOrDudsPlease (not verified)

Show us your math, and rates by the kilowatt hour, and remember that TDP number is only there for the cooling solution's maximum needed ability to dissipate the heat and the actual power usage will be lower for most workloads.

So until the games are run on actual RX vega SKUs and that power usage metric actually metered your figures do not add up. There is so much new IP in Vega that any benchmarking for any power uasge and gaming/other workload metrics will have to be redone as the games/other software begins to make use of Vega's latest IP. That HCCC/HBC(actual the HBM2 is the HBC last level cache) will save power and users will have software control to enable the HBCC/HBC features on Vega depending on the workload or game. So look for gaming profiles and more games tuned for that HBCC/HBC-HBM2-cache will better results for power usage/gaming performance.

Vega and that FineWine(TM) process done up the same as before and more of the games engineered for DX12/Vulkan process continues. Let the benchmarking begin, for both gaming and compute workloads on Vega.

July 31, 2017 | 09:41 AM - Posted by Just an Nvidia User (not verified)

It is approximately .25 cents per kilowatt hour where he is but the key is full tilt 24/7 max output.

Even if you only use $10 worth of gaming a month the Vega will cost you $15-20. Over a 2 year span it's going to cost $120-$240 more. This is assuming electric rates stay the same. If you game more or keep your card 3 years or more, you can practically buy a new card with the savings.

Efficiency should be included in the total cost to own but most people are only fixated on MSRP.

You would have to be a total fanboy to buy Vega. Similar performance at best and needs a freesync monitor to be fluid. Also test system had to have a Ryzen cpu penalizing the Nvidia 1080. They could have thrown a cheaper Intel into the Nvidia system and it would have had better results for Nvidia and worse for AMD. Change the game to an Nvidia or neutral game instead of BF1 which is AMD and see what happens.

Look how much more wattage that freesync monitor is going to run you over a regular one. I bought a 4k monitor last year and compared it to a freesync one similar in specs. Freesync one was $50 cheaper but I bought the other one. My monitor uses 35 watts max and freesync one was over 90. Maybe it was brands used or freesync one had more brightness but some extra has to be from freesync. Possibly newer monitors don't have this disparity but one must always consider efficiency because prices generally inflate over time not decrease.

My advice to AMD fanboys is to wait for independent reviews and make an informed purchase. And also wait a few months because AMD will probably knock some more off the MSRP especially if Volta/Pascal refresh enters the market.

July 31, 2017 | 11:09 AM - Posted by Fubar

"Even if you only use $10 worth of gaming a month the Vega will cost you $15"

Excuse me ? Let's do some math. For example one games 4 hours a day and GTX 1080 uses 220W of electricity and Vega uses 300w (and the electricity is REALLY expensive)

GTX 1080: 4h x 0,22kw x 0,25c/kWh x 30(days) = 6,6 dollars
VEGA: 4h x 0,3kw x 0,25c/kWh x 30(days) = 9 dollars

So the difference is 2,4 dollars per month or 28,8 dollars a year. In 5 years time you'd reach the price gap by using a totally abandoned graphics card by nvidia. And this is not even taking account the price difference in monitors which favors also amd (freesync).

"Maybe it was brands used or freesync one had more brightness but some extra has to be from freesync."

You could have just linked those two monitors or tell the products names so that people could compare. But it really is not possible since the monitors exist only in your mind.

So your logic is that anyone buying any other brand than the one YOU favor so dearly and clearly is a fanboy. Something about pot and kettle comes to mind.

July 31, 2017 | 01:16 PM - Posted by Shawn Heus (not verified)

"GTX 1080 uses 220W of electricity and Vega uses 300w" The GTX 1080 uses 180W and real testing shows most run a little lower than that. So a stock 1080 uses just a little more than half the power vs a stock Vega 64.

July 31, 2017 | 02:55 PM - Posted by Just an Nvidia User (not verified)

The water cooled Vega is up to 435 watts max according to AMD which is more than double of 1080 and the real consumption remains to be seen hence a range of 50% to 100% more since no particular model was specified. Maybe it was the water cooled one they used for the demonstrations. Was it reavealed?

I don't bs I found the info where I bought it Microcenter. I was off mine was 39 watts and under. The freesync one was 95 watts. It's why I didn't buy it.
Mine

http://www.microcenter.com/product/437546/B286HK_28_4K_UHD_Monitor

Freesync one

http://www.microcenter.com/product/457904/U2879VF_28_FreeSync_4K_LED_Mon...

I'd expect an apology but I've learned to expect nothing from an AMD fanboy but insults and misinformation.

Gamers don't game for only 4 hrs a day LOL. Taking a minimalist approach I see.

What is known is AMD's penchant for underreporting wattage to appear better against Nvidia. Rx 480 recent card springs to mind using a six pin connector instead of 8 pin it needed. They wired it internally for 8 pin. This led to overdraw of pci express spec. AMD reported it as 125 watts. Then clarified GPU only. Then it was 150. Nope still measured at like 166 watts by reputable sites and max draw at over 220.

So 300 isn't very likely for Vega while you use absolute Max for 1080. 180 + 20% allowed by over clocking software would be 216 for 1080. 300 stated watts for Vega +50% power limit allowed by AMD in software is 450 watts max. Curious that you underestimate AMD's consumption.

August 1, 2017 | 08:26 PM - Posted by MadTheMaxx (not verified)

220W(TDP) power on the RX Vega 64, 290W with Board power included and 350W Aqua Board(water pump) according to GamersNexus.

"RX Vega 64 & 56 Power Consumption, Price, & Threadripper Spacers "

http://www.gamersnexus.net/news-pc/3004-rx-vega-64-and-vega-56-power-spe...

July 31, 2017 | 11:21 AM - Posted by NotMartinTrautsomething (not verified)

Not really a fanboy just someone who needs a consumer GPU SKU that's good for gaming and compute as well as non gaming rendering workloads at a damn affordable price.

Who cares about gaming only and that FPS only metric once things can be kept in the 60 to 140 refresh rate range(90 ad above for VR) and this round of Vega and GTX 1080 competition will be more about the higher minimum average frame rates more so than that highest FPS metric only. So for those that waste too much of their time gaming that's on them, but some folks do more than game with their GPUs the majority of the time with most folks having a life and other responsibilities.

That Vega HBCC/HBC-HBM2 is going to provide for some madly large textures/high polygon mesh model scenes on the PC for Blender 3D/other 3D animation software usage and never having to worry about running out of video RAM space because of any video RAM size limitations on any Vega micro-arch based GPU SKUs. Wait for the discrete mobile Vega GPU SKUs and say only 4GB of HBM2/cache, Vega's HBCC/HBC-HBM2 IP is really going to shine in the discrete mobile market, Zen/Vega APU's included.

July 31, 2017 | 03:44 AM - Posted by WSJ-SJW (not verified)

How could AMD fall so far behind? Their high-end card until 2019 will only deliver 1080 class performance at ridiculous power consumption levels.

July 31, 2017 | 06:46 PM - Posted by Hishnash (not verified)

But not in the pro space that SSG will kill the workstation market

July 31, 2017 | 04:40 AM - Posted by Humanitarian

No actual performance figures prior to release date then? not very consumer friendly, stark contrast to the CPU side of things.

July 31, 2017 | 04:49 AM - Posted by Kal

Trend: AMD GPUs since the R9 290 Series - if not the generation before - all have this habit of being over-spec'd cards which get ripped by lesser nvidia cards.
Nvidia's architectures always seem to be more refined as opposed to brute-forcing extra cores on to the die of the GPU.
Why do VEGA cards remind me of Fury X? Fury X Gen 2? That's how it looks to me. I'm willing to bet these cards get outpaced if not remain identical in performance to a GTX 980Ti/1070 - leaving the GTX 1080 cards to hold their price and their status.

New trend: Alongside brute-force specs, AMD seems to be on the TERAFLOP XBOX bandwagon as if tflop rating has ever been in their favour. Flashback Fury X: 8TFLOPs - ripped by the 5.9TFLOP 980Ti.

AMD really needs to work on the efficiency of their GPUs because honestly, I'm the least bit impressed by these VEGA cards.
If they want tout TFLOP ratings as their performance status then they should be treated as such, and if history's a reminder of the aforementioned Fury card, they're going to need much more than 10TFLOPs as their starting point.

If nvidia drops the price of the GTX 1070 below its current price - which in some places is already a touch lower than the starter Vega card, then it's a wrap for AMD. Nvidia's dominance over the GPU market is one governed by respect and reliability for long-term support & performance - despite the controversy and conspiracies of "Planned obsolescence"

Once nvidia wrap their heads around DX12 and how to use it to their advantage, AMD's latest features and optimizations with Vega will mean little. AMD Cards can only [keep up] with nvidia counter-parts under DX12 titles, switch back to DX11 and nvidia's GPU efficiency truly shines, outweighing the competition. All I'm saying is - as an educated guess - next-gen nvidia cards are going to be absolute monsters. And if AMD can't beat 'em on price, which let's face it, is the only way they have been besting the competition in both CPU and GPU markets - then nvidia have nothing to worry about.

Really disappointed with the reveal of these cards, BUT, I will await benchmarks before my thoughts go further. The way things look for gaming is that anybody running a GTX 970 which most still are, have little reason to upgrade. In all honesty, I want to see 4K benchmarks as the new standard - starting with these cards. Should benchmarks at lower resolutions be a better calling for optimal gaming, then why bother? You can game 4K High settings on a GTX 970 - if you accept the ignorant-free truth that high to ultra - in most titles, have little visual feedback which justifies the performance hit. These Vega cards for the sake of their own respect need to be 4K viable as a standard, or they're just not worth the time.

July 31, 2017 | 05:23 AM - Posted by Anonymous36457736 (not verified)

The chart still has conflicting information with the article.

July 31, 2017 | 06:14 AM - Posted by isaackorp (not verified)

I will wait for the benchmarks, but the price seem to indicate that it will fall below the 1080TI.

July 31, 2017 | 07:11 AM - Posted by Anonymoussssss (not verified)

So did AMD confirm the exact die measurements ?

July 31, 2017 | 08:11 AM - Posted by ChangWang

That Aqua pack is calling my name!

July 31, 2017 | 10:58 AM - Posted by PJ (not verified)

Looks like the perfect card to replace my 970. Will wait to see what MSI does with it though.

July 31, 2017 | 12:04 PM - Posted by PetaFlopsTillYaDrop (not verified)

"AMD Project 47

NVIDIA is selling turnkey deep-learning and HPC boxes that combine its Tesla HPC cards with Intel Xeon processors, so researchers don't have to bother about which system to build. AMD wants a slice of that market, and hence unveiled Project 47, marketed as AMD P47 Petaflop Rack, promising 1000 TFLOP/s of compute power in a standard size rack.

The P47 combines twenty (20) AMD EPYC 7601 32-core processors, with eighty (80!) Radeon Vega Instinct GPUs, and twenty Mellanox-made 100 Gb InfinityBand cards, and 10 TB of Samsung NVMe storage. to belt out 1 PFLOP/s of peak single-precision compute performance, 2 PFLOP/s of half-precision (FP16) performance, at 30 GLOP/s per Watt single-precision. The company didn't reveal pricing, however we expect it to be on par with that of a luxury sedan."..."(1)

(1)

"Everything AMD Launched Today: A Summary"

https://www.techpowerup.com/235665/everything-amd-launched-today-a-summary

August 2, 2017 | 08:37 PM - Posted by Anonymous28 (not verified)

As soon as someone (like EK) builds a full size water block for this Vega 64 PCB & lists it for purchase I'll be all over this vid card!

August 5, 2017 | 06:53 PM - Posted by orvtrebor

The 56' looks to be the winner with that TDP and price point. Personally I won't even consider a 300W+ TDP card(s).

August 5, 2017 | 10:46 PM - Posted by Daniel644 (not verified)

POWER DRAW and TDP are 2 different things, in the article you say

"The power draw of the Vega FE cards was telling for the results we see on RX Vega. The Vega 64 liquid cooled will hit 345 watts, the air-cooled version hits 295 watts, and the Vega 56 comes in at a cool 210 watts."

those are the TDP number NOT the power draw, if that was the power draw then these cards would be literally nothing but space heaters, you MUST draw MORE power then the TDP because TDP is the amount of HEAT measured in watts that it puts out, if you aren't drawing MORE power then the amount of heat you are creating then you have no power to actually do things with the card. As a PC based website you really should understand the difference between TDP and power draw.

August 11, 2017 | 12:11 PM - Posted by Thatman007

No matter how much you justify AMD's insane benchmarks by calling them "interesting" they're still BS. Nobody cares about minimum 99% framerate or whatever crap you want to call it.

August 11, 2017 | 12:11 PM - Posted by Thatman007

No matter how much you justify AMD's insane benchmarks by calling them "interesting" they're still BS. Nobody cares about minimum 99% framerate or whatever crap you want to call it.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.