Considering a move to a high powered Vega-tarian lifestyle?

Subject: Graphics Cards | August 14, 2017 - 03:49 PM |
Tagged: vega 64 liquid, vega 64, vega 56, rx vega, radoen, amd

The reviews of AMD's two and a half new cards are in and they have a lot to say about AMD's current focus for GPU development.  They have not gone green with this new architecture; but be honest with yourself about how much think about the environment when absorbed in a gaming session on a 4k monitor.  The Vega 64 and 56 do require far more energy than Pascal cards and do produce more noise, however keep in mind that third party air cooling or a better radiator may help mitigate the issue. 

The real question is the price, while there will be some challenges with the two Vega 64 cards the Vega 56 is certainly a competitor to the GTX 1070.  If the mining craze dies down to the point where the prices of these two cards approach MSRP AMD offers a compelling choice for those who also want a new monitor.  Freesync displays sell at a significantly lower price than comparable G-Sync displays, even before you start to look at the new bundle program AMD has introduced. 

Since we know you have already been through Ryan's review, perhaps you would be interested in what our framerating friends over at The Tech Report thought.  If not, there are plenty of other reviews below.

View Full Size

"AMD's long-awaited Radeon RX Vega 64 and RX Vega 56 graphics cards are finally ready to make their way into gamers' hands. We go hands-on to see how they perform."

Here are some more Graphics Card articles from around the web:

Graphics Cards

August 14, 2017 | 04:07 PM - Posted by DaFutureIsPlasticSays (not verified)

More DX12/Vulan testing/benchmarking will show more interesting RX Vega results! there is also an AMD Vega whitepaper link provided by one of the TRep Review's posters where that same graphic is used, so read that PDF for sure.

Vega is out there now, if you can snag one before the miners get it or the Nvidia fans get a Vega to up sale to the miners so the Nvidia fans can get a GTX 1080Ti and their Nvidia fix satiated.

Supplies are limited on Vega with all the mining insanity goin on! Anyways AMD's stockholders are happy as a sale is a sale. AMD is positioned to make way more off of their Epyc sales/professional GPU/AI GPU sales anyways. AMD is going to have a larger market cap than Nvidia sooner than your average gaming only folks Think!

August 14, 2017 | 04:38 PM - Posted by edwinjamesmiller36

"AMD is going to have a larger market cap than Nvidia sooner than your average gaming only folks Think!"

So you are thinking the stock price is going is from $12.44 a share today to $102.00 a share sometime soon. Bullish. Very, very bullish.

August 14, 2017 | 04:54 PM - Posted by pessimistic_observer (not verified)

maybe he's thinking nvidia will drop some and theyll meet in the middle, also highly unlikely.

August 14, 2017 | 06:25 PM - Posted by GamingDaftsNOT (not verified)

Market cap and share price can vary, dependng on if there are stock splits and larger total nubmers of oustanding shares, Etc! And Intel's maket cap is currently 170.57B and its current stock price(36.34 after hours) is a lower stock price than Nvidia's stock price(168.40) with Nvidia's market cap of 99.49B dollars!

So what part of corporate valuation theory calculations does the average gamer NOT understand, as exemplified by these rather daft posts!

Just go look at AMD's past market cap when it was only competing with Intel in the CPU markets and AMD's opteron server chips where getting their highest market share numbers. So AMD has its Epyc CPUs and Vega GPU SKUs for the pro markets where GPU compute counts. Nvidia has its GPU sales mostly(Little SOC sales) but AMD now has Zen/Epyc and Vega Radeon Pro WX, and Radeon Instinct SKUs to go along with its Epyc revenues. And who cares if the miners buy up all of AMD's consumer output as that money is good also.

So look for AMD's other markets becoming bigger than AMD's Gaming markets and keep on eye on AMD's market cap when those Epyc SKUs get more server market share away from Intel! Nvidia's P/E ratios are different than AMD's also but that will mean that Nvidia may be overvalued at times.

August 15, 2017 | 01:11 PM - Posted by edwinjamesmiller36

Market cap = (share price) X (number of outstanding shares)

so if the stock splits (lets say 2 for 1) the share price is halved but the Market Cap remains the same.

Intel $170.57 Billion with 4.8 Billion shares outstanding

Nvidia $99.49 Billion with 592 Million shares outstanding

AMD $12.4 Billion with 954 Million shares outstanding

While AMD does have some good products coming out now which will certainly help them they also have a lot of debt that is maturing in the next year or two. And their high debt load leaves little room for error.

While AMD looks to be on the runway for higher valuations (and thus market cap), I do not see it on a rocket launchpad for stratospheric valuations.

August 15, 2017 | 03:48 PM - Posted by lensCapsEnMoreSVP (not verified)

What about AMD's Market Cap at its historcial HIGH figure? I searched around the Internet for an hour but most of the online historcial data only goes back 10 Years. And I really need to calculate AMD's High Point Market Cap amount in today's/constant dollars just to get any Idea of AMD's potential with its Epyc products maybe getting AMD back to its Opteron levels of server market share.

What is your estimate of the amount of revenue increase that AMD will be able to obtain(In Dollars) for each extra additional percentage point of Epyc server market share gains?

I have this Article listed after I Googled this phrase: "amd historical server market share numbers"
But Google's search AI is so damned intent in overriding my results will results that only pertain to current data.

And I did Find this article from Tom Hardware titled:

"AMD x86 server market share tops 22% - report
by THG Reporting Team April 25, 2006 at 7:31 PM "

So I'm really looking at Epyc sales in a positive way in the CPU/Server and Workstation markets and those historcial revenue figures will have to be converted to constant dollar figures also. I'm also thinking that each Epyc system sale is a potential sale for AMD to also make some Radeon Pro WX/Radeon Instinct sales, with AMD/Server system partners being able to sweeten the offer by package pricing AMD's Radeon Professional GPU SKUs with any purchase of AMD's Epyc CPU SKUs for any potential customers that may want/need more FP number crunching capabilities or AI/Infrencing capabilitues.

Sorry for no direct links to articles but the S P A M filter at times does not allow it!

August 15, 2017 | 04:00 PM - Posted by lensCapsEnMoreSVP (not verified)

P.S. I do realize that stock splits do not change valuation, but a lower per share cost after a stock split can encourage more small investors to purchase the stocks(outside of using a mutual fund investment method) because the smaller increments for purchasing allowed by a smaller share price are more resonable for a small investor to more affordably/more gradually increase their holdings in a company's stock and that can lead to some increases in a company's valuation over time.

August 15, 2017 | 04:27 PM - Posted by lensCapsEnMoreSVP (not verified)

"While AMD does have some good products coming out now which will certainly help them they also have a lot of debt that is maturing in the next year or two. And their high debt load leaves little room for error."

AMD did some debt restructuring, including using some stock options methods/other methods to restructre debts and pay down debts! And AMD's higher revenues will encourage more lenders to extend more loans/lend more out in the longer term because others' debt payments are earnings for the lenders. As long as AMD has its potential for revenue growth those revenues will service AMD's debts and any AMD profits represent that excess amount of revenues remaing after all the bills are paid. So lenders are only concerned about AMD's Revenues/Revenue growth and how that can allow for AMD to manage its debt loads. The more revenues/revenue growth the lower AMD's debt load will become relative to AMD's over all company valuation/debt ratio.

From the marketrealist dot com[sorry for no direct Link S/filter]:

"How Debt Restructuring Strengthened AMD’s Balance Sheet"

By Paige Tanner
Oct 31, 2016 5:24 pm EDT

"Debt restructuring gives AMD much-needed financial flexibility"

"In the previous part of the series, we saw that Advanced Micro Devices (AMD) has reduced its expensive debt significantly and pushed a major portion of it to 2026. The fact that the company has no term debt maturing before 2019 gives it strong financial flexibility for the next two years."

August 14, 2017 | 05:12 PM - Posted by Mr.Gold (not verified)

Vega 56 is actually a good product, but plate to the party.

Only question is, how much does FP16 help gaming?
It said FP16 doesn't consume much power, just deliver 2x the throughput.

So its possible that the Vega 56 get to GTX 1080 level, and so would consume about the same amount of power while gaming.

The new wolfenstein should tell us the full picture of what vega can do for gaming.

Now the fact that even vega 56 can be faster with blender then a 1080ti is a nice feature for non gamers.

Not a home run... but Vega56 is simply late.

August 14, 2017 | 07:08 PM - Posted by Jeremy Hellstrom

Launching 12 months ago would have been much better, but not doable.  Still the Vega 54 is still impressive compared to AMD's last gen.

August 15, 2017 | 04:57 PM - Posted by lensCapsEnMoreSVP (not verified)

The rumored Vega Nano has me wanting to wait. And there is a great implication with that Infinity Fabric(IF) IP included in Vega if you go and read the AnandTech Vega review.

There is the potential for any Vega Dual GPU Dies on a single PCIe card to be wired together over the IF and not make use of any PCIe based methods. So there is in each Vega GPU die that ability to have the GPU's Caches wired together in a similar method to the way 2 Zen/Zeppein Dies are wired up with the same coherent cache commuincation sync-methods to make say 2 Vega 56 dies on a single PCIe card appear as one single monolithic GPU die and the two Vega Die's HBCCs and HBC-HBM memory Pools connected up, UMA style, to work as one logical GPU. This IF usage forshodows NAVI types of functionality using the IF protocal to tie 2 Vega dies together(via IF Glue) to act as one logical GPU.

This will NOT be any Cross-fire based dual GPU mathod on a single PCIe card for dual GPUs/linked type of arrangement as one need only look at the Infinity Fabric on Zen/Threadripper and Zen/Epyc and see how that IF technology works on Zen/CPUs across 2 or more Zen/Zeppelin dies.

August 15, 2017 | 06:05 PM - Posted by Aparsh335i (not verified)

I agree. It's very late! It's August 2017 and GTX 1080 came out May 2016...They are releasing cards that are similar in performance while consuming more power, a year later. Their only saving grace is that they do mine well, so they will sell. For the guys doing perf/$ for purely gaming it's not ideal and they are losing. I'm a fan of the Ryzen CPUs and glad to see the comeback, but AMD's products are currently losing on both fronts in terms of perf/$. I know this is good for some people that use their comp for things other than gaming, or game and stream simultaneously.

August 15, 2017 | 10:50 PM - Posted by GamersSoFullOfThemselvesFFS (not verified)

That because Vega is for compute more than only gaming and look at the Radeon Pro WX/Vega based SKUs and this Radeon Instinct™ MI25 marketing copy from AMD's website:

"64 Compute Units each with 64 Stream Processors.
The Radeon Instinct™ MI25 server accelerator has 64 Compute Units, each consisting of 64 stream processors, for a total of 4,096 stream processors and is based on the next generation “Vega” architecture with a newly designed compute engine built on flexible new compute units (nCUs) allowing 16-bit, 32-bit and 64-bit processing at higher frequencies to supercharge today’s emerging dynamic workloads. The Radeon Instinct MI25 provides superior single-precision performance and flexibility for the most demanding compute intensive parallel machine intelligence and deep learning applications in an efficient package." [count how may times it says compute in that marketing copy]

The Vega die, like the Zen/Zeppelin Die, is designed for this professional market first for AI and also some FP acceleration workloads in a similar manner to the Zen Ryzen/Threadripper SKUs coming from that server first Zeppelin die design that is binned into the consumer parts. So for Vega 64 and Vega 56 that are based Vega GPU micro-architecture that was designed for compute/AI first!

The Vega 10 lower bins are going to be used for gaming SKUs and the Cherry Vega 10 dies with the best thermal attributes are going to be use for branded Radeon WX/ Instinct professional SKUs with the consumer market getting the Vega dies that can game fine but are not so good at the thermals. And that’s the way it will be until the Vega wafers/dies on GF’s 14 nm process start coming off the lines with enough Cherry dies to meet the professional markets demands with more cherry dies remaining to be sent down to the steerage levels of the gaming market, because the gaming market will not pay the high margins that the professional markets will.

Gamers are complaining profusely about all of AMD's Hype but that hype was meant for the Professional markets more than it was for the consumer/gaming only fickle markets.
And Hype Vega for compute/AI is what AMD did. It's just that the gaming enthusiasts websites latched onto AMD's early Professional market hype train and that excess coverage lead the gamers into believing that Vega should be the second coming for gaming.

Gamers need to lay off of Raja and his overseas design teams as Raja did as the AMD management told him to do and design Vega for the professional market where Nvidia, by the way, is making some mad markup billions of dollars of revenues and the GPU/AI market is not even really started getting into gear for the types of AI revenues potential that will be had by anyone with an AI enabled processor GPU/TPU(Tensor Processor Unit)/whatever AI processor offering.

Vega being designed for compute, as where previous AMD GPU micro-architectures, just so happens to be popular for coin mining and when the mining software/drivers are fixed/tuned for the new Vega micro-architectures Vega is going to be as MIA off of the store shelves just as the Polaris SKUs became when the coin value spiked and led to another in a series of GPU gold rush purchasing to mine for that pot of gold at the end of a hashing Algorithm.

Vega was never intended to compete with Nvidia’s mobile first gaming/GPU designs on that power metric because is the Teraflops/Watt metric at lower clocks that is more important than that Frames per Watt/FPS metric that is only important to gamers with their unwillingness to pay any mad markup pricing. Just you wait until JHH sees AMD moving Vega MSRPs higher, and JHH will top that because Nvidia's fan base will fork up more for that FPS usage only gaming metric on GPU SKUs that are so stripped of compute.

Money Talks!

August 14, 2017 | 05:29 PM - Posted by Mr.Book

Looking to finally upgrade from my HD7950. However, the VEGA pricing in Canada is out to lunch relative to a competing nVidia solution. Average GPU pricing in CAD is as follows, and doesn't include discounts or deals:

-Vega56 (pricing unknown)
-Vega64 Air ($850)
-Vega64 Liquid ($1000)
-GTX 1070 ($550)
-GTX 1080 ($710)
-GTX 1080Ti ($960)
-RX580 8gb ($390) (for reference)

Really interested to see where Vega56 lands in terms of pricing, and how close it inches to 1080 (non-Ti) pricing.

August 14, 2017 | 07:07 PM - Posted by Jeremy Hellstrom

Unless you are totally dead in the water, I would really suggest waiting for the current mining craze to die off before getting a new GPU ... or you could live dangerously and go looking at eBay for a cheap one being dumped because someone upgraded to one with a better hash rate.

August 15, 2017 | 10:30 AM - Posted by Mr.Book

No immediate rush. However, the HD7950 is getting long in the tooth, especially after moving to a 1440p screen.

It's been 2 1/2 years, what's another few months. Have had great success with the Gigabyte Windforce series.

Have a great day Jeremy.

August 15, 2017 | 04:39 AM - Posted by John H (not verified)

Used or new 980ti is still a good choice. It was never really a miners card and an OC model will generally meet or beat 1070. Thus Vega 56 perf with similar power draw.

August 15, 2017 | 11:19 AM - Posted by Mr.Book

980Ti is all but gone from the retail channels. 1070 price point in the upper $400's would be nice, but I'm sure it's wishful thinking.

As an aside, I have a difficult time enjoying gaming on nVidia cards. To me, their cards exhibit a symptom that I refer to as "nano-stutter". It's the slightest of scene hiccup that behaves as a tiny pause, occurring throughout the game session, regardless of in-game FPS and GPU used. This is very apparent when moving around in First Person Shooters, albeit it affects all genres.

Because of this, I've stuck with ATi, with which I've not been able to replicate the nano-stutter issue.

August 15, 2017 | 11:59 AM - Posted by James

Are you using a relatively low performance CPU? AFAIK, Nvidia driver uses a bit more CPU resources than AMD drivers due to more scheduling resources in hardware on the AMD parts.

August 15, 2017 | 12:19 PM - Posted by Mr.Book

In my experience, it has been CPU/Infrastructure/OS independent. Intel 'K' parts used only; 2700k, 3770k, 6700k with 8 or 16gb+ ram. No O/C'ing.

Comparable ATi GPU on the same system would not exhibit the issue. I know it's strange, and I have no explanation for it.

August 15, 2017 | 02:45 PM - Posted by Just an Nvidia User (not verified)

Maybe it related to older Nvidia cards only as you have a hd7950. Stop trying to cast shade on Nvidia cards of today. I have a 4 gig 760 and play all sorts of games possibly more modern than you with your ATI card. I have not witnessed this stutter you speak of.

August 15, 2017 | 03:22 PM - Posted by Mr.Book

Luckily for you, and other shareholders who may be reading, my experience with a product will not negatively impact nVidia's share price. Carry on.

August 16, 2017 | 03:01 PM - Posted by Anonymously Anonymous (not verified)

Still plenty of 980ti's around that are new, but the prices are still like they were way before the 10** series launched.

Possibly your CPU? I've been running a 980ti along with my 2600k for over a year and it's been a great experience.

August 16, 2017 | 06:44 PM - Posted by John H (not verified)

Ditto here, 4.6 GHz 2600k and 980TI OC (~ 1.45 GHz) - the old 2600K is still a good match for this card.

August 15, 2017 | 02:30 AM - Posted by Johan Steyn (not verified)

This might interest quite a few to see what Vega is really about:

August 15, 2017 | 11:07 AM - Posted by Mr.Book

It's exciting times watching AMD/Radeon group right the ship. They are doing more with less as they position themselves for the future (3-5 years?).

Thanks for this!

August 15, 2017 | 01:38 PM - Posted by Jeremy Hellstrom

I did already post a link to that in my post.

August 15, 2017 | 12:15 PM - Posted by James

Unfortunately, GPUs are very sensitive to optimization. I suspect Vega 64 would beat just about everything else, if the software was properly optimized for it. The 16-bit support is interesting also. Since Nvidia cards don't have it yet, games making use of it could perform significantly better on the AMD parts. While you can't use 16-bit for everything, it can probably be used for quite a bit. In addition to double the flops, it might take significantly lower bandwidth also. This isn't anything new though.

August 16, 2017 | 11:07 AM - Posted by NUMsMunchEnCrunch (not verified)

Read this article at techgage it's all about some compute(1):

And pay close attention to some of those crypto-hashing benchmarks on page 5: Titled "Sandra: Cryptography, Science, Finance & Bandwidth".

and the author states:

"Hot damn. I feel like these kinds of gains are those that AMD should promote out the wazoo. So many reviews posted today are likely to paint a rough picture of this card’s gaming performance, but on the other side of the fence, compute performance on Vega quite simply kicks ass. The results here may be able to give an impression of Vega 64’s future mining performance. Mining benchmarks you’ll see around the web in other launch reviews will show an edge over a top-end NVIDIA card. I would not be surprised if AMD optimizes its driver sometime in the future to vastly improve mining performance. Vega should technically be better than the 30MH/s you’ll see reported today, based on all of the compute performance seen here." (1)

Vega 64's Sandra(AES256 + SHA2-256 cryptography) scores are ahead of every other SKU(including the Quadro P6000-24GB SKU) but a little behind the Titan XP(12GB). But in the Sandra(AES256 + SHA2-512 cryptography) benchmarks Vega 64 owns the lead by a wide wide margin.

That Vega HBCC/HBC and only 8GB of HBM2 IP needs to be looked at also if any hashing/crypto workloads are being helped/could be helped(even more) by Vega's ability to utilize HBM2 like a last level cache and leverage regular system memory for better hash table effencies with that HBCC/HBC page caching mode on. Vega's doing a lot better even with its smaller amout of HBM2 compared to the more expensive SKUs from Nvidia that can cost more than 5 times as much in retail pricing!


"A Look At AMD’s Radeon RX Vega 64 Workstation & Compute Performance"

"by Rob Williams on August 14, 2017 in Graphics & Displays"

August 16, 2017 | 11:27 AM - Posted by NUMsMunchEnCrunch (not verified)

And just as I finish posting the above, this(1) story drops!

I think that AMD needs to stop setting a hard MSRP on its GPU SKUs and Needs to begin using a market pricing offset where AMD's GPU wholesale pricing can float up and down based on demand. AMD needs to begin providing a MSRP wholesale pricing Range to its retail channel partners indexed to what the market demand is causing the actual retail pricing to be. So AMD's wholesale prices can go higher and then it can be up to the retailers to eat some of that increase in wholesale pricing or pass it on to the consumer.

AMD can also institute a rebate policy on one per consumer customer basis, and that will be the way to make the coin miners have to pay more, with AMD netting some of the binifits of the higher retail demand pricing that is going on rather than only the retailers who do nothing to help AMD cover its R&D and other development expenses for GPU production!


"AMD Releases Beta Driver Specifically Geared for Blockchain Compute"

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

This question is for testing whether you are a human visitor and to prevent automated spam submissions.