Podcast #481 - NVIDIA TITAN V Deep Learning, NVIDIA EULA Changes, and more!

Subject: General Tech | December 28, 2017 - 11:43 AM |
Tagged: video, titan v, seasonic, nvidia, gtx 1080 ti, asus, amd, 850W, podcast

PC Perspective Podcast #481 - 12/27/17

Join us for discussion on NVIDIA TITAN V deep learning, NVIDIA EULA Changes, and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

Hosts: Ryan Shrout, Josh Walrath, Ken Addison

Peanut Gallery: Alex Lustenberg

Program length: 1:21:32

Podcast topics of discussion:
  1. Week in Review:
  2. News items of interest:
    1. 1:09:00 NVIDIA EULA Reprise
  3. Picks of the Week:
  4. Closing/outro


Video News

December 28, 2017 | 03:53 PM - Posted by Ipoopwhenifart (not verified)

Thumbnail caption:

"This is my pussy"

December 28, 2017 | 04:41 PM - Posted by CatsAreSlipperyWhenWet (not verified)

Mrs Slocombe's has all sorts of problems with that!


December 28, 2017 | 06:56 PM - Posted by Jeremy Hellstrom

Mr Humphries doesn't https://youtu.be/6b6c8JuWzWI?t=12s

December 28, 2017 | 07:20 PM - Posted by Power (not verified)

I expect Nvidia to forbid playing AMD sponsored titles.

December 30, 2017 | 07:06 PM - Posted by MustReadForManAndKitty (not verified)

Very In-Depth article and those chip foundry top 10 lists for the world market and the top 10 list for the Chinese market is also very interesting. Lot's of jucy information in one article makes this a must read for anyone intrested in the world's Chip-Fab market!

"Moore’s Law is still viable, but it’s evolving. At each node, process cost and complexity are skyrocketing, so now the cadence for a fully scaled node has extended from 18 months to 2.5 years or longer. In addition, fewer foundry customers can afford to move to advanced nodes. In total, the average IC design cost for a 16nm/14nm chip is about $80 million, compared to $30 million for a 28nm planar device, according to Gartner. In comparison, it will cost $271 million to design a 7nm chip, according to the firm."(1)

"Foundry Challenges in 2018

Growth will remain steady, but it’s getting harder and more expensive to move to the next nodes." (1)


January 1, 2018 | 03:00 PM - Posted by BubbaNeedBigBucksToGameCauseOfCompute (not verified)

Nvidia will be forced to define what Data Center means if they are taken to court over any licensing. Academic institutions can probably get a limited waver from Nvidia but that Government Grant funded stuff if it's too large a system then that's going to have to be the Tesla/Quadro systems.

ANY engineering/medical usage will require the end user to get the Full professional SKUs that have the validated ECC and warranty/driver validation/Pro hardware features to cover as error free usage as possible for production workloads. That includes any academic institution doing any engineering analysis or medical research that has to meet professional standards.

Consumer SKUs get one year warranties the professional GPU SKUs get the 3+ year warranty support and the extended software/driver support that comes with professional SKUs.

Remember that the consumer markets get most of their support for R&D and development cost underwritten by what the professional market markups fund as on the consumer SKUs the customers are unwilling to pay for the necessary R&D costs. So any Data Centers can write off the cost of the professional GPU SKUs on their taxes while consumers can not write of GPU sales. Coin Mining is probably not a valid business expense yet and consumer purchases are generally not eligible unless the hardware is purchased by someone with a individual business license.

Nvidia is going to be hard pressed to raise its consumer prices owing to the fact that Nvidia has such a large share of the consumer/gaming GPU market but AMD needs to raise its consumer GPU prices because of AMD's extra compute resulting in greater demand from bit coin mining. AMD raising its wholesale prices is necessary so AMD can afford to fully fund its driver development teams. AMD has less to lose market share wise by raising its GPU MSRP higher in order to earn the proper markup to fund driver development for its consumer GPU SKUs.

In the future as AMD's professional GPU sales increase along with its Epyc CPU sales revenues then AMD will have the necessary funding from those professional sales to underwrite the proper driver/development teams but AMD needs to raise its wholesale prices to get the revenue increases to fully fund its driver teams.

Just look at all of the DX9-DX10 gaming that was affected by AMD's latest driver release and that is why AMD needs to get its revenues higher from any consumer market wholesale price increases. AMD needs to forgo trying to get more consumer GPU market share currently and focus on quality and that bit coin mining boom continues to allow AMD so sell out of its Polaris and Vega GPU SKUs. So AMD really needs to ramp up its wholesale prices and forget about getting any consumer GPU market share away from Nvidia in the short term. AMD needs to focus on its driver teams and getting that part of the business up to speed or none of its GPU hardware will be worth a damn for gaming.

AMD can still sell 80 Vega 10 die based professional MI25 SKUs for every Project 47 project 47 supercomputer cabinet sold by AMD and its Project 47 OEM/Partner. AND that's also 20, 32 core Epyc server CPUs/MB inside one Project 47 single cabinet supercomputer sold with many users getting more than one cabinet. So Vega 10 is a great success in the compute/AI markets as will Vega 20 be and AMD can again introduce some Dual GPU on one PCIe card SKUs for the Compute/AI markets and get more compute is less area.

Vega may not have the ROPs to compute with the GTX 1080Ti in gaming workloads but Vega has the extra shaders for compute usage that are seeing the consumer Vega 64/56 SKUs selling for a higher price than Nvidia's consumer SKUs because of that extra compute ability compared to any Pascal based GPU SKUs.

AMD will not be able to please gamers without some fully staffed driver teams working on any legacy graphics API gaming issues while a different driver team works on getting the games tweaked for the latest DX12/Vulkan API features. So AMD needs to forgo and market share increases for gaming GPUs and raise its MSRP by at least 20-30% and raise its wholesale prices by at least 40% and get some of that coin mining revenue so AMD can properly fund its driver development.

If AMD(Advanced Mining Devices) can earn the revenues for coin mining then and keep that demand high then AMD's margins can continue to rise and that's what will see AMD's share prices rise faster and get AMD the necessary capital to fully fund its driver teams. Nvidia is too large to defeat overnight and AMD's drivers problems will just continue to lose AMD gaming sales in spite of any GPU hardware features that may be great for gaming. So quality for gaming drivers means that AMD's GPUs should be priced that same as Nvidia's GPUs and the mining demand for AMD's GPUs means that wholesale prices need to be higher for AMD.

AMD's consumer Raven Ridge APUs stand a better chance of getting AMD back more GPU sales via integrated graphics and AMD will need to fund the hell out of its APU driver team because those older games usage will be higher on APUs than on the high end gaming rigs where everyone is playing the latest games anyways. So APUs are where AND needs to focus the most driver team resources for APIs going back at least 10 years in time so that's DX9-10-11 titles as well as the DX12/Vulkan titles. AMD's focus need to be on getting back its professional CPU server market share and on its Driver Teams across all of its markets. Vega 10 is going to be more successful in the professional compute/AI markets and now in the compute(Coin Mining) for non gaming compute market usage on consumer GPUs also. The most new revenues need to go to the driver teams across all of AMD's markets as without proper drivers the hardware is worthless.

GPUs will continue to cost more if demand stays high and there is nothing consumers can do about that ig GPUs contnue to sell out for compute usage as well as gaming usage.

You can thank AMD for its work with Graphics APIs/HBM-HBM2 and you can thank Nvidia for creating that Professional market demand for GPUs that's even helping AMD make more GPU sales in the professional markets along with its Epyc server/workstation/HPC CPU SKUs. But the cost of GPUs will continue to trend higher as the demand for compute usage on GPUs overtakes the demand for GPUs for gaming only usage.

January 1, 2018 | 04:50 PM - Posted by LooksLikeVegaOnThatEMIBmodule (not verified)

From the direct link provided in the reddit post the Intel listing:

"Intel® Core™ i7-8809G Processor

3.1 GHz

8 MB [Cache]


100W Target Package TDP

Two channels DDR4-2400

Radeon RX Vega M GH Graphics, Intel® HD Graphics 630"(2)

So than HBM2 and dual channels to regular DDR4 DRAM and that HBCC/HBM2 IP on Vega hopefully working also!


[News] i7 8809G appears on Intel's website with Vega graphics!"



"Compare Unlocked Processors" [The SKU shows up in this listing of unlocked processors.]


January 5, 2018 | 07:06 PM - Posted by Anonymoussss (not verified)


Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

This question is for testing whether you are a human visitor and to prevent automated spam submissions.