PCPer Mailbag #36 - 3/23/2018

Subject: Editorial | March 23, 2018 - 09:00 AM |
Tagged: video, pcper mailbag, Josh Walrath

It's time for the PCPer Mailbag, our weekly show where Ryan and the team answer your questions about the tech industry, the latest and greatest GPUs, the process of running a tech review website, and more!

On today's show:

00:49 - User-replaceable batteries?
04:06 - Block GPUs from crypto mining?
07:52 - Custom game resolutions?
11:26 - What makes AIB GPUs special?
15:01 - Spectre patch for Ryzen?
16:27 - Security flaw disclosure windows?
20:04 - Ryzen CPU coolers?
23:14 - Thunderbolt 3 royalty free?
24:54 - SSDs replacing RAM?
27:19 - Gamepad and flight stick reviews?
27:59 - VR graphics and upscaling?

Want to have your question answered on a future Mailbag? Leave a comment on this post or in the YouTube comments for the latest video. Check out new Mailbag videos each Friday!

Be sure to subscribe to our YouTube Channel to make sure you never miss our weekly reviews and podcasts, and please consider supporting PC Perspective via Patreon to help us keep videos like our weekly mailbag coming!

Source: YouTube

March 23, 2018 | 09:01 AM - Posted by Anonymously Anonymous (not verified)

thumbnail captions:

"BBBBBBBRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRRPPPPPPPPPPPPPPP!!!!!!!!!!"

March 23, 2018 | 11:29 AM - Posted by TB3IsTheNewFireWireBecauseOfwhatIntelDid (not verified)

Thunderbolt 3 royalty free! Intel Tried for too long with that TB lock-in that requires Intel processors for OEMs and that's not worked out too well for TB/TB3 adoption.

And now there is/has been USB 3.1 Gen 1(5Gb/s bandwidth, AKA USB 3.0) and USB 3.1 Gen 2(10 Gb/s) and there is that Newer USB-IF standard called USB 3.2 that takes 2, USB 3.1 Gen 2(10Gb/s) Channels/controllers and marries them via link aggregation and uses that to get a USB 3.2 total aggregate bandwidth of 20GB/s.

The only maker using TB/TB3 across all its PC/Laptop lines is Apple and Apple now allows(White-lists) TB3 external GPU boxes for gaming/graphics. So Apple users can now use their Macbooks attatched to an external GPUs for gaming/graphcs.

Intel really screwed the pooch trying to use TB/TB3 for some vendor lock-in/Vendor Tie-Ins with TB/TB3. And really how many Laptop OEMs are even going to provide the necessary 4 PCIe 3.0 lanes that it takes to drive a 40Gb/s TB3 controller at maximum bandwidth. Some Laptop OEMs offer TB3 hanging off of only 2 PCIe 3.0 lanes and that's not enough PCIe connectivity.

Things may change once PCIe 4.0/5.0 arrives! But then when do laptops, that need the bandwidth the most on laptop SKUs that usually have the most limited numbers of PCIe lanes, ever get that newest PCIe standard before PCs that always get that latest PCIe standard offered first.

March 23, 2018 | 06:07 PM - Posted by tts (not verified)

A hardware approach to preventing GPU's to be used for mining would be to introduce single bit errors every now and again into the work done by the hardware. The errors won't be visible in game and mining would be effectively sabotaged on those cards.

No drivers needed and it'd probably be a "easy" change to implement.

AMD or NV could always release a mining-specific GPU (kind've like you mentioned for NV, thought AMD did something similar too) that didn't have this "feature" to keep miners happy.

A downside of this approach would be that GPGPU stuff would be also sabotaged of course but that, unfortunately, didn't really seem to take off in a big way on the desktop. Researchers generally won't be effected since they'll just buy the proper GPGPU specific cards (ie. FirePro, Tesla).

March 23, 2018 | 08:25 PM - Posted by TheRealityIsThatSupplyAndDemandSetsPricing (not verified)

Ray Tracing on the GPU is a GPGPU/Compute intensive sort of task so you want that screwed up also. No let's have damand set the pricing and the GPU makers can start picing the dies higher so the AIBs and retailers are not earning more off the GPU than the maker/designer of that GPU.

The GPU Makers can then be the ones to offer a rebate to the gamer by only allowing one, or two, rebates per household. That way the GPU's designer/maker can control the GPU's cost to the gaming consumer(Via Rebates) with the Miners having no chances of getting much in the way of rebates for purchases of more than 2 GPUs.

Really, GPUs need a hefty tax to cover the extra stress on the power grid that their use imposes on even the non gaming power bills for folks. Anyone running a GPU mining operation should pay a much higher power rate than any normal home power users. There are places that make use of a lot of cheeper hydro-power where households in those areas have seen their power rates spike because of coin mining operations locating in those areas where the power rates are lowest. So even the non basement dwellers are bing hurt by higher power rates due to the extra demand that mainers are putting on the power grid.

Flagship Gaming GPUs need to get a bigger fatter Tax with mainstream and lower end discrete GPUs getting a lower Tax and SOCs/APUs an even lower Tax. Hell just Tax the GPU by the power it is estimated to consume an use that money for power plant and power distribution construction/grid improvments.

Any miners need to have permits on those large mining farms and they get taxed extra if they are not using their own solar or wind power capacity to do the mining. Miners should have to pay higher rates for grid provided power also way higher than regular home power users.

GPUs are never going to be low priced again that ship has sailed and so that's a fact of life now. The Glorious PC Gaming Master Race just got Alien Invaded by the Glorious GPU Mining Uber Race with some very deep pockets and that PC Master Race now has to pay some Uber-High prices for those GPUs.

Also: Miners do not like Mining Only GPUs as a Mining Only GPU SKUs can only be sold for mining and Miners would rather that their GPUs so they could also be able to be sold to gamers as well as other miners should the miner who owns all those GPUs have to, or wants to, get out of the mining business should that mining no longer prove profitable.

March 23, 2018 | 09:25 PM - Posted by tts (not verified)

Ray tracing wouldn't be screwed up. Its an inherently noisy and statistical approach to doing lighting. At least the way its being proposed to be implemented.

The really high precision ray tracing stuff is still offline rendering only.

The problem with arguing for simplistic Econ 101 solutions here is that the situation is much much more complex that that. None of the OEM's are willing to blow out production for a fad bubble that could pop at any time. Everyone saw what happened when AMD and the AIB's produced too many R9 290's a few years back and they don't want to be stuck with the excess inventory or having to firesale it.

Rebates won't work as a limiting incentive since the GPU miners are perfectly willing to fake personal information or use straw buyers.

Electric companies have no means to identify GPU miners by power usage either BTW so they won't know who to tax and the GPU miners will set up in other locales that won't tax them anyways since laws vary greatly per state/country.

There is also no reason to believe that current GPU prices are the new norm. They were much less than they were a year or so ago and they went up for various real world reasons and they can go down again as well for various other real world reasons.

Most miners care primarily about mining and not card resale value too BTW. Its a nice bonus to recoup their hardware outlay costs but all the sane ones know that by the time they go to sell out market conditions can easily have driven the resale value down since the reasons why they'll be selling are the same as everyone else's causing a sales glut.

March 24, 2018 | 12:55 PM - Posted by MotherNatureIsNoisyAndStatisticalAtLowSampleRatesAlso (not verified)

"Ray tracing wouldn't be screwed up. Its an inherently noisy and statistical approach to doing lighting. At least the way its being proposed to be implemented."

Computer Ray Tracing is only inherently noisy and statistical at low sample/ray interaction settings and it's the same for light rays in the natural world if you take the time to look at those famous double slit experiments. So that's just not a case with respect to intentionally screwing up the math units on consumer GPUs. As intentionally screwing up the math units on consumer GPUs just to placate gamers who only game with their GPUs is really unfair to those Blender 3D/other users who can not afford the Professional GPU SKUs also.

I'm sure that the Professional GPU users whether SMBs or individual/home business GPU users can afford those Professional GPU SKU, and can also write them off as a bsuiness expense. But Students with Graphics Compute(Ray tracing/other graphics compute needs) who are learning or Indepenednt games developers really like AMD's GPUs because of their non intentionally borked compute for those Ray Tracing/Ray Interaction calcualtions acccelerated on AMD's loads of available GPU shader cores. AMD's shader heavy consumer designs are popular with more than gamers and that's good for AMD's bottom line and will give AMD the revenue growth to maybe consider/afford(GPU Tape-outs Cost $$$$$$$$+) a gaming only focused GPU design.

I think that gamers need to realise that they are now and forever going to be competing for GPUs with the miners and the other GPGPU folks that can not afford the professional level GPU SKUs and that any attempts by AMD/Nvidia to illgally set MSRP(GPU makers can not interfere with retail pricing) will not happen. Any intentional borking of math units of any consumer GPUs is going to reduce AMD's/Others GPU sales and AMD's BOD and shareholders will not allow that, what with their power to remove CEOs and anyone else that the BOD and, really, the stockholders think is not holding up to their fiduciary duty to those shareholders that own AMD.

AMD sells GPUs to a lot of different folks currently and gamers are free to form investment groups and invest in AMD as a group with the investor group able to if it provides enough investment actually get a seat on AMD's Board of Directors and steer AMD towards creating a line of gaming only Focused GPUs with less shader cores and loads of ROPs to really fling the frame rates out there for that ePeen pathological satisfaction sort of gaming need.

Vega 56 has the same amounts of Shader cores and TMUs as the GTX 1080Ti and it's only with Vega 56's 64 ROPs compared to the GTX 1080Ti 88 ROPs that Vega 56 falls behind the Ti in pixel fill rates. Vega Clock rates are a lttle behind also against Ti's clock rates. So AMD could really rework that Vega 56 design(New Base Die tape-out) with the same amounts of Shaders/TMUs and 88+ ROPs and higher clocks and then that Vega Tapeout could beat the GTX 1080Ti's in pixel fill rates that directly equates to more FPS rates. AMD's Vega micro-arch is already very efficient and its just that those Vega 10 base die tape-out based Vega 56/64 SKUs are so shader heavy, and for Ray Tracing and DX12/Vulkan the more shader cores the better, miners love them all the shader cores that they can get their hands on.

If I where on AMD's BOD I'd be breathing down the CEO/Underlings necks to get those margins above 45% ASAP by AMD rasing its per GPU die wholesale pricing to the AIB's and OEMs. In the real world, if you can not pay you can not play and in a free market economy supply and demend must not be interfered with except for valid legal and regulatory reasons.

And Mining operations are businesses and that ability to be able to rapidly liquidate a stock of GPUs to a larger used resale market at a higher price due to there being such great gaming and mining demand is most defintely on any miners mind if they have any sorts of business acumen.
Miners can not care about any gamers suffering and it's that gaming addiction that makes many gamers such fools with their disposable income. For any miners looking to unload their GPUs at the highest resale prices the more jonesing gaming addicts the better if the coin prices fall too low for there to be any justification for remaining in the coin mining business.

There are miners that like to get in low and mine enough coins to pay off the GPUs and then sell high and flip those GPUs out for sometimes more than they cost the miner new and get out before the coin prices fall too low. And then Rense and Repeat if possible.

March 24, 2018 | 06:55 PM - Posted by tts (not verified)

"Computer Ray Tracing is only inherently noisy and statistical at low sample/ray interaction settings"

Its going to be exactly that way for PC in game raytracing for some time. Doing very high quality ray tracing is still going to be too expensive. That is why they're only using ray tracing for part of the rendering process too. They'll still be using traditional renderers to do most of the work because they have to.

I'd also point out too that getting a ray tracing operation wrong every second or so isn't going to be visible to the naked eye. You'll litterally not be able to see it since it'll only be a pixel or 2 off (depends on how exactly things are handled in the renderer) for a frame every now and again.

"I'm sure that the Professional GPU users whether SMBs or individual/home business GPU users can afford those Professional GPU SKU, and can also write them off as a bsuiness expense. But Students..."

You realize there are FirePro card that sell for as little as $200 new right? You don't have to spend $1000+ for a FirePro W8100 if you just want to doodle around and test stuff out as amateur developer. Used cards are available too if you want them. Refurb Nvidia Kepler's are out for less than $400 as well.

"I think that gamers need to realise that they are now and forever going to be competing for GPUs with the miners"

The future will not exactly be like the past and mining as it currently exists is dependent on a lot things not changing at all. If govt. continue to ban it and/or regulate it, which is highly likely, you're going to see a big shift by default for instance.

"gamers are free to form investment groups"

Hahahha are you serious?! You might as well argue for them to all go out and win lotto tickets to afford GPU's.

"So AMD could really rework that Vega 56 design(New Base Die tape-out)"

The problem with Vega56 and all their current GPU's is the uarch its based on was getting old 2yr ago and they really needed to have a replacement ready to go for it instead of releasing Vega. On top of that GCN wasn't really quite the right fit for the gaming PC market, it was more focused on GPGPU which never really took off + HBM turned out to be a bad bet. It takes years to do a new uarch though and AMD didn't have the R&D resources so they're stuck with GCN all the way for now. Even Navi will be based on GCN so don't expect anything amazing there eiher.

"If I where on AMD's BOD I'd be breathing down the CEO/Underlings necks to get those margins above 45% ASAP by AMD rasing its per GPU die wholesale pricing to the AIB's and OEMs."

Hahahahaha you'd be booted off the BoD fast with that game plan. Why? Because AIB's and OEM's are only interested in paying the minimum for AMD GPU's because they know they're not that good right now vs the competition's in terms of both total performance as well as perf/watt. They have to compete on price, they have no choice!! AMD can't just force everyone to pay nearly 50% more for no reason at all, even Intel couldn't do something like that back in the days when they were throwing their weight around to get OEM's to buy P4's!! If Intel has to compete on price, even in shady back room deal situations, why in the world would you expect AMD to some how strong arm the OEM's + AIB's into paying nearly 50% more for nothing?!!

"Mining operations are businesses"

Then they know about wear and tear on hardware leading to its price depreciation + market sales conditions right? Its a really nice idea to be able to sell your GPU for exactly what you paid for it but everyone with half a brain knows that is unlikely for what are now in some cases 3yr+ old GPU's that won't perform all that great in newer games, that have been ran hard, often flashed with custom mining BIOS'es and need to be reflashed, and are selling into a flooded market since everyone else will be trying to sell off.

You're basically hoping on a Greater Fool to come along and buy your stuff at a inflated price for BS reasons. And everyone who knows how to run a business knows you can't rely on a Greater Fool scenario to keep yourself solvent.

March 24, 2018 | 08:10 PM - Posted by NoBigWorrysForAMDGoingForward (not verified)

"Its going to be exactly that way for PC in game raytracing for some time. Doing very high quality ray tracing is still going to be too expensive. That is why they're only using ray tracing for part of the rendering process too. They'll still be using traditional renderers to do most of the work because they have to."

Blender 3d/other software Ray tracing for Graphics is more important than gaming for some, no one but gamers cares about gaming on Vega. FPS is not even on my RADAR! Only Gamers have to worry about that FPS nonsense so let gamers worry about traditional renderers and that crappy gaming image quality. Graphics arts folks and animators like Ray Tracing also on AMD's consumer variants just like miners lake hashing on Vega's shader cores.

The "FirePro" branding does not exist any longer it's Radeon Pro "WX" with that "WX" denoting Professional graphcs/cpmpute. And AMD's WX branded Vega based SKUs with 4096 shader cores are sure not costing only $200.

"Then they know about wear and tear on hardware leading to its price depreciation + market sales conditions right? Its a really nice idea to be able to sell your GPU for exactly what you paid for it but everyone with half a brain knows that is unlikely for what are now in some cases 3yr+ old GPU's that won't perform all that great in newer games, that have been ran hard, often flashed with custom mining BIOS'es and need to be reflashed, and are selling into a flooded market since everyone else will be trying to sell off.

You're basically hoping on a Greater Fool to come along and buy your stuff at a inflated price for BS reasons. And everyone who knows how to run a business knows you can't rely on a Greater Fool scenario to keep yourself solvent.
"

Mining GPUs are usually under-volted and run for maixmum Hashes/Watt and are no where near as stressed as gaming GPUs. So that's really not a problem for adding to the degradation of any mining GPUs. AMD will continue to sell GPUs to anyone who will buy them and gamers can just get in there and pay what that market demand will bring as far as GPU pricing, the more demand the higher the prices will be if production(Supply) can not keep up with demand.

There are no greater fools on this planet, nor will there be in this planet's history, any greater Fool than the Bog Standard Gamer.

"Hahahahaha you'd be booted off the BoD fast with that game plan. Why? Because AIB's and OEM's are only interested in paying the minimum for AMD GPU's because they know they're not that good right now vs the competition's in terms of both total performance as well as perf/watt. They have to compete on price, they have no choice!! AMD can't just force everyone to pay nearly 50% more for no reason at all, even Intel couldn't do something like that back in the days when they were throwing their weight around to get OEM's to buy P4's!! If Intel has to compete on price, even in shady back room deal situations, why in the world would you expect AMD to some how strong arm the OEM's + AIB's into paying nearly 50% more for nothing?!!"

Some MINING Intersts are buying Dies and contracting out their own in-house AIB sorts of configurations so AMD can sell there at a higher margin. AMD sells more Vega 10 dies to the professional market for AMD's Radeon Pro WX 9100s and Its Radeon Instinct MI25 compute/AI cards. That project 47 petaflops supercomputer in a single cabinet uses 20 Epyc 32 core CPUs/SP3 MBs and 80, Vega 10 base die based, Randeon Instinct MI25's per cabinet.

The miners and gamers, Blender 3D folks also, get to fight over the lower binned Vega 10 dies that don't make the Binning grade to become MI25's or WX9100 and AMD gets a proper mark-up for its Vega 10 die based Professional Compute/AI GPU acceleator SKUs. Those AIB/OEM partners are very ready to pay AMD a higher markup for their Vega 10 dies, as the miners will pay an even greater markup just to get Vega 64's 4096 shaders cores. Until the Vega GPU damand dries up there can be greater margins to be had by AMD for its top Vega consumer GPU dies. AMD needs to get those margins above 45% to attract more invstors.

There will be no gimping of Vega shaders to placate gamers! AMD will not be doing that or need to worry about any flagship SKUs as that's not where the most GPU revenues come from. AMD has its Zen/Zeppelin die based CPUs to save the compamy's bacon along with Raven Ridge APUs and professional Vega based GPU to complement Epyc in the server rooms AI/infrerecing farms. AMD can take its time and just keep humoring the Flagship gaming freeks. AMD's Vega discrete GPUs are always out of stock waiting on the next resupply run so that ball is in AMD's cort.

March 25, 2018 | 05:07 AM - Posted by tts (not verified)

*looks at wall of nonsense text*

*walks out of thread while laughing*

March 25, 2018 | 11:31 AM - Posted by MadnessSpaffsEloquentMaybeNot (not verified)

And now you empathise with how others feel about that torally ridiculous statment:

"A hardware approach to preventing GPU's to be used for mining would be to introduce single bit errors every now and again into the work done by the hardware. The errors won't be visible in game and mining would be effectively sabotaged on those cards."

GPUs are not the domain of Only gamers, and there can be no moves made to gimp any GPUs to placate gamers.

[The Trolling starts Here]

Gamers are asking too much without putting up the money it takes for AMD orinetate its GPUs towards only gaming usage. AMD had funds for only one Vega 10 base die Tapeout at the time. What an load of utter nonsense from these self entitled gamers, gamers who will not even pay a proper markup, to expect such treatment from an AMD with so little market share in the discrete gaming GPU marketplace.
Gamers need to start hoping that Epyc/Radeon Pro WX and Radeon Instinct MI25 sales will take off and take precedent so AMD can get those margins above 45% and attract more long term investment/investors to AMD. More Investment meas more money for AMD to Nvidia(Has so many Billions more Dollars) up its gaming GPU focused GPU offerings.

Gamers stop your crying for new GPU micro-archs every year, and maybe start asking for more ROPs from AMD in a new Vega tapeout with at least 88 ROPs to match the GTX 1080Ti's 88 ROPs and high pixel fill rates. Vega 56 is 2/3 of that way there to matching the GTX 1080TI as Vega 56 has the exact same number if Shader cores and TMUs as the GTX 1080Ti, So all AMD has to do is bump up Vega 56's ROP counts from 64 to 88 and Vega 56 will match the GTX 1080Ti in resources.
Hell AMD could use 96 ROPs and match GP102's full complement of ROPs and that new Vega tapeout could beat the GTX1080Ti at that pixel fill rate metric that directly translates to higher FPS metrics.

Navi's GPU micro-arch is not going to be about that many changes relative to Vega's micro-arch. Navi is going to be modular like Zen/Zeppelin and allow AMD to very quickly and efficiently create a top to bottom line of GPU SKUs and easily allow AMD to create many Navi variants, via those Modular/Scalable Navi Dies/chiplets, to cover the whole GPU market of consumer GPUs bottom to mainstream to flagship and Professional compute/AI also.

[More Trolling on unrelated subject of interest follows]

Really gamers AMD's Explicit Primitive Shaders are still there to be made use of by games/gaming engine develpers while Implicit Primitive Shaders, Via some driver-middleware layers are still a work in progress for some open source developers, AMD has limited ->$<-. Implicit Primitive Shaders are for lagacy games/gaming engines that are not designed to take advantage of Vega's Explicit Primitive shaders. AMD Fanboys(Nutjobs Red Or Green) stop fudding up you own Red Team r/amd posts with your wholesale ignorance of just what Explicit primitive Shaders are and how the games/gaming engine developers can make use of that Vega Primitive Shader IP on new games/gaming engines.

That's a rather large and expensive task writing the driver/middleware shader code translation layers to translate Legacy gaming/gaming engine shader kernels over implicitly to Primitive Shader kernels to make use of Vega's Explicit Primitive Shader hardware. If You Have half a billion dollars to give to AMD/Others then maybe that Implicit Primitve Shader project can be restarted on your dime.

[OK off that subject and on to more Trolling about something else]

I'd rather that AMD delay Navi for the consumer markets for about 6 months and work on fixing its Raven Ridge APU driver issues and getting some New Vega GPU micro-arch based tapeouts with more ROPs and higher pixel fill rates. AMD needs to provide its laptop OEM clients with some Driver Team Support funded by AMD so those Raven Ridge APU SKUs can get more driver updates from the Laptop OEM's. This is what Nvidia does with its Laptop OEM partners and AMD needs to get that laptop driver support driver help to laptop OEMs, otherwise AMD will not get laptop/Raven Ridge sales, not matter how good the Raven Ridge/Vega integrated graphics is.

[More trolling below about: The Fair and Free markets that are not so Fair and Free(CPU and GPU markets) what with all those fat brown envelopes so full of incentives going around and/or coercive acts towards forcing vertical integration of third party OEM's gaming branding under that big GPU Monolopy's control in the GPU market.]

The way things are going in the PC/Laptop market with AMD on the recieving end of abuse from both Nvidia and Intel AMD may just have to commission its own in-house laptop designs from some sub-contractor. So AMD can maybe get some laptop designs that use Generic Graphics Drivers instead of those OEM's customizied graphics drivers that the OEM's never fully support. Maybe the laptop OEMs are being incentivized by some BIG CPU monolopy market share holder to not provide proper and timely driver support on AMD/Raven Ridge based OEM laptops and that's a bit more insidious than that single memory channel borking or putting AMD's Raven Ridge APUs in inferior designed/cooled laptop hardware. The Big GPU monolopy is up to the same sorts of skulduggery now against AMD than that BIG CPU monolopy was way back when, and still is up to to a lesser degree right now.

[It's a Wall-O-Text all while on a SoapBox and wearing a tintoil hat with this diatribe while also trying to be informative(Somewhat) post(Troll). It's trollformative(Trolling and Informative) by design.]

March 25, 2018 | 01:46 PM - Posted by WhyMe (not verified)

https://www.youtube.com/watch?v=6gLMSf4afzo

March 26, 2018 | 11:16 AM - Posted by TheMetaphysicalTerrorOfTheModularScalableGPUsDIEs (not verified)

That's not watched, use words to convey meanings!

Memes don't work so well here as it's not really as bad as WCCFTech, most of the time. By Why You Indeed, and do not feed Me that load! And do not invite trolling by posting absurd things like:

"A hardware approach to preventing GPUs to be used for mining would be to introduce single bit errors every now and again into the work done by the hardware. The errors won't be visible in game and mining would be effectively sabotaged on those cards."

AMD would never intentionally do anything to reduce the accuracy of their hardware, just to placate gamers. A Miner's money, or Graphics Arts(Non Gaming Graphics) User's Money, is just as good as any gamer's Money to a Business like AMD. AMD is in Business to sell CPUs, GPUs to any and all markets and not be like Nvidia so totally dependent on gamers that Nvidia is willing to risk fines for its current market tactics to protect that dependency on gamers for over 50% of Nvidia's overall revenues.

Zen is what has saved AMD's bacon and just like AMD's Opteron revenues back in that day, Zen/Epyc will allow AMD to live long and make mad revenue gains year on year for some years to come. Stop whining you flagship GPU freaks and wait for AMD's revenues to return as flagship gaming GPUs are not where the revenues are for any GPU makers. It's the professional GPU markets that have the proper markups that are the only rational turn a profit way to afford to create the revenues to allow for any GPU maker to create any Flagship Gaming GPU SKU's in the First Place, Flagship gaming GPUs are a loss leader for AMD and to a lesser degree Nvidia also.

AMD has the technology already in Vega to beat Nvidia but AMD does not currently have the money to spend on too many costly base die tapeouts like Nvidia can afford.

Look at Vega 56 and it's already almost a GTX 1080Ti with the Vega 56 having the same amount of Shader cores and TMUs as the Ti. Take Vega 56 and give it 88 ROPs(the Ti has 88 ROPs) instead of Vega 56's current complement of 64 ROPs and Vega 56 will match the GTX 1080Ti in ROP counts also. Let's give That New Vega SKU a little more than 88 ROPs and then it will be able to compete in Pixel Fill Rates without having to have as high clock rates as the GTX 1080Ti. Let's rebuild the Vega 56 as a new Vega die tapeout with 96 ROPs and maybe a little more shader cores too if needed and that's getting into full GP102 territory.

Come to think of it, maybe AMD will offer a dual Vega 56 on a single PCIe card variant and use the Infinity Fabric to tie/interface the 2 Vega 56 DIEs to each other on a single PCIe card into a single larger logical GPU/Processor.
AMD could do this Dual Vega 56 DIEs on a single PCIe card interfaced with the Infinity Fabric as a prelude to Navi! As Navi will be doing the same thing with The Infinity Fabric using smaller more modular Navi DIEs/Chiplets. That would be easy for AMD to accomplish if there where enough Binned Vega 10 base dies available.

Nvidia FEARS any Modular Navi DIEs/Chiplets, Zen/Zeppelin modular Die like, sort of modular processor DIEs sort of arrangement with GPU dies where AMD can quickly make an entire line low end to flagship GPU out of a bunch of smaller modular, at much higher die/wafer yields, Navi GPU DIEs/Chiplets. Nvidia is in Metaphysical Terror of AMD's Navi, and it's not because of Navi's GPU micro-arch being that much different than Vega's GPU micro-arch. Nvidia is in Metaphysical Terror of Navi because A Navi modular scalable GPU design represents the same sorts of competition to Nvidia as AMD's Zen/Zeppelin Die represents to Intel! And we all know how Zen/Zeppelin in the from of Threadripper so fustigated Intel's High Margin HTPC SKU pricing in the CPU market.

Look at Nvidia's current market tactics and see the Metaphysical Terror at Nvidia at the thought of a Modular/Scalable Navi GPU die and what that Modular Zen/Zeppelin CPU die did to Intel's High Margin HTPC revenues. Nvidia is in an OMG, OMFG, sort of mode thinking about what happened to Intel's high margin HTPC CPU mark-ups when Threadripper's Modular/Scalable Zen/Zeppelin came online. And If Navi is that same sort of Zen/Zeppelin sort of Modular/Scalable line of GPUs then, well, What will happen to Nvidia if the Miners are not able to eat up all of AMD's Modular/Scalable Navi die's production, and currently the Zen/Zeppelin Dies come of the diffusion lines at GF with such high yields that AMD has not too much trouble there.

Nvidia's CEO looks at the idea of a possible Navi Modular/Scalable GPU DIE in the same context of that AMD Zen/Zeppelin die and it’s more than just like Wile E Coyote looking up at that big Red Boulder coming down at him and his little umbrella. No It's more like that moon coming at Scratchy but it’s not really the Earth’s moon it’s that big black hole that swallows up any and all High Margins, and Nvidia is all about its High Margins if its about anything at all. The Horror!

March 26, 2018 | 01:08 PM - Posted by WhyMe (not verified)

I don't have a clue what you said as I'm not about to read the drivel and walls of text you insist on subject everyone to.

The comments section of PCPer is meant to be about commenting on the actual article, it's not the place for you to stand on your soapbox and lecture everyone, if you can't say it in a couple of sentence you should take it to the forums.

March 26, 2018 | 02:58 PM - Posted by HitTheScienceBooksGamersMaybeTheOtherRsAlso (not verified)

Ah Ha ha ha! You are beside yourself with that reply. And still letting the Inform-A-Troll lead. And Gaming has become such a laughing stock part of the larger technology world. That long meandering drivel takes a back seat to the short, very daft, and disingenuous drivel that comes via your hunt and peck gobbledygook.

[Trolling for Dafty Ducks! Hook, Line, and Sinker. Starts here]

Gamers are the most self entitled bunch of whiners on the face of the earth. And please do not post to Links without including that Link's(Video, Article, otherwise) TITLE in your post, do you expect folks to read your mind.

Mainstream Gaming, not Flagship Gaming, is the only way that most of the Little Eric fatazz basement Dwellers will have to game. The Glorious PC Master Race does not even have deep enough pockets to afford anything but mainstream.
How about those Vega sales figures, and those Epyc CPU revenues will dwarf any gaming revenues for AMD.

Troll-ra-loll-ra-loll-ral,
Troll-ra-loll-ra-li,
Troll-ra-loll-ra-loll-ral,
Hush, now don't you cry!

Troll-ra-loll-ra-loll-ral,
Troll-ra-loll-ra-li,
Troll-ra-loll-ra-loll-ral,
That's a Trollish lullaby.

Now all you Flagship GPU freaks, pony up $500 million and maybe AMD can make your ePEEN a little bigger. Those Tapeouts amd Mask-sets are damn costly and so are all those validation steps. The Future is Modular and Scalable with that Wonder Glue to tie things up right nicely for the ones that have the IP!

March 24, 2018 | 01:43 PM - Posted by Rick0502 (not verified)

Always a great listen, keep it up.

Can I make you an offer? How about I help you clean up your office and you let me keep some stuff? Win, win, win? As a bonus, if I go 30 pull ups you throw in something as a bonus?

March 25, 2018 | 04:19 PM - Posted by Josh Walrath

I might consider this...

March 26, 2018 | 11:37 AM - Posted by TheMetaphysicalTerrorOfTheModularScalableGPUsDIEs (not verified)

Whisky is the better trade for stuff. Really good wobble water! [Scratches his throat, Whisky]

March 26, 2018 | 01:01 AM - Posted by Kangaroo Mail Bag (not verified)

Why are there no external monitors like the Surface Studio? When the Surface Studio first came out everyone agreed that the hardware was sub par and the screen was exceptional. Apart from panel cost, is there a reason that the market going for Ultra Wides instead of the more usable 3:2 aspect ratio?

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.