PCPer Mailbag #13 - 10/13/2017

Subject: Editorial | October 13, 2017 - 09:00 AM |
Tagged: video, Ryan Shrout, pcper mailbag, pcper

It's Friday, which means it's time for PC Perspective's weekly mailbag, our video show where Ryan and team answer your questions about the tech industry, the latest and greatest hardware, the process of running a tech review website, and more!

Here's what you'll find on today's show:

00:32 - Successor to ATX design standard?
02:42 - 64-bit vs. 32-bit Windows gaming performance?
04:03 - What comes after Windows 10?
05:33 - How to save SLI and CrossFire?
07:59 - How does a CPU/GPU go from wafer to shipped product?
10:00 - The maturity of Ryzen since launch?
13:54 - Windows 7 security updates with Kaby Lake?
16:11 - Comparing new CPUs to older generations?
18:14 - Did Intel see Ryzen's good performance coming?
22:09 - Node shrinks and power usage?
24:21 - Gone fishin'?

Be sure to subscribe to our YouTube Channel to make sure you never miss our weekly reviews and podcasts, and please consider supporting PC Perspective via Patreon to help us keep videos like our weekly mailbag coming!

Source: YouTube

October 13, 2017 | 09:27 AM - Posted by JohnGR

Developing a product, doesn't mean marketing a product. They could have 6 core mainstream processors from the Broadwell era, but never felt the need to market them. So they where never got out from the lab. Intel does have a large R&D budget to create products that will not see the light of day, but will offer them the chance to get some things done way before they feel the need to market such a product.

14 to 18 cores Skylake X models probably weren't into Intel's roadmap before Threadripper. If they where, motherboard manufacturers would probably have better ready motherboards from day one. They would knew what is coming and that they would need to support it. Also there where some leeks in the past with 6 core models running on Z270 motherboards. Who can say that those where not 6 core Kaby Lake models that where never seen the light of day, instead they where tweaked and rebranded as Coffee Lake models needing a new chipset?

Tin foil theories? Maybe. Or maybe not. A company like Intel that can spent billions on R&D SHOULD have back up plans on the background running in parallel with retail products.

October 13, 2017 | 08:41 PM - Posted by Dark_wizzie

When I asked the question I was thinking... Okay, maybe Intel didn't make a nimble ninja pivot and released something from nothing. But, even the public knew Ryzen was coming years ago. Last time Intel was willing to sell us a crappy 10 core for $1700? Now all of a sudden mainstream breaks the 4 core barrier and HEDT goes up to 18? I think it's crazy to think such a rapid increase in core count had nothing to do with Ryzen. It just seems too coincidental.

October 14, 2017 | 05:13 AM - Posted by JohnGR

It would have been strange for Intel to not being able to persuade one of the motherboard manufacturers to do an unofficial test run of an early Ryzen system. What they probably didn't knew was the final frequencies and prices of Ryzen processors. And probably they only learned about Threadripper a couple of months before AMD's announcement. In Threadripper's case probably even motherboard manufacturers didn't knew about it's role as an HEDT alternative. I wouldn't be surprised if AMD was using EPYC chips to test that platform, as a professional single socket system, not as an AMD HEDT platform, surprising everyone.

October 13, 2017 | 10:16 AM - Posted by Anonymously Anonymous (not verified)

thumbnail caption:

"Allyn, what do you mean by 'come pull my finger'? "

October 13, 2017 | 10:25 AM - Posted by Benjamins (not verified)

Then are we going to see the X399 NVME RAID tests?

October 13, 2017 | 02:59 PM - Posted by Allyn Malventano

I've been testing X399 RAID. It hasn't been consistent enough to publish on yet. We're working with AMD to get to the bottom of the issue.

October 14, 2017 | 05:20 AM - Posted by JohnGR

[Humor ON] I have a friend who only feels good with Intel CPUs and systems. Every time he touches an AMD system, that system will either have stability problems, or performance problems. Even systems that I used and had no problems, they become problematic in his hands. He never had a good experience with AMD. I mean, he feels better with a Core 2 Duo than a quad core Phenom II. Maybe it's the same case? [/Humor OFF]

October 16, 2017 | 12:25 PM - Posted by Allyn Malventano

I hope so, because I wouldn't wish what this is doing on anyone else...

October 13, 2017 | 11:41 AM - Posted by CFandSLIareVeryDead (not verified)

"How to save SLI and CrossFire?" They're Dead Jim!

It's all in the hands of the gaming industry to make use of the Vulkan/DX12 APIs and the GPU makers respective to the metal DX12/Vulkan drivers. Explicit Multi-GPU in the API(DX12/Vulkan) is now going to be the method used to load balence between 2 or more GPUs.

So the gaming engine makers will have to be the ones that create in the gaming engine SDK the necessary software tools and middleware for Multi-GPU. Maybe the entire gaming industry can cerate a multi-GPU standard softwere tool chain/middleware package that all the gaming engine SDK's can make use of. But I'd expect that the Gaming engine makers will be competing with each other for gaming engines that have the best Multi-GPU(Via DX12/Vulkan) gaming engine support.

So as the gaming engine makers/games makers begin to offer more of their own DX12/Vulkan enabled gaming engine based games that Multi-GPU gaming support will begin to take place.

DX12 and Vulkan both support the too the GPU's metal driver support so the gaming engine makers can do multi-GPU on their own. Graphics API managed Multi-GPU load balancing has a lot of potential for games developers via the gaming engines to make use of all the GPUs on a PC/Laptop to be utilized for some Gaming Graphics/Gaming compute usage ant that includes integreted graphics maybe used to accelerate gaming physics while the discrete GPU/s can do the graphics but it's going to take time to develope the software/gaming engine ecosystem that can make use of any GPU plugged into a Laptop/PC for gaming graphics/gaming compute. Khronos the industry group responcable for Vulkan is developing more tools and is also merging OpenCL into its Vulkan(1) standard.

(1)

Breaking: OpenCL Merging Roadmap into Vulkan

https://www.pcper.com/reviews/General-Tech/Breaking-OpenCL-Merging-Roadm...

October 13, 2017 | 12:21 PM - Posted by Streetguru

Why does no one know what a Bottleneck means for gaming?

Would you agree with the simplification that:
CPUs bottleneck refresh rate
GPUs bottleneck Resolution

Since you can scale your GPU down to 480p lowest settings for it's max fps, but you can't do much to make your CPU faster in the same type of way.

October 13, 2017 | 12:38 PM - Posted by GrumpyCat (not verified)

Can you please add a DAW (Digital Audio Workstation) to your benchmark suite?

October 14, 2017 | 02:45 AM - Posted by dtkflex

Are there any freesync 2 monitors available for sale yet or any OLED screen monitors since there are phones, TVs and laptops with OLED screens already available?
Where are all the HDR monitors that were shown off at last CES?
Or are they just so unaffordable yet that I haven't been looking in the right places (OEM only or enterprise class devices)?

October 14, 2017 | 02:40 PM - Posted by Anonymousss (not verified)

Ryan you did answer my question, thanks. I know it wasn't very well phrased.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.