Podcast #438 - Vulkan, Logitech G213, Ryzen Preorders, and more!

Subject: Editorial | February 23, 2017 - 05:16 PM |
Tagged: vulkan, ryzen, qualcomm, Qt, mesh, g213, eero, corsair, bulldog

PC Perspective Podcast #438 - 02/23/17

Join us for Vulkan one year later, Logitech G213 Keyboard, eero home mesh networking, Ryzen Pre Orders, and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

Hosts: Allyn Malventano, Ken Addison, Josh Walrath, Jermey Hellstrom

Program length: 0:58:01

Podcast topics of discussion:
  1. Week in Review:
  2. News items of interest:
  3. Hardware/Software Picks of the Week
    1. Allyn: SS64.com - Nifty programmer's reference for scripting, web, db
    2. Ken: Dell refurbished XPS 13
  4. Closing/outro
 

Source:

The Qt Company Announces Qt 3D Studio

Subject: General Tech | February 20, 2017 - 09:46 PM |
Tagged: vulkan, Qt, nvidia

NVIDIA has just donated their entire DRIVE Design Studio to The Qt Company, who will form it into Qt 3D Studio. This product will be a visual editor for 3D user interfaces, where layers of 2D and 3D objects can be created, animated, and integrated into C++ applications. It will take them a little while to clean it up for public consumption, but it will eventually be available under the commercial / open-source dual-license that users of Qt are accustomed to.

If you’re not familiar with the Qt Framework, then, basically, think of a cross-platform, open-source alternative to the .NET framework, although it is based in unmanaged C++. (It also competes with GTK+. This isn’t a major point, but I would like it to be clear that it’s not a two-person race between one proprietary and one open-source player.) When AMD updated their graphics drivers to Crimson Edition, and flaunted huge speed-ups, it was mostly because they switched the control panel's UI framework from .NET to Qt.

As an aside, The Qt Company joined the Khronos Group on the day that Vulkan launched, which was almost exactly a year ago, and they are actively working on integrating the API in their framework. Combined with today’s announcement, it’s not hard to imagine how much easier it will be, some day, to create efficient and beautiful UIs.

Update: Speaking of which, The Qt Company is apparently planning to release Vulkan support with Qt 5.10.

Manufacturer: PC Perspective

Living Long and Prospering

The open fork of AMD’s Mantle, the Vulkan API, was released exactly a year ago with, as we reported, a hard launch. This meant public, but not main-branch drivers for developers, a few public SDKs, a proof-of-concept patch for The Talos Principle, and, of course, the ratified specification. This sets up the API to find success right out of the gate, and we can now look back over the year since.

khronos-2017-vulkan-alt-logo.png

Thor's hammer, or a tempest in a teapot?

The elephant in the room is DOOM. This game has successfully integrated the API and it uses many of its more interesting features, like asynchronous compute. Because the API is designed in a sort-of “make a command, drop it on a list” paradigm, the driver is able to select commands based on priority and available resources. AMD’s products got a significant performance boost, relative to OpenGL, catapulting their Fury X GPU up to the enthusiast level that its theoretical performance suggested.

Mobile developers have been picking up the API, too. Google, who is known for banishing OpenCL from their Nexus line and challenging OpenGL ES with their Android Extension Pack (later integrated into OpenGL ES with version 3.2), has strongly backed Vulkan. The API was integrated as a core feature of Android 7.0.

On the engine and middleware side of things, Vulkan is currently “ready for shipping games” as of Unreal Engine 4.14. It is also included in Unity 5.6 Beta, which is expected for full release in March. Frameworks for emulators are also integrating Vulkan, often just to say they did, but sometimes to emulate the quirks of these system’s offbeat graphics co-processors. Many other engines, from Source 2 to Torque 3D, have also announced or added Vulkan support.

Finally, for the API itself, The Khronos Group announced (pg 22 from SIGGRAPH 2016) areas that they are actively working on. The top feature is “better” multi-GPU support. While Vulkan, like OpenCL, allows developers to enumerate all graphics devices and target them, individually, with work, it doesn’t have certain mechanisms, like being able to directly ingest output from one GPU into another. They haven’t announced a timeline for this.

Vulkan is not extinct, in fact it might be about to erupt

Subject: General Tech | February 15, 2017 - 06:29 PM |
Tagged: vulkan, Intel, Intel Skylake, kaby lake

The open source API, Vulkan, just received a big birthday present from Intel as they added official support on their Skylake and Kaby Lake CPUs under Windows 10.  We have seen adoption of this API from a number of game engine designers, Unreal Engine and Unity have both embraced it, the latest DOOM release was updated to support Vulkan and there is even a Nintendo 64 renderer which runs on it.  Ars Technica points out that both AMD and NVIDIA have been supporting this API for a while and that we can expect to see Android implementations of this close to the metal solution in the near future.

khronos-2016-vulkanlogo2.png

"After months in beta, Intel's latest driver for its integrated GPUs (version 15.45.14.4590) adds support for the low-overhead Vulkan API for recent GPUs running in Windows 10. The driver supports HD and Iris 500- and 600-series GPUs, the ones that ship with 6th- and 7th-generation Skylake and Kaby Lake processors."

Here is some more Tech News from around the web:

Tech Talk

Source: Ars Technica

WebKit Proposal for WebGPU

Subject: General Tech | February 9, 2017 - 03:46 AM |
Tagged: webkit, webgpu, metal, vulkan, webgl

Apple’s WebKit team has just announced their proposal for WebGPU, which competes with WebGL to provide graphics and GPU compute to websites. Being from Apple, it is based on the Metal API, so it has a lot of potential, especially as a Web graphics API.

Okay, so I have mixed feelings about this announcement.

apple-2017-webkit-logo.png

First, and most concerning, is that Apple has attempted to legally block open standards in the past. For instance, when The Khronos Group created WebCL based on OpenCL, which Apple owns the trademark and several patents to, Apple shut the door on extending their licensing agreement to the new standard. If the W3C considers Apple’s proposal, they should be really careful about what legal control they allow Apple to retain.

From a functionality standpoint, this is very interesting, though. With the aforementioned death of WebCL, as well as the sluggish progress of WebGL Compute Shaders, there’s a lot of room to use one (or more) GPUs in a system for high-end compute tasks. Even if you are not interested in gaming on a web browser, although many people are, especially if you count the market that Adobe Flash dominated for the last ten years, you might want to GPU-accelerate photo and video tasks. Having an API that allows for this would be very helpful going forward, although, as stated, others are working on it, like The Khronos Group with WebGL compute shaders. On the other-other hand, an API that allows explicit multi-GPU would be even more interesting.

Further, it sounds like they’re even intending to ingest byte-code, like what DirectX 12 and Vulkan are doing with DXIL and SPIR-V, respectively, but it currently accepts shader code as a string and compiles it in the driver. This is interesting from a security standpoint, because it obfuscates what GPU-executed code consists of, but that’s up to the graphics and browser vendors to figure out... for now.

So when will we see it? No idea! There’s an experimental WebKit patch, which requires the Metal API, and an API proposal... a couple blog posts... a tweet or two... and that’s about it.

So what do you think? Does the API sound interesting? Does Apple’s involvement scare you? Or does getting scared about Apple’s involvement annoy you? Comment away! : D

Source: WebKit

NVIDIA Releases Vulkan Developer 376.80 Beta Drivers

Subject: Graphics Cards | February 3, 2017 - 10:58 PM |
Tagged: nvidia, graphics drivers, vulkan

On February 1st, NVIDIA released a new developer beta driver, which fixes a couple of issues with their Vulkan API implementation. Unlike what some sites have been reporting, you should not download it to play games that use the Vulkan API, like DOOM. In short, it is designed for developers, not end-users. The goal is to provide correct results when software interacts with the driver, not the best gaming performance or anything like that.

khronos-2016-vulkanlogo2.png

In a little more detail, it looks like 376.80 implements the Vulkan 1.0.39.1 SDK. This update addresses two issues with accessing devices and extensions, under certain conditions, when using the 1.0.39.0 SDK. 1.0.39.0 was released on January 23rd, and thus it will not even be a part of current video games. Even worse, it, like most graphics drivers for software developers, is based on the old, GeForce 376 branch, so it won’t even have NVIDIA’s most recent fixes and optimizations. NVIDIA does this so they can add or change the features that Vulkan developers require without needing to roll-in patches every time they make a "Game Ready" optimization or something. There is no reason to use this driver unless you are developing Vulkan applications, and you want to try out the new extensions. It will eventually make it to end users... when it's time.

If you are wishing to develop software using Vulkan’s bleeding-edge features, then check out NVIDIA’s developer portal to pick up the latest drivers. Basically everyone else should use 378.49 or its 378.57 hotfix.

Source: NVIDIA

DirectX Intermediate Language Announced... via GitHub

Subject: Graphics Cards | January 28, 2017 - 02:19 AM |
Tagged: microsoft, DirectX, llvm, dxil, spir-v, vulkan

Over the holidays, Microsoft has published the DirectX Shader Compiler onto GitHub. The interesting part about this is that it outputs HLSL into DirectX Intermediate Language (DXIL) bytecode, which can be ingested by GPU drivers and executed on graphics devices. The reason why this is interesting is that DXIL is based on LLVM, which might start to sound familiar if you have been following along with The Khronos Group and their announcements regarding Vulkan, OpenCL, and SPIR-V.

As it turns out, they were on to something, and Microsoft is working on a DirectX analogue of it.

microsoft-2015-directx12-logo.jpg

The main advantage of LLVM-based bytecode is that you can eventually support multiple languages (and the libraries of code developed in them). When SPIR-V was announced with Vulkan, the first thing that came to my mind was compiling to it from HLSL, which would be useful for existing engines, as they are typically written in HLSL and transpiled to the target platform when used outside of DirectX (like GLSL for OpenGL). So, in Microsoft’s case, it would make sense that they start there (since they own the thing) but I doubt that is the end goal. The most seductive outcome for game engine developers would be single-source C++, but there is a lot of steps between there and here.

Another advantage, albeit to a lesser extent, is that you might be able to benefit from performance optimizations, both on the LLVM / language side as well as on the driver’s side.

According to their readme, the minimum support will be HLSL Shader Model 6. This is the most recent shading model, and it introduces some interesting instructions, typically for GPGPU applications, that allow multiple GPU threads to interact, like balloting. Ironically, while DirectCompute and C++AMP don’t seem to be too popular, this would nudge DirectX 12 into a somewhat competent GPU compute API.

DXIL support is limited to Windows 10 Build 15007 and later, so you will need to either switch one (or more) workstation(s) to Insider, or wait until it launches with the Creators Update (unless something surprising holds it back).

Unity 5.6 Beta Supports Vulkan API

Subject: General Tech | December 22, 2016 - 11:54 PM |
Tagged: pc gaming, Unity, vulkan

One of the most popular video game engines, Unity, has released a beta for Unity 5.6, which will be the last version of Unity 5.x. This release pushes Vulkan into full support on both desktop and mobile, which actually beats Unreal Engine 4 on the desktop side of things. Specifically, Vulkan is available for the Android, Windows, Linux, and Tizen operating systems. Apple users should be happy that this version also updates Metal for iOS and macOS, but Apple is still preventing vendors from shipping Vulkan drivers so you really shouldn’t feel too happy.

unity-logo-rgb.png

At Unity’s Unity 2016 keynote, the company claimed about 30-60% better performance on the new API “out-of-the-box”. I do find this statement slightly odd, though, because Unity doesn’t really provide much access to “the box” without expensive source code up-sells. The most user involvement of the engine internals, for what I would assume is the majority of projects, is buying and activating a plug-in, and Vulkan would be kind-of crappy to hide behind a pay wall.

I mentioned that this will be the last Unity 5.x version. While the difference between a major and a minor version number tends to be just marketing these days, Unity is changing their major version to align with the year that it belongs to. Expect future versions, starting with a beta version in April, to be numbered 2017.x.

Unity 5.6 comes out of beta in March.

Source: Unity
Subject: Systems, Mobile

Vulkan 1.0, OpenGL 4.5, and OpenGL ES 3.2 on a console

A few days ago, sharp eyes across the internet noticed that Nintendo’s Switch console has been added to lists of compliant hardware at The Khronos Group. Vulkan 1.0 was the eye-catcher, although the other tabs also claims conformance with OpenGL 4.5 and OpenGL ES 3.2. The device is not listed as compatible with OpenCL, although that does not really surprise me for a single-GPU gaming system. The other three APIs have compute shaders designed around the needs of game developers. So the Nintendo Switch conforms to the latest standards of the three most important graphics APIs that a gaming device should use -- awesome.

But what about performance?

In other news, Eurogamer / Digital Foundary and VentureBeat uncovered information about the hardware. It will apparently use a Tegra X1, which is based around second-generation Maxwell, that is under-clocked from what we see on the Shield TV. When docked, the GPU will be able to reach 768 MHz on its 256 CUDA cores. When undocked, this will drop to 307.2 MHz (although the system can utilize this mode while docked, too). This puts the performance at ~315 GFLOPs when in mobile, pushing up to ~785 GFLOPs when docked.

You might compare this to the Xbox One, which runs at ~1310 GFLOPs, and the PlayStation 4, which runs at ~1840 GFLOPs. This puts the Nintendo Switch somewhat behind it, although the difference is even greater than that. The FLOP calculation of Sony and Microsoft is 2 x Shader Count x Frequency, but the calculation of Nintendo’s Switch is 4 x Shader Count x Frequency. FMA is the factor of two, but the extra factor of two in Nintendo’s case... ...

Yup, the Switch’s performance rating is calculated as FP16, not FP32.

nintendo-2016-switch-gpu.png

Snippet from an alleged leak of what Nintendo is telling developers.
If true, it's very interesting that FP16 values are being discussed as canonical.

Reducing shader precision down to 16-bit is common for mobile devices. It takes less transistors to store and translate half-precision values, and accumulated error will be muted by the fact that you’re viewing it on a mobile screen. The Switch isn’t always a mobile device, though, so it will be interesting to see how this reduction of lighting and shading precision will affect games on your home TV, especially in titles that don’t follow Nintendo’s art styles. That said, shaders could use 32-bit values, but then you are cutting your performance for those instructions in half, when you are already somewhat behind your competitors.

As for the loss of performance when undocked, it shouldn’t be too much of an issue if Nintendo pressures developers to hit 1080p when docked. If that’s the case, the lower resolution, 720p mobile screen will roughly scale with the difference in clock.

Lastly, there is a bunch of questions surrounding Nintendo’s choice of operating system: basically, all the questions. It’s being developed by Nintendo, but we have no idea what they forked it from. NVIDIA supports the Tegra SoC on both Android and Linux, it would be legal for Nintendo to fork either one, and Nintendo could have just asked for drivers even if NVIDIA didn’t already support the platform in question. Basically, anything is possible from the outside, and I haven’t seen any solid leaks from the inside.

The Nintendo Switch launches in March.

LibRetro Vulkanizes PlayStation

Subject: General Tech | December 4, 2016 - 07:42 PM |
Tagged: pc gaming, vulkan, libretro

About half of a year ago, LibRetro added Vulkan support to their Nintendo 64 renderer. This allowed them to do things like emulate the console’s hardware rasterization in software, and do so as an asynchronous shader, circumventing limitations in their OpenGL path trying to emulate the console’s offbeat GPU.

universal-2016-crashbandicoot.png

Image Credit: Universal Interactive via Wikipedia

They have now turned their sights (“for the lulz”) to the original PlayStation, creating a Vulkan back-end for emulators like Beetle PSX.

The fairly long blog post discusses how the PlayStation is designed in detail, making it an interesting read for anyone curious. One point that I found particularly interesting is how the video memory is configured as a single, 1MB, 2D array (1024x512x16-bit). At this time, texture resolution was quite small, and frame buffers were between 256x224 and 640x480, so that’s a lot of room to make a collage out of your frame and all textures in the scene, but it’s still odd to think about a console imposing such restrictions now that we’re spoiled by modern GPUs.

In terms of performance, the developer claims that modern GPUs can handle 8k resolutions with relative ease, and four-digit FPS at lower resolutions.