Vulkan 1.0, OpenGL 4.5, and OpenGL ES 3.2 on a console
A few days ago, sharp eyes across the internet noticed that Nintendo’s Switch console has been added to lists of compliant hardware at The Khronos Group. Vulkan 1.0 was the eye-catcher, although the other tabs also claims conformance with OpenGL 4.5 and OpenGL ES 3.2. The device is not listed as compatible with OpenCL, although that does not really surprise me for a single-GPU gaming system. The other three APIs have compute shaders designed around the needs of game developers. So the Nintendo Switch conforms to the latest standards of the three most important graphics APIs that a gaming device should use -- awesome.
But what about performance?
In other news, Eurogamer / Digital Foundary and VentureBeat uncovered information about the hardware. It will apparently use a Tegra X1, which is based around second-generation Maxwell, that is under-clocked from what we see on the Shield TV. When docked, the GPU will be able to reach 768 MHz on its 256 CUDA cores. When undocked, this will drop to 307.2 MHz (although the system can utilize this mode while docked, too). This puts the performance at ~315 GFLOPs when in mobile, pushing up to ~785 GFLOPs when docked.
You might compare this to the Xbox One, which runs at ~1310 GFLOPs, and the PlayStation 4, which runs at ~1840 GFLOPs. This puts the Nintendo Switch somewhat behind it, although the difference is even greater than that. The FLOP calculation of Sony and Microsoft is 2 x Shader Count x Frequency, but the calculation of Nintendo’s Switch is 4 x Shader Count x Frequency. FMA is the factor of two, but the extra factor of two in Nintendo’s case... ...
Yup, the Switch’s performance rating is calculated as FP16, not FP32.
Snippet from an alleged leak of what Nintendo is telling developers.
If true, it's very interesting that FP16 values are being discussed as canonical.
Reducing shader precision down to 16-bit is common for mobile devices. It takes less transistors to store and translate half-precision values, and accumulated error will be muted by the fact that you’re viewing it on a mobile screen. The Switch isn’t always a mobile device, though, so it will be interesting to see how this reduction of lighting and shading precision will affect games on your home TV, especially in titles that don’t follow Nintendo’s art styles. That said, shaders could use 32-bit values, but then you are cutting your performance for those instructions in half, when you are already somewhat behind your competitors.
As for the loss of performance when undocked, it shouldn’t be too much of an issue if Nintendo pressures developers to hit 1080p when docked. If that’s the case, the lower resolution, 720p mobile screen will roughly scale with the difference in clock.
Lastly, there is a bunch of questions surrounding Nintendo’s choice of operating system: basically, all the questions. It’s being developed by Nintendo, but we have no idea what they forked it from. NVIDIA supports the Tegra SoC on both Android and Linux, it would be legal for Nintendo to fork either one, and Nintendo could have just asked for drivers even if NVIDIA didn’t already support the platform in question. Basically, anything is possible from the outside, and I haven’t seen any solid leaks from the inside.
The Nintendo Switch launches in March.
Subject: Systems | November 30, 2016 - 05:17 PM | Jeremy Hellstrom
Tagged: msi, aegis ti, gaming pc, vr ready
Depending on which model you order, the MSI Aegis Ti PC will have an i7-6700K or i5-6600K and a pair of either GTX 1080s or 1070s. The model which shipped to TechPowerUp for testing sported a pair of M.2 Samsung 950 PROs and 32GB of DDR4-2400, along with the i7-6700K and GTX 1080s of course. The unique looking enclosure is VR Ready, in that there are USB and HDMI ports in the front to let you easily attach your VR goggles and is more than powerful enough to power said device at high settings. If you would prefer to spend $3000 on a configured gaming rig with some interesting features as opposed to building one yourself, pop over for a look at the full review.
"MSI sent us their latest fully featured PC, the Aegis Ti, to take a look at. This PC departs from the "traditional box" design in a big way and is ready to support not just one but two GTX 1080s! It's VR ready, including an HDMI port in front and dual M.2 drives, which can be configured in RAID, making it ready for whatever you want to throw at it."
Here are some more Systems articles from around the web:
- Cyberpower Infinity X55 VX @ Kitguru
- DinoPC Mayhem P2 GTX 1080 @ eTeknix
- AWD-IT Aura (GTX 1070) @ Kitguru
- Freshtech Solutions Aerocool DS230 GTX 1050 Ti @ eTeknix
A Holiday Project
A couple of years ago, I performed an experiment around the GeForce GTX 750 Ti graphics card to see if we could upgrade basic OEM, off-the-shelf computers to become competent gaming PCs. The key to this potential upgrade was that the GTX 750 Ti offered a great amount of GPU horsepower (at the time) without the need for an external power connector. Lower power requirements on the GPU meant that even the most basic of OEM power supplies should be able to do the job.
That story was a success, both in terms of the result in gaming performance and the positive feedback it received. Today, I am attempting to do that same thing but with a new class of GPU and a new class of PC games.
The goal for today’s experiment remains pretty much the same: can a low-cost, low-power GeForce GTX 1050 Ti graphics card that also does not require any external power connector offer enough gaming horsepower to upgrade current shipping OEM PCs to "gaming PC" status?
Our target PCs for today come from Dell and ASUS. I went into my local Best Buy just before the Thanksgiving holiday and looked for two machines that varied in price and relative performance.
|Dell Inspiron 3650||ASUS M32CD-B09|
|Processor||Intel Core i3-6100||Intel Core i7-6700|
|Memory||8GB DDR4||12GB DDR4|
|Graphics Card||Intel HD Graphics 530||Intel HD Graphics 530|
|Storage||1TB HDD||1TB Hybrid HDD|
|Power Supply||240 watt||350 watt|
|OS||Windows 10 64-bit||Windows 10 64-bit|
|Total Price||$429 (Best Buy)||$749 (Best Buy)|
The specifications of these two machines are relatively modern for OEM computers. The Dell Inspiron 3650 uses a modest dual-core Core i3-6100 processor with a fixed clock speed of 3.7 GHz. It has a 1TB standard hard drive and a 240 watt power supply. The ASUS M32CD-B09 PC has a quad-core HyperThreaded processor with a 4.0 GHz maximum Turbo clock, a 1TB hybrid hard drive and a 350 watt power supply. Both of the CPUs share the same Intel brand of integrated graphics, the HD Graphics 520. You’ll see in our testing that not only is this integrated GPU unqualified for modern PC gaming, but it also performs quite differently based on the CPU it is paired with.
Subject: Systems, Mobile | November 27, 2016 - 04:25 PM | Scott Michaud
Tagged: virtual boy, RISC, Nintendo, nec
I was one of the lucky kids who got a Virtual Boy, which was actually quite fun for nine-year-old me. It wasn’t beloved by the masses, but when you’re in a hotel, moving across the country, you best believe I’m going to punch that Teleroboxer cat in the head, over and over. It was quite an interesting piece of technology, despite its crippling flaws.
To see for yourself, Ben Heck published a full disassemble, with his best-guess explanations. He then performs a repair by 3D printing a clamp to put pressure on a loose ribbon connector.
From a performance standpoint, the Virtual Boy was launched with a 32-bit NEC RISC processor, clocked at 20 MHz. Keep in mind that, one, this is a semi-mobile, battery-powered device and, two, it launched around the same time as the original Pentium processor reached 120 MHz. The RAM setup is... unclear. I’m guessing PlanetVB accidentally wrote MB and KB to refer to “megabit” (Mb) and “kilobit” (kb) instead of “megabyte” and “kilobyte”, meaning the Wikipedia listing of 128KB VRAM, 128KB DRAM, and 64KB WRAM is accurate. The cartridge could also address up to an additional 16MB of RAM, meaning that specific titles could load as much as they need, albeit at a higher BOM cost. Shipped titles maxed out at 8KB of cartridge-expanded RAM, though.
Ben Heck’s video will be part of a series, where he will try to make it smaller and head-mounted.
Subject: Systems | November 26, 2016 - 04:21 PM | Scott Michaud
Tagged: Samsung, Lenovo
thebell, a Korean news outlet and sister site of ZDNet Korea, published a rumor that Samsung was in talks to sell their PC business to Lenovo. While I’m struggling with the Google Translate from Korean, it sounds like this would be caused by Samsung selling their printing business to HP, leading to the company divesting from related markets, too. This news was picked up by the American ZDNet and, some time after, Samsung released a statement outright denying the rumor: “The rumor is not true.”
So, as far as we know, Samsung is staying in the PC market.
Since it was a clear denial, not a decline to comment, this probably means that the rumor is either completely false, or, if it’s based on a kernel of truth, it’s very early or very tiny. It seems likely, though, that Lenovo would want to buy up pretty much anyone’s PC business at this point, if the price is right. As for Samsung selling? I could see it being something that could have been discussed behind-the-scenes to some level of seriousness, although that’s what hoaxes prey upon. Again, as far as we know, Samsung will keep their PC business, and there isn’t really anything concrete to say otherwise.
Subject: Graphics Cards, Systems | November 10, 2016 - 11:44 AM | Ryan Shrout
Tagged: VR, rift, Oculus, atw, asynchronous timewarp, asynchronous spacewarp, asw
Oculus has announced that as of today, support for Asynchronous Spacewarp is available and active for all users that install the 1.10 runtime. Announced at the Oculus Connect 3 event in October, ASW promises to complement existing Asynchronous Timewarp (ATW) technology to improve the experience of VR for lower performance systems that might otherwise result in stutter.
A quick refresher on Asynchronous Timewarp is probably helpful. ATW was introduced to help alleviate the impact of missed frames on VR headsets and started development back with Oculus DK2 headset. By shifting the image on the VR headset without input from the game engine based on relative head motion that occurred AFTER the last VR pose was sent to the game, timewarp presents a more accurate image to the user. While this technology was first used as a band-aid for slow frame rates, Oculus felt confident enough in its advantages to the Rift that it enables for all frames of all applications, regardless of frame rate.
ATW moves the entire frame as a whole, shifting it only based on relative changes to the user’s head rotation. New Asynchronous Spacewarp attempts to shift objects and motion inside of the scene by generating new frames to insert in between “real” frames from the game engine when the game is running in a 45 FPS state. With a goal of maintaining a smooth, enjoyable and nausea-free experience, Oculus says that ASW “includes character movement, camera movement, Touch controller movement, and the player's own positional movement.”
To many of you that are familiar with the idea of timewarp, this might sound like black magic. Oculus presents this example on their website to help understand what is happening.
Seeing the hand with the gun in motion, ASW generates a frame that continues the animation of the gun to the left, tricking the user into seeing the continuation of the motion they are going through. When the next actual frame is presented just after, the gun will have likely moved slightly more than that, and then the pattern repeats.
You can notice a couple of things about ASW in this animation example however. If you look just to the right of the gun barrel in the generated frame, there is a stretching of the pixels in an artificial way. The wheel looks like something out of Dr. Strange. However, this is likely an effect that would not be noticeable in real time and should not impact the user experience dramatically. And, as Oculus would tell us, it is better than the alternative of simply missing frames and animation changes.
Some ASW interpolation changes will be easier than others thanks to secondary data available. For example, with the Oculus Touch controller, the runtime will know how much the players hand has moved, and thus how much the object being held has moved, and can better estimate the new object location. Positional movement would also have this advantage. If a developer has properly implemented the different layers of abstraction for Oculus and its runtime, separating out backgrounds from cameras from characters, etc., then the new frames being created are less likely to have significant distortions.
I am interested in how this new feature affects the current library of games on PCs that do in fact drop below that 90 FPS mark. In October, Oculus was on stage telling users that the minimum spec for VR systems was dropping from requiring a GTX 970 graphics card to a GTX 960. This clearly expands the potential install base for the Rift. Will the magic behind ASW live up to its stated potential without an abundance of visual artifacts?
In a blog post on the Oculus website, they mention some other specific examples of “imperfect extrapolation.” If your game or application includes rapid brightness changes, object disocclusion trails (an object moving out of the way of another object), repeated patterns, or head-locked elements (that aren’t designated as such in the runtime) could cause distracting artifacts in the animation if not balanced and thought through. Oculus isn’t telling game developers to go back and modify their titles but instead to "be mindful of their appearance."
Oculus does include a couple of recommendations to developers looking to optimize quality for ASW with locked layers, using real-time rather than frame count for animation steps, and easily adjustable image quality settings. It’s worth noting that this new technology is enabled by default as of runtime 1.10 and will start working once a game drops below the 90 FPS line only. If your title stays over 90 FPS, then you get the advantages of Asynchronous Timewarp without the potential issues of Asynchronous Spacewarp.
The impact of ASW will be interesting to see. For as long as Oculus has been around they have trumpeted the need for 90 FPS to ensure a smooth gaming experience free of headaches and nausea. With ASW, that, in theory, drops to 45 FPS, though with the caveats mentioned above. Many believe, as do I, that this new technology was built to help Microsoft partner with Oculus to launch VR on the upcoming Scorpio Xbox console coming next year. Because the power of that new hardware still will lag behind the recommended specification from both Oculus and Valve for VR PCs, something had to give. The result is a new “minimum” specification for Oculus Rift gaming PCs and a level of performance that makes console-based integrations of the Rift possible.
Subject: Systems | November 9, 2016 - 03:31 PM | Jeremy Hellstrom
Tagged: VR, vive, rift, Oculus, htc, build guide, amd
Neoseeker embarked on an interesting project recently; building a VR capable system which costs less than the VR headset it will power. We performed a similar feat this summer, a rig which at the time cost roughly $900. Neoseeker took a different path, using AMD parts to keep the cost low while still providing the horsepower required to drive a Rift or Vive. They tested their rig on The Lab, Star Wars: Trials on Tatooine and Waltz of the Wizard, finding the performance smooth and most importantly not creating the need for any dimenhydrinate. There are going to be some games this system struggles with but at total cost under $700 this is a great way to experience VR even if you are on a budget.
"Team Red designed this system around their very capable Radeon RX 480 8GB video card and the popular FX-6350 Vishera 6-Core CPU. The RX 480 is obviously the main component that will not only be leading the dance, but also help drive the total build cost down thanks to its MSRP of $239. At the currently listed online prices, the components for system will cost around $660 USD in total after applicable rebates."
Here are some more Systems articles from around the web:
- Intel Kaby Lake Linux Testing With MSI's Cubi 2 Mini PC @ Phoronix
- MSI Aegis Ti (GTX 1080 SLI) Gaming PC @ Kitguru
- Gigabyte BRIX i7A-7500 @ Kitguru
- Freshtech Solutions Project 7 GTX 1080 Gaming PC @ eTeknix
Subject: Systems, Mobile | November 6, 2016 - 07:00 AM | Scott Michaud
Tagged: Nintendo, nes, Cortex A7, arm, Allwinner
It looks like Peter Brown, Senior Reviews Editor at GameSpot received an NES Classic and promptly disassembled it for a single photo. From there, users on Reddit searched the component model numbers and compiled specifications. According to their research, the system (unless Nintendo made multiple, interchangeable models) is based on an Allwinner R16 SoC, which has four ARM Cortex A7 cores and an ARM Mali 400 MP2 GPU. Attached to this is 256MB of DDR3 RAM and 512 MB of flash.
Image Credit: Peter Brown
Thankfully, the packaging of each chip has quite large, mostly legible branding, so it's easy to verify.
In terms of modern phone technology, this is about the bottom of the barrel. The Allwinner R16 should be roughly comparable to the Raspberry Pi 2, only that system has about four times the RAM as Nintendo's. This is not a bad thing, of course, because its entire goal is to emulate a device that was first released in 1983 (in Japan) albeit at high resolution. Not all of the games will be free for them to include, either. Mega Man 2, PAC-MAN, Final Fantasy, Castlevania 1 and 2, Ninja Gaiden, Double Dragon II, Bubble Bobble, Tecmo Bowl, Super C, and Galaga are all from third-party publishers, who will probably need some cut of sales.
Users are claiming that it doesn't look like it could be updated. Counting the ports, it doesn't look like there's any way in, but I could be wrong. That said, I never expected it to be upgradeable so I guess that's that?
The NES Classic Edition goes on sale on November 11th for $59.99 USD MSRP.
Subject: General Tech, Displays, Systems | November 3, 2016 - 07:01 AM | Scott Michaud
Update November 3rd @ 2:20pm: As noted in the comments, the video and article are back from 2014. As I said in the article, the concept was teased in Adobe MAX, but I must have found an old source and misread the date. I've also embed the new video just below.
Original post below
Adobe MAX started yesterday, and Dell used it as a venue to announce their Smart Desk concept. While it draws comparisons with Microsoft's Surface Studio, especially with their dial-based input accessory, it's unclear whether the similarities stop. For instance, while they promote how it uses “Dell Precision workstation performance,” they don't explicitly state that it is a PC itself. Unlike the Surface Studio, it might be a peripheral to be paired with a full desktop, which its thin profile suggests, unless it requires a specific device that's just not pictured.
I mean, it would be possible to fit a laptop into a twenty-some-inch tablet that's designed to permanently sit on a desk, but, unless the software requires deep OS integration, you would think that going the Wacom route would be a win for both parties. While powering hardware wouldn't be an issue, you would still need to use slower-for-the-price laptop components to dissipate heat and exist in a small volume. If it does contain a PC, it would be running Windows 10, too, because that was clearly shown on the secondary UltraSharp 27 monitor attached to it. On the other hand, the interface, while nothing about it excludes being a complex driver for everyday desktops, is the sort of thing that a company would do if they're shipping it in a full PC.
We'll know more in the future as Dell spills the beans (and probably develops a marketable product to have beans spilled over). What would you be more interested in? An all-in-one or a peripheral?
Subject: Systems | October 26, 2016 - 04:31 PM | Sebastian Peak
Tagged: workstation, nvidia, microsoft, Intel, GTX 980M, GTX 965M, desktop, DCI-P3, core i7, core i5, all-in-one, AIO, 4000x3500
Microsoft has announced their first all-in-one PC with the Surface Studio, and it looks like Apple has some serious competition on their hands in the high-end AIO workstation space. Outfitted with the highest resolution display this side of Cupertino, 6th-generation Intel Skylake processors, and discrete NVIDIA graphics, there is plenty of power for most users (though gamers will clearly be looking elsewhere). Make no mistake, this new AIO from Microsoft is not going to replace a standard desktop for most people due to the $2999+ price tag, but for creative professionals and other workstation users it is a compelling option.
"Expanding the Surface family, Surface Studio is a new class of device that transforms from a workstation into a powerful digital canvas, unlocking a more natural and immersive way to create on the thinnest LCD monitor ever built.1 With a stunning ultra-HD 4.5K screen, Surface Studio delivers 63 percent more pixels than a state-of-the-art 4K TV. Surface Studio works beautifully with pen, touch and Surface Dial — a new input device designed for the creative process that lets you use two hands on the screen to compose and create in all new ways."
The star of the show is the 28-inch PixelSense display, which boasts a massive 4500x3000 resolution for a pixel density of 192 ppi, and the taller 3:2 aspect ratio will be welcomed by some users as well. Microsoft is using 10-bit panels for this premium AIO offering, and color reproduction should be outstanding with the Surface Studio thanks to "individually color calibrated" displays. Another advantage for creative customers is the display's multi-touch capability and 1024 pressure-level Surface Pen, which makes this a very nice option for digital artists - especially at 28 inches/192 ppi.
Touchscreen desktops need display placement flexibility to be useful, and here Microsoft has a "zero gravity" hinge to allow for easy movement. The design looks stable thanks to a pair of arms connecting the display to the base, and this lower half is what actually houses the PC components. What's inside? Here's a look at the official specs:
- Screen: 28” PixelSense™ Display
- Resolution: 4500 x 3000 (192 PPI)
- Color settings: Adobe sRGB and DCI-P3, individually color calibrated
- Touch: 10 point multi-touch
- Aspect Ratio: 3:2
- Supports Pen enabled and Zero Gravity Hinge
- Processor: 6th Generation Intel® Core™ i5 or i7
- Memory: 8GB, 16GB, or 32GB RAM
- i5 Intel 8GB: NVIDIA® GeForce® GTX 965M 2GB GDDR5 memory
- i7 Intel 16GB: NVIDIA® GeForce® GTX 965M 2GB GDDR5 memory
- i7 Intel 32GB: NVIDIA® GeForce® GTX 980M 4GB GDDR5 memory
- Rapid Hybrid Drive options: 1TB or 2TB
- Connections & expansions:
- 4 x USB 3.0 (one high power port)
- Full-size SD ™ card reader (SDXC) compatible
- Mini DisplayPort
- Headset jack
- Compatible with Surface Dial on-screen interaction*
- 1 Gigabit Ethernet port
- Cameras, video and audio:
- Windows Hello1 face sign-in camera
- 5.0 MP camera with 1080p HD video (front)
- Autofocus camera with 1080p HD video (rear)
- Dual microphones
- Stereo 2.1 speakers with Dolby® Audio™ Premium
- 3.5 mm headphone jack
- Wi-Fi: 802.11ac Wi-Fi wireless networking, IEEE 802.11 a/b/g/n compatible
- Bluetooth: Bluetooth 4.0 wireless technology
- Xbox Wireless built-in
- TPM chip for enterprise security
- Enterprise-grade protection with Windows Hello2 face sign-in
- Warranty: 1-year limited hardware warranty
- Display: 637.35 mm x 438.90 mm x 12.5 mm (25.1” x 17.3” x 0.5”)
- Base: 250.00 mm x 220.00 mm x 32.2 mm (9.8” x 8.7” x 1.3”)
- Product weight: 9.56 kg max (21 lbs max)
The Surface Studio is currently available for pre-order at Microsoft.com with prices ranging from $2999 to $4199, depending on configuration.