Subject: Graphics Cards | March 14, 2016 - 08:41 PM | Ryan Shrout
Tagged: video, live, capsaicin, amd
Subject: General Tech | March 14, 2016 - 05:51 PM | Sebastian Peak
Tagged: wireless vr headset, vr headset, VR, virtual reality, Sulon Q, FX-8800P, amd fx, amd
AMD is powering the world's first truly self-contained VR solution, the Sulon Q, a wireless headset with a powerful computer built in.
AMD has partnered with Sulon Technologies, an startup based in Toronto, to produce this new headset, which seems to have the potential to disrupt the fledgling VR market. The idea is simple, and unique; unlike existing designs that require a VR-ready PC (Oculus Rift, HTC Vive) or the latest smartphone (GearVR) to work, the Sulon Q VR headset incorporates a full gaming PC inside the headset, allowing for the first actually wireless experience in this young technology's existence.
As Ars Technica notes in their post on the Sulon Q this morning:
"According to the announcement, that 'wear and play' untethered design makes the Sulon Q quite different from competition like the Oculus Rift or SteamVR-powered HTC Vive, which both need a relatively high-end PC to actually generate the images on the headset. With the Sulon Q, the Windows 10 PC hardware is built into the unit, including an expected four-core AMD FX-8800P processor with a Radeon R7 graphics card."
Who wouldn't want to wear an entire PC on their head? Thermal (and other health) concerns aside, just what sort of hardware is under the hood (so to speak)? According to the report published at VideoCardz this morning, it will offer a new AMD FX processor (the FX-8800P) and overall specs that look like they belong more to a gaming laptop than a VR headset.
(Quoting directly from the report on VideoCardz via this Reddit post):
Experiences: VR, AR, and spatial computing Ergonomics Lightweight, comfortable, ergonomically designed all-in-one tether-free form factor
Processors: AMD FX-8800P processor at up to 35W with Radeon R7 Graphics leveraging AMD’s Graphics Core Next architecture 4 compute cores and 8 GPU cores unlocked through Heterogeneous System Architecture (HSA) Sulon Spatial Processing Unit (SPU)
Memory: 8 GB DDR3 Memory
Storage: 256 GB SSD
Display: 2560×1440 OLED display at 90 Hz 110-degree Field-of-View
Audio: 3D spatial audio powered by GenAudio’s AstoundSound® technology Built-in 3.5 mm audio jack Custom spatially-optimized Sulon Q earbuds Dual noise-cancelling embedded microphones.
Tracking: Sulon Spatial Processing Unit combining real-time machine vision technologies and mixed reality spatial computer for real-time environment mapping and tracking from the inside outward, dynamic virtualization for VR/AR fusion, and gesture recognition
Sensors: Accelerometer, Gyroscope, Magnetometer, SPU
Software: Microsoft Windows® 10 “Project Dragon” application for spatial computing AMD LiquidVR technologies for ensure smooth and responsive VR and AR experiences
Peripherals: Wireless keyboard and mouse provided in box Any other Windows 10-compatible controllers and joysticks
Connectivity: WiFi 802.11ac + Bluetooth 4.1, 2x USB 3.0 Type A, Micro HDMI OUT
A video for the Sulon Q is also up on YouTube this morning:
The two biggest questions that always accompany any new hardware announcement - how much will it cost, and when is it available - have not been answered just yet. We'll await further information as GDC has just begun, but it seems very safe to say that 2016 will be focused very heavily on VR.
Podcast #390 - ASUS Z170 Sabertooth Mk1, Corsair Carbide 400C, more about Windows Store Games, and more!
Subject: General Tech | March 10, 2016 - 07:10 PM | Ken Addison
Tagged: podcast, video, asus, z170 sabertooth, corsair, carbide 400c, Windows Store, uwp, dx12, amd, nvidia, directflip, 16.3, 364.47, 364.51, SFX, Seagate, OCP, NVMe
PC Perspective Podcast #390 - 03/10/2016
Join us this week as we discuss the ASUS Z170 Sabertooth Mk1, Corsair Carbide 400C, more about Windows Store Games, and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store (audio only)
- RSS - Subscribe through your regular RSS reader (audio only)
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano
Program length: 1:12:32
Week in Review:
News items of interest:
Hardware/Software Picks of the Week
Ryan: Windows 10 Domain Networking and Shares
Subject: Graphics Cards | March 10, 2016 - 06:27 PM | Sebastian Peak
Tagged: XConnect, thunderbolt 3, radeon, graphics card, gpu, gaming laptop, external gpu, amd
AMD has announced their new external GPU technology called XConnect, which leverages support from the latest Radeon driver to support AMD graphics over Thunderbolt 3.
The technology showcased by AMD is powered by Razer, who partnered with AMD to come up with an expandable solution that supports up to 375W GPUs, including R9 Fury, R9 Nano, and all R9 300 series GPUs up to the R9 390X (there is no liquid cooling support, and the R9 Fury X isn't listed as being compatible). The notebook in AMD's marketing material is the Razer Blade Stealth, which offers the Razer Core external GPU enclosure as an optional accessory. (More information about these products from Razer here.) XConnect is not tied to any vendor, however; this is "generic driver" support for GPUs over Thunderbolt 3.
AMD has posted this video with the head of Global Technical Marketing, Robert Hallock, to explain the new tech and show off the Razer hardware:
The exciting part has to be the promise of an industry standard for external graphics, something many have hoped for. Not everyone will produce a product exactly like Razer has, since there is no requirement to provide a future upgrade path in a larger enclosure like this, but the important thing is that Thunderbolt 3 support is built in to the newest Radeon Crimson drivers.
Here are the system requirements for AMD XConnect from AMD:
- Radeon Software 16.2.2 driver (or later)
- 1x Thunderbolt 3 port
- 40Gbps Thunderbolt 3 cable
- Windows 10 build 10586 (or later)
- BIOS support for external graphics over Thunderbolt 3 (check with system vendor for details)
- Certified Thunderbolt 3 graphics enclosure configured with supported Radeon R9 Series GPU
- Thunderbolt firmware (NVM) v.16
The announcement introduces all sorts of possibilities. How awesome would it be to see a tiny solution with an R9 Nano powered by, say, an SFX power supply? Or what about a dual-GPU enclosure (possibly requiring 2 Thunderbolt 3 connections?), or an enclosure supporting liquid cooling (and the R9 Fury X)? The potential is certainly there, and with a standard in place we could see some really interesting products in the near future (or even DIY solutions). It's a promising time for mobile gaming!
Subject: Graphics Cards | March 10, 2016 - 12:42 AM | Scott Michaud
Tagged: amd, radeon, graphics drivers, vulkan, dx12, DirectX 12
New graphics drivers from AMD have just been published, and it's a fairly big release. First, Catalyst 16.3 adds Vulkan support to main-branch drivers, which they claim is conformant to the 1.0 specification. The Khronos Group website still doesn't list AMD as conforming, but I assume that they will be added shortly (rather than some semantic “conformant” “fully conformant” thing going on). This is great for the platform, as we are still in the launch window of DirectX 12.
Performance has apparently increased as well, significantly. This is especially true in the DirectX 12 title, Gears of War Ultimate Edition. AMD claims that FuryX will see up to a 60% increase in that title, and the R9 380 will gain up to 44%. It's unclear how much that is in real world performance, especially in terms of stutter and jank, which apparently plagues that game.
The driver also has a few other interesting features. One that I don't quite understand is “Power Efficiency Toggle”. This supposedly “allows the user to disable some power efficiency optimizations”. I would assume that means keeping you GPU up-clocked under certain conditions, but I don't believe that was much of an issue for the last few generations. That said, the resolved issues section claims that some games were choppy because of core clock fluctuation, and lists this option as the solution, so maybe it was. It is only available on “select” Radeon 300 GPUs and Fury X. That is, Fury X specifically, not the regular Fury or the Nano. I expect Ryan will be playing around with it in the next little while.
Last of the main features, the driver adds support for XConnect, which is AMD's new external graphics standard. It requires a BIOS that support external GPUs, which AMD lists the Razer Blade Stealth as. Also noteworthy, Eyefinity can now be enabled with just two displays, and Display Scaling can be set per-game. I avoid manually controlling drivers, even my Wacom tablet, to target specific applications, but that's probably great for those who do.
As a final note: the Ashes of the Singularity 2.0 benchmark now supports DirectFlip.
If you have a recent AMD GPU, grab the drivers from AMD's website.
Subject: Graphics Cards | March 9, 2016 - 08:30 PM | Scott Michaud
Tagged: ubuntu, graphics drivers, graphics driver, amd
AMD has been transitioning their kernel driver from the closed-source fglrx to the open-source AMDGPU driver that was announced last year. This forms the base that both closed and open user-mode drivers will utilize. For the upcoming Ubuntu 16.04 LTS, Canonical has decided to deprecate fglrx and remove it from the system upon upgrade. Users can then choose to install an AMDGPU-based one, or reinstall the Radeon driver. That will need to be done without Canonical's support, though.
It makes sense that they would choose Ubuntu 16.04 to pull the plug. This is the version that Canonical will be maintaining for the next five years, which could give a headache when AMD has spent the last year trying to get rid of it. AMDGPU is a much safer target as the years roll forward. On the other hand, GPUs prior to Fiji will not have the luxury of choosing, because AMD still hasn't announced AMDGPU for
GDC (Update March 9th @ 6pm: Fixed typo) GCN 1.0 and 1.1.
Subject: Graphics Cards | March 3, 2016 - 08:00 PM | Ryan Shrout
Tagged: uwp, radeon, dx12, amd
AMD's Robert Hallock, frequenter of the PC Perspective live streams and a favorite of the team here, is doing an AMAA on reddit today. While you can find some excellent information and views from Robert in that Q&A session, two particular answers stood out to me.
Asked by user CataclysmZA: Can you comment on the recent developments regarding Ashes of the Singularity and DirectX 12 in PC Perspective and Extremetech's tests? Will changes in AMD's driver to include FlipEx support fix the framerate issues and allow high-refresh monitor owners to enjoy their hardware fully? http://www.pcper.com/reviews/General-Tech/PC-Gaming-Shakeup-Ashes-Singularity-DX12-and-Microsoft-Store
Answer from Robert: We will add DirectFlip support shortly.
Well, there you have it. This is the first official notice I have from AMD that it is in fact its driver that was causing the differences in behavior between Radeon and GeForce cards in Ashes of the Singularity last week. It appears that a new driver will be incoming (sometime) that will enable DirectFlip / FlipEx, allowing exclusive full screen modes in DX12 titles. Some of our fear of the unknown can likely be resolved - huzzah!
Ashes of the Singularity wouldn't enter exclusive full screen mode on AMD Radeon hardware.
Another quesiton also piqued my interest:
Asked by user CataclysmZA: Can you comment on how FreeSync is affected by the way games sold through the Windows Store run in borderless windowed mode?
Answer from Robert: This article discusses the issue thoroughly. Quote: "games sold through Steam, Origin [and] anywhere else will have the ability to behave with DX12 as they do today with DX11."
While not exactly spelling it out, this answer seems to indicate that for the time being, AMD doesn't think FreeSync will work with Microsoft Store sold games in the forced borderless windowed mode. NVIDIA has stated that G-Sync works in some scenarios with the new Gears of War (a Universal Windows Platform app), but it seems they too have issues.
As more informaiton continues to come in, from whatever sources we can validate, I'll keep you updated!
Clockspeed Jump and More!
On March 1st AMD announced the availability of two new processors as well as more information on the A10 7860 APU.
The two new units are the A10-7890K and the Athlon X4 880K. These are both Kaveri based parts, but of course the Athlon has the GPU portion disabled. Product refreshes for the past several years have followed a far different schedule than the days of yore. Remember back in time when the Phenom II series and the competing Core 2 series would have clockspeed updates that were expected yearly, if not every half year with a slightly faster top end performer to garner top dollar from consumers?
Things have changed, for better or worse. We have so far seen two clockspeed bumps for the Kaveri /Godavari based APU. Kaveri was first introduced over two years ago with the A10-7850K and the lower end derivatives. The 7850K has a clockspeed that ranges from 3.7 GHz to the max 4 GHz with boost. The GPU portion is clocked at 720 MHz. This is a 95 watt TDP part that is one of the introductory units from GLOBALFOUNDRIES 28 nm HKMG process.
Today the new top end A10-7890K is clocked at 4.1 GHz to 4.3 GHz max. The GPU receives a significant boost in performance with a clockspeed of 866 MHz. The combination of CPU and GPU clockspeed increases push the total performance of the part exceeding 1 TFLOPs. It features the same dual module/quad core Godavari design as well as the 8 GCN Units. The interesting part here is that the APU does not exceed the 95 watt TDP that it shares with the older and slower 7850K. It is also a boost in performance from last year’s refresh of the A10-7870K which is clocked 200 MHz slower on the CPU portion but retains the 866 MHz speed of the GPU. This APU is fully unlocked so a user can easily overclock both the CPU and GPU cores.
The Athlon X4 880K is still based on the Godavari family rather than the Carizzo update that the X4 845 uses. This part is clocked from 4.0 to 4.2 GHz. It again retains the 95 watt TDP rating of the previous Athlon X4 CPUs. Previously the X4 860K was the highest clocked unit at 3.7 GHz to 4.0, but the 880K raises that to 4 to 4.2 GHz. A 300 MHz gain in base clock is pretty significant as well as stretching that ceiling to 4.2 GHz. The Godavari modules retain their full amount of L2 cache so the 880K has 4 MB available to it. These parts are very popular with budget enthusiasts and gaming builds as they are extremely inexpensive and perform at an acceptable level with free overclocking thrown in.
Earlier this week Samsung formally made a couple of announcements for new monitors due out this spring. The CF591 and CF390 range in size from 23 to 27 inches, mating a 1920x1080 resolution with an 1800R curvature and an attractive design. Even better news for gamers, all of the monitors in these two series will offer AMD's variable refresh rate technology known as FreeSync over HDMI.
The specifications of the monitors are interesting in their own light. The CF390 will be available in both 23.5-in and 27-in varieties, with a 1920x1080 resolution on a VA panel, a 4ms response time rating and a maximum brightness of 250 nits. The VA technology allows for solid viewing angles and color reproduction though all of them are limited to a 60Hz maximum refresh rate. The CF591 monitor is only available in a 27-in variety, shares almost all of the same traits, but sheds the glossy black design for a silver and white color option.
The CF390 features only VGA (D-Sub) and HDMI inputs while the CF591 overs VGA, dual HDMI and a single DisplayPort connection as well. Only the CF591 allows for audio input through a 3.5mm connection.
The supposed value of HDMI-based FreeSync is ubiquity and lower cost. Unfortunately, we don't have any pricing information from Samsung on either the CF390 or CF591 monitors, leaving a big question mark for AMD Radeon gamers that might be looking for a new display. Also, while the CF390 directly benefits from the addition of HDMI support on FreeSync, the CF591 still has a DisplayPort connection, meaning the value of HDMI-based FreeSync is lessened.
They 60Hz maximum refresh rate is disappointing in a world where 75Hz, 90Hz, even 165Hz monitors are being released left and right. Will the AMD driver-based frame doubling technology work on these displays? I have an inquiry in to AMD to verify but it might be difficult with the VA panels' minimum refresh rate. To be fair to AMD and Samsung though, this isn't marketed as a gaming monitor, just a monitor that happens to have a very gaming friendly option.
Both of these monitors look pretty sexy though; we need to see and test them in person to see if the image quality and FreeSync performance meet our expectations. Hopefully we'll be able to do so soon, but until then, let's hope that Samsung is able to release these at very competitive prices to help drive down the cost of VRR.
Things are about to get...complicated
Earlier this week, the team behind Ashes of the Singularity released an updated version of its early access game, which updated its features and capabilities. With support for DirectX 11 and DirectX 12, and adding in multiple graphics card support, the game featured a benchmark mode that got quite a lot of attention. We saw stories based on that software posted by Anandtech, Guru3D and ExtremeTech, all of which had varying views on the advantages of one GPU or another.
That isn’t the focus of my editorial here today, though.
Shortly after the initial release, a discussion began around results from the Guru3D story that measured frame time consistency and smoothness with FCAT, a capture based testing methodology much like the Frame Rating process we have here at PC Perspective. In that post on ExtremeTech, Joel Hruska claims that the results and conclusion from Guru3D are wrong because the FCAT capture methods make assumptions on the output matching what the user experience feels like. Maybe everyone is wrong?
First a bit of background: I have been working with Oxide and the Ashes of the Singularity benchmark for a couple of weeks, hoping to get a story that I was happy with and felt was complete, before having to head out the door to Barcelona for the Mobile World Congress. That didn’t happen – such is life with an 8-month old. But, in my time with the benchmark, I found a couple of things that were very interesting, even concerning, that I was working through with the developers.
FCAT overlay as part of the Ashes benchmark
First, the initial implementation of the FCAT overlay, which Oxide should be PRAISED for including since we don’t have and likely won’t have a DX12 universal variant of, was implemented incorrectly, with duplication of color swatches that made the results from capture-based testing inaccurate. I don’t know if Guru3D used that version to do its FCAT testing, but I was able to get some updated EXEs of the game through the developer in order to the overlay working correctly. Once that was corrected, I found yet another problem: an issue of frame presentation order on NVIDIA GPUs that likely has to do with asynchronous shaders. Whether that issue is on the NVIDIA driver side or the game engine side is still being investigated by Oxide, but it’s interesting to note that this problem couldn’t have been found without a proper FCAT implementation.
With all of that under the bridge, I set out to benchmark this latest version of Ashes and DX12 to measure performance across a range of AMD and NVIDIA hardware. The data showed some abnormalities, though. Some results just didn’t make sense in the context of what I was seeing in the game and what the overlay results were indicating. It appeared that Vsync (vertical sync) was working differently than I had seen with any other game on the PC.
For the NVIDIA platform, tested using a GTX 980 Ti, the game seemingly randomly starts up with Vsync on or off, with no clear indicator of what was causing it, despite the in-game settings being set how I wanted them. But the Frame Rating capture data was still working as I expected – just because Vsync is enabled doesn’t mean you can look at the results in capture formats. I have written stories on what Vsync enabled captured data looks like and what it means as far back as April 2013. Obviously, to get the best and most relevant data from Frame Rating, setting vertical sync off is ideal. Running into more frustration than answers, I moved over to an AMD platform.