To the Max?
Much of the PC enthusiast internet, including our comments section, has been abuzz with “Asynchronous Shader” discussion. Normally, I would explain what it is and then outline the issues that surround it, but I would like to swap that order this time. Basically, the Ashes of the Singularity benchmark utilizes Asynchronous Shaders in DirectX 12, but they disable it (by Vendor ID) for NVIDIA hardware. They say that this is because, while the driver reports compatibility, “attempting to use it was an unmitigated disaster in terms of performance and conformance”.
AMD's Robert Hallock claims that NVIDIA GPUs, including Maxwell, cannot support the feature in hardware at all, while all AMD GCN graphics cards do. NVIDIA has yet to respond to our requests for an official statement, although we haven't poked every one of our contacts yet. We will certainly update and/or follow up if we hear from them. For now though, we have no idea whether this is a hardware or software issue. Either way, it seems more than just politics.
So what is it?
Simply put, Asynchronous Shaders allows a graphics driver to cram workloads in portions of the GPU that are idle, but not otherwise available. For instance, if a graphics task is hammering the ROPs, the driver would be able to toss an independent physics or post-processing task into the shader units alongside it. Kollock from Oxide Games used the analogy of HyperThreading, which allows two CPU threads to be executed on the same core at the same time, as long as it has the capacity for it.
Kollock also notes that compute is becoming more important in the graphics pipeline, and it is possible to completely bypass graphics altogether. The fixed-function bits may never go away, but it's possible that at least some engines will completely bypass it -- maybe even their engine, several years down the road.
But, like always, you will not get an infinite amount of performance by reducing your waste. You are always bound by the theoretical limits of your components, and you cannot optimize past that (except for obviously changing the workload itself). The interesting part is: you can measure that. You can absolutely observe how long a GPU is idle, and represent it as a percentage of a time-span (typically a frame).
And, of course, game developers profile GPUs from time to time...
According to Kollock, he has heard of some console developers getting up to 30% increases in performance using Asynchronous Shaders. Again, this is on console hardware and so this amount may increase or decrease on the PC. In an informal chat with a developer at Epic Games, so massive grain of salt is required, his late night ballpark “totally speculative” guesstimate is that, on the Xbox One, the GPU could theoretically accept a maximum ~10-25% more work in Unreal Engine 4, depending on the scene. He also said that memory bandwidth gets in the way, which Asynchronous Shaders would be fighting against. It is something that they are interested in and investigating, though.
This is where I speculate on drivers. When Mantle was announced, I looked at its features and said “wow, this is everything that a high-end game developer wants, and a graphics developer absolutely does not”. From the OpenCL-like multiple GPU model taking much of the QA out of SLI and CrossFire, to the memory and resource binding management, this should make graphics drivers so much easier.
It might not be free, though. Graphics drivers might still have a bunch of games to play to make sure that work is stuffed through the GPU as tightly packed as possible. We might continue to see “Game Ready” drivers in the coming years, even though much of that burden has been shifted to the game developers. On the other hand, maybe these APIs will level the whole playing field and let all players focus on chip design and efficient injestion of shader code. As always, painfully always, time will tell.
Subject: Systems, Mobile | September 2, 2015 - 06:00 AM | Sebastian Peak
Tagged: V Nitro, Skylake, NVMe, nvidia, notebook, mu-mimo, laptop, IFA 2015, geforce, aspire V, acer
Acer’s updated V Nitro notebook series has been announced, and the notebooks have received the newest Intel mobile processors and have been fully updated with the latest connectivity some advanced wireless tech.
The Aspire V 13
"The refreshed Aspire V Nitro Series notebooks and Aspire V 13 support the latest USB 3.1 Type-C port, while 'Black Edition' Aspire V Nitro models support Thunderbolt 3, which brings Thunderbolt to USB Type-C at speeds up to 40Gbps. All models include Qualcomm VIVE 2x2 802.11ac Wi-Fi with Qualcomm MU | EFX MU-MIMO technology."
MU-MIMO devices are just starting to hit the market and the tech promises to eliminate bottlenecks when multiple devices are in use on the same network – with compatible adapters/routers, that is.
The Aspire V 15 Nitro
What kind of hardware will be offered? Here’s a brief overview:
- 6th Gen Intel Core processors
- Up to 32GB DDR4 system memory
- NVIDIA GeForce graphics
- (SATA) SSD/SSHD/HDD storage options
- Touchscreen option added for the 15-inch model
Additionally, the “Black Edition” models offer a 4K 100% Adobe RGB display option, NVIDIA GeForce GTX 960M up to 4GB, NVMe SSDs, and something called “AeroBlade” thermal exhaust, which Acer said has “the world’s thinnest metallic blades of just 0.1mm thin, which are stronger and quieter”.
The Aspire V 17 Nitro
Pricing will start at $599 for the V Nitro 13, $999 for the V Nitro 15, and $1099 for the V Nitro 17. All versions will be available in the U.S. in October.
Subject: Systems, Mobile | September 2, 2015 - 03:00 AM | Sebastian Peak
Tagged: nvidia, notebooks, Lenovo, laptops, Intel Skylake, Intel Braswell, IFA 2015, ideapad 500S, ideapad 300S, ideapad 100S, Ideapad, gtx, APU, amd
Lenovo has unveiled their reinvented their ideapad (now all lowercase) lineup at IFA 2015 in Berlin, and the new laptops feature updated processors including Intel Braswell and Skylake, as well as some discrete AMD and NVIDIA GPU options.
At the entry-level price-point we find the ideapad 100S which does not contain one of the new Intel chips, instead running an Intel Atom Z3735F CPU and priced accordingly at just $189 for the 11.6” version and $259 for the 14” model. While low-end specs (2GB RAM, 32GB/64GB eMMC storage, 1366x768 screen) aren’t going to blow anyone away, these at least provide a Windows 10 alternative to a Chromebook at about the same cost, and to add some style Lenovo is offering the laptop in four colors: blue, red, white, and silver.
Moving up to the 300S we find a 14” laptop (offered in red, black, or white) with Intel Pentium Braswell processors up to the quad-core N3700, and the option of a FHD 1920x1080 display. Memory and storage options will range up to 8GB DDR3L and up to either 256GB SSD or 1TB HDD/SSHD. At 0.86" thick the 300S weighs 2.9 lbs, and prices will start at $479.
A lower-cost ideapad 300, without the “S” and with more basic styling, will be available in sizes ranging from 14” to 17” and prices starting between $399 and $549 for their respective models. A major distinction will be the inclusion of both Braswell and Intel 6th Gen Skylake CPUs, as well at the option of a discrete AMD GPU (R5 330M).
Last we have the ideapad 500S, available in 13.3”, 14”, and 15.6” versions. With Intel 6th Gen processors up to Core i7 like the 300S, these also offer optional NVIDIA GPUs (GTX 920M for the 13.3", 940M for the 14"+) and up to FHD screen resolution. Memory and storage options range up to 8GB DDR3L and up to either 256GB SSD or 1TB HDD/SSHD, and the 500S is a bit thinner and lighter than the 300S, with the 13.3” version 0.76” thick and 3.4 lbs, moving up to 0.81” and 4.6 lbs with the 15.6” version.
A non-S version of the ideapad 500 will also be available, and this will be the sole AMD CPU representative with the option of an all-AMD solution powered by up to the A10-7300 APU, or a combination of R7 350M graphics along with 6th Gen Intel Core processors. 14” and 15” models will be available starting at $399 for the APU model and $499 with an Intel CPU.
All of the new laptops ship with Windows 10 as Microsoft’s newest OS arrived just in time for the back-to-school season.
Subject: Graphics Cards | August 31, 2015 - 07:19 PM | Scott Michaud
Tagged: nvidia, graphics drivers, geforce, drivers
Unlike last week's 355.80 Hotfix, today's driver is fully certified by both NVIDIA and Microsoft (WHQL). According to users on GeForce Forums, this driver includes the hotfix changes, although I am still seeing a few users complain about memory issues under SLI. The general consensus seems to be that a number of bugs were fixed, and that driver quality is steadily increasing. This is also a “Game Ready” driver for Mad Max and Metal Gear Solid V: The Phantom Pain.
NVIDIA's GeForce Game Ready 355.82 WHQL Mad Max and Metal Gear Solid V: The Phantom Pain drivers (inhale, exhale, inhale) are now available for download at their website. Note that Windows 10 drivers are separate from Windows 7 and Windows 8.x ones, so be sure to not take shortcuts when filling out the “select your driver” form. That, or just use GeForce Experience.
Subject: Graphics Cards | August 27, 2015 - 05:23 PM | Scott Michaud
Tagged: windows 10, nvidia, geforce, drivers, graphics drivers
While GeForce Hotfix driver 355.80 is not certified, or even beta, I know that a lot of our readers have issues with SLI in Windows 10. Especially in games like Battlefield 4, memory usage would expand until, apparently, a crash occurs. Since I run a single GPU, I have not experienced this issue and so I cannot comment on what happens. I just know that it was very common in the GeForce forums and in our comment section, so it was probably a big problem for many users.
If you are not experiencing this problem, then you probably should not install this driver. This is a hotfix that, as stated above, was released outside of NVIDIA's typical update process. You might experience new, unknown issues. Affected users, on the other hand, have the choice to install the fix now, which could very well be stable, or wait for a certified release later.
You can pick it up from NVIDIA's support site.
Subject: General Tech | August 27, 2015 - 12:59 PM | Ken Addison
Tagged: podcast, video, Nixeus, vue24, freesync, gsync, amd, r9 nano, Fiji, asus, PB258Q, qualcomm, snapdragon 820, nvidia
PC Perspective Podcast #364 - 08/27/2015
Join us this week as we discuss the Nixeus Vue 24 FreeSync Monitor, AMD R9 Nano leaks, GPU Marketshare and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Allyn Malventano, Jeremy Hellstrom, Josh Walrath, and Sebastian Peak
Program length: 1:22:36
Subject: Graphics Cards | August 24, 2015 - 03:43 PM | Jeremy Hellstrom
Tagged: nvidia, moba, maxwell, gtx 950, GM206, geforce, DOTA 2
It is more fun testing at the high end and the number of MOBA gamers here at PCPer could be described as very sparse, to say the least. Perhaps you are a MOBA gamer looking to play on a 1080p screen and have less than $200 to invest in a GPU and feel that Ryan somehow missed a benchmark that is important to you. One of the dozens of reviews linked to below are likely to have covered that game or specific feature which you are looking for. They also represent the gamut of cards available at launch from a wide variety of vendors, both stock and overclocked models. If you just want a quick refresher on the specifications and what has happened to the pricing on already released models, The Tech Report has handy tables for you to reference here.
"For most of this summer, much of the excitement in the GPU market has been focused on pricey, high-end products like the Radeon Fury and the GeForce GTX 980 Ti. Today, Nvidia is turning the spotlight back on more affordable graphics cards with the introduction of the GeForce GTX 950, a $159.99 offering that promises to handle the latest games reasonably well at the everyman's resolution of 1080p."
Here are some more Graphics Card articles from around the web:
- ASUS GeForce GTX 950 STRIX Graphics Card Review @ Techgage
- MSI GTX 950 Gaming 2G @ Modders-Inc
- EVGA GeForce GTX 950 FTW Edition Video Card Review @HiTech Legion
- Nvidia GTX 950 Round-Up Review: Three Cards Go Head to Head @ eTeknix
- NVIDIA GeForce GTX 950 Roundup featuring ASUS and MSI @ Neoseeker
- Palit GTX 950 2GB StormX Dual @ Kitguru
- GeForce GTX 950 @ HardwareHeaven
- Asus STRIX Gaming GTX 950 2GB DC2 OC @ Kitguru
- NVIDIA Introduces the GeForce GTX 950 for MOBA Gamers and Shares the GeForce Experience @ OCC
- ZOTAC GeForce GTX 950 AMP! Edition 2 GB @ techPowerUp
- Gigabyte GeForce GTX 950 OC 2 GB @ techPowerUp
- EVGA GeForce GTX 950 SSC 2 GB @ techPowerUp
- ASUS GTX 950 STRIX OC 2 GB @ techPowerUp
- Nvidia GeForce GTX 950 @ Legion Hardware
- The NVIDIA GTX 950 Review @ Hardware Canucks
- Asus, MSI, EVGA GTX 950 Review @ OCC
- The Extreme Cases Where A Sub-$200 NVIDIA GPU Can Beat A $550+ AMD R9 Fury On Linux @ Phoronix
- NVIDIA's GeForce GTX 950 Is A $150+ Bargain For Linux Gamers @ Phoronix
- The NVIDIA GPUs Delivering The Best Performance Per Watt & Per Dollar For Linux Gamers @ Phoronix
- The ASUS GTX 980 Ti STRIX OC Review @ Hardware Canucks
- Inno3D iChill GeForce GTX 980 4GB Ultra Review @ NikKTech
- HIS R7 370 IceQ X2 OC 2GB Video Card Review @ Madshrimps
- ASUS R9 390X STRIX DirectCU III 8G OC @ [H]ard|OCP
Subject: Graphics Cards | August 21, 2015 - 11:30 AM | Sebastian Peak
Tagged: PC, nvidia, Matrox, jpr, graphics cards, gpu market share, desktop market share, amd, AIB, add in board
While we reported recently on the decline of overall GPU shipments, a new report out of John Peddie Research covers the add-in board segment to give us a look at the desktop graphics card market. So how are the big two (sorry Matrox) doing?
|GPU Supplier||Market Share This Quarter||Market Share Last Quarter||Market Share Last Year|
The big news is of course a drop in market share for AMD of 4.5% quarter-to-quarter, and down to just 18% from 37.9% last year. There will be many opinions as to why their share has been dropping in the last year, but it certainly didn't help that the 300-series GPUs are rebrands of 200-series, and the new Fury cards have had very limited availability so far.
The graph from Mercury Research illustrates what is almost a mirror image, with NVIDIA gaining 20% as AMD lost 20%, for a 40% swing in overall share. Ouch. Meanwhile (not pictured) Matrox didn't have a statistically meaningful quarter but still manage to appear on the JPR report with 0.1% market share (somehow) last quarter.
The desktop market isn't actually suffering quite as much as the overall PC market, and specifically the enthusiast market.
"The AIB market has benefited from the enthusiast segment PC growth, which has been partially fueled by recent introductions of exciting new powerful (GPUs). The demand for high-end PCs and associated hardware from the enthusiast and overclocking segments has bucked the downward trend and given AIB vendors a needed prospect to offset declining sales in the mainstream consumer space."
But not all is well considering overall the add-in board attach rate with desktops "has declined from a high of 63% in Q1 2008 to 37% this quarter". This is indicative of the overall trend toward integrated GPUs in the industry with AMD APUs and Intel processor graphics, as illustrated by this graphic from the report.
The year-to-year numbers show an overall drop of 18.8%, and even with their dominant 81.9% market share NVIDIA has still seen their shipments decrease by 12% this quarter. These trends seem to indicate a gloomy future for discrete graphics in the coming years, but for now we in the enthusiast community will continue to keep it afloat. It would certainly be nice to see some gains from AMD soon to keep things interesting, which might help lower prices down from their lofty $400 - $600 mark for flagship cards at the moment.
Another Maxwell Iteration
The mainstream end of the graphics card market is about to get a bit more complicated with today’s introduction of the GeForce GTX 950. Based on a slightly cut down GM206 chip, the same used in the GeForce GTX 960 that was released almost 8 months ago, the new GTX 950 will fill a gap in the product stack for NVIDIA, resting right at $160-170 MSRP. Until today that next-down spot from the GTX 960 was filled by the GeForce GTX 750 Ti, the very first iteration of Maxwell (we usually call it Maxwell 1) that came out in February of 2014!
Even though that is a long time to go without refreshing the GTX x50 part of the lineup, NVIDIA was likely hesitant to do so based on the overwhelming success of the GM107 for mainstream gaming. It was low cost, incredibly efficient and didn’t require any external power to run. That led us down the path of upgrading OEM PCs with GTX 750 Ti, an article and video that still gets hundreds of views and dozens of comments a week.
The GTX 950 has some pretty big shoes to fill. I can tell you right now that it uses more power than the GTX 750 Ti, and it requires a 6-pin power connector, but it does so while increasing gaming performance dramatically. The primary competition from AMD is the Radeon R7 370, a Pitcairn GPU that is long in the tooth and missing many of the features that Maxwell provides.
And NVIDIA is taking a secondary angle with the GTX 950 launch –targeting the MOBA players (DOTA 2 in particular) directly and aggressively. With the success of this style of game over the last several years, and the impressive $18M+ purse for the largest DOTA 2 tournament just behind us, there isn’t a better area of PC gaming to be going after today. But are the tweaks and changes to the card and software really going to make a difference for MOBA gamers or is it just marketing fluff?
Let’s dive into everything GeForce GTX 950!
Subject: General Tech, Displays | August 13, 2015 - 06:51 PM | Jeremy Hellstrom
availability of the beta version of their GameWorks VR. As mentioned on this podcast, until now your GPU has treated the Oculus as a secondary monitor but with this update your graphics driver will directly talk to the Oculus as a separate device, which should help greatly with latency and development of the tricks and treats yet to be discovered when programming for this type of interface.
Tagged: nvidia, oculus rift, gameworks vr
NVIDIA's Gameworks VR, as well as AMD's LiquidVR will provide a platform for developers to program for the Oculus Rift as well as the competeing products from other companies. The new beta SDK from NVIDIA has been updated to support VR SLI and is compatible with the new 350.60 Game Ready drivers. Programmers working with the Maxwell architecture will benefit from Multi-Res Shading which should increase the performance of your current programs. Follow the links if you are interested in developing for Oculus, otherwise wait patiently for the day you can pre-order them.