Subject: Graphics Cards | March 9, 2016 - 08:30 PM | Scott Michaud
Tagged: ubuntu, graphics drivers, graphics driver, amd
AMD has been transitioning their kernel driver from the closed-source fglrx to the open-source AMDGPU driver that was announced last year. This forms the base that both closed and open user-mode drivers will utilize. For the upcoming Ubuntu 16.04 LTS, Canonical has decided to deprecate fglrx and remove it from the system upon upgrade. Users can then choose to install an AMDGPU-based one, or reinstall the Radeon driver. That will need to be done without Canonical's support, though.
It makes sense that they would choose Ubuntu 16.04 to pull the plug. This is the version that Canonical will be maintaining for the next five years, which could give a headache when AMD has spent the last year trying to get rid of it. AMDGPU is a much safer target as the years roll forward. On the other hand, GPUs prior to Fiji will not have the luxury of choosing, because AMD still hasn't announced AMDGPU for
GDC (Update March 9th @ 6pm: Fixed typo) GCN 1.0 and 1.1.
Subject: Graphics Cards | March 9, 2016 - 04:55 PM | Scott Michaud
Tagged: nvidia, graphics drivers
The last couple of days were not too great for software patches. Microsoft released a Windows 10 update that breaks 5K monitors, and NVIDIA's driver bug, mentioned in the last post, was bigger than they realized. It turns out that the issue is not isolated to multiple monitors, but rather had something to do with choosing “Express Install” in the setup screen.
In response, NVIDIA has removed 364.47 from their website. For those who want “Game Ready” drivers with games like “The Division,” NVIDIA has provided a 364.51 beta driver that supposedly corrects this issue. People on the forums still claim to have problems with this driver, but nothing has been confirmed yet. It's difficult to tell whether other issues exist with the drivers, whether users are having unrelated issues that are attributed to the drivers, or if it's just a few hoaxes. ((Update on March 9th @ 12:41pm: Still nothing confirmed, but one of our comments claim that they've experienced issues personally.)) If you are concerned, then you can roll back to 362.00.
Fortunately for me, I chose to clean install 364.47 and have not had any issues with them. I asked a representative from NVIDIA on Twitter whether I should upgrade to 364.51, and he said that a few other bugs were fixed but I shouldn't bother.
If you managed to properly install 364.47, then you should be fine staying there.
Subject: General Tech, Graphics Cards | March 9, 2016 - 03:18 AM | Ryan Shrout
Tagged: video, polygon.com, ben kuchera, VR, htc, vive, Oculus, rift
During our 12-hour live streaming event cleverly titled "Streaming Out Loud", we invited Ben Kuchera from Polygon.com to stop in and talk about a subject he is very passionate about: virtual reality. Ben has been a VR enthusiast since the beginning, getting a demo of the first Rift prototype from John Carmack himself. He was able to bring over the HTC Vive Pre unit to the office for some show and tell, answer questions about the experiences he has had so far, hardware requirements and much more.
Subject: Graphics Cards | March 7, 2016 - 11:15 PM | Scott Michaud
Tagged: vulkan, nvidia, graphics drivers, game ready
This new driver for NVIDIA brings Vulkan support to their current, supported branch. This is particularly interesting for me, because the Vulkan branch used to pre-date fixes for Adobe Creative Cloud, which meant that things like “Export As...” in Photoshop CC didn't work and After Effects CC would crash. They are also WHQL-certified, and they roll in all of the “Game Ready” fixes and optimizations that were released since ~October, which would be mostly new for Vulkan's branch.
... This is going to be annoying to temporarily disable...
Speaking of which, GeForce Game Ready 364.47 drivers is classified as “Game Ready” itself. The four titles optimized with this release are: Tom Clancy's The Division, Need For Speed, Hitman, and Ashes of the Singularity. If you are interested in playing those games, then this driver is what NVIDIA recommends that you use.
Note that an installation bug has been reported, however. When installing with multiple monitors, NVIDIA suggests that you disable all but one during the setup process, but you can safely re-enable them after. For me, with four monitors and a fairly meticulous desktop icon layout, this was highly annoying, but something I've had to deal with over time (especially the last two, beta Vulkan drivers). It's probably a good idea to close all applications and screenshot your icons before running the installer.
Subject: Graphics Cards | March 4, 2016 - 09:48 PM | Sebastian Peak
Tagged: PCIe power, PCI Express, nvidia, GTX 950 2G, gtx 950, graphics card, gpu, geforce, asus, 75W
ASUS has released a new version of the GTX 950 called the GTX 950 2G, and the interesting part isn't what's been added, but what was taken away; namely, the PCIe power requirement.
When NVIDIA announced the GTX 950 (which Ryan reviewed here) it carried a TDP of 90W, which prevented it from running without a PCIe power connector. The GTX 950 was (seemingly) the replacement for the GTX 750, which didn't require anything beyond motherboard power via the PCIe slot, and the same held true for the more powerful GTX 750 Ti. Without the need for PCIe power that GTX 750 Ti became our (any many others) default recommendation to turn any PC into a gaming machine (an idea we just happened to cover in depth here).
Here's a look at the specs from ASUS for the GTX 950 2G:
- Graphics Engine: NVIDIA GeForce GTX 950
- Interface: PCI Express 3.0
- Video Memory: GDDR5 2GB
- CUDA Cores: 768
- Memory Clock: 6610 MHz
- Memory Interface: 128-bit
- Engine Clock
- Gaming Mode (Default) - GPU Boost Clock : 1190 MHZ , GPU Base Clock : 1026 MHz
- OC Mode - GPU Boost Clock : 1228 MHZ , GPU Base Clock : 1051 MHz
- Interface: HDMI 2.0, DisplayPort, DVI
- Power Consumption: Up to 75W, no additional PCIe power required
- Dimensions: 8.3 x 4.5 x 1.6 inches
Whether this model has any relation to the rumored "GTX 950 SE/LP" remains to be seen (and other than power, this card appears to have stock GTX 950 specs), but the option of adding in a GPU without concern over power requirements makes this a very attractive upgrade proposition for older builds or OEM PC's, depending on cost.
The full model number is ASUS GTX950-2G,
and a listing is up on Amazon, though seemingly only a placeholder at the moment. (Link removed. The listing was apparently for an existing GTX 950 product.)
Subject: Graphics Cards | March 3, 2016 - 08:00 PM | Ryan Shrout
Tagged: uwp, radeon, dx12, amd
AMD's Robert Hallock, frequenter of the PC Perspective live streams and a favorite of the team here, is doing an AMAA on reddit today. While you can find some excellent information and views from Robert in that Q&A session, two particular answers stood out to me.
Asked by user CataclysmZA: Can you comment on the recent developments regarding Ashes of the Singularity and DirectX 12 in PC Perspective and Extremetech's tests? Will changes in AMD's driver to include FlipEx support fix the framerate issues and allow high-refresh monitor owners to enjoy their hardware fully? http://www.pcper.com/reviews/General-Tech/PC-Gaming-Shakeup-Ashes-Singularity-DX12-and-Microsoft-Store
Answer from Robert: We will add DirectFlip support shortly.
Well, there you have it. This is the first official notice I have from AMD that it is in fact its driver that was causing the differences in behavior between Radeon and GeForce cards in Ashes of the Singularity last week. It appears that a new driver will be incoming (sometime) that will enable DirectFlip / FlipEx, allowing exclusive full screen modes in DX12 titles. Some of our fear of the unknown can likely be resolved - huzzah!
Ashes of the Singularity wouldn't enter exclusive full screen mode on AMD Radeon hardware.
Another quesiton also piqued my interest:
Asked by user CataclysmZA: Can you comment on how FreeSync is affected by the way games sold through the Windows Store run in borderless windowed mode?
Answer from Robert: This article discusses the issue thoroughly. Quote: "games sold through Steam, Origin [and] anywhere else will have the ability to behave with DX12 as they do today with DX11."
While not exactly spelling it out, this answer seems to indicate that for the time being, AMD doesn't think FreeSync will work with Microsoft Store sold games in the forced borderless windowed mode. NVIDIA has stated that G-Sync works in some scenarios with the new Gears of War (a Universal Windows Platform app), but it seems they too have issues.
As more informaiton continues to come in, from whatever sources we can validate, I'll keep you updated!
Subject: Graphics Cards | March 3, 2016 - 04:54 PM | Ryan Shrout
Tagged: uwp, uwa, universal windows platform, microsoft, full screen, dx12, DirectX 12
With all of the debate and discussion that followed the second release of Ashes of the Singularity's DX12 benchmark mode, questions about full screen capabilities on AMD hardware and a debate of the impact the Microsoft Store and Universal Windows Platform would have on PC gaming, we went to the source of the debate to try and get some feedback. Microsoft was willing to talk about the issues that arose from this most recent storm though honestly what it is willing to say on the record today is limited.
When asked specifically about the UWP and PC games made available on the Windows 10 Store, Microsoft reiterated its desire to work with gamers and the community to find what works.
“UWP (Universal Windows Platform) allows developers to create experiences that are easily deployed across all Windows 10 devices, from PCs to tablets to phones to Xbox One. When it comes to a UWP game on Windows 10 PCs, we’re early in our journey. We’re listening to the feedback from the community – multiple GPUs, SLI, crossfire, v-sync, etc. We’re embracing the feedback and working to ensure gamers on Windows 10 have a great experience. We’ll have more to discuss in the coming months.” – a Microsoft spokesperson
It's good to know that Microsoft is listening to the media and gamers and seems willing to make changes based on feedback. It will have to be seen though what of this feedback gets implemented and in what time frame.
Universal Windows Platform
One particular fear for some gamers is that Microsoft would attempt to move to the WDDM compositing model not just for games sold in the Windows Store, but for all games that run on the OS. I asked Microsoft directly:
To answer your question, can we assume that those full screen features that work today with DX12 will work in the future as well – yes.
This should ease the worries of people thinking the very worst for Windows and DX12 gaming going forward. As long as DX12 allows for games to enter into an exclusive full screen mode, like the FlipEx option we discussed in a previous story, games sold through Steam, Origin and anywhere else will have the ability to behave with DX12 as they do today with DX11.
Windows 10 Store
I have some meetings setup with various viewpoints on this debate for GDC in a couple weeks, so expect more then!
Subject: Graphics Cards | March 2, 2016 - 10:30 PM | Sebastian Peak
Tagged: nvidia, geforce, game ready, 362.00 WHQL
The new Far Cry game is out (Far Cry Primal), and for NVIDIA graphics card owners this means a new GeForce Game Ready driver. The 362.00 WHQL certified driver provides “performance optimizations and a SLI profile” for the new game, is now available via GeForce Experience, as well as the manual driver download page.
(Image credit: Ubisoft)
The 362.00 WHQL driver also supports the new Gears of War: Ultimate Edition, which is a remastered version of 2007 PC version of the game that includes Windows 10 only enhancements such as 4k resolution support and unlocked frame rates. (Why these "need" to be Windows 10 exclusives can be explained by checking the name of the game’s publisher: Microsoft Studios.)
(Image credit: Microsoft)
Here’s a list of what’s new in version 362.00 of the driver:
- Added Beta support on GeForce GTX GPUs for external graphics over Thunderbolt 3. GPUs supported include all GTX 900 series, Titan X, and GeForce GTX 750 and 750Ti.
- As of Windows 10 November Update, Fermi GPUs now use WDDM 2.0 in single GPU configurations.
For multi-GPU configurations, WDDM usage is as follows:
- In non-SLI multi-GPU configurations, Fermi GPUs use WDDM 2.0. This includes configurations where a Fermi GPU is used with Kepler or Maxwell GPUs.
- In SLI mode, Fermi GPUs still use WDDM 1.3. Application SLI Profiles
Added or updated the following SLI profiles:
- Assassin's Creed Syndicate - SLI profile changed (with driver code as well) to make the application scale better
- Bless - DirectX 9 SLI profile added, SLI set to SLI-Single
- DayZ - SLI AA and NVIDIA Control Panel AA enhance disabled
- Dungeon Defenders 2 - DirectX 9 SLI profile added
- Elite Dangerous - 64-bit EXE added
- Hard West - DirectX 11 SLI profile added
- Metal Gear Solid V: The Phantom Pain - multiplayer EXE added to profile
- Need for Speed - profile EXEs updated to support trial version of the game
- Plants vs Zombies Garden Warfare 2 - SLI profile added
- Rise of the Tomb Raider - profile added
- Sebastien Loeb Rally Evo - profile updated to match latest app behavior
- Tom Clancy's Rainbow Six: Siege - profile updated to match latest app behavior
- Tom Clancy's The Division - profile added
- XCOM 2 - SLI profile added (including necessary code change)
The "beta support on GeForce GTX GPUs for external graphics over Thunderbolt 3" is certainly interesting addition, and one that could eventually lead to external solutions for notebooks, coming on the heels of AMD teasing their own standardization of external GPUs.
The full release 361 (GeForce 362.00) notes can be viewed here (warning: PDF).
Subject: Graphics Cards, Processors | February 29, 2016 - 11:48 PM | Scott Michaud
Tagged: tesla motors, tesla, SoC, Peter Bannon, Jim Keller
When we found out that Jim Keller has joined Tesla, we were a bit confused. He is highly skilled in processor design, and he moved to a company that does not design processors. Kind of weird, right? There are two possibilities that leap to mind: either he wanted to try something new in life, and Elon Musk hired him for his general management skills, or Tesla wants to get more involved in the production of their SoCs, possibly even designing their own.
Now Peter Bannon, who was a colleague of Jim Keller at Apple, has been hired by Tesla Motors. Chances are, the both of them were not independently interested in an abrupt career change that led them to the same company. That seems highly unlikely, to say the least. So it appears that Tesla Motors wants experienced chip designers in house. What for? We don't know. This is a lot of talent to just look over the shoulders of NVIDIA and other SoC partners, to make sure they have an upper hand in negotiation. Jim Keller is at Tesla as their “Vice-President of Autopilot Hardware Engineering.” We don't know what Peter Bannon's title will be.
And then, if Tesla Motors does get into creating their own hardware, we wonder what they will do with it. The company has a history of open development and releasing patents (etc.) into the public. That said, SoC design is a highly encumbered field, depending on what they're specifically doing, which we have no idea about.
Things are about to get...complicated
Earlier this week, the team behind Ashes of the Singularity released an updated version of its early access game, which updated its features and capabilities. With support for DirectX 11 and DirectX 12, and adding in multiple graphics card support, the game featured a benchmark mode that got quite a lot of attention. We saw stories based on that software posted by Anandtech, Guru3D and ExtremeTech, all of which had varying views on the advantages of one GPU or another.
That isn’t the focus of my editorial here today, though.
Shortly after the initial release, a discussion began around results from the Guru3D story that measured frame time consistency and smoothness with FCAT, a capture based testing methodology much like the Frame Rating process we have here at PC Perspective. In that post on ExtremeTech, Joel Hruska claims that the results and conclusion from Guru3D are wrong because the FCAT capture methods make assumptions on the output matching what the user experience feels like. Maybe everyone is wrong?
First a bit of background: I have been working with Oxide and the Ashes of the Singularity benchmark for a couple of weeks, hoping to get a story that I was happy with and felt was complete, before having to head out the door to Barcelona for the Mobile World Congress. That didn’t happen – such is life with an 8-month old. But, in my time with the benchmark, I found a couple of things that were very interesting, even concerning, that I was working through with the developers.
FCAT overlay as part of the Ashes benchmark
First, the initial implementation of the FCAT overlay, which Oxide should be PRAISED for including since we don’t have and likely won’t have a DX12 universal variant of, was implemented incorrectly, with duplication of color swatches that made the results from capture-based testing inaccurate. I don’t know if Guru3D used that version to do its FCAT testing, but I was able to get some updated EXEs of the game through the developer in order to the overlay working correctly. Once that was corrected, I found yet another problem: an issue of frame presentation order on NVIDIA GPUs that likely has to do with asynchronous shaders. Whether that issue is on the NVIDIA driver side or the game engine side is still being investigated by Oxide, but it’s interesting to note that this problem couldn’t have been found without a proper FCAT implementation.
With all of that under the bridge, I set out to benchmark this latest version of Ashes and DX12 to measure performance across a range of AMD and NVIDIA hardware. The data showed some abnormalities, though. Some results just didn’t make sense in the context of what I was seeing in the game and what the overlay results were indicating. It appeared that Vsync (vertical sync) was working differently than I had seen with any other game on the PC.
For the NVIDIA platform, tested using a GTX 980 Ti, the game seemingly randomly starts up with Vsync on or off, with no clear indicator of what was causing it, despite the in-game settings being set how I wanted them. But the Frame Rating capture data was still working as I expected – just because Vsync is enabled doesn’t mean you can look at the results in capture formats. I have written stories on what Vsync enabled captured data looks like and what it means as far back as April 2013. Obviously, to get the best and most relevant data from Frame Rating, setting vertical sync off is ideal. Running into more frustration than answers, I moved over to an AMD platform.