Subject: Graphics Cards | November 7, 2016 - 06:00 PM | Scott Michaud
Tagged: nvidia, graphics drivers
Update, November 7th @ 5:25pm EST:
First, NVIDIA gave Ryan their official statement, which I included below verbatem.
GeForce Experience collects data to improve the application experience; this includes crash and bug reports as well as system information needed to deliver the correct drivers and optimal settings. NVIDIA does not share any personally identifiable information collected by GeForce Experience outside the company. NVIDIA may share aggregate-level data with select partners, but does not share user-level data. The nature of the information collected has remained consistent since the introduction of GeForce Experience 1.0.The change with GeForce Experience 3.0 is that this error reporting and data collection is now being done in real-time.
They also pointed to their GeForce Experience FAQ.
It sounds like there's a general consensus, both from NVIDIA and even their harshest critics, that telemetry only affects GeForce Experience, and not their base driver. I still believe that there should be a more granular opt-out that still allows access to GeForce Experience, like web browsers and Visual Studio prompt with a checkbox during install. Still, if this concerns you, and, like Windows 10, it might not and that's okay, you can remove GeForce Experience.
Also, GamersNexus yet again did a very technical breakdown of the situation. I think they made an error, though, since they claimed to have recorded traffic "for about an hour", which may not have included the once-per-day reporting time from Windows Task Scheduler. (My image below suggests, at least for my system, monitor once per hour but report at 12:25pm and user login.) I reached out to them on Twitter for clarification, but it looks like they may have just captured GeForce Experience's typical traffic.
Update, November 7th @ 7:15pm EST: Heard back from GamersNexus. They did check at the Windows Task Scheduler time as well, and they claim that they didn't see anything unusual. They aren't finished with their research, though.
Original news, posted November 6th @ 4:25pm EST, below.
Over the last day, users have found NVIDIA Telemetry Monitor added to Windows Task Scheduler. We currently don't know what it is or exactly when it was added, but we do know its schedule. When the user logs in, it runs an application that monitors... something... once every hour while the computer is active. Then, once per day (at just after noon on my PC) and once on login, it runs an application that reports that data, which I assume means sends it to NVIDIA.
Before we begin, NVIDIA (or anyone) should absolutely not be collecting data from personal devices without clearly explaining the bounds and giving a clear option to disable it. Lots of applications, from browsers to software development tools, include crash and error reporting, but they usually and rightfully ask you to opt-in. Microsoft is receiving a lot of crap for this practice in Windows 10, even with their “Basic” option, and, while most of those points are nonsense, there is ground for some concern.
I've asked NVIDIA if they have a statement regarding what it is, what it collects, and what their policy will be for opt-in and opt-out. I haven't received a response yet, because I sent it less than an hour ago on a weekend, but we'll keep you updated.
Subject: Graphics Cards | November 5, 2016 - 08:19 PM | Scott Michaud
Tagged: linux, DOTA 2, valve, nvidia, vulkan, opengl
Phoronix published interesting benchmark results for OpenGL vs Vulkan on Linux, across a wide spread of thirteen NVIDIA GPUs. Before we begin, the CPU they chose was an 80W Intel Xeon E3-1280 v5, which fits somewhere between the Skylake-based Core i7-6700k and Core i7-6700 (no suffix). You may think that Xeon v5 would be based on Broadwell, but, for some reason, Intel chose the E3-1200 series to be based on Skylake. Regardless, the choice of CPU will come in to play.
They will apparently follow up this article with AMD results.
A trend arose throughout the whole article. At 1080p, everything, from the GTX 760 to the GTX 1080, was rendering at ~101 FPS on OpenGL and ~115 FPS on Vulkan. The obvious explanation is that the game is 100% CPU-bound on both APIs, but Vulkan is able to relax the main CPU thread enough to squeeze out about 14% more frames.
The thing is, the Xeon E3-1280 v5 is about as high-end of a mainstream CPU as you can get. It runs the most modern architecture and it can achieve clocks up to 4 GHz on all cores. DOTA 2 can get harsh on the CPU when a lot of units are on screen, but this is a little surprisingly low. Then again, I don't have any experience running DOTA 2 benchmarks, so maybe it's a known thing, or maybe even a Linux-version thing?
Moving on, running the game at 4K, the results get more interesting. In GPU-bound scenarios, NVIDIA's driver shows a fairly high performance gain on OpenGL. Basically all GPUs up to the GTX 1060 run at a higher frame rate in OpenGL, only switching to Vulkan with the GTX 1070 and GTX 1080, where OpenGL hits that 101 FPS ceiling and Vulkan goes a little above.
Again, it will be interesting to see how AMD fairs against this line of products, both in Vulkan and OpenGL. Those will apparently come “soon”.
Subject: General Tech | November 4, 2016 - 05:33 PM | Scott Michaud
Tagged: epic games, valve, htc, vr funhouse, nvidia
In early September, we posted about a VR game jam that was coming to Hamburg, Germany by Epic Games, NVIDIA, HTC, and Valve. The companies wanted to increase the amount of content available so, with the release of the VR Funhouse mod kit, they rented a boat, docked it really well, and let indie developers do their thing around the clock. Seven teams of three-to-five participated, and the public were invited to play around with the results.
Most of the entries deviated from the literal fun-house theme to some extent. Probably the most original game is one where users play a kid in a candy store, trying to evade detection while gorging on sweet, sweet candy. Go figure, it's called Kid in a Candy Store. The closest to the literal interpretation of the theme is Beer Beer Beer and Sausages, where you serve carnival food, with real beer and mustard fluid simulations.
Two of the games, Beer Beer Beer and Sausages and Waiter Wars, are available for free on the VR Funhouse Steam Workshop page. I'm not sure what happened to the rest. The Unreal Engine post seems to suggest that they are supposed to be here, but maybe some of the teams are looking to polish it up a little first.
Subject: Graphics Cards | November 2, 2016 - 07:10 PM | Jeremy Hellstrom
Tagged: pascal, nvidia, GTX1070, GTX1060, GTX 1080, fail, evga, ACX 3.0
Checklist time readers, do you have the following:
- A GTX 1060/1070/1080
- Which is from EVGA
- With an ACX 3.0 cooler
- With one of the model numbers above
If not, make like Bobby McFerrin.
If so, you have a reason to be concerned and EVGA offers their apologies and more importantly, a fix. EVGA's tests, which emulate the ones performed at Tom's show that the thermal temperature of the PWM and memory was just marginally within spec. That is a fancy way of saying that in certain circumstances the PWM was running just short of causing a critical thermal incident, also know as catching on fire and letting out the magic smoke. They claim that this was because the testing focused on GPU temperature and the lowest acoustic levels possible and did not involve measuring the heat produced on memory or the VRM which is, as they say, a problem.
You have several choices of remedy from EVGA, please remember that you should reach out directly to their support, not NVIDIA's. You can try requesting a refund from the store you purchased it at but your best bet is EVGA.
The first option is a cross-ship RMA. Contact EVGA as a guest or with your account to set up an RMA and they will ship you a replacement card with a new VBIOS which will not have this issue and you won't need to send yours back until the replacement arrives.
You can flash to the new VBIOS which will adjust the fan-speed curve to ensure that your fans are running higher than 30% and will provide sufficient cooling to additional portions of the GPU. Your card will be louder but it will also be less likely to commit suicide in a dramatic fashion.
Lastly you can request a thermal pad kit, which EVGA suggests is unnecessary but certainly sounds like a good idea especially as it is free although requires you sign up for an EVGA account. Hopefully in the spare seconds currently available to the team we can get our hands on an ACX 3.0 cooled Pascal card with the VBIOS update and thermal pads so we can verify this for you.
This issue should not have happened and does reflect badly on certain factors of EVGA's testing. Their response has been very appropriate on the other hand, if you are affected then you can get a replacement card with no issues or you can fix the issue yourself. Any cards shipped, though not necessarily purchased, after Nov. 1st will have the new VBIOS so be careful if you are sticking with a new EVGA Pascal card.
Subject: Graphics Cards | November 2, 2016 - 04:01 PM | Scott Michaud
Tagged: nvidia, graphics drivers
The release of NVIDIA's GeForce 375.57 graphics drivers wasn't the most smooth. It introduced a few bugs into the package, which was likely due to all of the games that were coming out at the time. One issue introduced artifacts in animated GIFs, which could introduce seconds worth of black blotches. This was supposed to be fixed in the next WHQL driver, but it slipped. Since the next WHQL driver is looking to be a couple of weeks out, NVIDIA released a hotfix.
The driver also fixes “occasional flicker on high refresh rate monitors”. I'm not sure how old this bug is. I've heard some people complain about it with recent drivers, but Allyn and I have noticed weird snowy flickers for several months now. (Allyn actually took slow motion video of one occurrence back in May.) I guess we'll see if this is the same issue.
You can pick up 375.76 Hotfix from NVIDIA's CustHelp.
Subject: Graphics Cards | November 1, 2016 - 11:57 AM | Ryan Shrout
Tagged: video, rx 480, readon, nvidia, multi-gpu, gtx 1060, geforce, dx12, deus ex: mankind divided, amd
Last week a new update was pushed out to Deus Ex: Mankind Divided that made DX12 a part of the main line build and also integrated early support for multi-GPU support under DX12. I wanted to quickly see what kind of scaling it provided as we still have very few proof points on the benefit of running more than one graphics card with games utilizing the DX12 API.
As it turns out, the current build and driver combination only shows scaling on the AMD side of things. NVIDIA still doesn't have DX12 multi-GPU support enabled at this point for this title.
- Test System
- Core i7-5960X
- X99 MB + 16GB DDR4
- AMD Radeon RX 480 8GB
- Driver: 16.10.2
- NVIDIA GeForce GTX 1060 6GB
- Driver: 375.63
Not only do we see great scaling in terms of average frame rates, but using PresentMon for frame time measurment we also see that the frame pacing is consistent and provides the user with a smooth gaming experience.
Subject: Graphics Cards | October 29, 2016 - 11:45 PM | Scott Michaud
Tagged: nvidia, gtx 1070, vbios
So apparently I completely missed this news for over a week. It's probably something that our readers would like to know, though, because it affects the stability of GTX 1070 cards. Video RAM chips are purchased from a variety of vendors, and they should ideally be interchangeable. It turns out that, while NVIDIA seems to ship their cards with Samsung memory, some partners have switched to Micron GDDR5 modules.
According to DigitalTrends, the original VBIOS installed in graphics cards cannot provide enough voltage for Micron quick enough, so it would improperly store data. This reminds me when I had a 7900 GT, which apparently had issues with the voltage regulators feeding the VRAM, leading to interesting failures when the card got hot, like random red, green, and blue dots scattered across the screen, even during POST.
Anywho, AIB vendors have been releasing updated VBIOSes through their websites. DigitalTrends listed EVGA, Gainward, and Palit, but progress has been made since then. I've found updates at ASUS that were released a couple of days ago, which claim to fix Micron memory stability, but it looks like Gigabyte and MSI are still MIA. The best idea is to run GPU-Z and, if Micron produces your GDDR5 memory, check your vendor's website for a new VBIOS.
It's a pain, but this sort of issue goes beyond driver updates.
Subject: Graphics Cards | October 29, 2016 - 08:08 PM | Scott Michaud
Tagged: nvidia, graphics drivers
Yesterday, which was a Friday, NVIDIA released updated graphics drivers for Titanfall 2, Call of Duty: Infinite Warfare, Call of Duty: Modern Warfare Remastered, Skyrim Special Edition, Obduction, and Dishonored 2. While it kind-of missed Skyrim Special Edition by a day-and-a-bit, the GeForce 375.70 drivers seem stable enough in my testing, although a couple of issues that were introduced in 375.57 are still ongoing. I've been using them with a GeForce GTX 1080 (and a secondary GTX 670) for a little over a day, and I haven't yet seen an issue.
As for the known bugs, while neither of which affect me, they could be a bother to some. First, Folding@Home is allegedly reporting incorrect results, which NVIDIA is currently investigating. Second, and probably more severe, is that certain animated GIFs have quite severe artifacting. It's almost like, for the first handful of seconds, instead of seeing the frame difference over the first frame, you see it over a black frame. This can be worked around by disabling hardware acceleration (or using a different browser -- Firefox seems okay) until NVIDIA can release another driver. The good news is that it's already been fixed internally, they just couldn't ship it with 375.70.
Feel free to download 375.70 at NVIDIA's website (or GeForce Experience)... or wait for a later release if GIFV support in certain applications (like Google Chrome) or donating resources to Folding@Home are important to you. One of the “Game Ready” titles for this driver (Dishonored 2) won't be released until mid-November, though, so it might be a little while.
Subject: General Tech | October 27, 2016 - 12:19 PM | Ryan Shrout
Tagged: z850, x50, video, tegra, switch, surface studio, Samsung, qualcomm, podcast, Optane, nvidia, Nintendo, microsoft, Intel, gtx 1050, Fanatec, evga, acer, 960 PRO, 5G
PC Perspective Podcast #422 - 10/27/16
Join us this week as we discuss the Samsung 960 Pro, Fanatec racing gear, an Acer UltraWide projector, Optane leaks, MS Surface Studio and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the iTunes Store (audio only)
- Google Play - Subscribe to our audio podcast directly through Google Play!
- RSS - Subscribe through your regular RSS reader (audio only)
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Allyn Malventano, Josh Walrath, Jeremy Hellstrom
Program length: 1:47:11
- Join our spam list to get notified when we go live!
- Fragging Frogs VLAN 14
- Week in Review:
- Today’s episode is brought to you by Harry’s! Use code PCPER at checkout!
- News items of interest:
- 1:00:50 GTX 1050 and 1050Ti
- 1:05:30 Intel Optane (XPoint) First Gen Product Specifications Leaked
- 1:11:20 Microsoft Introduces Surface Studio AiO Desktop PC
- 1:21:45 Microsoft Windows 10 Creators Update Formally Announced
- 1:25:25 Qualcomm Announces Snapdragon X50 5G Modem
- 1:31:55 NVIDIA Tegra SoC powers new Nintendo Switch gaming system
- Hardware/Software Picks of the Week
- Ryan: Chewbacca Hoodie
- Jeremy: The Aimpad R5 is actually much cooler than I thought
- Josh: Solid for the price. Get on special!
- Allyn: Factorio
Subject: Systems | October 26, 2016 - 04:31 PM | Sebastian Peak
Tagged: workstation, nvidia, microsoft, Intel, GTX 980M, GTX 965M, desktop, DCI-P3, core i7, core i5, all-in-one, AIO, 4000x3500
Microsoft has announced their first all-in-one PC with the Surface Studio, and it looks like Apple has some serious competition on their hands in the high-end AIO workstation space. Outfitted with the highest resolution display this side of Cupertino, 6th-generation Intel Skylake processors, and discrete NVIDIA graphics, there is plenty of power for most users (though gamers will clearly be looking elsewhere). Make no mistake, this new AIO from Microsoft is not going to replace a standard desktop for most people due to the $2999+ price tag, but for creative professionals and other workstation users it is a compelling option.
"Expanding the Surface family, Surface Studio is a new class of device that transforms from a workstation into a powerful digital canvas, unlocking a more natural and immersive way to create on the thinnest LCD monitor ever built.1 With a stunning ultra-HD 4.5K screen, Surface Studio delivers 63 percent more pixels than a state-of-the-art 4K TV. Surface Studio works beautifully with pen, touch and Surface Dial — a new input device designed for the creative process that lets you use two hands on the screen to compose and create in all new ways."
The star of the show is the 28-inch PixelSense display, which boasts a massive 4500x3000 resolution for a pixel density of 192 ppi, and the taller 3:2 aspect ratio will be welcomed by some users as well. Microsoft is using 10-bit panels for this premium AIO offering, and color reproduction should be outstanding with the Surface Studio thanks to "individually color calibrated" displays. Another advantage for creative customers is the display's multi-touch capability and 1024 pressure-level Surface Pen, which makes this a very nice option for digital artists - especially at 28 inches/192 ppi.
Touchscreen desktops need display placement flexibility to be useful, and here Microsoft has a "zero gravity" hinge to allow for easy movement. The design looks stable thanks to a pair of arms connecting the display to the base, and this lower half is what actually houses the PC components. What's inside? Here's a look at the official specs:
- Screen: 28” PixelSense™ Display
- Resolution: 4500 x 3000 (192 PPI)
- Color settings: Adobe sRGB and DCI-P3, individually color calibrated
- Touch: 10 point multi-touch
- Aspect Ratio: 3:2
- Supports Pen enabled and Zero Gravity Hinge
- Processor: 6th Generation Intel® Core™ i5 or i7
- Memory: 8GB, 16GB, or 32GB RAM
- i5 Intel 8GB: NVIDIA® GeForce® GTX 965M 2GB GDDR5 memory
- i7 Intel 16GB: NVIDIA® GeForce® GTX 965M 2GB GDDR5 memory
- i7 Intel 32GB: NVIDIA® GeForce® GTX 980M 4GB GDDR5 memory
- Rapid Hybrid Drive options: 1TB or 2TB
- Connections & expansions:
- 4 x USB 3.0 (one high power port)
- Full-size SD ™ card reader (SDXC) compatible
- Mini DisplayPort
- Headset jack
- Compatible with Surface Dial on-screen interaction*
- 1 Gigabit Ethernet port
- Cameras, video and audio:
- Windows Hello1 face sign-in camera
- 5.0 MP camera with 1080p HD video (front)
- Autofocus camera with 1080p HD video (rear)
- Dual microphones
- Stereo 2.1 speakers with Dolby® Audio™ Premium
- 3.5 mm headphone jack
- Wi-Fi: 802.11ac Wi-Fi wireless networking, IEEE 802.11 a/b/g/n compatible
- Bluetooth: Bluetooth 4.0 wireless technology
- Xbox Wireless built-in
- TPM chip for enterprise security
- Enterprise-grade protection with Windows Hello2 face sign-in
- Warranty: 1-year limited hardware warranty
- Display: 637.35 mm x 438.90 mm x 12.5 mm (25.1” x 17.3” x 0.5”)
- Base: 250.00 mm x 220.00 mm x 32.2 mm (9.8” x 8.7” x 1.3”)
- Product weight: 9.56 kg max (21 lbs max)
The Surface Studio is currently available for pre-order at Microsoft.com with prices ranging from $2999 to $4199, depending on configuration.