Subject: Graphics Cards | November 1, 2016 - 03:57 PM | Ryan Shrout
Tagged: video, rx 480, readon, nvidia, multi-gpu, gtx 1060, geforce, dx12, deus ex: mankind divided, amd
Last week a new update was pushed out to Deus Ex: Mankind Divided that made DX12 a part of the main line build and also integrated early support for multi-GPU support under DX12. I wanted to quickly see what kind of scaling it provided as we still have very few proof points on the benefit of running more than one graphics card with games utilizing the DX12 API.
As it turns out, the current build and driver combination only shows scaling on the AMD side of things. NVIDIA still doesn't have DX12 multi-GPU support enabled at this point for this title.
- Test System
- Core i7-5960X
- X99 MB + 16GB DDR4
- AMD Radeon RX 480 8GB
- Driver: 16.10.2
- NVIDIA GeForce GTX 1060 6GB
- Driver: 375.63
Not only do we see great scaling in terms of average frame rates, but using PresentMon for frame time measurment we also see that the frame pacing is consistent and provides the user with a smooth gaming experience.
Subject: Graphics Cards | October 30, 2016 - 03:45 AM | Scott Michaud
Tagged: nvidia, gtx 1070, vbios
So apparently I completely missed this news for over a week. It's probably something that our readers would like to know, though, because it affects the stability of GTX 1070 cards. Video RAM chips are purchased from a variety of vendors, and they should ideally be interchangeable. It turns out that, while NVIDIA seems to ship their cards with Samsung memory, some partners have switched to Micron GDDR5 modules.
According to DigitalTrends, the original VBIOS installed in graphics cards cannot provide enough voltage for Micron quick enough, so it would improperly store data. This reminds me when I had a 7900 GT, which apparently had issues with the voltage regulators feeding the VRAM, leading to interesting failures when the card got hot, like random red, green, and blue dots scattered across the screen, even during POST.
Anywho, AIB vendors have been releasing updated VBIOSes through their websites. DigitalTrends listed EVGA, Gainward, and Palit, but progress has been made since then. I've found updates at ASUS that were released a couple of days ago, which claim to fix Micron memory stability, but it looks like Gigabyte and MSI are still MIA. The best idea is to run GPU-Z and, if Micron produces your GDDR5 memory, check your vendor's website for a new VBIOS.
It's a pain, but this sort of issue goes beyond driver updates.
Subject: Graphics Cards | October 30, 2016 - 12:08 AM | Scott Michaud
Tagged: nvidia, graphics drivers
Yesterday, which was a Friday, NVIDIA released updated graphics drivers for Titanfall 2, Call of Duty: Infinite Warfare, Call of Duty: Modern Warfare Remastered, Skyrim Special Edition, Obduction, and Dishonored 2. While it kind-of missed Skyrim Special Edition by a day-and-a-bit, the GeForce 375.70 drivers seem stable enough in my testing, although a couple of issues that were introduced in 375.57 are still ongoing. I've been using them with a GeForce GTX 1080 (and a secondary GTX 670) for a little over a day, and I haven't yet seen an issue.
As for the known bugs, while neither of which affect me, they could be a bother to some. First, Folding@Home is allegedly reporting incorrect results, which NVIDIA is currently investigating. Second, and probably more severe, is that certain animated GIFs have quite severe artifacting. It's almost like, for the first handful of seconds, instead of seeing the frame difference over the first frame, you see it over a black frame. This can be worked around by disabling hardware acceleration (or using a different browser -- Firefox seems okay) until NVIDIA can release another driver. The good news is that it's already been fixed internally, they just couldn't ship it with 375.70.
Feel free to download 375.70 at NVIDIA's website (or GeForce Experience)... or wait for a later release if GIFV support in certain applications (like Google Chrome) or donating resources to Folding@Home are important to you. One of the “Game Ready” titles for this driver (Dishonored 2) won't be released until mid-November, though, so it might be a little while.
Subject: General Tech | October 27, 2016 - 04:19 PM | Ryan Shrout
Tagged: z850, x50, video, tegra, switch, surface studio, Samsung, qualcomm, podcast, Optane, nvidia, Nintendo, microsoft, Intel, gtx 1050, Fanatec, evga, acer, 960 PRO, 5G
PC Perspective Podcast #422 - 10/27/16
Join us this week as we discuss the Samsung 960 Pro, Fanatec racing gear, an Acer UltraWide projector, Optane leaks, MS Surface Studio and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the iTunes Store (audio only)
- Google Play - Subscribe to our audio podcast directly through Google Play!
- RSS - Subscribe through your regular RSS reader (audio only)
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Allyn Malventano, Josh Walrath, Jeremy Hellstrom
Program length: 1:47:11
- Join our spam list to get notified when we go live!
- Fragging Frogs VLAN 14
- Week in Review:
- Today’s episode is brought to you by Harry’s! Use code PCPER at checkout!
- News items of interest:
- 1:00:50 GTX 1050 and 1050Ti
- 1:05:30 Intel Optane (XPoint) First Gen Product Specifications Leaked
- 1:11:20 Microsoft Introduces Surface Studio AiO Desktop PC
- 1:21:45 Microsoft Windows 10 Creators Update Formally Announced
- 1:25:25 Qualcomm Announces Snapdragon X50 5G Modem
- 1:31:55 NVIDIA Tegra SoC powers new Nintendo Switch gaming system
- Hardware/Software Picks of the Week
- Ryan: Chewbacca Hoodie
- Jeremy: The Aimpad R5 is actually much cooler than I thought
- Josh: Solid for the price. Get on special!
- Allyn: Factorio
Subject: Systems | October 26, 2016 - 08:31 PM | Sebastian Peak
Tagged: workstation, nvidia, microsoft, Intel, GTX 980M, GTX 965M, desktop, DCI-P3, core i7, core i5, all-in-one, AIO, 4000x3500
Microsoft has announced their first all-in-one PC with the Surface Studio, and it looks like Apple has some serious competition on their hands in the high-end AIO workstation space. Outfitted with the highest resolution display this side of Cupertino, 6th-generation Intel Skylake processors, and discrete NVIDIA graphics, there is plenty of power for most users (though gamers will clearly be looking elsewhere). Make no mistake, this new AIO from Microsoft is not going to replace a standard desktop for most people due to the $2999+ price tag, but for creative professionals and other workstation users it is a compelling option.
"Expanding the Surface family, Surface Studio is a new class of device that transforms from a workstation into a powerful digital canvas, unlocking a more natural and immersive way to create on the thinnest LCD monitor ever built.1 With a stunning ultra-HD 4.5K screen, Surface Studio delivers 63 percent more pixels than a state-of-the-art 4K TV. Surface Studio works beautifully with pen, touch and Surface Dial — a new input device designed for the creative process that lets you use two hands on the screen to compose and create in all new ways."
The star of the show is the 28-inch PixelSense display, which boasts a massive 4500x3000 resolution for a pixel density of 192 ppi, and the taller 3:2 aspect ratio will be welcomed by some users as well. Microsoft is using 10-bit panels for this premium AIO offering, and color reproduction should be outstanding with the Surface Studio thanks to "individually color calibrated" displays. Another advantage for creative customers is the display's multi-touch capability and 1024 pressure-level Surface Pen, which makes this a very nice option for digital artists - especially at 28 inches/192 ppi.
Touchscreen desktops need display placement flexibility to be useful, and here Microsoft has a "zero gravity" hinge to allow for easy movement. The design looks stable thanks to a pair of arms connecting the display to the base, and this lower half is what actually houses the PC components. What's inside? Here's a look at the official specs:
- Screen: 28” PixelSense™ Display
- Resolution: 4500 x 3000 (192 PPI)
- Color settings: Adobe sRGB and DCI-P3, individually color calibrated
- Touch: 10 point multi-touch
- Aspect Ratio: 3:2
- Supports Pen enabled and Zero Gravity Hinge
- Processor: 6th Generation Intel® Core™ i5 or i7
- Memory: 8GB, 16GB, or 32GB RAM
- i5 Intel 8GB: NVIDIA® GeForce® GTX 965M 2GB GDDR5 memory
- i7 Intel 16GB: NVIDIA® GeForce® GTX 965M 2GB GDDR5 memory
- i7 Intel 32GB: NVIDIA® GeForce® GTX 980M 4GB GDDR5 memory
- Rapid Hybrid Drive options: 1TB or 2TB
- Connections & expansions:
- 4 x USB 3.0 (one high power port)
- Full-size SD ™ card reader (SDXC) compatible
- Mini DisplayPort
- Headset jack
- Compatible with Surface Dial on-screen interaction*
- 1 Gigabit Ethernet port
- Cameras, video and audio:
- Windows Hello1 face sign-in camera
- 5.0 MP camera with 1080p HD video (front)
- Autofocus camera with 1080p HD video (rear)
- Dual microphones
- Stereo 2.1 speakers with Dolby® Audio™ Premium
- 3.5 mm headphone jack
- Wi-Fi: 802.11ac Wi-Fi wireless networking, IEEE 802.11 a/b/g/n compatible
- Bluetooth: Bluetooth 4.0 wireless technology
- Xbox Wireless built-in
- TPM chip for enterprise security
- Enterprise-grade protection with Windows Hello2 face sign-in
- Warranty: 1-year limited hardware warranty
- Display: 637.35 mm x 438.90 mm x 12.5 mm (25.1” x 17.3” x 0.5”)
- Base: 250.00 mm x 220.00 mm x 32.2 mm (9.8” x 8.7” x 1.3”)
- Product weight: 9.56 kg max (21 lbs max)
The Surface Studio is currently available for pre-order at Microsoft.com with prices ranging from $2999 to $4199, depending on configuration.
Subject: Graphics Cards | October 25, 2016 - 05:21 PM | Jeremy Hellstrom
Tagged: pascal, nvidia, msi, GTX 1050 Ti, gtx 1050, GP107
The Guru of 3D tested out MSI's GeForce GTX 1050 and 1050 Ti, with MSRP's of $109 and $139 respectively. The non-Ti version has the lowest count of Texture Mapping Units of this generation but a higher GPU frequency that the Ti model, it also has the smallest amount of memory at 2GB though at least it is clocked the same in both models. DirectX 12 testing offers variable results, in many games the two are bookends to the RX 460 with the GTX 1050 a bit slower and the 1050 Ti a bit faster but this does not hold true in all games. DirectX 11 results were more favourable for this architecture, the two cards climbed in the rankings with the 1050 Ti offering acceptable performance. Check out their full review here.
"Last week Nvidia announced the GeForce GTX 1050 series, with two primary models. In this article we'll review the MSI GeForce GTX 1050 and 1050 Ti Gaming X, two graphics cards aimed at the budget minded consumer. We say budget minded as these cards are very affordable and positioned in an attractive 109 and 139 dollar (US) segment."
Here are some more Graphics Card articles from around the web:
- Sub-$150 Pascal: NVIDIA GeForce GTX 1050 & GTX 1050 Ti Review @ Techgage
- The NVIDIA GTX 1050 Ti & GTX 1050 Review @ Hardware Canucks
- MSI GTX 1050 Gaming X 2G Review @ OCC
- MSI GTX 1050 Ti Gaming X 4 GB @ techPowerUp
- MSI GTX 1050 Gaming X 2 GB @ techPowerUp
- Nvidia's GeForce GTX 1060 @ The Tech Report
- Gigabyte GTX 1070 Xtreme Gaming 8 GB @ techPowerUp
- MSI RX 470 Gaming X 8G Review @ OCC
- The XFX Radeon RX 470 RS Black Edition @ Tech ARP
Subject: Graphics Cards | October 24, 2016 - 01:21 PM | Jeremy Hellstrom
Tagged: 375.63, nvidia, geforce 375.57
After the many issues being reported by NVIDIA users, GeForce 375.63 has been released which should ameliorate the issues encountered with animated GIFs and various games. It is also WHQL certified, just as the last one was, but hopefully this version will show improvements. Let us know in the comments if you continue to see driver issues.
NVIDIA was beaten to the punch by AMD this particular cycle, today marks the release of the GeForce 375.57 driver with new profiles for BF1, Civ VI and Titanfall 2 as well as VR support updates for the same two pre-release games. If you haven't signed up for the GeForce Experience you can still grab the drivers here.
Game Ready Drivers provide the best possible gaming experience for all major new releases, including Virtual Reality games. Prior to a new title launching, our driver team is working up until the last minute to ensure every performance tweak and bug fix is included for the best gameplay on day-1.
Provides the optimal experience for Battlefield 1, Civilization VI, and Titanfall 2
Game Ready VR
Provides the optimal VR experience for Eagle Flight and Serious Sam VR: The Last Hope
Subject: Graphics Cards | October 22, 2016 - 10:49 PM | Scott Michaud
Tagged: nvidia, graphics drivers
Before it was released, employees of NVIDIA were claiming that it was difficult to get their drivers through Microsoft's WHQL certification. It is a busy time of year, with the holiday gaming and hardware rush in full swing, so there was likely a backlog until Microsoft could return the signed graphics driver. It also seems like GeForce 375.57 drivers could have used a little more time in NVIDIA's QA department.
At the GeForce Forums, users are complaining about a variety of issues. Ironically, there seems to be a bunch of them claiming that Battlefield 1 is crashing and otherwise being buggy. I haven't installed the game yet, so I cannot contribute my own experiences to it, one way or the other. I have seen some issues myself, though. For instance, I can confirm that tiles in the Windows 10 Start Menu lock up the entire panel if you attempt to move them. NVIDIA acknowledges a handful of issues with Windows 10 on their forums, and they plan a hotfix driver soon (which I'm guessing cannot be applied on PCs running Anniversary Edition clean installs that have secure boot enabled, because of Microsoft's kernel mode driver changes -- thankfully, I'm guessing that applies to very few people).
One issue that seems localized to me, though, is StarCraft II. Since I installed the driver (and granted I installed several things that night, like the CUDA SDK) it fails to launch about three-quarters of the time. Could be unrelated, but it should give you an idea about how broad the issues seem to be. Other users are complaining about GIFV corruption, for instance.
Best to roll back and wait for the next WHQL driver (unless hotfix users give glowing praise).
Subject: Processors, Mobile | October 20, 2016 - 03:40 PM | Ryan Shrout
Tagged: Nintendo, switch, nvidia, tegra
It's been a hell of a 24 hours for NVIDIA and the Tegra processor. A platform that many considered dead in the water after the failure of it to find its way into smartphones or into an appreciable amount of consumer tablets, had two major design wins revealed. First, it was revealed that NVIDIA is powered the new fully autonomous driving system in the Autopilot 2.0 hardware implementation in Tesla's current Model S, X and upcoming Model 3 cars.
Now, we know that Nintendo's long rumored portable and dockable gaming system called Switch is also powered by a custom NVIDIA Tegra SoC.
We don't know much about the hardware that gives the Switch life, but NVIDIA did post a short blog with some basic information worth looking at. Based on it, we know that the Tegra processor powering this Nintendo system is completely custom and likely uses Pascal architecture GPU CUDA cores; though we don't know how many and how powerful it will be. It will likely exceed the performance of the Nintendo Wii U, which was only 0.35 TFLOPS and consisting of 320 AMD-based stream processors. How much faster we just don't know yet.
On the CPU side we assume that this is built using an ARM-based processor, most likely off-the-shelf core designs to keep things simple. Basing it on custom designs like Denver might not be necessary for this type of platform.
Nintendo has traditionally used custom operating systems for its consoles and that seems to be what is happening with the Switch as well. NVIDIA mentions a couple of times how much work the technology vendor put into custom APIs, custom physic engines, new libraries, etc.
The Nintendo Switch’s gaming experience is also supported by fully custom software, including a revamped physics engine, new libraries, advanced game tools and libraries. NVIDIA additionally created new gaming APIs to fully harness this performance. The newest API, NVN, was built specifically to bring lightweight, fast gaming to the masses.
We’ve optimized the full suite of hardware and software for gaming and mobile use cases. This includes custom operating system integration with the GPU to increase both performance and efficiency.
The system itself looks pretty damn interesting, with the ability to switch (get it?) between a docked to your TV configuration to a mobile one with attached or wireless controllers. Check out the video below for a preview.
I've asked both NVIDIA and Nintendo for more information on the hardware side but these guys tend to be tight lipped on the custom silicon going into console hardware. Hopefully one or the other is excited to tell us about the technology so we can some interesting specifications to discuss and debate!
UPDATE: A story on The Verge claims that Nintendo "took the chip from the Shield" and put it in the Switch. This is more than likely completely false; the Shield is a significantly dated product and that kind of statement could undersell the power and capability of the Switch and NVIDIA's custom SoC quite dramatically.
Subject: Graphics Cards | October 20, 2016 - 12:08 AM | Scott Michaud
Tagged: amd, nvidia, gtx 1060, rx 480, dx12, dx11, battlefield 1
Battlefield 1 is just a few days from launching. In fact, owners of the Deluxe Edition have the game unlock yesterday. It's interesting that multiple publishers are using release date as a special edition bonus these days, including Microsoft's recent Windows Store releases. I'm not going to say interesting bad or good, though, because I'll leave that up to the reader to decide.
Anywho, DigitalFoundry is doing their benchmarking thing, and they wanted to see what GPU could provide a solid 60FPS when everything is maxed out (at 1080p). They start off with a DX12-to-DX12 comparison between the GTX 1060 and the RX 480. This is a relatively fair comparison, because the 3GB GTX 1060 and the 4GB RX 480 both come in at about $200, while upgrading to 6GB for the 1060 or 8GB for the 480 bumps each respective SKU up to the ~$250 price point. In this test, NVIDIA has a few dips slightly below 60 FPS in complex scenes, while AMD stays above that beloved threshold.
They also compare the two cards in DX11 and DX12 mode, with both cards using a Skylake-based Core i5 CPU. In this test, AMD's card noticed a nice increase in frame rate when switching to DirectX 12, while NVIDIA had a performance regression in the new API. This raises two questions, one of which is potentially pro-NVIDIA, and the other, pro-AMD. First, would the original test, if NVIDIA's card was allowed to use DirectX 11, show the GTX 1060 more competitive against the DX12-running RX 480? This brings me to the second question: what would the user see? A major draw of Mantle-based graphics APIs is that the application has more control over traditionally driver-level tasks. Would 60 FPS in DX12 be more smooth than 60 FPS in DX11?
I don't know. It's something we'll need to test.