Subject: Graphics Cards | March 18, 2016 - 01:59 PM | Jeremy Hellstrom
Tagged: msi, GTX 980 Ti, MSI GTX 980 Ti GOLDEN Edition, nvidia, factory overclocked
Apart from the golden fan and HDMI port MSI's 980 Ti GOLDEN Edition also comes with a moderate factory overclock, 1140MHz Base, 1228MHz Boost and 7GHz memory, with an observed frequency of 1329MHz in game. [H]ard|OCP managed to up those to 1290MHz Base and 1378MHz Boost and 7.8GHz memory with the card hitting 1504MHz in game. That overclock produced noticeable results in many games and pushed it close to the performance of [H]'s overclocked MSI 980 Ti LIGHTNING. The LIGHTNING proved to be the better card in terms of performance, both graphically and thermally, however it is also more expensive than the GOLDEN and does not have quite the same aesthetics, if that is important to you.
"Today we evaluate the MSI GTX 980 Ti GOLDEN Edition video card. This video card features a pure copper heatsink geared towards faster heat dissipation and better temps on air than other air cooled video cards. We will compare it to the MSI GTX 980 Ti LIGHTNING, placing the two video cards head to head in an overclocking shootout. "
Here are some more Graphics Card articles from around the web:
- Gigabyte GeForce GTX 980Ti Xtreme @ eTeknix
- ASUS GeForce GTX 980 Ti Matrix 6 GB @ techPowerUp
- 4 Weeks with NVIDIA TITAN X SLI at 4K Resolution @ [H]ard|OCP
- NVIDIA GeForce GT 710: Trying NVIDIA's Newest Sub-$50 GPU On Linux @ Phoronix
Shedding a little light on Monday's announcement
Most of our readers should have some familiarity with GameWorks, which is a series of libraries and utilities that help game developers (and others) create software. While many hardware and platform vendors provide samples and frameworks, taking the brunt of the work required to solve complex problems, this is NVIDIA's branding for their suite of technologies. Their hope is that it pushes the industry forward, which in turn drives GPU sales as users see the benefits of upgrading.
This release, GameWorks SDK 3.1, contains three complete features and two “beta” ones. We will start with the first three, each of which target a portion of the lighting and shadowing problem. The last two, which we will discuss at the end, are the experimental ones and fall under the blanket of physics and visual effects.
The first technology is Volumetric Lighting, which simulates the way light scatters off dust in the atmosphere. Game developers have been approximating this effect for a long time. In fact, I remember a particular section of Resident Evil 4 where you walk down a dim hallway that has light rays spilling in from the windows. Gamecube-era graphics could only do so much, though, and certain camera positions show that the effect was just a translucent, one-sided, decorative plane. It was a cheat that was hand-placed by a clever artist.
GameWorks' Volumetric Lighting goes after the same effect, but with a much different implementation. It looks at the generated shadow maps and, using hardware tessellation, extrudes geometry from the unshadowed portions toward the light. These little bits of geometry sum, depending on how deep the volume is, which translates into the required highlight. Also, since it's hardware tessellated, it probably has a smaller impact on performance because the GPU only needs to store enough information to generate the geometry, not store (and update) the geometry data for all possible light shafts themselves -- and it needs to store those shadow maps anyway.
Even though it seemed like this effect was independent of render method, since it basically just adds geometry to the scene, I asked whether it was locked to deferred rendering methods. NVIDIA said that it should be unrelated, as I suspected, which is good for VR. Forward rendering is easier to anti-alias, which makes the uneven pixel distribution (after lens distortion) appear more smooth.
Subject: Graphics Cards | March 11, 2016 - 05:03 PM | Sebastian Peak
Tagged: rumor, report, pascal, nvidia, HBM2, gtx1080, GTX 1080, gtx, GP104, geforce, gddr5x
We are expecting news of the next NVIDIA graphics card this spring, and as usual whenever an announcement is imminent we have started seeing some rumors about the next GeForce card.
(Image credit: NVIDIA)
Pascal is the name we've all being hearing about, and along with this next-gen core we've been expecting HBM2 (second-gen High Bandwidth Memory). This makes today's rumor all the more interesting, as VideoCardz is reporting (via BenchLife) that a card called either the GTX 1080 or GTX 1800 will be announced, using the GP104 GPU core with 8GB of GDDR5X - and not HBM2.
The report also claims that NVIDIA CEO Jen-Hsun Huang will have an announcement for Pascal in April, which leads us to believe a shipping product based on Pascal is finally in the works. Taking in all of the information from the BenchLife report, VideoCardz has created this list to summarize the rumors (taken directly from the source link):
- Pascal launch in April
- GTX 1080/1800 launch in May 27th
- GTX 1080/1800 has GP104 Pascal GPU
- GTX 1080/1800 has 8GB GDDR5X memory
- GTX 1080/1800 has one 8pin power connector
- GTX 1080/1800 has 1x DVI, 1x HDMI, 2x DisplayPort
- First Pascal board with HBM would be GP100 (Big Pascal)
Rumored GTX 1080 Specs (Credit: VideoCardz)
The alleged single 8-pin power connector with this GTX 1080 would place the power limit at 225W, though it could very well require less power. The GTX 980 is only a 165W part, with the GTX 980 Ti rated at 250W.
As always, only time will tell how accurate these rumors are; though VideoCardz points out "BenchLife stories are usually correct", though they are skeptical of the report based on the name GTX 1080 (though this would follow the current naming scheme of GeForce cards).
Podcast #390 - ASUS Z170 Sabertooth Mk1, Corsair Carbide 400C, more about Windows Store Games, and more!
Subject: General Tech | March 10, 2016 - 02:10 PM | Ken Addison
Tagged: podcast, video, asus, z170 sabertooth, corsair, carbide 400c, Windows Store, uwp, dx12, amd, nvidia, directflip, 16.3, 364.47, 364.51, SFX, Seagate, OCP, NVMe
PC Perspective Podcast #390 - 03/10/2016
Join us this week as we discuss the ASUS Z170 Sabertooth Mk1, Corsair Carbide 400C, more about Windows Store Games, and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store (audio only)
- RSS - Subscribe through your regular RSS reader (audio only)
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano
Program length: 1:12:32
Week in Review:
News items of interest:
Hardware/Software Picks of the Week
Ryan: Windows 10 Domain Networking and Shares
Subject: Graphics Cards, Systems | March 10, 2016 - 11:38 AM | Sebastian Peak
Tagged: zotac, zbox, VR, SFF, nvidia, mini-pc, MAGNUS EN980, liquid cooling, GTX980, GTX 980, graphics, gpu, geforce
ZOTAC is teasing a new mini PC "ready for virtual reality" leading up to Cebit 2016, happening later this month. The ZBOX MAGNUS EN980 supplants the EN970 as the most powerful version of ZOTAC's gaming mini systems, and will come equipped with no less than an NVIDIA GeForce GTX 980.
(Image via Guru3D)
Some questions remain ahead of a more formal announcemnent, and foremost among them is the version of the system's GTX 980. Is this the full desktop variant, or the GTX 980m? It seems to be the former, if we can read into the "factory-installed water-cooling solution", especially if that pertains to the GPU. In any case this will easily be the most powerful mini-PC ZOTAC has released, as even the current MAGNUS EN970 doesn't actually ship with a GTX 970 as the name would imply; rather, a GTX 960 handles discrete graphics duties according to the specs.
The MAGNUS EN980's GTX 980 GPU - mobile or not - will make this a formidable gaming system, paired as it is with a 6th-gen Intel Skylake CPU (the specific model was not mentioned in the press release; the current high-end EN970 with dicrete graphics uses the Intel Core i5-5200U). Other details include support for up to four displays via HDMI and DisplayPort, USB 3.0 and 3.1 Type-C inputs, and built-in 802.11ac wireless.
We'll have to wait until Cebit (which runs from March 14 - 18) for more details. Full press release after the break.
Subject: Graphics Cards | March 9, 2016 - 11:55 AM | Scott Michaud
Tagged: nvidia, graphics drivers
The last couple of days were not too great for software patches. Microsoft released a Windows 10 update that breaks 5K monitors, and NVIDIA's driver bug, mentioned in the last post, was bigger than they realized. It turns out that the issue is not isolated to multiple monitors, but rather had something to do with choosing “Express Install” in the setup screen.
In response, NVIDIA has removed 364.47 from their website. For those who want “Game Ready” drivers with games like “The Division,” NVIDIA has provided a 364.51 beta driver that supposedly corrects this issue. People on the forums still claim to have problems with this driver, but nothing has been confirmed yet. It's difficult to tell whether other issues exist with the drivers, whether users are having unrelated issues that are attributed to the drivers, or if it's just a few hoaxes. ((Update on March 9th @ 12:41pm: Still nothing confirmed, but one of our comments claim that they've experienced issues personally.)) If you are concerned, then you can roll back to 362.00.
Fortunately for me, I chose to clean install 364.47 and have not had any issues with them. I asked a representative from NVIDIA on Twitter whether I should upgrade to 364.51, and he said that a few other bugs were fixed but I shouldn't bother.
If you managed to properly install 364.47, then you should be fine staying there.
Subject: Graphics Cards | March 7, 2016 - 06:15 PM | Scott Michaud
Tagged: vulkan, nvidia, graphics drivers, game ready
This new driver for NVIDIA brings Vulkan support to their current, supported branch. This is particularly interesting for me, because the Vulkan branch used to pre-date fixes for Adobe Creative Cloud, which meant that things like “Export As...” in Photoshop CC didn't work and After Effects CC would crash. They are also WHQL-certified, and they roll in all of the “Game Ready” fixes and optimizations that were released since ~October, which would be mostly new for Vulkan's branch.
... This is going to be annoying to temporarily disable...
Speaking of which, GeForce Game Ready 364.47 drivers is classified as “Game Ready” itself. The four titles optimized with this release are: Tom Clancy's The Division, Need For Speed, Hitman, and Ashes of the Singularity. If you are interested in playing those games, then this driver is what NVIDIA recommends that you use.
Note that an installation bug has been reported, however. When installing with multiple monitors, NVIDIA suggests that you disable all but one during the setup process, but you can safely re-enable them after. For me, with four monitors and a fairly meticulous desktop icon layout, this was highly annoying, but something I've had to deal with over time (especially the last two, beta Vulkan drivers). It's probably a good idea to close all applications and screenshot your icons before running the installer.
Subject: Graphics Cards | March 4, 2016 - 04:48 PM | Sebastian Peak
Tagged: PCIe power, PCI Express, nvidia, GTX 950 2G, gtx 950, graphics card, gpu, geforce, asus, 75W
ASUS has released a new version of the GTX 950 called the GTX 950 2G, and the interesting part isn't what's been added, but what was taken away; namely, the PCIe power requirement.
When NVIDIA announced the GTX 950 (which Ryan reviewed here) it carried a TDP of 90W, which prevented it from running without a PCIe power connector. The GTX 950 was (seemingly) the replacement for the GTX 750, which didn't require anything beyond motherboard power via the PCIe slot, and the same held true for the more powerful GTX 750 Ti. Without the need for PCIe power that GTX 750 Ti became our (any many others) default recommendation to turn any PC into a gaming machine (an idea we just happened to cover in depth here).
Here's a look at the specs from ASUS for the GTX 950 2G:
- Graphics Engine: NVIDIA GeForce GTX 950
- Interface: PCI Express 3.0
- Video Memory: GDDR5 2GB
- CUDA Cores: 768
- Memory Clock: 6610 MHz
- Memory Interface: 128-bit
- Engine Clock
- Gaming Mode (Default) - GPU Boost Clock : 1190 MHZ , GPU Base Clock : 1026 MHz
- OC Mode - GPU Boost Clock : 1228 MHZ , GPU Base Clock : 1051 MHz
- Interface: HDMI 2.0, DisplayPort, DVI
- Power Consumption: Up to 75W, no additional PCIe power required
- Dimensions: 8.3 x 4.5 x 1.6 inches
Whether this model has any relation to the rumored "GTX 950 SE/LP" remains to be seen (and other than power, this card appears to have stock GTX 950 specs), but the option of adding in a GPU without concern over power requirements makes this a very attractive upgrade proposition for older builds or OEM PC's, depending on cost.
The full model number is ASUS GTX950-2G,
and a listing is up on Amazon, though seemingly only a placeholder at the moment. (Link removed. The listing was apparently for an existing GTX 950 product.)
Subject: Graphics Cards | March 2, 2016 - 05:30 PM | Sebastian Peak
Tagged: nvidia, geforce, game ready, 362.00 WHQL
The new Far Cry game is out (Far Cry Primal), and for NVIDIA graphics card owners this means a new GeForce Game Ready driver. The 362.00 WHQL certified driver provides “performance optimizations and a SLI profile” for the new game, is now available via GeForce Experience, as well as the manual driver download page.
(Image credit: Ubisoft)
The 362.00 WHQL driver also supports the new Gears of War: Ultimate Edition, which is a remastered version of 2007 PC version of the game that includes Windows 10 only enhancements such as 4k resolution support and unlocked frame rates. (Why these "need" to be Windows 10 exclusives can be explained by checking the name of the game’s publisher: Microsoft Studios.)
(Image credit: Microsoft)
Here’s a list of what’s new in version 362.00 of the driver:
- Added Beta support on GeForce GTX GPUs for external graphics over Thunderbolt 3. GPUs supported include all GTX 900 series, Titan X, and GeForce GTX 750 and 750Ti.
- As of Windows 10 November Update, Fermi GPUs now use WDDM 2.0 in single GPU configurations.
For multi-GPU configurations, WDDM usage is as follows:
- In non-SLI multi-GPU configurations, Fermi GPUs use WDDM 2.0. This includes configurations where a Fermi GPU is used with Kepler or Maxwell GPUs.
- In SLI mode, Fermi GPUs still use WDDM 1.3. Application SLI Profiles
Added or updated the following SLI profiles:
- Assassin's Creed Syndicate - SLI profile changed (with driver code as well) to make the application scale better
- Bless - DirectX 9 SLI profile added, SLI set to SLI-Single
- DayZ - SLI AA and NVIDIA Control Panel AA enhance disabled
- Dungeon Defenders 2 - DirectX 9 SLI profile added
- Elite Dangerous - 64-bit EXE added
- Hard West - DirectX 11 SLI profile added
- Metal Gear Solid V: The Phantom Pain - multiplayer EXE added to profile
- Need for Speed - profile EXEs updated to support trial version of the game
- Plants vs Zombies Garden Warfare 2 - SLI profile added
- Rise of the Tomb Raider - profile added
- Sebastien Loeb Rally Evo - profile updated to match latest app behavior
- Tom Clancy's Rainbow Six: Siege - profile updated to match latest app behavior
- Tom Clancy's The Division - profile added
- XCOM 2 - SLI profile added (including necessary code change)
The "beta support on GeForce GTX GPUs for external graphics over Thunderbolt 3" is certainly interesting addition, and one that could eventually lead to external solutions for notebooks, coming on the heels of AMD teasing their own standardization of external GPUs.
The full release 361 (GeForce 362.00) notes can be viewed here (warning: PDF).
Things are about to get...complicated
Earlier this week, the team behind Ashes of the Singularity released an updated version of its early access game, which updated its features and capabilities. With support for DirectX 11 and DirectX 12, and adding in multiple graphics card support, the game featured a benchmark mode that got quite a lot of attention. We saw stories based on that software posted by Anandtech, Guru3D and ExtremeTech, all of which had varying views on the advantages of one GPU or another.
That isn’t the focus of my editorial here today, though.
Shortly after the initial release, a discussion began around results from the Guru3D story that measured frame time consistency and smoothness with FCAT, a capture based testing methodology much like the Frame Rating process we have here at PC Perspective. In that post on ExtremeTech, Joel Hruska claims that the results and conclusion from Guru3D are wrong because the FCAT capture methods make assumptions on the output matching what the user experience feels like. Maybe everyone is wrong?
First a bit of background: I have been working with Oxide and the Ashes of the Singularity benchmark for a couple of weeks, hoping to get a story that I was happy with and felt was complete, before having to head out the door to Barcelona for the Mobile World Congress. That didn’t happen – such is life with an 8-month old. But, in my time with the benchmark, I found a couple of things that were very interesting, even concerning, that I was working through with the developers.
FCAT overlay as part of the Ashes benchmark
First, the initial implementation of the FCAT overlay, which Oxide should be PRAISED for including since we don’t have and likely won’t have a DX12 universal variant of, was implemented incorrectly, with duplication of color swatches that made the results from capture-based testing inaccurate. I don’t know if Guru3D used that version to do its FCAT testing, but I was able to get some updated EXEs of the game through the developer in order to the overlay working correctly. Once that was corrected, I found yet another problem: an issue of frame presentation order on NVIDIA GPUs that likely has to do with asynchronous shaders. Whether that issue is on the NVIDIA driver side or the game engine side is still being investigated by Oxide, but it’s interesting to note that this problem couldn’t have been found without a proper FCAT implementation.
With all of that under the bridge, I set out to benchmark this latest version of Ashes and DX12 to measure performance across a range of AMD and NVIDIA hardware. The data showed some abnormalities, though. Some results just didn’t make sense in the context of what I was seeing in the game and what the overlay results were indicating. It appeared that Vsync (vertical sync) was working differently than I had seen with any other game on the PC.
For the NVIDIA platform, tested using a GTX 980 Ti, the game seemingly randomly starts up with Vsync on or off, with no clear indicator of what was causing it, despite the in-game settings being set how I wanted them. But the Frame Rating capture data was still working as I expected – just because Vsync is enabled doesn’t mean you can look at the results in capture formats. I have written stories on what Vsync enabled captured data looks like and what it means as far back as April 2013. Obviously, to get the best and most relevant data from Frame Rating, setting vertical sync off is ideal. Running into more frustration than answers, I moved over to an AMD platform.