Podcast #296 - NVIDIA's 337.50 Driver Improvements, Corsair H105, Intel Haswell Refresh details and more!

Subject: General Tech | April 17, 2014 - 02:58 PM |
Tagged: podcast, video, nvidia, 337.50, corsair, H105, amd, Intel, haswell, devil's canyon

PC Perspective Podcast #296 - 04/17/2014

Join us this week as we discuss NVIDIA's 337.50 Driver Improvements, Corsair H105, Intel Haswell Refresh details and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath and Allyn Malventano

Program length: 1:25:06
 

Be sure to subscribe to the PC Perspective YouTube channel!!

 

 

Author:
Manufacturer: NVIDIA

SLI Testing

Let's see if I can start this story without sounding too much like a broken record when compared to the news post I wrote late last week on the subject of NVIDIA's new 337.50 driver. In March, while attending the Game Developer's Conference to learn about the upcoming DirectX 12 API, I sat down with NVIDIA to talk about changes coming to its graphics driver that would affect current users with shipping DX9, DX10 and DX11 games. 

As I wrote then:

What NVIDIA did want to focus on with us was the significant improvements that have been made on the efficiency and performance of DirectX 11.  When NVIDIA is questioned as to why they didn’t create their Mantle-like API if Microsoft was dragging its feet, they point to the vast improvements possible and made with existing APIs like DX11 and OpenGL. The idea is that rather than spend resources on creating a completely new API that needs to be integrated in a totally unique engine port (see Frostbite, CryEngine, etc.) NVIDIA has instead improved the performance, scaling, and predictability of DirectX 11.

NVIDIA claims that these fixes are not game specific and will improve performance and efficiency for a lot of GeForce users. Even if that is the case, we will only really see these improvements surface in titles that have addressable CPU limits or very low end hardware, similar to how Mantle works today.

In truth, this is something that both NVIDIA and AMD have likely been doing all along but NVIDIA has renewed purpose with the pressure that AMD's Mantle has placed on them, at least from a marketing and PR point of view. It turns out that the driver that starts to implement all of these efficiency changes is the recent 337.50 release and on Friday I wrote up a short story that tested a particularly good example of the performance changes, Total War: Rome II, with a promise to follow up this week with additional hardware and games. (As it turns out, results from Rome II are...an interesting story. More on that on the next page.)

slide1.jpg

Today I will be looking at seemingly random collection of gaming titles, running on some reconfigured test bed we had in the office in an attempt to get some idea of the overall robustness of the 337.50 driver and its advantages over the 335.23 release that came before it. Does NVIDIA have solid ground to stand on when it comes to the capabilities of current APIs over what AMD is offering today?

Continue reading our analysis of the new NVIDIA 337.50 Driver!!

NVIDIA GeForce Driver 337.50 Early Results are Impressive

Subject: Graphics Cards | April 11, 2014 - 03:30 PM |
Tagged: nvidia, geforce, dx11, driver, 337.50

UPDATE: We have put together a much more comprehensive story based on the NVIDIA 337.50 driver that includes more cards and more games while also disputing the Total War: Rome II results seen here. Be sure to read it!!

When I spoke with NVIDIA after the announcement of DirectX 12 at GDC this past March, a lot of the discussion centered around a pending driver release that promised impressive performance advances with current DX11 hardware and DX11 games. 

What NVIDIA did want to focus on with us was the significant improvements that have been made on the efficiency and performance of DirectX 11.  When NVIDIA is questioned as to why they didn’t create their Mantle-like API if Microsoft was dragging its feet, they point to the vast improvements possible and made with existing APIs like DX11 and OpenGL. The idea is that rather than spend resources on creating a completely new API that needs to be integrated in a totally unique engine port (see Frostbite, CryEngine, etc.) NVIDIA has instead improved the performance, scaling, and predictability of DirectX 11.

09.jpg

NVIDIA claims that these fixes are not game specific and will improve performance and efficiency for a lot of GeForce users. Even if that is the case, we will only really see these improvements surface in titles that have addressable CPU limits or very low end hardware, similar to how Mantle works today.

Lofty goals to be sure. This driver was released last week and I immediately wanted to test and verify many of these claims. However, a certain other graphics project kept me occupied most of the week and then a short jaunt to Dallas kept me from the task until yesterday. 

To be clear, I am planning to look at several more games and card configurations next week, but I thought it was worth sharing our first set of results. The test bed in use is the same as our standard GPU reviews.

Test System Setup
CPU Intel Core i7-3960X Sandy Bridge-E
Motherboard ASUS P9X79 Deluxe
Memory Corsair Dominator DDR3-1600 16GB
Hard Drive OCZ Agility 4 256GB SSD
Sound Card On-board
Graphics Card NVIDIA GeForce GTX 780 Ti 3GB
NVIDIA GeForce GTX 770 2GB
Graphics Drivers NVIDIA: 335.23 WHQL, 337.50 Beta
Power Supply Corsair AX1200i
Operating System Windows 8 Pro x64

The most interesting claims from NVIDIA were spikes as high as 70%+ in Total War: Rome II, so I decided to start there. 

First up, let's take a look at the GTX 780 Ti SLI results, the flagship gaming card from NVIDIA.

TWRome2_2560x1440_OFPS.png

TWRome2_2560x1440_PER.png

TWRome2_2560x1440_PLOT.png

With this title, running at the Extreme preset, jumps from an average frame rate of 59 FPS to 88 FPS, an increase of 48%! Frame rate variance does increase a bit with the faster average frame rate but it stays within limits of smoothness, but barely.

Next up, the GeForce GTX 770 SLI results.

TWRome2_2560x1440_OFPS.png

TWRome2_2560x1440_PER.png

TWRome2_2560x1440_PLOT.png

Results here are even more impressive as the pair of GeForce GTX 770 cards running in SLI jump from 29.5 average FPS to 51 FPS, an increase of 72%!! Even better, this occurs without any kind of frame rate variance increase and in fact, the blue line of the 337.50 driver is actually performing better in that perspective.

All of these tests were run with the latest patch on Total War: Rome II and I did specifically ask NVIDIA if there were any differences in the SLI profiles between these two drivers for this game. I was told absolutely not - this just happens to be the poster child example of changes NVIDIA has made with this DX11 efficiency push.

Of course, not all games are going to see performance improvements like this, or even improvements that are measurable at all. Just as we have seen with other driver enhancements over the years, different hardware configurations, image quality settings and even scenes used to test each game will shift the deltas considerably. I can tell you already that based on some results I have (but am holding for my story next week) performance improvements in other games are ranging from <5% up to 35%+. While those aren't reaching the 72% level we saw in Total War: Rome II above, these kinds of experience changes with driver updates are impressive to see.

Even though we are likely looking at the "best case" for NVIDIA's 337.50 driver changes with the Rome II results here, clearly there is merit behind what the company is pushing. We'll have more results next week!

Podcast #295 - AMD Radeon R9 295X2, AMD AM1 Socket SoCs, Building a 1080P Gaming PC for under $550 and more

Subject: General Tech | April 10, 2014 - 02:25 PM |
Tagged: podcast, video, amd, 295x2, AM1, Plextor M6e, nvidia, 337.50, GFE

PC Perspective Podcast #295 - 04/10/2014

Join us this week as we discuss the AMD Radeon R9 295X2, AMD AM1 Socket SoCs, Building a 1080P Gaming PC for under $550 and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath and Allyn Malventano

 
This podcast is brought to you by Coolermaster, and the CM Storm Pulse-R Gaming Headset!
 
Program length: 1:22:06
 
  1. Week in Review:
  2. 0:51:18 This podcast is brought to you by Coolermaster, and the CM Storm Pulse-R Gaming Headset!
  3. News items of interest:
    1. 1:03:10 NAB News
  4. Hardware/Software Picks of the Week:
    1. Jeremy: Move over mineral oil, 3M's Novec
    2. Allyn: For those with too many tabs in Chrome - OneTab
  5. Closing/outro

Be sure to subscribe to the PC Perspective YouTube channel!!

 

 

Congratulations, your SHIELD has been upgraded to the next level

Subject: General Tech | April 9, 2014 - 02:59 PM |
Tagged: gaming, nvidia, shield

If you own NVIDIA's SHIELD then you may have noticed an over the air upgrade notification recently which you should take advantage of.  Legit Reviews have assembled a look at the features you get from this upgrade, from broader access to GameStream from both your SHIELD and from any PC with a modern NVIDIA GPU to a new OS, KitKat 4.4.2.  Check out how well NVIDIA implemented these updates in the full article.

nvidia-shield-deck-6-645x363.jpg

"The NVIDIA SHIELD is potentially the next evolution of mobile gaming. The SHIELD in a nutshell is a blend of mobile android devices, PC gaming, and console gaming. When the NVIDIA SHIELD first came out last year, it was carrying a price tag of $299, eventually that dropped to $249 and that remains the current price for the SHIELD. Though for right now through the end of April NVIDIA has lowered the price to $199 to celebrate the latest and greatest over the air update that we announced last week here."

Here is some more Tech News from around the web:

Gaming

Some NVIDIA R337.50 Driver Controversy

Subject: General Tech, Graphics Cards | April 8, 2014 - 06:44 PM |
Tagged: nvidia, geforce, drivers

NVIDIA's GeForce 337.50 Driver was said to address performance when running DirectX 11-based software. Now that it is out, multiple sources are claiming the vendor-supplied benchmarks are exaggerated or simply untrue.

nvidia-337-sli.png

ExtremeTech compiled benchmarks from Anandtech and BlackHoleTec.

Going alphabetically, Anandtech tested the R337.50 and R331.xx drivers with a GeForce GTX 780 Ti, finding a double-digit increase with BioShock: Infinite and Metro: Last Light and basically zero improvement for GRID 2, Rome II, Crysis: Warhead, Crysis 3, and Company of Heroes 2. Adding a second GTX 780 Ti into the mix helped matters, seeing a 76% increase in Rome II and about 9% in most of the other titles.

BlackHoleTec is next. Testing the mid-range, but overclocked GeForce 760 between R337.50 and R335.23 drivers, they found slight improvements (1-3 FPS), except for Battlefield 4 and Skyrim (the latter is not DX11 to be fair) which noticed a slight reduction in performance (about 1 FPS).

ExtremeTech, finally, published one benchmark but it did not compare between drivers. All it really shows is CPU scaling in AMD GPUs.

Unfortunately, I do not have any benchmarks to present of my own because I am not a GPU reviewer nor do I have a GPU testbed. Ironically, the launch of the Radeon R9 295 X2 video card might have lessened that number of benchmarks available for NVIDIA's driver, who knows?

If it is true, and R337.50 does basically nothing in a setup with one GPU, I am not exactly sure what NVIDIA was hoping to accomplish. Of course someone was going to test it and publish their results. The point of the driver update was apparently to show how having a close relationship with Microsoft can lead you to better PC gaming products now and in the future. That can really only be the story if you have something to show. Now, at least I expect, we will probably see more positive commentary about Mantle - at least when people are not talking about DirectX 12.

If you own a GeForce card, I would still install the new driver though, especially if you have an SLi configuration. Scaling to a second GPU does see measurable improvements with Release 337.50. Even for a single-card configuration, it certainly should not hurt anything.

Source: ExtremeTech

NVIDIA Will Present Global Impact Award And $150,000 Grant To Researchers At GTC 2015

Subject: General Tech | April 8, 2014 - 05:03 PM |
Tagged: research, nvidia, GTC, gpgpu, global impact award

During the GPU Technology Conference last month, NVIDIA introduced a new annual grant called the Global Impact Award. The grant awards $150,000 to researchers using NVIDIA GPUs to research issues with worldwide impact such as disease research, drug design, medical imaging, genome mapping, urban planning, and other "complex social and scientific problems."

NVIDIA Global Impact Award.png

NVIDIA will be presenting the Global Impact Award to the winning researcher or non-profit institution at next year's GPU Technology Conference (GTC 2015). Individual researchers, universities, and non-profit research institutions that are using GPUs as a significant enabling technology in their research are eligible for the grant. Both third party and self-nomiations (.doc form) are accepted with the nominated candidates being evaluated based on several factors including the level of innovation, social impact, and current state of the research and its effectiveness in approaching the problem. Submissions for nominations are due by December 12, 2014 with the finalists being announced by NVIDIA on March 13, 2015. NVIDIA will then reveal the winner of the $150,000 grant at GTC 2015 (April 28, 2015).

The researcher, university, or non-profit firm can be located anywhere in the world, and the grant money can be assigned to a department, initiative, or a single project. The massively parallel nature of modern GPUs makes them ideal for many times of research with scalable projects, and I think the Global Impact Award is a welcome incentive to encourage the use of GPGPU in applicable research projects. I am interested to see what the winner will do with the money and where the research leads.

More information on the Global Impact Award can be found on the NVIDIA website.

Source: NVIDIA

NVIDIA 337.50 Driver and GeForce Experience 2.0 Released

Subject: General Tech, Graphics Cards | April 7, 2014 - 09:01 AM |
Tagged: nvidia, geforce experience, directx 11

We knew that NVIDIA had an impending driver update providing DirectX 11 performance improvements. Launched today, 337.50 still claims significant performance increases over the previous 335.23 version. What was a surprise is GeForce Experience 2.0. This version allows both ShadowPlay and GameStream to operate on notebooks. It also allows ShadowPlay to record, and apparently stream to Twitch, your Windows desktop (but not on notebooks). It also enables Battery Boost, discussed previously.

nvidia-shadowplay-desktop.png

Personally, I find desktop streaming is the headlining feature, although I rarely use laptops (and much less for gaming). This is especially useful for OpenGL, games which run in windowed mode, and if you want to occasionally screencast without paying for Camtasia or tinkering with CamStudio. If I were to make a critique, and of course I will, I would like the option to select which monitor gets recorded. Its current behavior records the primary monitor as far as I can tell.

I should also mention that, in my testing, "shadow recording" is not supported when not recording a fullscreen game. I'm guessing that NVIDIA believes their users would prefer to not record their desktops until manually started and likewise stopped. It seems like it had to have been a conscious decision. It does limit its usefulness in OpenGL or windowed games, however.

This driver also introduces GameStream for devices out of your home discussed in the SHIELD update.

nvidia-337-sli.png

This slide is SLi improvements, driver-to driver, for the GTX 770 and the 780 Ti.

As for the performance boost, NVIDIA claims up to 64% faster performance in configurations with one active GPU and up to 71% faster in SLI. It will obviously vary on a game-by-game and GPU-by-GPU basis. I do not have any benchmarks, besides a few examples provided by NVIDIA, to share. That said, it is a free driver. If you have a GeForce GPU, download it. It does complicate matters if you are deciding between AMD and NVIDIA, however.

Source: NVIDIA

GTC 2014: NVIDIA Launches Iray VCA Networked Rendering Appliance

Subject: General Tech, Graphics Cards | April 1, 2014 - 04:42 PM |
Tagged: VCA, nvidia, GTC 2014

NVIDIA launched a new visual computing appliance called the Iray VCA at the GPU Technology Conference last week. This new piece of enterprise hardware uses full GK 110 graphics cards to accelerate the company’s Iray renderer which is used to create photo realistic models in various design programs.

NVIDIA IRAY VCA.jpg

The Iray VCA specifically is a licensed appliance (hardware + software) that combines NVIDIA hardware and software. On the hardware side of things, the Iray VCA is powered by eight graphics cards, dual processors (unspecified but likely Intel Xeons based on usage in last year’s GRID VCA), 256GB of system RAM, and a 2TB SSD. Networking hardware includes two 10GbE NICs, two 1GbE NICs, and one Infiniband connection. In total, the Iray VCA features 20 CPU cores and 23,040 CUDA cores. The GPUs used are based on the full GK110 die and are paired with 12GB of memory each.

Even better, it is a scalable solution such that companies can add additional Iray VCAs to the network. The appliances reportedly transparently accelerate the Iray accelerated renders done on designer’s workstations. NVIDIA reports that an Iray VCA is approximately 60-times faster than a Quadro K5000-powered workstation. Further, according to NVIDIA, 19 Iray VCAs working together amounts to 1 PetaFLOP of compute performance which is enough to render photo realistic simulations using 1 billion rays with up to hundreds of thousands of bounces.

DSC01431.JPG

The Iray VCA enables some rather impressive real time renders of 3D models with realistic physical properties and lighting. The models are light simulations that use ray tracing, global illumination and other techniques to show photo realistic models using up to billions of rays of light. NVIDIA is positioning the Iray VCA as an alternative to physical prototyping, allowing designers to put together virtual prototypes that can be iterated and changed at significantly less cost and time.

DSC01447.JPG

Iray itself is NVIDIA’s GPU-accelerated photo realistic renderer. The Iray technology is used in a number of design software packages. The Iray VCA is meant to further accelerate that Iray renderer by throwing massive amounts of parallel processing hardware at the resource intensive problem over the network (the Iray VCAs can be installed at a data center or kept on site). Initially the Iray VCA will support 3ds Max, Catia, Bunkspeed, and Maya, but NVIDIA is working on supporting all Iray accelerated software with the VCA hardware.

GTC 2014 IRAY VCA Renders Honda Car Interior In Real Time.jpg

The virtual prototypes can be sliced and examined and can even be placed in real world environments by importing HDR photos. Jen-Hsun Huang demonstrated this by placing Honda’s vehicle model on the GTC stage (virtually).

DSC01450.JPG

In fact, one of NVIDIA’s initial partners with the Iray VCA is Honda. Honda is currently beta testing a cluster of 25 Iray VCAs to refine styling designs for cars and their interiors based on initial artistic work. Honda Research and Development System Engineer Daisuke Ide was quoted by NVIDIA as stating that “Our TOPS tool, which uses NVIDIA Iray on our NVIDIA GPU cluster, enables us to evaluate our original design data as if it were real. This allows us to explore more designs so we can create better designs faster and more affordably.”

The Iray VCA (PDF) will be available this summer for $50,000. The sticker price includes the hardware, Iray license, and the first year of updates and maintenance. This is far from consumer technology, but it is interesting technology that may be used in the design process of your next car or other major purchase.

What do you think about the Iray VCA and NVIDIA's licensed hardware model?

Podcast #293 - NVIDIA Titan-Z, ASUS ROG Poseidon 780, News from OculusVR and more!

Subject: General Tech | March 27, 2014 - 02:42 PM |
Tagged: W9100, video, titan z, poseidon 780, podcast, Oculus, nvidia, GTC, GDC

PC Perspective Podcast #293 - 03/27/2014

Join us this week as we discuss the NVIDIA Titan-Z, ASUS ROG Poseidon 780, News from OculusVR and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath and Allyn Malventano

 
This podcast is brought to you by Coolermaster, and the CM Storm Pulse-R Gaming Headset!
 
Program length: 1:19:03
  1. Week in Review:
    1. 0:10:45 Microsoft's DirectX 12 (Live Blog)
  2. 0:37:07 This podcast is brought to you by Coolermaster, and the CM Storm Pulse-R Gaming Headset
  3. News items of interest:
  4. Hardware/Software Picks of the Week:
    1. Josh: Certainly not a Skype Connection to the Studio
  5. Closing/outro

Be sure to subscribe to the PC Perspective YouTube channel!!