NVIDIA's new Tesla M40 series

Subject: General Tech | November 11, 2015 - 06:12 PM |
Tagged: nvidia, Tesla M40, neural net, JetsonTX1

There are a lot of colloquialisms tossed about such as AI research and machine learning which refer to the work being done designing neural nets by feeding in huge amounts of data to an architecture capable of forming and weighting connections in an attempt to create a system capable of processing that input in a meaningful way.  You might be familiar with some of the more famous experiments such as Google's Deep Dream and Wolfram's Language
Image Identification Project
.  As you might expect this takes a huge amount of computational power and NVIDA has just announced the Tesla M40 accelerator card for training deep neural nets.  It is fairly low powered at 50-75W of draw and NVIDIA claims it will be able to deal with five times more simultaneous video streams than previous products.  Along with this comes Hyperscale Suite software, specifically designed to work on the new hardware which Jen-Hsun Huang comments on over at The Inquirer.  

At the end of the presentation he also mentioned the tiny Jetson TX1 SoC.  It has 256-core Maxwell GPU capable of 1TFLOPS, a 64-bit ARM A57 CPU, 4GB of memory and communicates via Ethernet or Wi-Fi all on a card 50x87mm (2x3.4)" in size.  It will be available at $300 when released some time early next year.

hyperscale-datacentre-nvidia-540x334.jpeg

"Machine learning is the grand computational challenge of our generation. We created the Tesla hyperscale accelerator line to give machine learning a 10X boost. The time and cost savings to data centres will be significant."

Here is some more Tech News from around the web:

Tech Talk

Source: The Inquirer

NVIDIA Releases Driver 358.91 for Fallout 4, Star Wars Battlefront, Legacy of the Void

Subject: Graphics Cards | November 9, 2015 - 01:44 PM |
Tagged: nvidia, geforce, 358.91, fallout 4, Star Wars, battlefront, starcraft, legacy of the void

It's a huge month for PC gaming with the release of Bethesda's Fallout 4 and EA's Star Wars Battlefront likely to take up hours and hours of your (and my) time in the lead up to the holiday season. NVIDIA just passed over links to its latest "Game Ready" driver, version 358.91.

bethesda-2015-fallout4-official-2.jpg

Fallout 4 is going to be impressive graphically

Here's the blurb from NVIDIA directly:

Continuing to fulfill our commitment to GeForce gamers to have them Game Ready for the top Holiday titles, today we released a new Game Ready driver.  This Game Ready driver will get GeForce Gamers set-up for tomorrow’s release of Fallout 4, as well as Star Wars Battlefront, StarCraft II: Legacy of the Void. WHQLed and ready for the Fallout wasteland, driver version 358.91 will deliver the best experience for GeForce gamers in some of the holiday’s hottest titles.

Other than learning that NVIDIA considers "WHQLed" to be a verb now, this is good news for PC gamers looking to dive into the world of Fallout or take up arms against the Empire on the day of release. I honestly believe that these kinds of software updates and frequent driver improvements timed to major game releases is one of the biggest advantages that GeForce gamers have over Radeon users; though I hold out hope that the red team will get on the same cadence with one Raja Koduri in charge.

You can also find more information from NVIDIA about configuration with its own GPUs for Fallout 4 and for Star Wars Battlefront on GeForce.com.

Source: NVIDIA

NVIDIA Confirms Clock Speed, Power Increases at High Refresh Rates, Promises Fix

Subject: Graphics Cards | November 6, 2015 - 04:05 PM |
Tagged: ROG Swift, refresh rate, pg279q, nvidia, GTX 980 Ti, geforce, asus, 165hz, 144hz

Last month I wrote a story that detailed some odd behavior with NVIDIA's GeForce GTX graphics cards and high refresh rate monitors, in particular with the new ASUS ROG Swift PG279Q that has a rated 165Hz refresh rate. We found that when running this monitor at 144Hz or higher refresh rate, idle clock speeds and power consumption of the graphics card increased dramatically.

The results are much more interesting than I expected! At 60Hz refresh rate, the monitor was drawing just 22.1 watts while the entire testing system was idling at 73.7 watts. (Note: the display was set to its post-calibration brightness of just 31.) Moving up to 100Hz and 120Hz saw very minor increases in power consumption from both the system and monitor.

powerdraw.png

But the jump to 144Hz is much more dramatic – idle system power jumps from 76 watts to almost 134 watts – an increase of 57 watts! Monitor power only increased by 1 watt at that transition though. At 165Hz we see another small increase, bringing the system power up to 137.8 watts.

When running the monitor at 60Hz, 100Hz and even 120Hz, the GPU clock speed sits comfortably at 135MHz. When we increase from 120Hz to 144Hz though, the GPU clock spikes to 885MHz and stays there, even at the Windows desktop. According to GPU-Z the GPU is running at approximately 30% of the maximum TDP.

We put NVIDIA on notice with the story and followed up with emails including more information from other users as well as additional testing completed after the story was posted. The result: NVIDIA has confirmed it exists and has a fix incoming!

In an email we got from NVIDIA PR last night: 

We checked into the observation you highlighted with the newest 165Hz G-SYNC monitors.
 
Guess what? You were right! That new monitor (or you) exposed a bug in the way our GPU was managing clocks for GSYNC and very high refresh rates.
 
As a result of your findings, we are fixing the bug which will lower the operating point of our GPUs back to the same power level for other displays.
 
We’ll have this fixed in an upcoming driver.

This actually supports an oddity we found before: we noticed that the PG279Q at 144Hz refresh was pushing GPU clocks up pretty high while a monitor without G-Sync support at 144Hz did not. We'll see if this addresses the entire gamut of experiences that users have had (and have emailed me about) with high refresh rate displays and power consumption, but at the very least NVIDIA is aware of the problems and working to fix them.

I don't have confirmation of WHEN I'll be able to test out that updated driver, but hopefully it will be soon, so we can confirm the fix works with the displays we have in-house. NVIDIA also hasn't confirmed what the root cause of the problem is - was it related to the clock domains as we had theorized? Maybe not, since this was a G-Sync specific display issue (based on the quote above). I'll try to weasel out the technical reasoning for the bug if we can and update the story later!

NVIDIA Promoting Game Ready Drivers with Giveaway

Subject: Graphics Cards | November 4, 2015 - 09:01 AM |
Tagged: nvidia, graphics drivers, geforce, game ready

In mid-October, NVIDIA announced that Game Ready drivers would only be available through GeForce Experience with a registered email address, which we covered. Users are able to opt-out of NVIDIA's mailing list, though. They said that this would provide early access to new features, chances to win free hardware, and the ability to participate in the driver development process.

nvidia-geforce.png

Today's announcement follows up on the “win free hardware” part. The company will be giving away $100,000 worth of prizes, including graphics cards up to the GeForce GTX 980 Ti, game keys, and SHIELD Android TV boxes. To be eligible, users need to register with GeForce Experience and use it to download the latest Game Ready driver.

Speaking of Game Ready drivers, the main purpose of this blog post is to share the list of November/December games that are in this program. NVIDIA pledges to have optimized drivers for these titles on or before their release date:

  • Assassin's Creed: Syndicate
  • Call of Duty Black Ops III
  • Civilization Online
  • Fallout 4
  • Just Cause 3
  • Monster Hunter Online
  • Overwatch
  • RollerCoaster Tycoon World
  • StarCraft II: Legacy of the Void
  • Star Wars Battlefront
  • Tom Clancy's Rainbow Six Siege
  • War Thunder

As is the case recently, NVIDIA also plans to get every Game Ready driver certified by Microsoft, through Microsoft's WHQL driver certification program.

Source: NVIDIA

Podcast #373 - Samsung 950 Pro, ASUS ROG Swift PG279Q, Steam Link and more!

Subject: General Tech | October 29, 2015 - 03:22 PM |
Tagged: podcast, video, Samsung, 950 PRO, NVMe, asus, ROG Swift, pg279q, g-sync, nvidia, amd, steam, steam link, valve

PC Perspective Podcast #373 - 10/29/2015

Join us this week as we discuss the Samsung 950 Pro, ASUS ROG Swift PG279Q, Steam Link and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Josh Walrath, Allyn Malventano, and Sebastian Peak

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

 

Testing GPU Power Draw at Increased Refresh Rates using the ASUS PG279Q

Subject: Graphics Cards, Displays | October 24, 2015 - 04:16 PM |
Tagged: ROG Swift, refresh rate, pg279q, nvidia, GTX 980 Ti, geforce, asus, 165hz, 144hz

In the comments to our recent review of the ASUS ROG Swift PG279Q G-Sync monitor, a commenter by the name of Cyclops pointed me in the direction of an interesting quirk that I hadn’t considered before. According to reports, the higher refresh rates of some panels, including the 165Hz option available on this new monitor, can cause power draw to increase by as much as 100 watts on the system itself. While I did say in the review that the larger power brick ASUS provided with it (compared to last year’s PG278Q model) pointed toward higher power requirements for the display itself, I never thought to measure the system.

To setup a quick test I brought the ASUS ROG Swift PG279Q back to its rightful home in front of our graphics test bed, connected an EVGA GeForce GTX 980 Ti (with GPU driver 358.50) and chained both the PC and the monitor up to separate power monitoring devices. While sitting at a Windows 8.1 desktop I cycled the monitor through different refresh rate options and then recorded the power draw from both meters after 60-90 seconds of time to idle out.

powerdraw.png

The results are much more interesting than I expected! At 60Hz refresh rate, the monitor was drawing just 22.1 watts while the entire testing system was idling at 73.7 watts. (Note: the display was set to its post-calibration brightness of just 31.) Moving up to 100Hz and 120Hz saw very minor increases in power consumption from both the system and monitor.

But the jump to 144Hz is much more dramatic – idle system power jumps from 76 watts to almost 134 watts – an increase of 57 watts! Monitor power only increased by 1 watt at that transition though. At 165Hz we see another small increase, bringing the system power up to 137.8 watts.

Interestingly we did find that the system would repeatedly jump to as much as 200+ watts of idle power draw for 30 seconds at time and then drop back down to the 135-140 watt area for a few minutes. It was repeatable and very measurable.

So, what the hell is going on? A look at GPU-Z clock speeds reveals the source of the power consumption increase.

powerdraw2.png

When running the monitor at 60Hz, 100Hz and even 120Hz, the GPU clock speed sits comfortably at 135MHz. When we increase from 120Hz to 144Hz though, the GPU clock spikes to 885MHz and stays there, even at the Windows desktop. According to GPU-Z the GPU is running at approximately 30% of the maximum TDP.

Though details are sparse, it seems pretty obvious what is going on here. The pixel clock and the GPU clock are connected through the same domain and are not asynchronous. The GPU needs to maintain a certain pixel clock in order to support the required bandwidth of a particular refresh rate, and based on our testing, the idle clock speed of 135MHz doesn’t give the pixel clock enough throughput to power anything more than a 120Hz refresh rate.

refreshsetup.jpg

Pushing refresh rates of 144Hz and higher causes a surprsing increase in power draw

The obvious question here though is why NVIDIA would need to go all the way up to 885MHz in order to support the jump from 120Hz to 144Hz refresh rates. It seems quite extreme and the increased power draw is significant, causing the fans on the EVGA GTX 980 Ti to spin up even while sitting idle at the Windows desktop. NVIDIA is aware of the complication, though it appears that a fix won’t really be in order until an architectural shift is made down the road. With the ability to redesign the clock domains available to them, NVIDIA could design the pixel and GPU clock to be completely asynchronous, increasing one without affecting the other. It’s not a simple process though, especially in a processor this complex. We have seen Intel and AMD correctly and effectively separate clocks in recent years on newer CPU designs.

What happens to a modern AMD GPU like the R9 Fury with a similar test? To find out we connected our same GPU test bed to the ASUS MG279Q, a FreeSync enabled monitor capable of 144 Hz refresh rates, and swapped the GTX 980 Ti for an ASUS R9 Fury STRIX.

powerdrawamd1.png

powerdrawamd2.png

The AMD Fury does not demonstrate the same phenomenon that the GTX 980 Ti does when running at high refresh rates. The Fiji GPU runs at the same static 300MHz clock rate at 60Hz, 120Hz and 144Hz and the power draw on the system only inches up by 2 watts or so. I wasn't able to test 165Hz refresh rates on the AMD setup so it is possible that at that threshold the AMD graphics card would behave differently. It's also true that the NVIDIA Maxwell GPU is running at less than half the clock rate of AMD Fiji in this idle state, and that may account for difference in pixel clocks we are seeing. Still, the NVIDIA platform draws slightly more power at idle than the AMD platform, so advantage AMD here.

For today, know that if you choose to use a 144Hz or even a 165Hz refresh rate on your NVIDIA GeForce GPU you are going to be drawing a bit more power and will be less efficient than expected even just sitting in Windows. I would bet that most gamers willing to buy high end display hardware capable of those speeds won’t be overly concerned with 50-60 watts of additional power draw, but it’s an interesting data point for us to track going forward and to compare AMD and NVIDIA hardware in the future.

Are NVIDIA and AMD ready for SteamOS?

Subject: Graphics Cards | October 23, 2015 - 03:19 PM |
Tagged: linux, amd, nvidia, steam os

Steam Machines powered by SteamOS are due to hit stores in the coming months and in order to get the best performance you need to make sure that the GPU inside the machine plays nicely with the new OS.  To that end Phoronix has tested 22 GPUs, 15 NVIDIA ranging from a GTX 460 straight through to a TITAN X and seven AMD cards from an HD 6570 through to the new R9 Fury.  Part of the reason they used less AMD cards in the testing stems from driver issues which prevented some models from functioning properly.  They tested Bioshock Infinite, both Metro 2033 games, CS:GO and one of Josh's favourites, DiRT Showdown.  The performance results may not be what you expect and are worth checking out fully.  As well Phoronix put in cost to performance findings, for budget conscious gamers.

image.php_.jpg

"With Steam Machines set to begin shipping next month and SteamOS beginning to interest more gamers as an alternative to Windows for building a living room gaming PC, in this article I've carried out a twenty-two graphics card comparison with various NVIDIA GeForce and AMD Radeon GPUs while testing them on the Debian Linux-based SteamOS 2.0 "Brewmaster" operating system using a variety of Steam Linux games."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: Phoronix
Author:
Subject: Displays
Manufacturer: ASUS

Specifications

It's hard to believe that it has only been 14 months since the release of the first ASUS ROG Swift, the PG278Q, back in August of 2014. It seems like lifetimes have passed, with drama circling around other G-Sync panels, the first release of FreeSync screens, the second geneation of FreeSync panels that greatly improve overdrive. Now, we sit in the middle of the second full wave of G-Sync screens. A lot can happen in this field if you blink.

The PG278Q was easily the best G-Sync monitor on the market for quite a long time. It offered performance, features and quality that very few other monitors could match, and it did it all while including support for NVIDIA's G-Sync variable refresh rate technology. If you are new to VRR tech, and want to learn about G-Sync you can check out our original editorial or an in-depth interview with NVIDIA's Tom Petersen. In short: being able to have a variable refresh rate on a panel match the frame rate of the game prevents Vsync quirks like screen tearing and judder.

icon.jpg

But a lot has changed since ASUS released the PG278Q including the release of other higher quality monitors from the likes of Acer, BenQ and others. ASUS showed off some new G-Sync ready displays at CES but that was way back in January of 2015 - more than 10 months ago! The PG279Q was the most interesting to us then and remains that way today. There are some impressive specifications on the table including a 27-in 2560x1440 screen built on IPS technology, to improve color reproduction and view angles, a 165Hz maximum refresh rate and the best build quality we have seen on a gaming monitor to date.

This time ASUS has a lot more competition to deal with but can the ROG Swift PG279Q real ignite ASUS as the best G-Sync monitor provider? What kind of experience do you get for a $799 monitor today?

Continue reading our review of the ASUS ROG Swift PG279Q 165Hz 2560x1440 27-in IPS G-Sync Monitor!!

Gigabyte GTX 980 WATERFORCE Liquid-Cooled Graphics Card

Subject: Graphics Cards | October 21, 2015 - 07:18 AM |
Tagged: water cooling, nvidia, liquid cooled, GTX 980 WATERFORCE, GTX 980, GPU Water Block, gigabyte, AIO

Gigabyte has announced the GeForce GTX 980 WATERFORCE water-cooled graphics card, and this one is ready to go out of the box thanks to an integrated closed-loop liquid cooler.

wf01.png

In addition to full liquid cooling, the card - model GV-N980WAOC-4GD - also features "GPU Gauntlet Sorting", meaning that each card has a binned GTX 980 core for better overclocking performance.

"The GTX 980 WATERFORCE is fitted with only the top-performing GPU core through the very own GPU Gauntlet Sorting technology that guarantees superior overclocking capabilities in terms of excellent power switching and thermal efficiency. Only the strongest processors survived can be qualified for the GTX 980 WATERFORCE, which can fulfill both gaming enthusiasts’ and overclockers’ expectations with greater overclocking headroom, and higher, stable boost clocks under heavy load."

wf02.png

The cooling system for the GTX 980 WATERFORCE begins with a full-coverage block that cools the GPU, RAM, power delivery, without the need for any additional fan for board components. The tubes carrying liquid to the radiator are 45 cm SFP, which Gigabyte says "effectively prevent...leak(s) and fare a lower coolant evaporation rate", and the system is connected to a 120 mm radiator.

Gigabyte says both the fan and the pump offer low noise output, and claim that this cooling system allows the GTX 980 WATERFORCE to "perform up to 38.8% cooler than the reference cooling" for cool and quiet gaming.

wf03.png

The WATERFORCE card also features two DVI outputs (reference is one dual-link output) in addition to the standard three DisplayPort 1.2 and single HDMI 2.0 outputs of a GTX 980.

Pricing and availability have not been announced.

Source: Gigabyte

NVIDIA Releases Share Beta, Requires GFE for Future Beta Driver Downloads

Subject: Graphics Cards | October 15, 2015 - 12:01 PM |
Tagged: nvidia, geforce experience, beta drivers

NVIDIA just released a new driver, version 358.50, with an updated version of GeForce Experience that brings about some interesting changes to the program. First, let's talk about the positive changes, including beta access to the updated NVIDIA Share utility and improvements in GameStream.

As we detailed first with the release of the GeForce GTX 950, NVIDIA is making some impressive additions to the ShadowPlay portion of GeForce Experience, along with a rename to NVIDIA Share

gfe-14.jpg

The idea is to add functionality to the Shadowplay feature including an in-game overlay to control the settings and options for local recording and even an in-overlay editor and previewer for your videos. This allows the gamer to view, edit, ­snip and then upload those completed videos to YouTube directly, without ever having to leave the game. (Though you’ll obviously want to pause it before going through that process.) Capture and “Instant Replay” support is now capable of 4K / 60 Hz capture and upload as well – nice!

Besides added capability for the local recording portion of Share, NVIDIA is also adding some new features to mix. NVIDIA Share will now allow for point to point stream sharing, giving you the ability to send a link to your friend that they can open in a web browser and watch the game that you are playing with very low latency. You could use this as a way to show your friend that new skill you learned for Rocket League, to try and convince him to pick up his own copy or even just for a social event. It supports voice communication for the ability to talk smack if necessary.

gfe-other.jpg

But it goes beyond just viewing the game – this point to point streaming allows the remote player to take over the controls to teach the local gamer something new or to finish a difficult portion of the game you might be stuck on. And if the game supports local multiplayer, you can BOTH play as the remote gaming session will emulate a second attached Xbox / SHIELD controller to the system! This does have a time limit of 1 hour as a means to persuade game developers and publishers to not throw a hissy-fit.

The demo I saw recently was very impressive and it all worked surprisingly well out of the box.

gfe-5.jpg

Fans of NVIDIA local network GameStream might enjoy the upgrade to support streaming games at 4K 60 FPS - as long as you have an NVIDIA SHIELD Android TV device connected to a 4K capable TV in your home. Clearly this will make the visual presentation of your games on your television more impressive than ever and NVIDIA has added support for 5.1 channel surround sound pass through. 

There is another change coming with this release of GFE that might turn some heads surrounding the frequently updated "Game Ready" drivers NVIDIA puts out for specific game launches. These drivers have been a huge part of NVIDIA's success in recent years as the day one experience for GeForce users has been improved over AMD in many instances. It is vital for drivers and performance to be optimal on the day of a game's release as many enthusiast gamers are the ones going through the preloading process and midnight release timings. 

gfe-13.jpg

Future "Game Ready" drivers will no longer be made available through GeForce.com and instead will ONLY be delivered through GeForce Experience. You'll also be required to have a validated email address to get the downloads for beta drivers - though NVIDIA admitted to me you would be able to opt-out of the mailing list anytime after signing up.

NVIDIA told media that this method of driver release was planning for stuff in the future but gamers would be getting early access to new features, chances to win free hardware and the ability to take part in the driver development process like never before. Honestly though, this is a way to get users to sign up for a marketing mailing list that has some specific purpose going forward. Not all mailing lists are bad obviously (have you signed up for the PC Perspective Live! Mailing List yet?!?) but there is bound to be some raised eyebrows over this.

gfe-4.jpg

NVIDIA says that more than 90% of its driver downloads today already come through GeForce Experience, so changes to the user experience should be minimal. We'll wait to see how the crowd reacts but I imagine once we get past the initial shock of the change over to this system, the roll outs will be fast, clean and simple. But dammit - we fear change.

Source: NVIDIA