Report: NVIDIA Preparing GeForce GTX 980MX and 970MX Mobile GPUs

Subject: Graphics Cards | January 19, 2016 - 10:31 AM |
Tagged: rumor, report, nvidia, GTX 980MX, GTX 980M, GTX 970MX, GTX 970M, geforce

NVIDIA is reportedly preparing faster mobile GPUs based on Maxwell, with a GTX 980MX and 970MX on the way.

img-01.jpg

The new GTX 980MX would sit between the GTX 980M and the laptop version of the full GTX 980, with 1664 CUDA cores (compared to 1536 with the 980M), 104 Texture Units (up from the 980M's 96), a 1048 MHz core clock, and up to 8 GB of GDDR5. Memory speed and bandwidth will reportedly be identical to the GTX 980M at 5000 MHz and 160 GB/s respectively, with both GPUs using a 256-bit memory bus.

The GTX 970MX represents a similar upgrade over the existing GTX 970M, with CUDA Core count increased from 1280 to 1408, Texture Units up from 80 to 88, and 8 additional raster devices available (56 vs. 48). Both the 970M and 970MX use 192-bit GDDR5 clocked at 5000 MHz, and available with the same 3 GB or 6 GB of frame buffer.

WCCFtech prepared a chart to demonstrate the differences between NVIDIA's mobile offerings:

Model GeForce GTX 980 Laptop Version GeForce GTX 980MX

GeForce GTX 980M

GeForce GTX 970MX GeForce GTX 970M GeForce GTX 965M

GeForce GTX 960M

Architecture

Maxwell Maxwell Maxwell Maxwell Maxwell Maxwell Maxwell
GPU GM204 GM204 GM204 GM204 GM204 GM204 GM107
CUDA Cores 2048 1664 1536 1408 1280 1024 640
Texture Units 128 104 96 88 80 64 40
Raster Devices 64 64 64 56 48 32 16
Clock Speed 1218 MHz 1048 MHz 1038 MHz 941 MHz 924 MHz 950 MHz 1097 MHz
Memory Bus 256-bit 256-bit 256-bit 192-bit 192-bit 128-bit 128-bit
Frame Buffer 8 GB GDDR5 8/4 GB GDDR5 8/4 GB GDDR5 6/3 GB GDDR5 6/3 GB GDDR5 4 GB GDDR5 4 GB GDDR5
Memory Frequency 7008 MHz 5000 MHz 5000 MHz 5000 MHz 5000 MHz 5000 MHz 5000 MHz
Memory Bandwidth 224 GB/s 160 GB/s 160 GB/s 120 GB/s 120 GB/s 80 GB/s 80 GB/s
TDP ~150W 125W 125W 100W 100W 90W 75W

These new GPUs will reportedly be based on the same Maxwell GM204 core, and TDPs are apparently unchanged at 125W for the GTX 980MX, and 100W for the 970MX.

We will await any official announcement.

Source: WCCFtech

CES 2016: NVIDIA talks SHIELD Updates and VR-Ready Systems

Subject: Graphics Cards, Shows and Expos | January 5, 2016 - 09:39 PM |
Tagged: vr ready, VR, virtual reality, video, Oculus, nvidia, htc, geforce, CES 2016, CES

Other than the in-depth discussion from NVIDIA on the Drive PX 2 and its push into autonomous driving, NVIDIA didn't have much other news to report. We stopped by the suite and got a few updates on SHIELD and the company's VR Ready program to certify systems that meet minimum recommended specifications for a solid VR experience.

For the SHIELD,  NVIDIA is bringing Android 6.0 Marshmallow to the device, with new features like shared storage and the ability to customize the home screen of the Android TV interface. Nothing earth shattering and all of it is part of the 6.0 rollout. 

The VR Ready program from NVIDIA will validate notebooks, systems and graphics cards that have the amount of horsepower to meet the minimum performance levels for a good VR experience. At this point, the specs essentially match up with what Oculus has put forth: a GTX 970 or better on the desktop and a GTX 980 (full, not 980M) on mobile. 

Other than that, Ken and I took in some of the more recent VR demos including Epic's Bullet Train on the final Oculus Rift and Google's Tilt Brush on the latest iteration of the HTC Vive. Those were both incredibly impressive though the Everest demo that simulates a portion of the mountain climb was the one that really made me feel like I was somewhere else.

Check out the video above for more impressions!

Coverage of CES 2016 is brought to you by Logitech!

PC Perspective's CES 2016 coverage is sponsored by Logitech.

Follow all of our coverage of the show at http://pcper.com/ces!

Source: NVIDIA

NVIDIA Updates GeForce Experience with Beta Features

Subject: Graphics Cards | December 16, 2015 - 08:12 AM |
Tagged: nvidia, geforce experience, geforce

A new version of GeForce Experience was published yesterday. It is classified as a beta, which I'm guessing means that they wanted to release it before the Holidays, but they didn't want to have to fix potential, post-launch issues during the break. Thus, release it as a beta so users will just roll back if something doesn't work. On the other hand, NVIDIA is suggesting that it will be a recurring theme with their new "Early Access" program.

It has a few interesting features, though. First, it has a screenshot function that connects with Imgur. Steam's F12 function is pretty good for almost any title, but there are some times that you don't want to register the game with Steam, so a second option is welcome. They also have an overlay to control your stream, rather than just an indicator icon.

nvidia-2015-geforce-experience-streamstuff.png

They added the ability to choose the Twitch Ingest Server, which is the server that the broadcaster connects to and delivers your stream into Twitch's back-end. I haven't used ShadowPlay for a while, but you previously needed to use whatever GeForce Experience chose. If it's not the best connection (ex: across the continent) then you basically had to deal with it. OBS and OBS Studio have these features too of course. I'll be clear: NVIDIA is playing catch-up to open source software in that area.

The last feature to be mentioned might just be the most interesting, though. A while ago, we mentioned that NVIDIA wants to allow online co-op by a GameStream-like service. They now have it, and it's called GameStream Co-op. The viewer can watch, take over your input, or register as a second gamepad. It requires a GeForce GTX 650 (or 660M) or higher.

GeForce Experience Beta is available now.

Source: NVIDIA

Report: NVIDIA To Cut GeForce GTX 900 Series GPU Prices

Subject: Graphics Cards | November 30, 2015 - 03:48 PM |
Tagged: rumor, report, price cut, nvidia, GTX 980, GTX 970, gtx 960, geforce

A report published by TechPowerUp suggests NVIDIA will soon be cutting prices across their existing GeForce lineup, with potential price changes reaching consumers in time for the holiday shopping season.

GeForce-GTX-690-image03.jpg

So what does this report suggest? The GTX 980 drops to $449, the GTX 970 goes to $299, and the GTX 960 goes to $179. These are pretty consistent with some of the sale or post-rebate prices we’ve seen of late, and such a move would certainly even things up somewhat between AMD and NVIDIA with regard to cost. Of course, we could see an answer from AMD in the form of a price reduction from their R9 300-series or Fury/Nano. We can only hope!

We’ve already seen prices come down during various black Friday sales on several GPUs, but the potential for a permanent price cut makes for interesting speculation if nothing else. Not to disparage the source, but no substantive evidence exists to directly point to a plan by NVIDIA to lower their GPU prices for some 900-series cards, but it would make sense given their competition from AMD at various price points.

Here’s to lower prices going forward.

Source: TechPowerUp

NVIDIA Releases Driver 358.91 for Fallout 4, Star Wars Battlefront, Legacy of the Void

Subject: Graphics Cards | November 9, 2015 - 01:44 PM |
Tagged: nvidia, geforce, 358.91, fallout 4, Star Wars, battlefront, starcraft, legacy of the void

It's a huge month for PC gaming with the release of Bethesda's Fallout 4 and EA's Star Wars Battlefront likely to take up hours and hours of your (and my) time in the lead up to the holiday season. NVIDIA just passed over links to its latest "Game Ready" driver, version 358.91.

bethesda-2015-fallout4-official-2.jpg

Fallout 4 is going to be impressive graphically

Here's the blurb from NVIDIA directly:

Continuing to fulfill our commitment to GeForce gamers to have them Game Ready for the top Holiday titles, today we released a new Game Ready driver.  This Game Ready driver will get GeForce Gamers set-up for tomorrow’s release of Fallout 4, as well as Star Wars Battlefront, StarCraft II: Legacy of the Void. WHQLed and ready for the Fallout wasteland, driver version 358.91 will deliver the best experience for GeForce gamers in some of the holiday’s hottest titles.

Other than learning that NVIDIA considers "WHQLed" to be a verb now, this is good news for PC gamers looking to dive into the world of Fallout or take up arms against the Empire on the day of release. I honestly believe that these kinds of software updates and frequent driver improvements timed to major game releases is one of the biggest advantages that GeForce gamers have over Radeon users; though I hold out hope that the red team will get on the same cadence with one Raja Koduri in charge.

You can also find more information from NVIDIA about configuration with its own GPUs for Fallout 4 and for Star Wars Battlefront on GeForce.com.

Source: NVIDIA

NVIDIA Confirms Clock Speed, Power Increases at High Refresh Rates, Promises Fix

Subject: Graphics Cards | November 6, 2015 - 04:05 PM |
Tagged: ROG Swift, refresh rate, pg279q, nvidia, GTX 980 Ti, geforce, asus, 165hz, 144hz

Last month I wrote a story that detailed some odd behavior with NVIDIA's GeForce GTX graphics cards and high refresh rate monitors, in particular with the new ASUS ROG Swift PG279Q that has a rated 165Hz refresh rate. We found that when running this monitor at 144Hz or higher refresh rate, idle clock speeds and power consumption of the graphics card increased dramatically.

The results are much more interesting than I expected! At 60Hz refresh rate, the monitor was drawing just 22.1 watts while the entire testing system was idling at 73.7 watts. (Note: the display was set to its post-calibration brightness of just 31.) Moving up to 100Hz and 120Hz saw very minor increases in power consumption from both the system and monitor.

powerdraw.png

But the jump to 144Hz is much more dramatic – idle system power jumps from 76 watts to almost 134 watts – an increase of 57 watts! Monitor power only increased by 1 watt at that transition though. At 165Hz we see another small increase, bringing the system power up to 137.8 watts.

When running the monitor at 60Hz, 100Hz and even 120Hz, the GPU clock speed sits comfortably at 135MHz. When we increase from 120Hz to 144Hz though, the GPU clock spikes to 885MHz and stays there, even at the Windows desktop. According to GPU-Z the GPU is running at approximately 30% of the maximum TDP.

We put NVIDIA on notice with the story and followed up with emails including more information from other users as well as additional testing completed after the story was posted. The result: NVIDIA has confirmed it exists and has a fix incoming!

In an email we got from NVIDIA PR last night: 

We checked into the observation you highlighted with the newest 165Hz G-SYNC monitors.
 
Guess what? You were right! That new monitor (or you) exposed a bug in the way our GPU was managing clocks for GSYNC and very high refresh rates.
 
As a result of your findings, we are fixing the bug which will lower the operating point of our GPUs back to the same power level for other displays.
 
We’ll have this fixed in an upcoming driver.

This actually supports an oddity we found before: we noticed that the PG279Q at 144Hz refresh was pushing GPU clocks up pretty high while a monitor without G-Sync support at 144Hz did not. We'll see if this addresses the entire gamut of experiences that users have had (and have emailed me about) with high refresh rate displays and power consumption, but at the very least NVIDIA is aware of the problems and working to fix them.

I don't have confirmation of WHEN I'll be able to test out that updated driver, but hopefully it will be soon, so we can confirm the fix works with the displays we have in-house. NVIDIA also hasn't confirmed what the root cause of the problem is - was it related to the clock domains as we had theorized? Maybe not, since this was a G-Sync specific display issue (based on the quote above). I'll try to weasel out the technical reasoning for the bug if we can and update the story later!

NVIDIA Promoting Game Ready Drivers with Giveaway

Subject: Graphics Cards | November 4, 2015 - 09:01 AM |
Tagged: nvidia, graphics drivers, geforce, game ready

In mid-October, NVIDIA announced that Game Ready drivers would only be available through GeForce Experience with a registered email address, which we covered. Users are able to opt-out of NVIDIA's mailing list, though. They said that this would provide early access to new features, chances to win free hardware, and the ability to participate in the driver development process.

nvidia-geforce.png

Today's announcement follows up on the “win free hardware” part. The company will be giving away $100,000 worth of prizes, including graphics cards up to the GeForce GTX 980 Ti, game keys, and SHIELD Android TV boxes. To be eligible, users need to register with GeForce Experience and use it to download the latest Game Ready driver.

Speaking of Game Ready drivers, the main purpose of this blog post is to share the list of November/December games that are in this program. NVIDIA pledges to have optimized drivers for these titles on or before their release date:

  • Assassin's Creed: Syndicate
  • Call of Duty Black Ops III
  • Civilization Online
  • Fallout 4
  • Just Cause 3
  • Monster Hunter Online
  • Overwatch
  • RollerCoaster Tycoon World
  • StarCraft II: Legacy of the Void
  • Star Wars Battlefront
  • Tom Clancy's Rainbow Six Siege
  • War Thunder

As is the case recently, NVIDIA also plans to get every Game Ready driver certified by Microsoft, through Microsoft's WHQL driver certification program.

Source: NVIDIA

Testing GPU Power Draw at Increased Refresh Rates using the ASUS PG279Q

Subject: Graphics Cards, Displays | October 24, 2015 - 04:16 PM |
Tagged: ROG Swift, refresh rate, pg279q, nvidia, GTX 980 Ti, geforce, asus, 165hz, 144hz

In the comments to our recent review of the ASUS ROG Swift PG279Q G-Sync monitor, a commenter by the name of Cyclops pointed me in the direction of an interesting quirk that I hadn’t considered before. According to reports, the higher refresh rates of some panels, including the 165Hz option available on this new monitor, can cause power draw to increase by as much as 100 watts on the system itself. While I did say in the review that the larger power brick ASUS provided with it (compared to last year’s PG278Q model) pointed toward higher power requirements for the display itself, I never thought to measure the system.

To setup a quick test I brought the ASUS ROG Swift PG279Q back to its rightful home in front of our graphics test bed, connected an EVGA GeForce GTX 980 Ti (with GPU driver 358.50) and chained both the PC and the monitor up to separate power monitoring devices. While sitting at a Windows 8.1 desktop I cycled the monitor through different refresh rate options and then recorded the power draw from both meters after 60-90 seconds of time to idle out.

powerdraw.png

The results are much more interesting than I expected! At 60Hz refresh rate, the monitor was drawing just 22.1 watts while the entire testing system was idling at 73.7 watts. (Note: the display was set to its post-calibration brightness of just 31.) Moving up to 100Hz and 120Hz saw very minor increases in power consumption from both the system and monitor.

But the jump to 144Hz is much more dramatic – idle system power jumps from 76 watts to almost 134 watts – an increase of 57 watts! Monitor power only increased by 1 watt at that transition though. At 165Hz we see another small increase, bringing the system power up to 137.8 watts.

Interestingly we did find that the system would repeatedly jump to as much as 200+ watts of idle power draw for 30 seconds at time and then drop back down to the 135-140 watt area for a few minutes. It was repeatable and very measurable.

So, what the hell is going on? A look at GPU-Z clock speeds reveals the source of the power consumption increase.

powerdraw2.png

When running the monitor at 60Hz, 100Hz and even 120Hz, the GPU clock speed sits comfortably at 135MHz. When we increase from 120Hz to 144Hz though, the GPU clock spikes to 885MHz and stays there, even at the Windows desktop. According to GPU-Z the GPU is running at approximately 30% of the maximum TDP.

Though details are sparse, it seems pretty obvious what is going on here. The pixel clock and the GPU clock are connected through the same domain and are not asynchronous. The GPU needs to maintain a certain pixel clock in order to support the required bandwidth of a particular refresh rate, and based on our testing, the idle clock speed of 135MHz doesn’t give the pixel clock enough throughput to power anything more than a 120Hz refresh rate.

refreshsetup.jpg

Pushing refresh rates of 144Hz and higher causes a surprsing increase in power draw

The obvious question here though is why NVIDIA would need to go all the way up to 885MHz in order to support the jump from 120Hz to 144Hz refresh rates. It seems quite extreme and the increased power draw is significant, causing the fans on the EVGA GTX 980 Ti to spin up even while sitting idle at the Windows desktop. NVIDIA is aware of the complication, though it appears that a fix won’t really be in order until an architectural shift is made down the road. With the ability to redesign the clock domains available to them, NVIDIA could design the pixel and GPU clock to be completely asynchronous, increasing one without affecting the other. It’s not a simple process though, especially in a processor this complex. We have seen Intel and AMD correctly and effectively separate clocks in recent years on newer CPU designs.

What happens to a modern AMD GPU like the R9 Fury with a similar test? To find out we connected our same GPU test bed to the ASUS MG279Q, a FreeSync enabled monitor capable of 144 Hz refresh rates, and swapped the GTX 980 Ti for an ASUS R9 Fury STRIX.

powerdrawamd1.png

powerdrawamd2.png

The AMD Fury does not demonstrate the same phenomenon that the GTX 980 Ti does when running at high refresh rates. The Fiji GPU runs at the same static 300MHz clock rate at 60Hz, 120Hz and 144Hz and the power draw on the system only inches up by 2 watts or so. I wasn't able to test 165Hz refresh rates on the AMD setup so it is possible that at that threshold the AMD graphics card would behave differently. It's also true that the NVIDIA Maxwell GPU is running at less than half the clock rate of AMD Fiji in this idle state, and that may account for difference in pixel clocks we are seeing. Still, the NVIDIA platform draws slightly more power at idle than the AMD platform, so advantage AMD here.

For today, know that if you choose to use a 144Hz or even a 165Hz refresh rate on your NVIDIA GeForce GPU you are going to be drawing a bit more power and will be less efficient than expected even just sitting in Windows. I would bet that most gamers willing to buy high end display hardware capable of those speeds won’t be overly concerned with 50-60 watts of additional power draw, but it’s an interesting data point for us to track going forward and to compare AMD and NVIDIA hardware in the future.

NVIDIA Releases 358.50 WHQL Game Ready Drivers

Subject: Graphics Cards | October 7, 2015 - 01:45 PM |
Tagged: opengl es 3.2, nvidia, graphics drivers, geforce

The GeForce Game Ready 358.50 WHQL driver has been released so users can perform their updates before the Star Wars Battlefront beta goes live tomorrow (unless you already received a key). As with every “Game Ready” driver, NVIDIA ensures that the essential performance and stability tweaks are rolled in to this version, and tests it against the title. It is WHQL certified too, which is a recent priority for NVIDIA. Years ago, “Game Ready” drivers were often classified as Beta, but the company now intends to pass their work through Microsoft for a final sniff test.

ea-2015-battlefront.jpg

Another interesting addition to this driver is the inclusion of OpenGL 2015 ARB and OpenGL ES 3.2. To use OpenGL ES 3.2 on the PC, if you want to develop software in it for instance, you needed to use a separate release since it was released at SIGGRAPH. It has now been rolled into the main, public driver. The mobile devs who use their production machines to play Battlefront rejoice, I guess. It might also be useful if developers, for instance at Mozilla or Google, want to create pre-release implementations of future WebGL specs too.

Source: NVIDIA

Microsoft Surface Book 2-in-1 with Skylake with NVIDIA Discrete GPU Announced

Subject: Mobile | October 6, 2015 - 02:38 PM |
Tagged: video, surface book, surface, Skylake, nvidia, microsoft, Intel, geforce

Along with the announcement of the new Surface Pro 4, Microsoft surprised many with the release of the new Surface Book 2-in-1 convertible laptop. Sharing much of the same DNA as the Surface tablet line, the Surface Book adopts a more traditional notebook design while still adding enough to the formula to produce a unique product.

book-3.jpg

The pivotal part of the design (no pun intended) is the new hinge, a "dynamic fulcrum" design that looks great and also (supposedly) will be incredibly strong. The screen / tablet attachment mechanism is called Muscle Wire and promises secure attachment as well as ease of release with a single button.

An interesting aspect of the fulcrum design is that, when closed, the Surface Book screen and keyboard do not actually touch near the hinge. Instead you have a small gap in this area. I'm curious how this will play out in real-world usage - it creates a natural angle for using the screen in its tablet form but also may find itself "catching" coin, pens and other things between the two sections. 

book-5.jpg

The 13.5-in screen has a 3000 x 2000 resolution (3:2 aspect ratio obviously) with a 267 PPI pixel density. Just like the Surface Pro 4, it has a 10-point touch capability and uses the exclusive PixelSense display technology for improved image quality.

While most of the hardware is included in the tablet portion of the device, the keyboard dock includes some surprises of its own. You get a set of two USB 3.0 ports, a full size SD card slot and a proprietary SurfaceConnect port for an add-on dock. But most interestingly you'll find an optional discrete GPU from NVIDIA, an as-yet-undiscovered GeForce GPU with 1GB (??) of memory. I have sent inquiries to Microsoft and NVIDIA for details on the GPU, but haven't heard back yet. We think it is a 30 watt GeForce GPU of some kind (by looking at the power adapter differences) but I'm more interested in how the GPU changes both battery life and performance.

UPDATE: Just got official word from NVIDIA on the GPU, but unfortunately it doesn't tell us much.

The new GPU is a Maxwell based GPU with GDDR5 memory. It was designed to deliver the best performance in ultra-thin form factors such as the Surface Book keyboard dock. Given its unique implementation and design in the keyboard module, it cannot be compared to a traditional 900M series GPU. Contact Microsoft for performance information.

book02.jpg

Keyboard and touchpad performance looks to be impressive as well, with a full glass trackpad integration, backlit keyboard design and "class leading" key switch throw distance.

The Surface Book is powered by Intel Skylake processors, available in both Core i5 and Core i7 options, but does not offer Core m-based or Iris graphics options. Instead the integrated GPU will only be offered with the Intel HD 520.

book-4.jpg

Microsoft promises "up to" 12 hours of battery life on the Surface Book, though that claim was made with the Core i5 / 256GB / 8GB configuration option; no discrete GPU included. 

book-1.jpg

Pricing on the Surface Book starts at $1499 but can reach as high as $2699 with the maximum performance and storage capacity options. 

Source: Microsoft