NVIDIA Quietly Releases the GeForce GTX 965M Mobile GPU

Subject: Graphics Cards | January 7, 2015 - 10:51 AM |
Tagged: nvidia, notebook, mobile graphics, mobile gpu, GeForce 965M

With zero fanfare NVIDIA has released a new mobile graphics chip today, the GeForce GTX 965M.

View Full Size

Based on the 28nm Maxwell GM204 core and positioned just below the existing GTX 970M, the new GTX 965M has 1024 CUDA cores (compared to the 970M's 1280) and the new 965M has a lower 128-bit memory interface (vs 192-bit with the 970M). The base clock is slightly faster at 944 MHz (plus unspecified Boost headroom).

Compared with the flagship GTX 980M which boasts 1536 CUDA cores and 256-bit GDDR5 this new GTX 965M will be a significantly lower performer, but NVIDIA is marketing it towards 1080p mobile gaming. At a lower cost to OEMs the 965M should help create some less expensive 1080p gaming notebooks as the new GPU is adopted.

View Full Size

The chip features proprietary NVIDIA Optimus and Battery Boost support, and is GameStream, ShadowPlay, and GameWorks ready.


Specs from NVIDIA:

  • CUDA Cores: 1024
  • Base Clock: 944 MHz + Boost
  • Memory Clock: 2500 MHz
  • Memory Interface: GDDR5
  • Memory Interface Width: 128-bit
  • Memory Bandwidth: 80 GB/sec
  • DirectX API: 12
  • OpenGL: 4.4
  • OpenCL: 1.1
  • Display Resolution: Up to 3840x2160

More information on this new mobile GPU can be found via the source link.

Source: NVIDIA

Video News


January 7, 2015 | 10:57 AM - Posted by nathanddrews

I just looked into it and it seems that none of NVIDIA's mobile chips support G-Sync. Is this a software issue or hardware? Any word from NVIDIA on that? Seems like the mobile environment would be an incredible space for G-Sync for both the generally lower performance (below 60fps) at native resolutions and also for power usage... assuming that refreshing the screen below 60Hz results in lower power consumption.

January 7, 2015 | 11:28 AM - Posted by Anonymous (not verified)

The G-Sync controller hardware is to big for mobile devices.

But the whole idea is possible with eDP 1.4 connectivity. The VESA A-Sync will work with several new notebooks and even some notebooks from the past with a firmware update.

January 7, 2015 | 02:17 PM - Posted by Anonymous (not verified)

Where do you think Nvidia got the idea for G-Sync?. From eDP standards.

In eDP 1.2+ the GPU is the scaler and controls the power to the display.

January 7, 2015 | 11:27 AM - Posted by fade2blac

Would G-Sync only make sense in a less compact laptop chassis? The extra physical space, interconnects, power draw, and perhaps thermal load are all barriers to including the G-Sync module itself. If there is not space behind the display panel, then maybe a manufacturer could do away with an optical drive and re-purpose the space it would otherwise have occupied?

DP Adaptive Sync would seem to be a much simpler and more pragmatic alternative, but then GPU choices are limited. nVidia has decided to officially make no commitment to DP Adaptive Sync and it is likely not to be supported on nVidia parts as a whole. While nVidia is towing the line on G-Sync or bust for now...maybe Adaptive Sync is possible with the hardware and they are just holding back on driver support until the market decides which will catch on? Hopefully, laptop vendors start including displays with scalers that provide DP Adaptive Sync support so all GPU's (even Intel) can have the option of variable refresh through open standards.

January 7, 2015 | 12:11 PM - Posted by Anonymous (not verified)

I thought that some form of frame rate synchronization was already built into laptop LCDs/notebooks, to save on power usage, I have read posts on other forums saying that this is true, for power saving reasons, and that laptops have their own special circuitry, and OEM custom pathways to their laptop's internal LCD. So maybe it's just the DP, and VGA standards at the time the laptop's manufacture, that will determine what a laptop can sync with an external monitor, but internally with the built-in LCD there may be some form of OEM custom adaptive sync going on.

January 7, 2015 | 07:26 PM - Posted by Anonymous (not verified)

Imho I think veriable refresh rate displays have been out for time and, Virtical display synchronization is dependent on how fast the graphics prossesing unit can tranfer data to the display hardware. If freesync can not handle the high data through put that gsync can then the subject is moot and not relevant. As I have a 120Hz non gsync display and a laptop with 120Hz non gsync display and a desktop with asus swift 1440p gsync I can only say nvidia are sucking us consumers dry. If I ever get around to getting another gtx 980 for my desk top and rbuilding it the maybe with 2 gtx 980's I will see some gsync in action but as if stands ftw.

January 7, 2015 | 07:38 PM - Posted by Anonymous (not verified)

intel invented the variable refresh , nvidia milked but congratulations to nvidia for having the vision www.phoronix.com/scan.php?page=news_item&px=MTU0Njg

January 7, 2015 | 10:03 PM - Posted by nathanddrews

Intel did not invent it, but they did innovate upon it. These technologies are all based on pre-existing vertical blanking interval (VBLANK) that has been around for decades and has been utilized in several different ways.

Intel has been exploring VBLANK for power saving - Panel Self Refresh and Dynamic Refresh Rate Switching, for example. NVIDIA was smart enough to apply VBLANK methods to gaming and AMD followed suit.

It's a great technology that is finally getting its opportunity to shine.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.