Review Index:
Feedback

Mobile G-Sync Confirmed and Tested with Leaked Alpha Driver

Manufacturer: NVIDIA

Introduction

It has been an abnormal week for us here at PC Perspective. Our typical review schedule has pretty much flown out the window, and the past seven days have been filled with learning, researching, retesting, and publishing. That might sound like the norm, but in these cases the process was initiated by tips from our readers. Last Saturday (24 Jan), a few things were brewing:

We had to do a bit of triage here of course, as we can only research and write so quickly. Ryan worked the GTX 970 piece as it was the hottest item. I began a few days of research and testing on the 840 EVO slow down issue reappearing on some drives, and we kept tabs on that third thing, which at the time seemed really farfetched. With those two first items taken care of, Ryan shifted his efforts to GTX 970 SLI testing while I shifted my focus to finding out of there was any credence to this G-Sync laptop thing.

A few weeks ago, an ASUS Nordic Support rep inadvertently leaked an interim build of the NVIDIA driver. This was a mobile driver build (version 346.87) focused at their G751 line of laptops. One recipient of this driver link posted it to the ROG forum back on the 20th. A fellow by the name Gamenab, owning the same laptop cited in that thread, presumably stumbled across this driver, tried it out, and was more than likely greeted by this popup after the installation completed:

View Full Size

Now I know what you’re thinking, and it’s probably the same thing anyone would think. How on earth is this possible? To cut a long story short, while the link to the 346.87 driver was removed shortly after being posted to that forum, we managed to get our hands on a copy of it, installed it on the ASUS G751 that we had in for review, and wouldn’t you know it we were greeted by the same popup!

Ok, so it’s a popup, could it be a bug? We checked NVIDIA control panel and the options were consistent with that of a G-Sync connected system. We fired up the pendulum demo and watched the screen carefully, passing the machine around the office to be inspected by all. We then fired up some graphics benchmarks that were well suited to show off the technology (Unigine Heaven, Metro: Last Light, etc), and everything looked great – smooth steady pans with no juddering or tearing to be seen. Ken Addison, our Video Editor and jack of all trades, researched the panel type and found that it was likely capable of 100 Hz refresh. We quickly dug created a custom profile, hit apply, and our 75 Hz G-Sync laptop was instantly transformed into a 100 Hz G-Sync laptop!

Ryan's Note: I think it is important here to point out that we didn't just look at demos and benchmarks for this evaluation but actually looked at real-world gameplay situations. Playing through Metro: Last Light showed very smooth pans and rotation, Assassin's Creed played smoothly as well and flying through Unigine Heaven manually was a great experience. Crysis 3, Battlefield 4, etc. This was NOT just a couple of demos that we ran through - the variable refresh portion of this mobile G-Sync enabled panel was working and working very well.

View Full Size

At this point in our tinkering, we had no idea how or why this was working, but there was no doubt that we were getting a similar experience as we have seen with G-Sync panels. As I digested what was going on, I thought surely this can’t be as good as it seems to be… Let’s find out, shall we?

Continue reading our story on Mobile G-Sync and impressions of our early testing!!


January 30, 2015 | 08:13 PM - Posted by Cyclops

I was a skeptic but I guess NVidia let the cat out of the bag "accidentally". Great investigative work as always guys.

January 30, 2015 | 08:42 PM - Posted by J D (not verified)

All real investigation was done by people who didn't call gamenab a troll while PCperspective gave reviews about him being a liar.

Couple people with ASUS G751 and MSI GT72 IPS LG got GSYNC while others called them liars.

Some of us had to post proofs on youtube multiple times.

January 30, 2015 | 08:56 PM - Posted by Allyn Malventano

We had no issue with, and even named Gamenab as the first to discover this. What I *do* hold issue with, however, is that he tried to pass it off as his invention, making all sorts of claims that the module uses crypto, etc. It was too easy to see straight through his act.

  • His first video attempt showed a game running *windowed*. Anyone with a shred of adaptive sync experience knows that it can only properly work full screen.
  • Simple analysis of his 'modified' driver revealed a .sys file lifted from another driver build.
  • When people started calling him on his sillyness (myself included), he started taking offense and only dug himself deeper with conspiracy theories about how *his* algorithm magically got into an Nvidia driver.
    • ...and that specific driver just happened to be leaked by a tech support rep
    • ...posted about on a forum thread
    • ...relating to the laptop that he just happened to already own.

Seriously, that guy needs to buy a lottery ticket if he was that lucky.

January 30, 2015 | 09:14 PM - Posted by J D (not verified)

Yeah but:
a) those arguments were given in the way that anyone who read them would think that there is not a bit of truth information from him about a possibility of GSYNC without Module;
b) since when we can trust Nvidia's unofficial information? Especially after GTX 970's "official" one? Why don't you AGAIN leave a chance that everything here is kinda deeper?
I don't judge radical unless there is hard evidence. Sometimes it's better to silently analyze to the ground.

My personal opinion: if not gamenab we would never get promise from NV about moduless GSYNC notebooks.

January 30, 2015 | 11:39 PM - Posted by mikelbivins

Agreed Allyn!

June 8, 2015 | 02:43 PM - Posted by J D (not verified)

"His first video attempt showed a game running *windowed. Anyone with a shred of adaptive sync experience knows that it can only properly work full screen.
*Seriously, that guy needs to buy a lottery ticket if he was that lucky."

So... after newest nVIDIA driver... Looks like Gamenab either was correct or he can buy two tickets in a row and still win 2 jackpots.

January 30, 2015 | 09:06 PM - Posted by Cyclops

So taking apart a laptop to it's bare components and putting it back together, contacting NVidia and testing hours on end to figure out what is actually happening is not Investigative work?

Gamenab dude could have come out and said it was just this driver package instead of trying to shove it in our face as if it was his own creation.

January 30, 2015 | 09:18 PM - Posted by J D (not verified)

You don't know the deep story and TIMING, do you?

January 30, 2015 | 09:47 PM - Posted by Cyclops

You're right, it's another conspiracy.

January 31, 2015 | 06:10 AM - Posted by loccothan (not verified)

Look at this funny stuff :D

http://youtu.be/spZJrsssPA0

January 30, 2015 | 08:19 PM - Posted by mLocke

based PC Perspective

January 30, 2015 | 08:32 PM - Posted by Fifthdread

This could have some really interesting implications for Gsync. The fact that Gsync is largely working without the module is curious indeed. Am I paying a 100+ premium for a module that really just guarantees that the monitor works for gsync- and makes it proprietary to Nvidia so it's not Freesync compatible?

January 30, 2015 | 08:43 PM - Posted by nathanddrews

Sounds like the $100 premium is to guarantee that the display doesn't shut off/fade to black when the frame rate gets too low or interrupted. Sounds worth it to me.

Such is the penalty of early adoption... ultimately not that bad.

February 3, 2015 | 10:33 PM - Posted by fade2blac

If a frame rate stall causes the screen to display a black/empty frame, then my first question is why can't the GPU driver be tweaked to simply repeat the last image from the frame buffer?

Supposedly, the display and system perform a sort of handshake to exchange info on the minimum/maximum supported frame rates during initialization. The GPU therefore knows the rate at which it must spit out frames and it can either repeat the last frame (would cause stutter/judder) or try and do some form of predictive/interpolated guesswork to inject an approximated frame (artificial motion enhancement like 120Hz TV's do which is not perfect either, aka. "soap opera" effect).

There is still some potential for variable latency and race conditions if the GPU is actively updating a frame buffer at the same time a frame is needed. Of course, double and triple buffering are well known concepts and should still be just as useful in this scenario.

February 6, 2015 | 08:56 AM - Posted by Anonymous (not verified)

I believe the SOLUTION to this is part of the reason for the G-Sync module for desktop monitors which includes fast memory. This memory holds the last frame and can REPEAT it if needed.

Without the G-Sync module we're talking about a larger DELAY which of course brings its own issues especially at lower frame rates.

January 30, 2015 | 08:33 PM - Posted by Runamok81 (not verified)

Wow! Interesting indeed! Me thinks the dedicated G-Sync module is not long for this world.

January 30, 2015 | 08:34 PM - Posted by Cyclops

Would a G-Sync module consume that much more power than a typical Scalar?

January 30, 2015 | 08:37 PM - Posted by ChangWang

Can't say that I'm surprised.

meanwhile, I'll leave you with this gem I found on techreport
https://www.youtube.com/watch?v=spZJrsssPA0

January 30, 2015 | 08:43 PM - Posted by Allyn Malventano

Oh that is absolutely a gem of a video.

January 30, 2015 | 10:18 PM - Posted by mAxius

i will just leave this here

https://www.youtube.com/watch?v=tNGi06cq_pQ

January 30, 2015 | 08:59 PM - Posted by nathanddrews

OMG that's incredible, I love it.

I still like my 970, but I might try to return it if the 380X comes out soon...

January 30, 2015 | 08:42 PM - Posted by Spacebob (not verified)

Tough week for Nvidia. I would love to listen in on these phone calls.

I'm curious what kind of protocol they are using to talk to the monitor. Perhaps they discovered it the same way AMD did, many notebooks already have VRR monitors to save energy. Looks like Nvidia has found a way to leverage that.

January 30, 2015 | 08:49 PM - Posted by Anonymous (not verified)

I never thought for a moment that Nvidia wasn't going to come up with their own way of doing VRR without a proprietary module, it was just a matter of when.

January 30, 2015 | 08:53 PM - Posted by H1tman_Actua1

Hell yes! now VR just needs to implement this into VR panels and we'll be ready to rock!

January 30, 2015 | 09:05 PM - Posted by Anonymous (not verified)

Nothing new

Didn't AMD demo the first FreeSync on a Toshiba laptop over a year agao.

AMD Demonstrates "FreeSync", Free G-Sync Alternative, at CES 2014
http://www.anandtech.com/show/7641/amd-demonstrates-freesync-free-gsync-...

January 31, 2015 | 02:00 PM - Posted by Anonymous (not verified)

this might be a clue

"Adaptive-Sync is a proven and widely adopted technology. The technology has been a standard component of VESA's embedded DisplayPort (eDP) specification since its initial rollout in 2009. As a result, Adaptive-Sync technology is already incorporated into many of the building block components for displays that rely on eDP for internal video signaling. Newly introduced to the DisplayPort 1.2a specification for external displays, this technology is now formally known as DisplayPort Adaptive-Sync."

Adaptive-Sync has been apart of the embedded displayport for a while now, so its no wonder

so it safe to bet that nvidia has done some driver mojo to enable *FreeSync* like things in their graphics cards

i wonder how old is the GPU in this laptop? if its a new core then its probably been updated to support it

January 30, 2015 | 09:07 PM - Posted by Anonymous (not verified)

How much was Nvidia expecting to charge for this (Free of hardware costs for Nvidia) laptop G-Sync! Maybe for Nvidia just the cost of printing the sticker that says G-Sync enabled, and passing that on at 9999.9999% markup to the Laptops' buyers. First its Memory arithmetic errors 3.5 = 4, and now Free-Sync becomes G-free-sync(free as in free money for Nvidia), add an extra $150 to the BOM, and let the free cash flow, the green team is more about the green these days, than ever before.

January 30, 2015 | 09:08 PM - Posted by Anonymous (not verified)

NGREEDIA LIES AGAIN TO THEIR CUSTOMERS.

Thanks to Freesync and the price difference that will make Gsync monitors NOT competitive with the Freesync ones, Nvidia is getting ready to promote Adaptive Sync under the G-Sync brand.

If you bought a 970 with wrong specs,unfortunately for you the spec that says DP1.2 and not DP1.2a is correct. If you feel something penetrating you AGAIN from behind, don't worry, it's only NGREEDIA.

January 30, 2015 | 09:34 PM - Posted by Anonymous (not verified)

Are you guys talking about this

Truth about the G-sync Marketing Module (NVIDIA using VESA Adaptive Sync Technology – Freesync)
http://gamenab.net/2015/01/26/truth-about-the-g-sync-marketing-module-nv...

This guy got G-Sync to work on selected monitors that don't have a G-Sync Module with a modded driver.

January 31, 2015 | 04:51 PM - Posted by Allyn Malventano

Yes, that is what we are talking about, but that guy just installed a leaked driver on his laptop. That's all he needed to do to 'get it working'. 

January 30, 2015 | 09:49 PM - Posted by db87

G-Sync module out the window! (read: this is SPARTA!!!!)

Finaly Nvidia found the light and decided to use FreeSync.

But it's a shame for Nvidia fanboys that like to be robbed by Nvidia :P

January 30, 2015 | 10:10 PM - Posted by ScepticMatt (not verified)

After reading the article:
"mobile G-Sync" = VESA Adaptive Sync

"Adaptive-Sync is a proven and widely adopted technology. The technology has been a standard component of VESA’s embedded DisplayPort (eDP™) specification since its initial rollout in 2009. As a result, Adaptive-Sync technology is already incorporated into many of the building block components for displays that rely on eDP for internal video signaling. Newly introduced to the DisplayPort 1.2a specification for external displays, this technology is now formally known as DisplayPort Adaptive-Sync.
"

Source: VESA

January 31, 2015 | 03:09 AM - Posted by willmore (not verified)

Yeah, that was my takeaway as well.

January 30, 2015 | 10:30 PM - Posted by ZoA (not verified)

TL:DR G-sync is just an old eDP with some fancy drivers for which nvidia charges 100 to 200$ premium.

January 30, 2015 | 10:36 PM - Posted by Anonymous (not verified)

LOL!!!

This is coming full circle.


[PCPerspective] NVIDIA's take on AMD's under documented free sync

http://www.pcper.com/news/General-Tech/NVIDIAs-take-AMDs-under-documente...

"AMD demoed its "free sync" alternative to G-Sync on laptops. Desktop displays are different, Nvidia says, and they may not support the variable refresh rate tech behind AMD's solution—at least not yet."

Which is sourcing.


[TheTechReport] Nvidia responds to AMD's ''free sync'' demo

http://techreport.com/news/25878/nvidia-responds-to-amd-free-sync-demo

Which quotes Tom Petersen

However, Petersen quickly pointed out an important detail about AMD's "free sync" demo: it was conducted on laptop systems. Laptops, he explained, have a different display architecture than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand.

Which again goes back to what G-Sync is doing.

[AnandTech] NVIDIA G-Sync Review
http://www.anandtech.com/show/7582/nvidia-gsync-review

The first G-Sync module only supports output over DisplayPort 1.2, though there is nothing technically stopping NVIDIA from adding support for HDMI/DVI in future versions. Similarly, the current G-Sync board doesn’t support audio but NVIDIA claims it could be added in future versions (NVIDIA’s thinking here is that most gamers will want something other than speakers integrated into their displays). The final limitation of the first G-Sync implementation is that it can only connect to displays over LVDS. NVIDIA plans on enabling V-by-One support in the next version of the G-Sync module, although there’s nothing stopping it from enabling eDP support as well.

Probably another miscommunication between engineering and marketing for sure.

January 30, 2015 | 10:56 PM - Posted by Anon (not verified)

I was about to dig up that TechReport article myself.

If they have implemented this on laptops in alpha drivers that means VESA Adaptive-Sync support is in development. They probably won't enable it on their GPUs until they sell all of their G-Sync monitor stock but NVidia is clearly developing support for Adaptive-Sync right now.

Tom Petersen suggsted otherwise a week ago on the 960 stream.

https://www.youtube.com/watch?v=HHrSe0DtXHk#t=3247

But he was more open to it than NVidia were back in September when they said they had "no plans" to support Adaptive-Sync.

http://www.pcper.com/news/Graphics-Cards/NVIDIA-Confirms-It-Has-No-Plans...

January 30, 2015 | 10:50 PM - Posted by Anonymous (not verified)

I don't get all the hate for this company all of a sudden. Yeah they screwed up with 970, yeah they are profit oriented. So are all companies. I think it's proven that the module HAS a function, even if the core functionality is possible without. Can't we all be happy about the possibility of reaching a standard soon, which won't bind our monitor choice to the brand of our graphics chip? Can't we acknowledge the fact that nvidia brought VRR to market one year before Amd, that technology has moved on and will continue to move in a direction which will make things possible which weren't one year ago, I. E. VRR without a module?
Every day I get sadder about all this hate in the interwebs. :(

January 30, 2015 | 11:38 PM - Posted by mikelbivins

A big part of it is that they sell folks expensive products as the greatest technology ever, and then a month or so later, something much better comes out and people feel left behind and at a high cost. That's a big part of it in my opinion. But yes, if you understand technology and technology companies, this is how it is and this effect will continue to snowball (the better tech we have, the faster they can create better tech).

February 1, 2015 | 12:08 PM - Posted by Anonymous (not verified)

Well module free G-sync is not new OR better, just a new form of implementation - which BTW does not have EVERY feature or even as good a UX as a G-sync module enabled setup does.

This is just a natural progression as DP 1.2a allows for some of the G-sync features to be enabled, but not ALL! I know I sure as HELL am glad my monitor doesn't go black from an FPS dip.

And the simple truth remains, with G-sync Nvidia has brought adaptive-sync capabilities to all of us well over a year before the FIRST "free sync" parts become available! As they still are completely unavailable IIRC!

January 31, 2015 | 06:24 AM - Posted by Heavy (not verified)

im starting to hate nvidia too they make great cards but damn it looks like they care more about the bottom line then the gamer community.AMD spent more time finding away to make VRR for free and so anyone can use it while nvidia charged us a couple hundread dollars.Im pretty sure Nvidia will still not take advantage of free sync even though they can do it just too still charge us money.Im switching to the red team next gen just because i see AMD as a company that cares about the gamers spending time and money for free sync and mantle just so we can player better without paying a huge sum of money.

February 1, 2015 | 12:16 PM - Posted by Anonymous (not verified)

AMD didn't spend "time" doing JACK SH!T! They sat around flapping til DP 1.2a became a STANDARD (as in most monitors and video cards use it anymore) - and AMD did NOTHING other than re-brand adaptive v-sync!

Nvidia at least had the ingenuity to goe their OWN path and create adaptive V-sync parts for consumers over a YEAR before AMD simply just made use of AN EXISTING STANDARD!

And now it seems Nvidia is going to make possible PART of the G-sync experience free for DP 1.2a (and even potentially HDMI and DVI eventually) - I see NOTHING but crap from AMD, yet the fanboys STILL find ways to justify the tons of carbon created by their inefficient power hungry PC parts (ALL of them!) - while STILL lagging behind a chip Intel made THREE YEARS AGO!

February 3, 2015 | 03:18 AM - Posted by 200380051 (not verified)

"The DisplayPort™ Adaptive-Sync specification was ported from the Embedded DisplayPort™ specification through a proposal to the VESA group by AMD."
http://www.guru3d.com/news-story/vesa-adds-adaptive-sync-to-displayport-...

They put together a proposal (eDP port to DP) and submitted it to VESA. They did their research part. They chose to move the industry forward with it.

GG monitor refresh rates. Stutter, tearing...gone are the days you had to pick your poison. Now you really only need to pick red or green. And a new monitor that suits.

February 1, 2015 | 04:42 AM - Posted by kn00tcn

oh there's hate alright, for YEARS

the way its meant to be played came when? during the geforce FX series failure, they put a spin on crap

proprietary physx? cuda? gameworks originally without source code so devs arent allowed to optimize themselves?

blocking the AA setting from batman if you dont use nvidia, even though it works perfectly fine when you spoof your other brand as nvidia

blocking gpu physx if your primary card is a different brand, even though there was a period where it was allowed, & then there was a period of people modding the drivers to re-enable gpu physx for your secondary nvidia card

the class action lawsuit about the failing/desoldering mobile gpus

this isnt about traditional 'make a good product, price if it's premium, profit', nvidia consistently tries to sabotage, spin, or hide things (which doesnt make sense, their hardware & drivers are competitive enough, there's no need to be a douche)

& after all that... just get what you can use, i just bought a 660 because i need a driver for vista & amd stopped releasing them over a year ago

my laptop is a 570m because dont touch amd based laptops

gaming is a luxury anyway... as for the internet (i do not recommend you use the word 'interwebs'), of course it's going to be amplified hysteria so try not to read too much of it, just stick to actual tests, make decisions for yourself, etc

February 1, 2015 | 12:35 PM - Posted by Anonymous (not verified)

LMFAO! Look at a PhysX enabled game, then compare the SAME game with PhysX off! HUGE difference! AMD fanboy jackoffs love to TRY to bash Intel and Nvidia separately - but they are BOTH smart enough to know their own places, Intel makes CPUs Nvidia makes GPUs.

The LAST time a Radeon card had a ACTUAL edge over Nvidia was back when they were still ATI! Because ATI were their own company they could afford to make a brand new architecture in the 9700 - and yes, that card and many subsequent variations thereof had Nvidia scrambling in the FX series to try to gain headroom, which they did - by the FX 8000 series.

You say PhysX is stupid, well where the fuck is Mantle? Oh, 3 games? Its been a banner year for AMD adoption!

And PhysX + DX12 will have Mantle DOA! AMD is like a drowning puppy, sad as hell to watch, waves battering them as with each paddle towards shore they release another POINTLESS SKU or space heater GPU (which has similar performance to an equal priced Nvidia card, just takes twice or more Wattage to power!) I used to love AMD, but ever since they swallowed ATI, they have been a sinking ship!

Over three years since SandyBridge was released and AMD still has NO consumer CPU that (for all intents and purposes, apart from EXTREMELY multi threaded workloads - which is more of an admirable trait in a Workstation or Server CPU) - can even get CLOSE to the performance of an i7-2600k, even at stock speeds it has better per-core speed than the stock 4.7 GHz clocked 9590, and that is without counting that ANY i7-2600K can EASILY hit at least 4.3 GHz with a dinky stock cooler (where the 9590 come with a fucking water block STOCK, and still cannot overclock!) - and can EASILY hit 5 GHz with just a Cooler Master Hyper 212 EVO!

So go and TRY to bash Nvidia (or Intel) truth is, they are BOTH doing FAR better than AMD for good reason! Ooooo boy, Nvidia got their memory specs wrong for the 970 - that means AMD is DEFINITELY the better choice! You flapping AMD fanboys are HILARIOUS!

February 1, 2015 | 10:26 PM - Posted by kn00tcn

dumbass, i bought nvidia, apparently you cant see in your blind rage

i said nvidia has good products, yet why do they keep doing anti competitive tactics when the products are very competitive alone

wtf do cpus have to do with anything? why dont you just continue jerking off in your corner

February 4, 2015 | 07:30 PM - Posted by Anonymous (not verified)

I have to agree with kn00tcn. I've always preferred Nvidia because my first laptop had their GPU and I was able to play Far Cry 1 despite being below the minimum specs. Good first experience.

But I take a big issue with their anti-competitive decisions. Currently using a 750 Ti and it is great! I am not sure my next GPU will be from Nvidia though. Depends on how they act. I want to support competition, not a company who lies and bullies.

February 1, 2015 | 12:35 PM - Posted by Anonymous (not verified)

LMFAO! Look at a PhysX enabled game, then compare the SAME game with PhysX off! HUGE difference! AMD fanboy jackoffs love to TRY to bash Intel and Nvidia separately - but they are BOTH smart enough to know their own places, Intel makes CPUs Nvidia makes GPUs.

The LAST time a Radeon card had a ACTUAL edge over Nvidia was back when they were still ATI! Because ATI were their own company they could afford to make a brand new architecture in the 9700 - and yes, that card and many subsequent variations thereof had Nvidia scrambling in the FX series to try to gain headroom, which they did - by the FX 8000 series.

You say PhysX is stupid, well where the fuck is Mantle? Oh, 3 games? Its been a banner year for AMD adoption!

And PhysX + DX12 will have Mantle DOA! AMD is like a drowning puppy, sad as hell to watch, waves battering them as with each paddle towards shore they release another POINTLESS SKU or space heater GPU (which has similar performance to an equal priced Nvidia card, just takes twice or more Wattage to power!) I used to love AMD, but ever since they swallowed ATI, they have been a sinking ship!

Over three years since SandyBridge was released and AMD still has NO consumer CPU that (for all intents and purposes, apart from EXTREMELY multi threaded workloads - which is more of an admirable trait in a Workstation or Server CPU) - can even get CLOSE to the performance of an i7-2600k, even at stock speeds it has better per-core speed than the stock 4.7 GHz clocked 9590, and that is without counting that ANY i7-2600K can EASILY hit at least 4.3 GHz with a dinky stock cooler (where the 9590 come with a fucking water block STOCK, and still cannot overclock!) - and can EASILY hit 5 GHz with just a Cooler Master Hyper 212 EVO!

So go and TRY to bash Nvidia (or Intel) truth is, they are BOTH doing FAR better than AMD for good reason! Ooooo boy, Nvidia got their memory specs wrong for the 970 - that means AMD is DEFINITELY the better choice! You flapping AMD fanboys are HILARIOUS!

January 30, 2015 | 11:33 PM - Posted by mikelbivins

So in the future, they say they don't know (module, no module, different module), I'll give them that. But, I believe it is fair to say that they know (especially with this adaptive sync spec coming) in the future's future the answer is NO SPECIAL NVIDIA MODULE will be required to enjoy this variable refresh technology. Unfortunately, I feel they would never admit to knowing that way because there is still money to be made with the "exclusivity of Nvidia modules" in whatever form they come in in the next generation, if that makes any sense. I just wish companies were honest, that's all. That's why we need folks like you PCPer to keep them honest as possible and give us info we need.

January 31, 2015 | 12:58 AM - Posted by J D (not verified)

GSYNC is working on MSI GT72 laptop too with IPS LG screen 9xxM GPU.
Confirmed 2 days ago the first time and many more after that.

You need to use nvlddmkm.sys file from 346.87 driver or just use gamenab's one.
Gamenab told since the vary beginning that it's his algorithm is in Nvidia's driver. Believe him or no is everybody's choice.

February 4, 2015 | 07:23 PM - Posted by Anonymous (not verified)

I don't believe him. I find it hard to believe some random Joe was able to write a algorithm that is not only compatible with the screen in that laptop, but also able communicate with Nvidia's drivers without issue. Nvidia makes awesome stuff, but everything is proprietary.

January 31, 2015 | 02:04 AM - Posted by General Lee (not verified)

Just tested the driver on my G71JY. Yup, G-Sync popped up. The panel refresh rate itself can be overclocked up to 100 Hz, which is rather exceptional for an IPS panel. I first heard it could do 100 Hz at ROG forums, and later on with newer Nvidia drivers it actually does default at 75 Hz. I wonder if they were actually trying to get it 120 Hz and it didn't pan out...

January 31, 2015 | 07:54 AM - Posted by Anonymous (not verified)

shocking, "journalists" shilling for nvidia again while random curious user has to find out that nvidia is lying to customers again about gsync only available with some overpriced proprietary module, exactly as random user has to find out that his 970 doesnt have advertised vram

meanwhile, exactly like it was with gtx970 3,5gb vram, said user is being harassed, belittled, insulted and called names by nvidia astroturfers, fanboys and even "journalists" , while doing unbiased and independent research that will provide benefits to customers

January 31, 2015 | 08:05 AM - Posted by JohnGR

Don't forget the old good 257 Nvidia driver that was leaked without PhysX/CUDA lock and was working beautifully with AMD primary and Nvidia secondary card without the need for a patch.

This company really needs a boycottage to come at it's senses. Even Intel that is an almost monopoly treats it's customers with more respect. Probably even Apple.

January 31, 2015 | 08:01 AM - Posted by JohnGR

Nvidia's fanboys having nightmares this week. HAHAHA

January 31, 2015 | 09:00 AM - Posted by Anonymous (not verified)

Geezus...
Some of the commenters here can't have read more than a few words of the article...
Protip... Read the whole thing... Understand the whole thing... Then comment...

January 31, 2015 | 09:47 AM - Posted by Lithium

Some serious stupidity and bad intentions circulating around here.

For eDP equipped laptops G-Sync module is not needed.
Gamenab/whoever just discovered hot water.

Year ago there wasn't such tech for DESKTOPS.
Module was made just to make it possible on DESKTOPS in moment when scalers doesn't support VESA eDP.

True is that you all should thank NVIDIA to finally show this very important technology and made VESA and monitor scaler chip makers to DO SOMETHING.

Free-sync is made of air and with zero work from AMD side.
Its just few year old VESA standard that no one actually use on PCs.

January 31, 2015 | 12:30 PM - Posted by JohnGR

Thank you for the first line in your post. It summarizes perfectly the rest of it.

January 31, 2015 | 04:56 PM - Posted by Allyn Malventano

...and yet everything he said appears completely accurate from simple observations.

January 31, 2015 | 06:35 PM - Posted by SikSlayer

Of course. There's a lot of very clear bias going on with most folks calling foul on all this. The Tech Report and AnandTech quotes (as well as from PCPer) from the Anon commenter above makes it all the more obvious that this was something they were gonna do anyway, likely sans module.

February 1, 2015 | 08:25 AM - Posted by JohnGR

So the people who comment at PCPerspective are stupid with bad intentions. I think this comment should go directly on the first page and stay there for everyone to see it.

Also don't forget to send a letter at AMD saying them that their Freesync is "made of air and with zero work from their side". Because you agree with EVERYTHING in his post.

February 3, 2015 | 05:20 AM - Posted by Allyn Malventano

I'm sorry to say this, but without a shipping product, FreeSync is still technically vaporware (made of air).

February 3, 2015 | 08:59 AM - Posted by JohnGR

He didn't meant that, but anyway. Maybe you should also send a mail to Samsung and ask them about their vaporware 4Ks that they announced for March.

Don't let the Nvidia fan in you, beat the journalist in you.

February 3, 2015 | 07:11 PM - Posted by Allyn Malventano

That's not a variable refresh panel. Please stop grasping at straws. You're coming across as desperate.

February 3, 2015 | 10:09 PM - Posted by JohnGR

I am the desperate. OK, If you say so.

January 31, 2015 | 11:35 AM - Posted by Anonymous (not verified)

you dont get it...
nvidia said that they will not use at any form or anything from adaptive sync
this is why they have the g sync module a premium module designed to steal more money from fanboys

but that driver reveals something else..
if the monitor is an adaptive sync ready one and nvidia is using it under the name of g sync this could lead to some serious thor's hammer comming down to nvidia head from vesa

and lol @ nvidia helping vesa while bashing them in the process
dp1.2a wether it is laptop or not it doesnt change the bare emc of the them...
im sure those news dp1.2a desktops monitors will work just fine when someone port that driver to them..rendering that overly priced piece of nothing g sync obsolete (well it was since vesa implented adaptive sync as a standar but anyway)

January 31, 2015 | 12:08 PM - Posted by Anonymous (not verified)

The G-Sync module provides hardware triple buffering (this is the reason of the good amount of RAM in the module), that reduces the latency in the panel and improves the timeframe in games without support of triple buffering (99%?).

The version of G-Sync for laptops is equal to Freesync, the desktop version with the G-sync module is slighty better than both.

February 1, 2015 | 11:40 AM - Posted by ScepticMatt (not verified)

"triple buffering ... that reduces the latency"

You couldn't be ... more wrong on that one.

Buffering causes delay. Triple buffering = 3 frames extra delay (50 ms @ 60 fps)
There is a reason why Oculus VR and competitive gamers use as little buffering as possible.

February 1, 2015 | 03:53 PM - Posted by Allyn Malventano

Many people do not have a proper understanding of triple buffering. It is just not the same thing as 'buffering three frames ahead' which many falsely believe. It can be done without added latency. As a matter of fact, triple buffering results in *reduced* latency and higher FPS when operating with VSYNC - having only two frame buffers forces the GPU to wait until a draw to complete before it can start working on the next frame (because it is already one frame ahead).

February 2, 2015 | 09:59 PM - Posted by Anonymous (not verified)

You appear to confuse "frame rate" and "latency".

Frame rate is the number of frames that GPU can render per given time period, typically one second.
Latency is the time differential between frame being drawn and frame being displayed.

Triple buffering increases latency over double buffering or single buffering but may increase frame rate stability. It's main goal is to improve frame rate (as you state) by "stabilizing" frame rate in scenes that have very different frame rate between frames, i.e. quickly drawn frame followed by extremely particle heavy slow frame for example. In case of double buffering, the slow frame may still be rendered by the time that frame buffer is exhausted and you get a massive reduction in frame rate for that moment while display buffer has to wait for the slow frame to be drawn before it can be pushed to the display. With triple buffering it has one more frame of buffer to push to display before it has to wait.

So yes, triple buffering can increase FPS. However it does this at a cost of also increasing latency. That is how triple buffering inherently works in directx.

The other implementation, specifically page flipping which enables GPU to pick which frame to display which does indeed in some rare cases allow to reduce latency is not available in directx and therefore largely irrelevant for windows gaming which is overwhelmingly directx based.

February 3, 2015 | 05:24 AM - Posted by Allyn Malventano

What you are referring to is not 'real' triple buffering. Some games do not implement this correctly, resulting in increased latency (usually to a very noticeable degree). That is not what I was talking about, and those games that do it wrong are the exact reason for the misnomer. 

February 3, 2015 | 09:50 AM - Posted by Anonymous (not verified)

You are arguing against implementation in directx itself, which is the overwhelming industry leader right now. It's done like this because it reduces framerate instability and microstutter that can be caused by page flipping with triple buffering in scenarios where frames have significantly different rendering cost for GPU.

Like Don Quixote fighting against wind mills.

January 31, 2015 | 12:08 PM - Posted by RS84 (not verified)

COnclusion :

PhySick : DRM
G-Spot : DRM
FUXR : DRM
TeXAA : DRM
0.5G : DRM

Lol...

January 31, 2015 | 12:11 PM - Posted by Anonymous (not verified)

"Obviously we went straight to NVIDIA..."

Obviously you go straight to nvidia before you write anything...

Guys remember that all the shills and fanboys stating that nvidia created new feature and deserves all the millions for royalties and proprietary hardware? Well, looks like that "new feature" is 4 year old vesa embedded display port spec after all. Ofc with cooler sounding name that surely took alot of time and money to think it up and customer robbing proprietary scaler. :)

January 31, 2015 | 01:25 PM - Posted by Anonymous (not verified)

If you want to find out and confirm what you think you're seeing, then yes, you do go straight to Nvidia. That doesn't make them shills, it makes them thorough.

Allyn, Ryan & Co., congrats guys, nice feature.

Also, how dare a company try to make money on a new feature. The "Freesync" reveal was such a blatant "us too! us too!" maneuver that AMD forgot, as is their wont, that they need to make money to survive as a company. They shifted the profits of the adaptive sync technology over to the monitor manufacturers. I don't blame Nvidia for not wanting to get involved in AMD's race to the bottom.

Otherwise, I agree that this tech should have been out years ago.

January 31, 2015 | 04:59 PM - Posted by Allyn Malventano

I had my portion of the article (the first two pages and most of page three) *before* we had our call with Nvidia. That call was the last thing that happened before we published. We needed their statements for the article. It's called journalistic integrity. 

January 31, 2015 | 07:10 PM - Posted by Anonymous (not verified)

I think you mistaken by the definition of journalistic integrity.

1) Truthfulness - limited to a narrow scope of article being written

2) Accuracy - Again narrowed to fit article being written

3) Objectivity - Limited to Nvidia sources

4) Impartiality - None favored Nvidia sources

5) Fairness - None favored Nvidia sources

6) Public Accountability - None, Even when articles you shown lead directly to counter the original article.

I suggest your read these.

Journalism ethics and standards
http://en.wikipedia.org/wiki/Journalism_ethics_and_standards

January 31, 2015 | 07:30 PM - Posted by Anonymous (not verified)

I hope this is a troll. If not, you're mistaken.

Their "source" is not nvidia, but theirs and others own work in finding out how this laptop came to be able to run G-sync. Furthermore, who else are they going to contact?

They were running the story whether or not Nvidia commented on it. It's a journalistic practice to give someone the opportunity to comment before publishing something, especially if it could be damaging to them. That's how you get all sides of a story to present to a reader so they can make their own determination. Nvidia responded with additional information, because they were forced to do so, or look incredibly foolish.

January 31, 2015 | 02:38 PM - Posted by Pohzzer (not verified)

Appears to me Nvidia 'jumped the gun' with G-Sync, knowing the new VESA standard was in the works, to establish G-Sync monitors were worth a premium and don't work with AMD cards so in the future they can implement their version of free sync on 1.2a compliant 'G-Sync' branded monitors that includes a cheap chip blocking AMD cards from accessing the adaptive sync feature in an attempt to lock those monitor buyers into Nvidia's GPUs.

Would be a classic JHH~Nvidia dick move - if they think they can get away with it.

January 31, 2015 | 07:29 PM - Posted by Fatcatsandfacts (not verified)

so it's basically going to be AdaptiveSync branded as FreeSync by AMD and Gsync by Nvidia... hopefully, having both vendors supporting adaptive refresh rate on the same monitor is the best case scenario.

January 31, 2015 | 07:34 PM - Posted by Aurhinius (not verified)

Perhaps it is time to get Tom Peterson back for a Q&A about the 970 memory issues and now Gsync functionality without a gsync module.

Lets see if they want to have some screen time when there isn't a product launch. I don't want to be cynical but I doubt they'll oblige you. Though I would have a lot of respect if they stepped up in difficult times.

I think some serious questions and answers need to come and they need to be more upfront about what the gsync modules purpose is over and above adaptive vsync. They need to stop avoiding things and offer a bit more disclosure. They've patented the module I am sure so why hide it's full functionality and why not answer out right if it is required?

January 31, 2015 | 07:42 PM - Posted by Anonymous (not verified)

JHH to Nvidia's customers: ┌∩┐(◣_◢)┌∩┐ you, I Have All your cash, so You can't have ┌∩┐(◣_◢)┌∩┐ from Nvidia, without paying a whole ┌∩┐(◣_◢)┌∩┐ load of cash, and enjoy your 3.5 GB of ┌∩┐(◣_◢)┌∩┐ up memory, and G-Stink!

February 1, 2015 | 01:11 AM - Posted by Anonymous (not verified)

Dat Gsync module scam ! :D

February 1, 2015 | 01:50 AM - Posted by apprenticegamer

This makes it hard to believe the 970 story of accidental miscommunication between marketing and engineering story, maybe Nvidia just likes to talk out of the side of their mouths.

February 1, 2015 | 02:52 PM - Posted by Anonymous (not verified)

all i see lot of hitching in the video

February 1, 2015 | 03:58 PM - Posted by Allyn Malventano

That is because you are watching a 30 FPS video of a display refreshing at 50 FPS. Smooth playback is not possible in the recorded video. Even if we posted this at 60 FPS, there would still be visible judder due to the rates being out of sync.

February 2, 2015 | 02:32 AM - Posted by Anonymous (not verified)

Could you have played it back at 30fps (in slow motion, of course)?

February 2, 2015 | 03:30 AM - Posted by Allyn Malventano

It would have still juddered. The capture itself has to be in sync with or an even multiple higher (even multiples not so important if you are several times higher). Also, playback of that high FPS capture would need to be played back on a high FPS display (or slowed down on playback). This is why it is so hard to explain the effect of G-Sync / adaptive sync - it's just something you need to see in person. 

February 1, 2015 | 05:03 PM - Posted by Ultramar (not verified)

All I want is a decent monitor with variable refresh rate tech that'll work with amd and nvidia gpus and won't break the bank.

I believe...

February 1, 2015 | 05:06 PM - Posted by Staying Calm (not verified)

Wow alot of you people have some serious anger issues. I for one am stoked on mobile G-Sync especially if it can be added with software only!

A question and I think it's rather important at least to me since I am in the market for a gaming laptop. Is there certain laptop displays I could look out for that are more likely to work with G-Sync mobile.

February 2, 2015 | 05:33 AM - Posted by Allyn Malventano

The laptops this partial solution works on may be just that - partial support. I believe newer TCON revisions may be required for functional (playable) Mobile G-Sync, which means it would be the next round of laptops and not necessarily the current ones.

February 2, 2015 | 06:52 AM - Posted by Anonymous (not verified)

If laptop makers actually support panel self refresh (eDP 1.3), then it seems the need for a g-sync module goes away completely. This performs very similar function to what a g-sync module does. This development driver seems to use adaptive sync, but it may expect their to be a gsync buffer present. This could be the source of the blanking. When the frame time goes to high a g-sync display will refresh from the local buffer but since there isn't one it just blanks the screen. It essentially lost the video signal.

Also, I am still wondering why they wouldn't want to update the panel asynchronously if they have a buffer in the display controller. If you have 144 Hz panel, then why not scan it from the buffer at 144 Hz all of the time. Does this add too much latency? You can't update the buffer in middle of a scan or it will cause tearing, so you could get up to one frame of latency, but is this noticeable at 144 Hz? It may require some extra circuitry to handle updates that come in middle of a scan (double buffering or something more clever). Unless I am missing something, this is what a g-sync display does when it has to wait too long for a frame (<30 Hz); it just does the scan out of the local buffer.

February 3, 2015 | 05:27 AM - Posted by Allyn Malventano

Panel self refresh actually gets in the way of what current G-Sync does. If a panel forced a self refresh, that refresh would have to complete before a new frame could be sent, causing that frame to judder. You need *complete* control of the path to avoid situations like this. 

February 1, 2015 | 05:39 PM - Posted by Anonymous (not verified)

Nvidia has been deleting and hiding every single thread about this.

February 1, 2015 | 10:15 PM - Posted by Anonymous (not verified)

Eye opening when customer service is using a site to defend it self and its actions.

https://forums.geforce.com/default/topic/808129/geforce-900-series/sites...

February 2, 2015 | 05:33 PM - Posted by Anonymous (not verified)

The auto industry is regulated strictly, with regards to the information its sales associates have to supply potential customers, and its about time, after 35+ years, that the reporting of vital information to the consumer about the electronic PC/laptop, and mobile devices market, have the same reporting requirements. Device/components reporting is in such disarray, where are the required data sheets, with important information so obfuscated, or just not supplied, information that is vital for the potential customer to make an proper purchasing decision is just not there, the requirement of a mandated data sheet, specific to the PC/Laptop/mobile device, as well as the SOCs/CPUs, and the GPU(should have their own data sheet requirement), and the makers of OEM/ODM parts, as well as the final PC, laptop, mobile device, should have to bundle all the appropriate data sheets, and provide them to their prospective customers.

This includes the brick and mortar stores, having display copies of all the relevant data sheets, online retailers should provide links to the relevant data sheets, this also includes the software such as graphics drivers(Are the graphics drivers updatable by the OEM, or can they be updated by The GPU's maker), as well as listing all of the Bloat/adware that is bundled with the computer that is not vitally necessary for the operation of the device. It's time for strong reporting regulations, or the consumer will continue to be in the dark, and unable to even make an educated guess.

It's time to start writing letters, to the FTC, and any other federal agencies that oversee the markets, and include your state's Attorney general in you list, the PC market, as well as the mobile market, is terrible for providing system specific information about their products, OEMs and ODMs alike, need strong information reporting requirements.

February 2, 2015 | 05:16 PM - Posted by Searching4Sasquatch (not verified)

Amazing how many AMD trolls are posting. Would love to know how many actually work at AMD. :P

February 2, 2015 | 05:47 PM - Posted by Anonymous (not verified)

And by mentioning the AMD specific(as claimed by you), you are implying that you may be likewise for the apposing side. Be sure to note the difference between the fanboi "troll" and the pissed off consumer who were just venting their anger against the big monopoly/ies abusing their market position, and the relative lack of any enforcement of fair market practices, in this new era of the poorly regulated big monopolies of the past 35 years.

February 2, 2015 | 07:38 PM - Posted by larsoncc

I'm sure that posting this late will effectively bury this comment, but here goes anyway -

I feel that the existence of GSync Mobile, being essentially an eDP adaptive sync implementation, means that a FreeSync monitor is absolutely the way to go regardless of GPU brand used; bear with me on this.

Now that the driver file is in the wild, how long will it be before either a) the EDID for your FreeSync monitor is modified to mimic a mobile monitor, or b) The driver file is hacked and allows your FreeSync monitor to operate in the same fashion, or a combination of a) and b)?

It's almost a guarantee that by hook or by crook, a single monitor will drive GSync and FreeSync in the near future. NVidia should drop the pretense immediately and offer proper support for GSync on Freesync monitors.

Conversely, once the FreeSync drivers and monitors for AMD are out in the wild, I wonder how much effort the modding community will have to make to get the GSync modules working with AMD cards.

It'd be good for the modding community to start looking into this driver NOW.

February 4, 2015 | 01:36 AM - Posted by Broken_Heart

You mentioned in the video that if the CPU is busy for a second, the panel goes blank. I think this issue would be greatly mitigated if you change the priority of the game/ driver process in the windows task manager to High instead of normal.

Anyway, This little mishap shows that Nvidia is trying to make G-Sync work without a module. Why is that? Because FreeSync is FREEEEEEEEEEEEEE.

This is Strike out for Nvidia IMHO:

Strike 1: They released a driver that burned GPUs. AMD Never did this

Strike 2: They disabled PhysX when another card was detected in the system. AMD also never did this

Strike 3: They made people pay a $100 premium for G-Sync when they could've enable it for free. They didn't bother to do that until AMD showed the plethora of FreeSync monitors. Because no sane person would pay $100 for a feature they can get for free.

Oh, and for everyone ranting about Maxwell's Efficiency (Performance wise). AMD was the first to try and improve the efficiency of it's architecture when they transitioned from VLIW 5 to VLIW 4 and redesigned their GPUs. And they did it again with GCN. Nvidia does it with Maxwell and for some reason, everybody treats it like a new innovation when AMD's been doing it for years. And let's not forget who first made the move to unified shader architecture.

However, As much as I dislike Nvidia for their brash strategy, I can't help but be impressed with Maxwell's POWER efficiency. They deserve full credit for that.

I just can't help but notice that Nvidia copies everything AMD does. Sometimes AMD does that too but Nvidia seems to do it much more often.

February 5, 2015 | 03:46 PM - Posted by taltamir

I thought the whole gsync chip thing was supposed to be temporary measure before they go ahead and fully integrate said functionality into future GPUs.

March 7, 2015 | 07:40 AM - Posted by Anonymous (not verified)

Where can i download this driver?

May 2, 2015 | 04:44 PM - Posted by Anonymous (not verified)

which gaming laptop screens currently support gsync ?

July 29, 2015 | 01:02 PM - Posted by Ahmed1296 (not verified)

Well , i don't understand a lot of the technical stuff but can anyone tell me whether it'll work on my laptop ( Acer VN7-591G ) or not ? .. help is much appreciated :)

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.