Review Index:
Feedback

NVIDIA G-Sync Tech Preview and First Impressions

Author:
Manufacturer: NVIDIA

Quality time with G-Sync

Readers of PC Perspective will already know quite alot about NVIDIA's G-Sync technology.  When it was first unveiled in October we were at the event and were able to listen to NVIDIA executives, product designers and engineers discuss and elaborate on what it is, how it works and why it benefits gamers.  This revolutionary new take on how displays and graphics cards talk to each other enables a new class of variable refresh rate monitors that will offer up the smoothness advantages of having V-Sync off, while offering the tear-free images normally reserved for gamers enabling V-Sync. 

View Full Size

NVIDIA's Prototype G-Sync Monitor

We were lucky enough to be at NVIDIA's Montreal tech day while John Carmack, Tim Sweeney and Johan Andersson were on stage discussing NVIDIA G-Sync among other topics.  All three developers were incredibly excited about G-Sync and what it meant for gaming going forward.

Also on that day, I published a somewhat detailed editorial that dug into the background of V-sync technology, why the 60 Hz refresh rate existed and why the system in place today is flawed.  This basically led up to an explanation of how G-Sync works, including integration via extending Vblank signals and detailed how NVIDIA was enabling the graphics card to retake control over the entire display pipeline.

In reality, if you want the best explanation of G-Sync, how it works and why it is a stand-out technology for PC gaming, you should take the time to watch and listen to our interview with NVIDIA's Tom Petersen, one of the primary inventors of G-Sync.  In this video we go through quite a bit of technical explanation of how displays work today, and how the G-Sync technology changes gaming for the better.  It is a 1+ hour long video, but I selfishly believe that it is the most concise and well put together collection of information about G-Sync for our readers.

The story today is more about extensive hands-on testing with the G-Sync prototype monitors.  The displays that we received this week were modified versions of the 144Hz ASUS VG248QE gaming panels, the same ones that will in theory be upgradeable by end users as well sometime in the future.  These monitors are TN panels, 1920x1080 and though they have incredibly high refresh rates, aren't usually regarded as the highest image quality displays on the market.  However, the story about what you get with G-Sync is really more about stutter (or lack thereof), tearing (or lack thereof), and a better overall gaming experience for the user. 

Continue reading our tech preview of NVIDIA G-Sync!!

One more thing worth noting right away is that performance testing with G-Sync displays takes a wildly different spin.  As you know PC Perspective has adopted our own Frame Rating graphics testing process that uses direct capture from a GPU running a game, into a hardware capture system with an overlay that we then post-process and analyze to get real world performance that you cannot get with software like FRAPS.  The negative to that method is that is that is currently requires DVI connectivity, which hasn't been a problem as all graphics cards support DVI today.  But G-Sync is a DisplayPort exclusive feature meaning that we cannot use our Frame Rating capture systems currently.  We are working with some partners to enable DP 1.2 capture and thus performance testing, but there are other hiccups in the way.  We are definitely working on those issues and I think we'll have them solved in early 2014.

That being said, the performance in terms of frames per second or frames times, for G-Sync are going to be closely analogous to current monitors running with V-sync disabled.  Today's monitors are displaying at a fixed refresh rate and when a new frame is completed they are simply replacing part of a buffer and the data is sent immediately to the screen, resulting a horizontal tear which I am sure everybody here is familiar with.

View Full Size

With G-Sync, as soon as that frame is done it polls the graphics driver to check to see if the display is in the middle of a scan.  If so, it waits and this poll time takes about 1 ms to be completed.  Then it will tell the monitor to prepare for a new frame by resending the Vblank signal that is actually put on hold by NVIDIA's driver.  The end result is that a G-Sync monitor and enabled systems performance will very closely mirror, in terms of frames per second, a standard configuration with V-Sync disabled.  The benefit of course is that you no longer have any of this distracting, distorting horizontal tearing on the display.

Because of that polling time, NVIDIA did warn us that there is currently a 1-2% performance delta between V-Sync off frame rates and G-Sync enabled frame rates.  G-Sync is a little bit slower because of that polling time that Tom Petersen indicated was in the 1ms area.  Interestingly though, they did say they were planning to improve that time to basically 0ms with some driver updates once monitor partners begin to ship production units.

So performance results here are going to be very minimal, and in fact we are only going to show you a handful of graphs.  We are going to show you V-Sync on versus V-Sync off, where V-Sync off will emulate the performance G-Sync though without the visual anomalies associated with it.  In the graphs below we are using our standard GPU test bed with a SNB-E platform and processor, 16GB of DDR3 memory, an SSD and we are testing with a single GeForce GTX 760 2GB reference card using the latest NVIDIA 331.93 beta drivers.  The two sets of results you see are Frame Rating captured results, one with V-Sync enabled and one with it disabled.

View Full Size

NVIDIA's G-Sync Demo Application

I decided to use the GeForce GTX 760 graphics cards as it is a very common, mainstream GPU and also allows us to find instances in games where G-Sync is very effective.  In scenarios where you are gaming on a 60 Hz monitor and you are running enough graphics hardware to keep the frame rate over 60 FPS 100% of the time, it is true that the many of the benefits of G-Sync will be lost.  However, I will argue that even dropping under that 60 FPS mark for 5% of your game time results in a sub-par experience to the gamer.  Take into account that even the mighty GeForce GTX 780 Ti can be brought to its knees at 1920x1080 with the highest quality settings in Crysis 3 and I think that G-Sync technology will be useful for mainstream and enthusiast gamers alike.  

Also note that higher resolution displays are likely to be shown at CES for 2014 release.

Results will show you instantaneous frame rates, average frame rates over time and how variance is affected as a way to attempt to demonstrate how stutter and frame time variance can affect your actual user experience.  This is very similar to how we have tested SLI and CrossFire over the past two years, helping to showcase visual experience differences in a numeric, quantitative fashion.  It's difficult to do, no doubt, but we believe that attempting this is at least required of a solid overview of G-Sync technology.

View Full Size

View Full Size

Based on the first graph, you might think that the experience of playing Crysis 3 (High, High, 4xMSAA settings) would be the same with V-Sync enabled or disabled, but it clearly is not.  Even though the average rates per second are nearly the same, the second graph, that shows the instantaneous frame time tells a different story.  The black line representing the V-Sync disabled test results shows a rather smooth transition of frame rates from the 0 second mark through the 60 second mark with a couple of hiccups along the way. 

The orange line that shows a V-Sync enabled result is actually very quickly oscillating back and forth between a 16.6ms and 33.3ms frame time, essentially hitting either a 60 FPS mark or 30 FPS mark in any given moment.  The result is unsmooth animation - the human eye is quite adept and seeing variances in patterns and the "hitching" or stutter that appears when the game transitions between these two states is explained very well in our interview with Tom Petersen above.

View Full Size

A zoomed-in graph (just the first 3 seconds) shows the back and forth frame times more clearly.  The orange line shows a few frames at 16.6ms (great!) and then a spike to 33.3ms (not great), repeating over and over.  The black line shows a more regular and consistent frame time of anywhere from 20-23ms.

It would seem obvious then that in this case, where performance is not able to stay above the refresh rate of the panel, the black line shows the better, smoother experience.  However, with standard V-Sync options, this meant horrible tearing across the screen.  G-Sync offers nearly the same performance levels as the V-Sync off result but without the horizontal tearing.

Another interesting angle to take on this debate is that with V-Sync off, and in analog G-Sync enabled, you are getting the full performance out of your graphics card, 100% of the time.  Your GPU is not waiting on monitor refresh cycles to begin outputting a frame or begin rendering the next frame.  Gamers have been enabling this for years by disabling V-Sync, but the tearing problem was again the result.  It was a trade off between frame rate, responsiveness and tearing versus stutter, input latency and a tear-free image.

December 12, 2013 | 09:09 AM - Posted by RoboGeoff (not verified)

Is there any possibility that existing monitors could be modified to support G-Sync? I have 3 Dell U2410s, but more importantly a Dell U3014. It would suck to have to change all of these monitors to get G-Sync support.

December 12, 2013 | 09:25 AM - Posted by Anonymous (not verified)

Not likely as all monitors are different. If the market for a specific monitor is large there might be a mod kit otherwise no, you will need to buy a new one that has it built in.

December 12, 2013 | 04:13 PM - Posted by Ryan Shrout

I think really NVIDIA only has plans to allow upgrades to this ONE monitor.  

December 12, 2013 | 09:18 AM - Posted by Anonymous (not verified)

Is there any talk of making this an open standard for AMD/Intel to use? Otherwise I fear this will be another useless, but visually impressive piece of technology that is doomed to failure. Case point is Apple and their firewire.

December 12, 2013 | 04:13 PM - Posted by Ryan Shrout

Not really...that's not usually a direction NVIDIA takes.  Which is unfortunate, I agree.

December 12, 2013 | 05:37 PM - Posted by arbiter

I don't think it will fail, Seems like at some point in next few years hopefully monitor makers will start to ditch fixed refresh rate that is not needed anymore.

December 13, 2013 | 03:50 AM - Posted by Anonymous (not verified)

I don't know, but I believe Thunderbolt is on lifesupport.....
But, congrats on those who can get it.

December 12, 2013 | 09:40 AM - Posted by Joker

We were promised the DIY kit for December. As G-Sync is currently the only thing on my Christmas wish list, I would be rather disappointed if we don't get the kits by years end.

Already set with my VG248QE and two 780's to push out that blistering 144 Hz refresh rate without having to look at stuttering when it dips. Just need my kit!

December 12, 2013 | 04:14 PM - Posted by Ryan Shrout

Trust me, I am trying to get information from them on this.  :)

December 12, 2013 | 10:02 AM - Posted by Anonymous (not verified)

So, I'm still a little confused about whether G-Sync matters in my situation.

I have that 144 Hz VG248QE monitor, but when I play games I turn all the settings up, so even with my Titan card, I never find any games that push me over 144 frames per second (if I did get tearing, I'd just turn up the AA even higher).

So I run with V-sync off, and don't get tearing anyway since my monitor has such a high refresh rate to accomodate the frames coming out.

Does G-Sync do anything for me?

December 12, 2013 | 04:54 PM - Posted by Ryan Shrout

Yes, it still will.  You may not have MUCH of it, but you have visual tearing even at that high of a refresh because the rates between the GPU and display don't match up perfectly.

December 12, 2013 | 10:08 AM - Posted by LawrenceRM (not verified)

I'm acutally really curious to see how this is compared to Adaptive V-Sync that came out a year or two ago.

Is there a big enough difference to warrant getting G-Sync compatible hardware vs sticking with current ahrdware and Adaptive V-Sync?

December 12, 2013 | 04:57 PM - Posted by Ryan Shrout

Adaptive Vsync turns off Vsync under your maximum refresh rate, resulting in the horizontal tearing.

December 12, 2013 | 10:37 AM - Posted by Randal_46

Looks great, but effectively adds the cost of a new monitor or GPU onto a purchase. I'll wait until the technology is incorporated into the standard. Those that can afford new monitor/GPUs, have fun.

December 12, 2013 | 10:58 AM - Posted by mAxius

seconded if this becomes an open standard ill gladly buy. till then nvidia can take its closed standard and shove it.

December 12, 2013 | 05:31 PM - Posted by arbiter

Depends on how you look at it, AMD's stuff is closed as well. AMD knows nvidia will never use things like mantle in their gpu's. So making it open is really nothing more then a PR move since even if nvidia did add it, probably would require special hardware in new cards to support it. I am sure nvidia would be willing to license the tech but AMD like with physx won't want license it. If nvidia made it open and amd used it, no one would bat an eye about it, now if amd made something open and nvidia used it would massive news cause they used amd tech.

December 13, 2013 | 01:04 PM - Posted by mAxius

sorry to reign on your parade but nvidia is in the freaking vesa display group they can purpose to add this to dp/hdmi/dvi the sad truth is they are not and are using it as a lock into to their product stack. Api wars aside if mantle pans out nvidia can adopt it and use it as its an open standard. on to physx and cuda could of run on amd gpu's but it is not an open standard so why adopt something proprietary. the same for gsync it can be a great help to the gaming world as apart of dp/hdmi/dvi but it is not its going just another useless thing nvidia tries to push its product stack over its competitors.

December 12, 2013 | 12:01 PM - Posted by BiG 10 (not verified)

Those of you running the Asus VG248QE (or Benq XL2420's) must do yourselves a favor and check out LightBoost http://www.blurbusters.com/zero-motion-blur/lightboost/. On lightboost-approved monitors, you can enable buttery smooth motion with a simple free software solution (no extra hardware needed). The best example of LB vs. no-LB is by viewing this moving picture - http://www.testufo.com/#test=photo&photo=toronto-map.png&pps=960&pursuit=0. Don't stare straight at the monitor, but track the moving map with your eyes. Non-LB monitors will have a very blurry image, and when LB is enabled, the moving image will be crystal clear.

Gsync sounds like a great option for those whose framerates can vary between 30 and 120fps, but if you have a proper gaming system and can keep your frames at or above 90fps, you really (I can't emphasize this enough) need to experience a fps with a lightboosted monitor. Battlefield 4 is a completely different game.

December 12, 2013 | 01:59 PM - Posted by orvtrebor

Love the idea behind the tech, and I don't even mind the price premium, but I will not go back to a TN panel. Until there is an IPS option I'm out.

December 12, 2013 | 04:29 PM - Posted by Anonymous (not verified)

This is exactly where I stand. I'd gladly grab a pair of new Ultra Sharps with G-Sync

December 12, 2013 | 04:58 PM - Posted by Ryan Shrout

SOON!  :D

December 12, 2013 | 03:01 PM - Posted by Anonymous (not verified)

Will the G-Sync upgrade module be compatible with Asus VG278H 3D Vision 2 monitors?

December 12, 2013 | 04:58 PM - Posted by Ryan Shrout

I do not know that...

December 12, 2013 | 04:26 PM - Posted by Anonymous (not verified)

Doesn't G-Sync require a Kepler-based GPU? In this case the 770 and 760 would not work for G-Sync. Only the 780 and up.

December 12, 2013 | 04:59 PM - Posted by Ryan Shrout

No, the 600-series was Kepler as well.

December 13, 2013 | 12:06 PM - Posted by renz (not verified)

when they first show to the public what G-Sync is capable of in Montreal they were using GTX760. for 600 series some of the low end were based on Fermi but GTX650 and above is Kepler based. for GT640 they have both Fermi and Kepler version.

December 12, 2013 | 04:31 PM - Posted by Anonymous (not verified)

Why does the module have on-board memory?

Is it just another buffer "on screen" and if so, what is it buffering?

December 12, 2013 | 05:00 PM - Posted by Ryan Shrout

It uses the memory to store the most recent frame in case the FPS drops below 30 FPS and it needs to re-scan that frame back onto the display.

December 14, 2013 | 03:55 PM - Posted by Bryan (not verified)

Why does the monitor have to re-scan a frame if the FPS drops below 30? Sorry if this has been answered elsewhere, but I thought the set-and-hold quality of lcd displays would negate the need for a minimum refresh rate.

December 15, 2013 | 11:01 PM - Posted by Ryan Shrout

Because the light of the LCD does eventually deteriorate if it isn't refreshed after a period of time.  NVIDIA found that time 

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.