PCPer Live! NVIDIA G-Sync Discussion with Tom Petersen, Q&A

Subject: Graphics Cards, Displays | October 20, 2013 - 02:50 PM |
Tagged: video, tom petersen, nvidia, livestream, live, g-sync

UPDATE: If you missed our live stream today that covered NVIDIA G-Sync technology, you can watch the replay embedded below.  NVIDIA's Tom Petersen stops by to talk about G-Sync in both high level and granular detail while showing off some demonstrations of why G-Sync is so important.  Enjoy!!

Last week NVIDIA hosted press and developers in Montreal to discuss a couple of new technologies, the most impressive of which was NVIDIA G-Sync, a new monitor solution that looks to solve the eternal debate of smoothness against latency.  If you haven't read about G-Sync and how impressive it was when first tested on Friday, you should check out my initial write up, NVIDIA G-Sync: Death of the Refresh Rate, that not only does that, but dives into the reason the technology shift was necessary in the first place.

G-Sync essentially functions by altering and controlling the vBlank signal sent to the monitor.  In a normal configuration, vBlank is a combination of the combination of the vertical front and back porch and the necessary sync time.  That timing is set a fixed stepping that determines the effective refresh rate of the monitor; 60 Hz, 120 Hz, etc.  What NVIDIA will now do in the driver and firmware is lengthen or shorten the vBlank signal as desired and will send it when one of two criteria is met.

  1. A new frame has completed rendering and has been copied to the front buffer.  Sending vBlank at this time will tell the screen grab data from the card and display it immediately.
  2. A substantial amount of time has passed and the currently displayed image needs to be refreshed to avoid brightness variation.

In current display timing setups, the submission of the vBlank signal has been completely independent from the rendering pipeline.  The result was varying frame latency and either horizontal tearing or fixed refresh frame rates.  With NVIDIA G-Sync creating an intelligent connection between rendering and frame updating, the display of PC games is fundamentally changed.

Every person that saw the technology, including other media members and even developers like John Carmack, Johan Andersson and Tim Sweeney, came away knowing that this was the future of PC gaming.  (If you didn't see the panel that featured those three developers on stage, you are missing out.)

View Full Size

But it is definitely a complicated technology and I have already seen a lot of confusion about it in our comment threads on PC Perspective.  To help the community get a better grasp and to offer them an opportunity to ask some questions, NVIDIA's Tom Petersen is stopping by our offices on Monday afternoon where he will run through some demonstrations and take questions from the live streaming audience.

Be sure to stop back at PC Perspective on Monday, October 21st at 2pm ET / 11am PT as to discuss G-Sync, how it was developed and the various ramifications the technology will have in PC gaming.  You'll find it all on our PC Perspective Live! page on Monday but you can sign up for our "live stream mailing list" as well to get notified in advance!

NVIDIA G-Sync Live Stream

11am PT / 2pm ET - October 21st

PC Perspective Live! Page

View Full Size

We also want your questions!!  The easiest way to get them answered is to leave them for us here in the comments of this post.  That will give us time to filter through the questions and get the answers you need from Tom.  We'll take questions via the live chat and via Twitter (follow me @ryanshrout) during the event but often time there is a lot of noise to deal with. 

So be sure to join us on Monday afternoon!

 

October 20, 2013 | 03:06 PM - Posted by wargames12

A couple questions:

Will using the variable refresh rate be similar to Vsync in terms of possibly causing input lag, and other things like degrading performance? Is this something we're going to only see people with high end hardware using, or can low-mid range cards still benefit from it without sacrificing performance?

Will there still be screen tearing when your fps goes above the max refresh rate?

October 20, 2013 | 03:07 PM - Posted by 1STANCESTOR (not verified)

I saw somewhere that a G-Sync add-on may be possible for some monitors. My question is: How does the add-on integrate with monitors, and will it work on HDTVs?

October 20, 2013 | 03:16 PM - Posted by Silviu (not verified)

So far, nVidia has a great track record of providing support in their linux drivers for current GPU. Will the linux driver have support for G-Sync?

October 20, 2013 | 04:14 PM - Posted by Nilbog

Have there been any discussion about licencing this technology for other companies to use (Intel, AMD, etc), to benefit the industry and gaming experience as a whole?

If not, can this discussion start? Please?

Clearly this would be beneficial for everyone, and would make NVIDIA more money in the long run as well.

October 20, 2013 | 04:15 PM - Posted by Joker (not verified)

I just recently purchased the Asus VG248QE. My question is how difficult will it be for me to get the module itself from retail and install it? Also how much will the module by itself cost?

October 20, 2013 | 04:55 PM - Posted by ThreeTwoBravo (not verified)

+1

October 20, 2013 | 05:39 PM - Posted by Ophelos

To answer your last question. It'll cost $175 for the DIY kit.
http://www.geforce.com/hardware/technology/g-sync/faq

October 26, 2013 | 06:16 PM - Posted by Panta

so to played like the game meant to be played.
all we need is
650$ for a 780.
175$ for G-Sync
270$ - for ASUS VG248QE
= holy Josh, that's getting ridiculously expansive.

October 20, 2013 | 04:17 PM - Posted by Adam T (not verified)

Hi, Tom. Now that G-Sync unshackles us from the 30/60/120 fps v-sync targets and the stuttering effect experienced when dropping from a higher v-sync setting to a lower setting, have you and your team been able to determine a frame-rate "sweet spot" for an optimal G-Sync enhanced gaming experience after which additional fps could be considered diminishing returns?

As a corollary, can you describe how noticeable the fluidity difference is in the subjective gaming experience between a G-Sync'ed ~50fps experience, a G-Sync'ed ~70fps experience and a G-Sync'ed ~100+fps experience?

Thanks Ryan and Tom.

October 20, 2013 | 04:41 PM - Posted by Anonymous (not verified)

Can you please ask about 3D Vision and G-sync. I love my Asus Vg278 lightboost 120hz 3D vision monitor.

Not sure if the module would be eventually available for this Model. I think it will be worth the upgrade. I hope the 3d is improved and not hindered. 3D gaming is amazing.

October 20, 2013 | 05:03 PM - Posted by Anonymous (not verified)

Or for the VG236H. A little older but still 120hz 3dvision. WOuld really like to see the DIY kit for this!

October 20, 2013 | 05:57 PM - Posted by CrazyRob (not verified)

I'm also interested in G-sync's interaction with 3D Vision. Although I love my 3d Vision Surround setup, one of my complaints is the induced lag by the requirement of vsync in 3D mode. If g-sync works with 3D Vision, I'll gladly upgrade my monitors.

October 21, 2013 | 11:41 AM - Posted by MABManZ (not verified)

Just adding I would also like to know more details about G-Sync and 3D Vision compatibility. I'm currently using an Acer HN274H but I'd definitely consider upgrading to a G-Sync, Lightboosted setup if it's worth it.

October 22, 2013 | 05:18 AM - Posted by stereoman (not verified)

Very interesting video, what I gathered is G-sync capable displays can also work in 3D Vision mode however I am not sure you can run Gsync and 3D Vision at the same time as the glasses need to flicker at 60 hz per eye hence the 120hz requirement for 3d vision to work but the way G-sync works is it slows down the refresh rate of the monitor to match the FPS of the graphics, if thats the case the glasses would also need to have a variable refresh, I hope i'm wrong because this technology looks amazing and I would love to see 3d vision with G-sync enabled.

October 20, 2013 | 04:43 PM - Posted by The Colors Guy (not verified)

Tom, what is your favorite color?;)

October 20, 2013 | 04:46 PM - Posted by Anonymous (not verified)

In the Nvidia G-SYNC FAQ it says.

"Q: Does NVIDIA G-SYNC work for all games?

A: NVIDIA G-SYNC works with all games. However, we have found some games that do not behave well and for those games we recommend that users take advantage of our control panel’s ability to disable G-SYNC per game. Games that NVIDIA discovers that have trouble with G-SYNC will be disabled by default in our driver."

Can a list of compatible games provide ?

I don't want to left in the dark as far as compatibility to my favorite games or certain genre.

When can we see a actual in-game action sequence with this technology demo. Like Borderlands 2, Hawken or Metro:Last Light with PhysX effects & particles flying everywhere to see how smooth it is.

On the DIY kit the G-SYNC disables all port VGA & HDMI it also disables the audio throough the DisplayPort. Why is that?
Will it only be DisplayPort only ?
Will monitors not have ability to pass through audio ?
Will using G-Sync on Monitors that already have G-Sync disable all other ports including audio or will G-Sync monitors have DisplayPort connectors only ?

When will IPS panels have this?

October 20, 2013 | 04:53 PM - Posted by arka (not verified)

Why gfx cards less powerful than gtx660ti can not use g-sync?
Will there be any mid range [less than 1080x1920, 120Hz] g-sync supported monitor?

October 20, 2013 | 05:55 PM - Posted by Randomoneh

Is the image on the screen still being updated from top to bottom or is every pixel being updated at the same time?

Thank you.

October 20, 2013 | 06:49 PM - Posted by Randomoneh2 (not verified)

Is the image on the screen still being updated from top to bottom or is every pixel being updated at the same time?

Let's say former is the case, right? (1)Card is rendering a very intensive frame (from top to bottom, I guess). (2)Frame has been rendered completely (from top to bottom), (3)refresh cycle has begun and frame is now being drawn onto the screen. We can expect it to be completely drawn in about 16.6 ms.

October 20, 2013 | 06:59 PM - Posted by Randomoneh

Something is wrong either with my connection or with PCPer commenting system.

Here's my question in one piece:

Is the image on the screen still being updated from top to bottom or is every pixel being updated at the same time?

Let's say former is the case, right? (1)Card is rendering a very intensive frame (from top to bottom, I guess). (2)Frame has been rendered completely (from top to bottom), (3)refresh cycle has begun and frame is now being drawn onto the screen. We can expect it to be completely drawn in about 16.6 ms.
But wait! As soon as (2) happened, new frame is starting to be rendered. Only this time, frame is depicting a light scene and rendering is done in ~8 ms. I would love to have insight into what new has happened in this virtual world that I'm simulating and this new frame might have some information (enemy position or something similar) that I would be interested in knowing, even if I could see just the bottom half of this most current frame. Not to mention the input-to-output lag would be lower if I were to see this new frame.

So, I want this new frame because I want to see THE most current event in a scene. Am I out of luck and will have to wait for previous frame to be completely drawn onto my screen or what?

Thank you.

October 20, 2013 | 07:54 PM - Posted by Anonymous (not verified)

So we all know G-sync will be great for games but what about Movies. The current standard is 24fps can i match the frame rate to the movie? Jen-Hsun Huang mention judder on TVs so your team must of known of this issue.

Also will G-sync support TVs in the future and is that a possible way to kill the NTSC and PAL framerate difference?
Also when i say support TVs I also mean the cable companies that show movies or maybe allow sports to be higher fps?

Finally my last question is that as i understand it you need hardware in the monitor to support G-sync/VRR and you need GPU cards that have the hardware feature as well. This is the reason why only some of the kepler GPU support G-sync. Why is it that you don't push all the hardware requirements from the GPU side on to the monitor G-sync board. Wouldn't this allow older nvidia cards to use g-sync and maybe other hardware manufactures a chance?

October 20, 2013 | 09:41 PM - Posted by Desensitized Lemons

Q. Is there any chance that Nvidia could come out with a fully branded 144hz Nvidia monitor that is completely bezel-less with the G-sync technology onboard? If so I would buy three monitors because lately I have been itching to try out Nvidia surround but I just can't get over those black bezels that are in the way and to push those puppies I would run three 780's in sli or maybe three $$ Titans $$ .

On the behave of the high-end enthusiast I would like to say thank you Nvidia for pushing out new technologies up until this point monitors where getting kind of stale.

October 20, 2013 | 11:00 PM - Posted by Anonymous (not verified)

Will Quadro K600 support G-Sync? How about other OS? (e.g. Mac OS X, FreeBSD, Solaris)

October 21, 2013 | 12:31 AM - Posted by Anonymous (not verified)

According to the there website they been doing something similar for a couple of years with Quadros but not at the monitor level.

NVIDIA Quadro Sync and G-Sync
http://www.nvidia.com/object/nvidia-sync-quadro-gsync.html

October 20, 2013 | 11:02 PM - Posted by NavyChief (not verified)

I just purchased the ASUS VG278HE model and would like to know if NVIDIA has plans to expand the upgrade kit offering to include this model.

October 21, 2013 | 01:31 AM - Posted by blitzio

I recently purchased a 27" BenQ gaming monitor, will this have a mod kit as well down the line?

October 21, 2013 | 03:13 AM - Posted by Anonymous (not verified)

Will g-sync work with nvidia surround?
IPS panels will have g-sync? any type of IPS?
any 4k panels with gsync on the near future?
Only DP conenction will work? if that so, how will you connect a nvidia surround setup in order to use g-sync?

October 21, 2013 | 04:32 AM - Posted by Anonymous (not verified)

I want to know how on-line gaming and timing\lag is affected.

If i'm casting a spell or shooting at someone running across pillars. Will I have to wait for G-Sync to render the scene smoothly ?
While the users with out G-Sync will have a competitive advantage over me and i'll probably get owned by the time G-Sync smooths out the animations.

Fractions of a sec between being alive or dead.

How much power will G-Sync add to monitor usage?

October 21, 2013 | 05:36 AM - Posted by Vincent_Lynx (not verified)

Since G-sync provides more "safety" on lower and mid-end gaming hardware when it comes up to graphic quality / tearing etc.:

Are there already plans / confirmations of TV manufacturers to implement G-sync on their screens ?

(My guess: Especially for "living room gaming appliances/consoles/PCs/HTPCs..." that are usually not that high-ended it would be great to have a "reserve" on gaming power. In other words: Won't these gaming solutions benefit most from g-sync ? )

October 21, 2013 | 06:29 AM - Posted by Anonymous (not verified)

I seen over Neogaf forum that one rep from Nvidia stated that G-Sync will give 3d vision superior performance so no worry boys, nvidia will not give up on 3D :)

But please ask more about how G-Sync will work and tell him that the 3d vision comunity feels very cheated as nvidia dont answer even one reply over their own forum so we feel left out in hte cold.

So what the future for nvidia 3d and 3d vision + G-Sync ask him that for us 3d gamers.

October 21, 2013 | 06:35 AM - Posted by Oskar (not verified)

Dear Tom,

G-sync seams great.
Any updates on dual DVI tiled monitor support on Geforce?

Thank you

Oskar

October 21, 2013 | 08:08 AM - Posted by Amplitude (not verified)

Is g-sync can be used when lightboost is enabled in 2d mode ?
If not do you have any plan for this?

October 21, 2013 | 08:09 AM - Posted by Xander R (not verified)

I've been using LightBoost lately to play games in 2D (not stereoscopic 3D). It makes such a huge difference in the perception of blur in the image that I simply cannot play without it. But it does make anomalies like stutter and tearing even more noticeable (at least to my eyes), making it necessary to use V-sync to lock in the fps to get the best effect. So my question is this: how will G-Sync work in conjunction with LB? Will they be separate features or will they work in harmony with one another (i.e. the strobing rate of LB will dynamically change with the framerate, similar to the behavior of G-sync with the refresh rate) I personally am hoping for the latter of course. Oh, and just when will LB for 2D gameplay become a native feature in the control panel software?

October 21, 2013 | 09:17 AM - Posted by Peter Nyman (not verified)

Hi! I'm a flightsimmer and use Microsofts Flight Simulator (FS2004 and FSX). I have the option to lock the fps in-game to anything between 0-90fps. How will this play along with G-Sync? Especially FSX is a demanding piece of software and usually gives fps between 30-60, so using 1/2 refresh rate vsync (30hz) and limiting to 30fps in-game has been the best solution so far.

October 21, 2013 | 10:05 AM - Posted by Joker

How labor intensive will will it be to install the G-Sync module in current Asus VG248QE monitors?

October 21, 2013 | 10:17 AM - Posted by Janisku7

can 4K monitor have 3D vision support and have g-sync module too?

October 21, 2013 | 10:35 AM - Posted by Randal_46

Is the DIY G-SYNC kit useable on other monitor types? If not, are there any plans to release upgrade kits for the most common higher-end brands (Dell Ultrasharp, etc).

Monitors are infrequently replaced so I would think it would be in NVIDIA's best interest to release upgrade kits for the most popular monitor brands to encourage adoption, unless this is cost-prohibitive or the display companies are planning to gouge customers with new G-SYNC monitors.

October 21, 2013 | 10:45 AM - Posted by Anonymous (not verified)

I second this:

"But please ask more about how G-Sync will work and tell him that the 3d vision comunity feels very cheated as nvidia dont answer even one reply over their own forum so we feel left out in hte cold.

So what the future for nvidia 3d and 3d vision + G-Sync ask him that for us 3d gamers."

Please tell the boys at Nvidia that 3Dvision gaming is alive and kicking. Having good 3D support is a major factor in my decision to buy a certain game or not.

Why do they not even mention 3Dvision in their presentations at all?

So please ask them about the future of 3D vision.

thanks

October 21, 2013 | 11:17 AM - Posted by Pirateguybrush (not verified)

I'd like to know more about nVidia's plans for ongoing 3d vision support, as the number of 3d-ready titles has been in sharp decline. Why is it that the community are able to fix a lot of games themselves, but the experts at nVidia can't improve support?

October 21, 2013 | 11:33 AM - Posted by Desensitized Lemons

Nvidia Please stop making and putting more resources into Nvidia Shield Nobody want's that thing that was a complete waste of time and money. What the high-end enthusiast want is a completely bezel-less monitor a monitor of this type would push enthusiast to purchase more than one monitor and most likely more than one video card $$$

October 21, 2013 | 12:15 PM - Posted by Zuu111

I'm curious to know if Nvidia have their sights on just revising TN panels that are favoured by gamers, or if, down the line, this would be implemented in newer generations of IPS panels, 1440 and 1600 sized monitors, emerging 4k monitors, etc.? How "ready" is G-Sync to be implemented in, say, larger IPS panels? Is this a "Qx 2014" eventuality, or something more vaguely down the line? Makes me wonder if, eventually, this innovation might expand beyond Nvidia proprietary technology.

October 21, 2013 | 12:37 PM - Posted by Anonymous (not verified)

official launch dates?

October 21, 2013 | 12:43 PM - Posted by Anonymous (not verified)

I'd like to know where 3D Vision is going as well. Does G-Sync's mentioned improvement for 3D Vision mean they'll keep supporting/improving it?

October 21, 2013 | 02:00 PM - Posted by me (not verified)

1) will we be able to see the difference through the stream?
Since my monitor and the recording technology still samples at 30 or 60 fps will we be able to really see the benefits. Even the videos posted online were not entirely representative.

2) If I use this in surround can I only have the main central monitor using gsync and the other two as regular monitors since they are really only in my peripheral vision anyway?

October 22, 2013 | 03:26 AM - Posted by snook

this is the second coming of physx.

don't ever expect this to work with AMD cards.

I now swear to allah or someone like him.
I will never buy Nvidia...ever.

October 22, 2013 | 04:43 AM - Posted by Bob (not verified)

Althought i am excided for this tech, i dont like they use double buffering as an example, because more superior solution - tripple buffering - exists for some time.

October 22, 2013 | 07:21 AM - Posted by Bob (not verified)

sorry, wrote this before i saw this is actually addressed. Tnx Tom.

October 22, 2013 | 07:21 AM - Posted by Bob (not verified)

sorry, wrote this before i saw this is actually addressed. Tnx Tom.

October 29, 2013 | 05:47 AM - Posted by Anonymous (not verified)

Great to see them trying to implement change...but two points

1 The need for a module would be removed if scanning was done instantaneously....no tearing so no need for vsync or its associated lag or stutter.....but I guess this is something out of the control of Nvidia so they are trying to effect change the only way they can.

2 Vsync solves tearing but introduces stutter...but the size of the stutter is only a portion of the frame refresh time ie part of 16ms for a 60Hz panel.
So using a high refresh rate panel would reduce this stutter...maximum of 7ms on a 144Hz panel.....this is surely getting into the realm of being un-noticable anyway....just bring out panels with even higher refresh rates and use vsync??

December 7, 2013 | 04:31 AM - Posted by Anonymous (not verified)

please answer this question: when the NVIDIA engineers first had the idea to invent G-SYNC? it is important for me

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.