NVIDIA Announces G-Sync, Variable Refresh Rate Monitor Technology

Subject: Graphics Cards | October 18, 2013 - 07:52 AM |
Tagged: variable refresh rate, refresh rate, nvidia, gsync, geforce, g-sync

UPDATE: I have posted a more in-depth analysis of the new NVIDIA G-Sync technology: NVIDIA G-Sync: Death of the Refres Rate.  Thanks for reading!!

UPDATE 2: ASUS has announced the G-Sync enabled version of the VG248QE will be priced at $399.

During a gaming event being held in Montreal, NVIDIA unveield a new technology for GeForce gamers that the company is hoping will revolutionize the PC and displays.  Called NVIDIA G-Sync, this new feature will combine changes to the graphics driver as well as change to the monitor to alter the way refresh rates and Vsync have worked for decades.

View Full Size

With standard LCD monitors gamers are forced to choose between a tear-free experience by enabling Vsync or playing a game with the substantial visual anomolies in order to get the best and most efficient frame rates.  G-Sync changes that by allowing a monitor to display refresh rates other than 60 Hz, 120 Hz or 144 Hz, etc. without the horizontal tearing normally associated with turning off Vsync.  Essentially, G-Sync allows a properly equiped monitor to run at a variable refresh rate which will improve the experience of gaming in interesting ways.

View Full Size

This technology will be available soon on Kepler-based GeForce graphics cards but will require a monitor with support for G-Sync; not just any display will work.  The first launch monitor is a variation on the very popular 144 Hz ASUS VG248QE 1920x1080 display and as we saw with 3D Vision, supporting G-Sync will require licensing and hardware changes.  In fact, NVIDIA claims that the new logic inside the panels controller is NVIDIA's own design - so you can obviously expect this to only function with NVIDIA GPUs. 

DisplayPort is the only input option currently supported. 

It turns out NVIDIA will actually be offering retrofitting kits for current users of the VG248QE at some yet to be disclosed cost.  The first retail sales of G-Sync will ship as a monitor + retrofit kit as production was just a bit behind.

Using a monitor with a variable refresh rates allows the game to display 55 FPS on the panel at 55 Hz without any horizontal tearing.  It can also display 133 FPS at 133 Hz without tearing.  Anything below the 144 Hz maximum refresh rate of this monitor will be running at full speed without the tearing associated with the lack of vertical sync.

View Full Size

The technology that NVIDIA is showing here is impressive when seen in person; and that is really the only way to understand the difference.  High speed cameras and captures will help but much like 3D Vision was, this is a feature that needs to be seen to be appreciated.  How users will react to that road block will have to be seen. 

Features like G-Sync show the gaming world that without the restrictions of console there is quite a bit of revolutionary steps that can be made to maintain the PC gaming advantage well into the future.  4K displays were a recent example and now NVIDIA G-Sync adds to the list. 

Be sure to stop back at PC Perspective on Monday, November 21st at 2pm ET / 11am PT as we will be joined in-studio by NVIDIA's Tom Petersen to discuss G-Sync, how it was developed and the various ramifications the technology will have in PC gaming.  You'll find it all on our PC Perspective Live! page on Monday but you can sign up for our "live stream mailing list" as well to get notified in advance!

NVIDIA G-Sync Live Stream

11am PT / 2pm ET - October 21st

PC Perspective Live! Page

 

Source: NVIDIA
October 18, 2013 | 08:12 AM - Posted by Anonymous (not verified)

I could really care less about all the other bells and whistles that both AMD and Nvidia have come out with for their GPU's, all I want is for my card to give me max performance with great results on the screen. That being said, unless AMD has something planned like this, then Gsync will be a very compelling reason to get an Nvidia card and compatible monitor the next time I'm ready to upgrade in those areas.

October 18, 2013 | 08:26 AM - Posted by Mark "Dusty" D (not verified)

I'm assuming you meant you could not care less about the bells and whistles and take your statement to mean you only care about max performance, and maybe image quality by saying great results on the screen. If this is the case, your two statements are contridictory. Why would Gsync matter if all you care about is max performance?

October 18, 2013 | 09:01 AM - Posted by Anonymous (not verified)

It's called "an idiomatic expression". Some people use those and most people understand what is meant, my apologies for confusing you.
Next, you are not reading carefully. I will quote what I originally wrote to point out where you are wrong: "all I want is for my card to give me max performance with great results on the screen." I did not use the word "MAYBE" and I did not say "I only care about max performance". What I did say is that I want max performance with great results on the screen which ARE NOT contradictory when you consider what Gsync is bringing to the table.

October 18, 2013 | 09:08 AM - Posted by Anonymous (not verified)

Dear Mr. Snarky,

The idiomatic expression is "I COULDN'T care less".

You're welcome.

October 18, 2013 | 09:28 AM - Posted by Anonymous (not verified)

In your case it's an "idiotic expression".

October 18, 2013 | 08:13 AM - Posted by Anonymous (not verified)

I wonder how close they came to naming this technology as "Nsync" as opposed to "Gsync", lol.

October 18, 2013 | 08:24 AM - Posted by Anonymous (not verified)

So Nvidia by announcing G-Sync is also calling into question how they delivering frames to monitor and how F-CAT is being used?

October 18, 2013 | 08:34 AM - Posted by Mark "Dusty" D (not verified)

It would make current F-CAT technology ineffective, but frame time would still be an important metric. This would be similar to how Android phones update their screen. The panel does not consantly update, it will only update when the display buffer has a full frame. This means if your GPU pipeline can only fill the frame buffer with full frames at 14 FPS, you will refresh the screen at 14 FPS. In contrast to the panel refreshing at 60 FPS regardless of whether the frame buffer has a complete image or not (runt frames).

October 18, 2013 | 09:30 AM - Posted by Anonymous (not verified)

The only reason you would need this is if your GPU set-up isn't n-sync. Since its proprietary to only Nvidia its telling me currently they have an issue and they want to remedy it.

So it calls into question whats going on with F-CAT that we don't know. Maybe this is why it wasn't widely adapted by reviewers who got offered the F-CAT kits.

After all if your better then you competition in frame-pacing you don't need this. Just let your competition hang itself.

This just stinks.

Do they really want to get into the business of selling Nvidia cables next ?

October 18, 2013 | 09:45 AM - Posted by Allyn Malventano

I think you need to understand the significance of this before you outright dismiss the tech. This is groundbreaking once you get your head around it.

October 18, 2013 | 10:00 AM - Posted by Anonymous (not verified)

I'm try'n, but you guys have been telling me Nvidias are the best at this for a few months and now the CEO says I'll have to buy a new monitor with this new tech to get rid of stuttering and tearing in my Kepler card.

So whos lying to me ?

October 18, 2013 | 11:07 AM - Posted by LukeM89

No one is lying to you sir. The issue Nvidia has is the same issue AMD is faced with; Limited LCD-hardware refresh rates adopted from the CRT interface. Please read this great article by Ryan Shrout that explains nicely the implications of such technology regardless of being proprietary or not.

http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-G-Sync-Death-Refresh-...

October 18, 2013 | 11:14 AM - Posted by nobody special (not verified)

This is giving you the best of both worlds. Nobody is lying just adding better stuff. They were already great at not stuttering. This adds not TEARING also at the same time. AMD seems about to catch the stuttering problem so time to up their game again...LOL.

No complaints here, as AMD will have to match it at some point so we'll get better from both sides over time. But with AMD's funding problems we won't likely see an answer until next xmas from AMD. Oh well, in the end it all improves. No AMD knows what to do...ROFL. You should be thanking god at least ONE side of this duo has the money to R&D fixing PC problems so the other side can figure it out too eventually ;) If NV starts making what AMD does as fools keep asking for lower pricing and more free games, we'll end up with problems that last forever.

Watch for the 200mil payment to Global Foundry next month to eat AMD's meager profit this Q and next Q's profits too. If they'd stop giving out free games and quit lowering pricing maybe they'd make some real money.

October 18, 2013 | 11:23 AM - Posted by LukeM89

^ well said ^

October 18, 2013 | 10:09 AM - Posted by bystander (not verified)

Actually, this tech would make FCAT just as important if not more so. This tech will make your monitor's refresh sync with your GPU's frame delivery times. Having consistent delivery times will make every thing look smoother, and prevent refresh times to be too short for the display to keep up, or too long to feel like a delay.

FCAT will still matter. Frame latency will not become less important.

October 18, 2013 | 10:09 AM - Posted by bystander (not verified)

Actually, this tech would make FCAT just as important if not more so. This tech will make your monitor's refresh sync with your GPU's frame delivery times. Having consistent delivery times will make every thing look smoother, and prevent refresh times to be too short for the display to keep up, or too long to feel like a delay.

FCAT will still matter. Frame latency will not become less important.

October 18, 2013 | 08:27 AM - Posted by Anonymous (not verified)

Oh wow I actually have that monitor...can't wait for that retrofit kit thing.

October 18, 2013 | 08:57 AM - Posted by Matt (not verified)

Even as a lifelong ATI/AMD user, this tech really makes me give nvidia a second thought. I hate tearing!

October 18, 2013 | 09:11 AM - Posted by Paul Keeble (not verified)

This is great stuff. Its not just gaming that will benefit its films as well. We won't need to do the horrid things we currently do to get a ~24 fps source to display well on a 60/120 hz screen, instead we can just run it at 24hz.

Eliminating tearing, stutter and lag all in one technology sweep this is bigger news and a bigger more useful change than pretty much anything else. I really really hope everyone else can use this standard so we can finally do away with vsync and the problems it brings in the digital monitor age.

October 18, 2013 | 09:18 AM - Posted by Anonymous (not verified)

Let's see:

1. Proprietary technology.
2. Requires custom monitor design... again proprietary.
3. Only works over a single connection tech.

As amazing as it may be, it's yet another technology that won't have an impact except in a small group of extreme gamers. Are you going to ditch your monitors and buy a new, not yet released card just for this? And yet, even if it will be used only by a niche market, I expect that from now on, in every future NVidia card review, we will hear the marketing blurb and how awesome it all really is.

Our only hope is that there won't be a blocking patent on this and that other players (Intel, AMD, ...) come up with an open standard instead.

October 18, 2013 | 10:49 AM - Posted by Homeles (not verified)

At least Nvidia isn't being hypocritical by releasing a proprietary technology. For them to release another technology along the lines of PhysX and CUDA shouldn't surprise anyone.

AMD, on the other hand, has done something very hypocritical in regards to open vs. closed standards. Typically, they've been the banner bearer for open standards by pushing for things like DisplayPort and HSA. Now they're pushing Mantle, which is a complete about-face from everything they've stood for in the past.

October 18, 2013 | 02:09 PM - Posted by arbiter

Mantle requires it to be built in to the game, this tech is in the monitor so it should work on all games.

October 18, 2013 | 09:19 AM - Posted by Anonymous (not verified)

I really hope this will become an open standard sooner rather than later. As long as it's tied to new Nvidia cards and a few supported monitors it's still nothing more than what they have going with 3D Vision monitors. In fact I'm pretty sure the only monitors that will have G-sync are those 3D Vision supported monitors. I'd much rather see this in 2560x1440 IPS rather than 1080p TN that already has 120/144 support which alleviates some of the tearing and stutter issues.

October 18, 2013 | 09:44 AM - Posted by Bobsrt (not verified)

I have the Overlord Tempest 1440 IPS 120hz panel and just read on their forums about all this. Seems Scribby may be getting something going with GSync and IPS?

I would LOVE this on my display. TN at 1080 is old news for me. Only 1440 at 120hz IPS now.

October 18, 2013 | 07:15 PM - Posted by Anonymous (not verified)

Oh God, here comes the fucking Korean monitor overclocking brigade, you guys are worse than Jehovas witnesses.

Look man, I'm happy you like your new toy or whatever, but don't come shitting on TN from your mighty 300-400 dollar throne, where your average pixel transition time exceeds your frame persistence time. At 120hz, you're refreshing a new frame about once every 8.3 ms, that IPS panel, like just about every IPS panel in the world, refreshes in about 10ms average, and potentially much worse depending on the color transition. You are LITERALLY spending the entire time watching pixel transitions, you never get to see them in their final state because the typical transition time on that screen is longer than the time the frame has before the next one gets refreshed.

LCDs smear the living fucking SHIT out of anything in motion. Just look at a fast 144hz TN monitor and see how much better it gets when you enable that LightBoost stuff in 2D mode. Your IPS is an order of magnitude slower vs that overdriven TN panel (10ms vs 1ms), so if the 144hz screen is smears, what's a word to describe what yours does?

TN will be "old news" when something legitimately better comes along for gaming. IPS panels are great on phones, ipads, and monitors used for coding, web surfing, and static pictures. It's not the Jesus technology you think it is. Not for gaming or anything involving motion, anyway.

October 25, 2013 | 08:55 AM - Posted by mdrejhon

Easy, although I love my LightBoost, let's take it easy on those people who's never seen strobe backlights. Just refer them to various internet sources about strobe backlights (e.g. TFTCentral's article about how strobe backlights can reduce motion blur to CRT-quality -- www.tftcentral.co.uk/articles/content/motion_blur.htm ...)

Another good news -- G-SYNC has an NVIDIA-sanctioned strobe backlight (sequel to lightboost). Mainstream awareness of strobe backlights will increase during 2014, and there will be multiple brand names of similarly-performing strobe backlights other than "lightboost".

October 18, 2013 | 09:53 AM - Posted by MAXLD (not verified)

This will heavily depend on the monitor brands. If they put this stuff on the majority of the monitor lineup, this will definitely pick up a rapid pace and AMD will be in trouble. If they choose to milk it and put it on a few specific expensive gaming models, like the ones we have now: questionable quality TN panels, low ms,... then people will have to eat what Asus and other brands want them to eat, not what they choose to... and that will end up being just another restricted "niche" market like the 3D, PhysX, etc...

PC gaming is going forward with latest AMD and nVidia new stuff, but it's becoming even more segmented, and that's bad news for the consumer... when the time comes to choose new gpus, he will have to make several sacrifices depending on the brand. And when not only he has to decide between gpus, but also specific/limited monitors... that's not good at all, specially if the "niche" strategy is confirmed and he has to buy something he doesn't feel that is the overall best choice.

I don't see AMD doing the same, much less brands having monitors with double tech integrated, or same models with different chips inside. Maybe one day, just like SLi/CFX boards, but that's a very long way to go and nVidia wouldn't allow it anyway while it has the upperhand. Besides, AMD is still struggling with the now "basic" frame pacing issues, they probably aren't very near to come up with their own version of this.

In general, a good day for tech, a bad day for the unbiased consumer. Let's at least just hope monitor companies do the right thing and shower that thing into their lineups (doubt it, but anyway...).

October 18, 2013 | 03:09 PM - Posted by Anonymous (not verified)

I would rather have seen further development of CRTs. It's sad that a 13 year old FW900 still outperforms any modern LCD. At least this technology should be compatible with FED/SED displays in the small chance that they ever make it to market.

October 25, 2013 | 08:50 AM - Posted by mdrejhon

> I would rather have seen further development of CRTs. It's
> sad that a 13 year old FW900 still outperforms any modern
> LCD. At least this technology should be compatible with
> FED/SED displays in the small chance that they ever make it
> to market.

While colors/blacks are better on CRT, strobe backlights have finally allowed LCD to have less motion blur than CRT. Google "lightboost",

G-SYNC was reportedly to also include a fixed-rate strobe backlight (lightboost sequel) that is superior to lightboost and is officially sanctioned by nVidia:
http://www.blurbusters.com/confirmed-nvidia-g-sync-includes-a-strobe-bac...

October 18, 2013 | 11:34 PM - Posted by Anonymous (not verified)

So I buy a GPU
Then I buy a N-SYNC monitor

http://www.geforce.com/hardware/technology/g-sync/faq

Q: Does NVIDIA G-SYNC work for all games?

A: NVIDIA G-SYNC works with all games. However, we have found some games that do not behave well and for those games we recommend that users take advantage of our control panel’s ability to disable G-SYNC per game. Games that NVIDIA discovers that have trouble with G-SYNC will be disabled by default in our driver.

After spending all that money I still might find my favorite games or future games not be compatible.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.