Review Index:
Feedback

NVIDIA G-Sync: Death of the Refresh Rate

Author: Ryan Shrout
Manufacturer: NVIDIA

Introduction of LCD Monitors

CRT displays were used on computers since the very earliest days as they replaced the arrays of light bulbs used for information output.  In the 1980s though the CRT went through a revolution as the capabilities were improved from monochrome to 16 colors and resolutions of 640x540 (EGA) were reached. 

LCD monitors (liquid crystal display) were first introduced to consumers in the mid-1990s at high prices and relatively low image quality compared to the CRT displays at the time.  LCD manufacturers were forced to integrate the completely new display technology on the existing display controller infrastructure that was dominated by the CRT.  Part of this integration was the adoption of a refresh rate even though from a technical point of few, LCDs functioned very differently.

LCDs produce an image on the screen with the combination of a persistent back light (CCFL, LED) and by direct voltage application.  These pixels can be updated without interference from or with other segments of the display, in any order or at any rate.  There is also no need to refresh the pixels in an LCD due to the CRT limitation called phosphor persistence as the pixels do not go dark between refreshes.  This means that there is effectively no minimum refresh rate for LCD panels.  (Though, in extreme cases of sub-15 FPS instances, a refresh might be necessary to avoid visible light variance.)

View Full Size

ASUS VG248QE - a 144 Hz Refresh LCD Monitor

Because LCD monitors don’t have a collection of electron beams that “paint” the screen and don’t need to reset back to a starting position, they do not have a technical reliance on a fixed update rate.  Instead, LCD monitors are limited by the update rate of the crystal and silicon used in manufacturing.  Today we call that period the response time.  There are lots of debates currently about exactly how to measure it (grey to grey, etc.) but this rate is completely independent of the refresh rate as it stands today.

A display, LCD or CRT, at the time required the use of an analog to digital converter that was responsible for taking in the signal from the graphics system of a computer and translating that into the digital data that would be painted on the display.  The controllers inside these displays were receiving data from the computer at a fixed rate that was related to the refresh rate of the CRT.  Early LCDs used very similar electronics for the analog conversion and adopted the refresh rate along with them.  It wasn’t until 2003 that LCD displays first outsold their CRT competitors and by that time refresh rates on LCDs were fully integrated.

 

Integration of Refresh Rates Today

Those of us old enough to remember CRTs as the dominant display type will remember the idea of having different refresh rate options for your monitor.  High end monitors could run at 60 Hz, 75 Hz, 100 Hz, etc. to improve visual quality all while running at different resolutions (CRTs had no “native” resolution).  A technology called EDID (extended display information data) was created in 1994 to allow a monitor to pass information to the computer system to indicate the available resolutions and refresh rates the monitor supported. 

With that information, it is up to the graphics card to set and indicate the refresh rate back to the display.  (This explains how you can create custom resolutions and timings in a GPU driver that may not be accepted by your monitor.)  In fact, the scan rate is controlled by a signal called “vertical blanking” that is sent to the monitor, used with CRTs to give the electron guns time to move from their finishing position (bottom right) back to the starting position (top left), ready to draw another frame on the screen.

LCDs today do not need this signal as there are no electron gun to reposition. So why have LCD manufacturers continued to implement static refresh rates of 60 Hz, 120 Hz or even 144 Hz?

The answer lies in the connectivity options and controllers powering displays that are available today.  As the analog signals of VGA were replaced with the likes of DVI and HDMI, these higher speed, digital connection options had to conform to existing interface patterns to maintain compatibility with older output types.  How many DVI-I to VGA passive adapters do you have laying around with your graphics card? 

DVI, HDMI and early versions of DisplayPort transmit display data in multiples of a refresh rate.  A pixel clock is simply a timing circuit provided to divide an incoming scan line of video into individual pixels.  The simplicity of the math might surprise you:

2560x1600 Resolution
2720 pixels * 1646 pixels * 60 Hz = 268.628 MHz pixel clock

View Full Size

Breakdown of current Display Timing Variables

The extra pixels on each dimension (160 on horizontal, 46 on the vertical) are actually part of the legacy CRT standard that include room for sync width and front/back porch which are meant to space out signal and synchronization communications.  This gives time for the electron guns to reposition. 

If you look at DVI and HDMI specifications you will often see the performance rated by maximum pixel clock.  DVI is limited to 165 MHz per link or resolution equivalents of 1920x1200 @ 60 Hz and 1280x1024 @ 85 Hz.  Recent updates to HDMI 2.0 have enabled 600 MHz total dual-link pixel clocks for resolutions of 3840x2160 @ 60 Hz.

DisplayPort introduced a self-clock feature while also supporting HDMI and DVI pixel clocks for backwards compatibility but readers of PC Perspective are also aware some active adapters are required for conversion to DVI/HDMI connectivity.  These adapters communicate with the graphics controller on the computer to negotiate speeds and convert the protocol and signal.

So even though DisplayPort and HDMI can technically support some impressive bandwidth rates (17.28 Gbps for DP 1.2) they are working with legacy logic that makes them slow and “lazy”.  The connectivity is there to push much higher screen refresh rates than we are accustomed to without some of the complications that occur with modern graphics cards and monitors.

The question you should be asking yourself now is: what happens if you completely re-write the logic of communications between a GPU and a monitor?

October 18, 2013 | 10:54 AM - Posted by LukeM89

Great article. However, will Tom Peterson be joining you in November or on Monday October 21st? As there is no Monday the 21st in November. At any rate (pun intended) I look forward to the live event!

October 18, 2013 | 11:19 AM - Posted by Ryan Shrout

October 21st, sorry!! 

Thanks for pointing that out!

October 18, 2013 | 11:01 AM - Posted by Shawn (not verified)

Awesome article, nicely done! Is this technology limited in any way to screen resolution, i.e. use in 4k & surround?

October 18, 2013 | 11:20 AM - Posted by LukeM89

It would appear as though a possible limitation with 4k resolutions would be that of the cable's bandwidth and "the legacy logic that makes them slow and 'lazy'."

October 18, 2013 | 11:40 AM - Posted by YTech

Great Article!

I think PCper should have a Calendar for sale with the best of the best of their articles and such for all their Pcper Fans!

I do believe that a re-write of the GPU to Display is required. The downside to this is back-compatibility will be eliminated.

The major challenge that I can see, is from the consumer point of view, the profit, the main reason for technology advancement. Consumers will probably not buy in it as it won't be compatible with their current setup. This is where companies are releasing small devices to help the merge-over.

Such as the G-Sync device. I believe this is the transition piece. The GPU will probably get re-written and only display that will be able to make us of it will require the G-Sync until all companies follows through with their own approach and create a new standard. I believe Mantle from AMD has the same concept in mind, however they are not as aggressive as nVidia with their G-Sync.

All-in-all, change is scary and it will happen for the good! :)

Great broadcast, by the way!

October 18, 2013 | 12:37 PM - Posted by Anonymous (not verified)

Sounds great. Looking forward to seeing some video and hopefully checking it out soon.

one quibble -- the birth of christ is about a valid metric as the birth of santa claus.

October 18, 2013 | 01:21 PM - Posted by Mark "Dusty" D (not verified)

It's historically accepted that Jesus was born. The only quibble you should have is whether he was a diety or just a prophet of a religion.

http://en.wikipedia.org/wiki/Anno_Domini#Historical_birth_date_of_Jesus

October 21, 2013 | 04:06 PM - Posted by bystander (not verified)

Don't forget, his birth date is also argued. Which may be what he was talking about. His birth date, from what I have heard, is likely in early summer, and the year is also unknown.

October 18, 2013 | 12:58 PM - Posted by Stealthgyro (not verified)

This seems like something FrameRating help, by giving them exact data points rather than, people trying to explain it in abstract terms.

This does have my curiosity running. Since G-Sync is more communication between the monitor and the GPU what happens with things like your FrameRating setup? Will the communication be able to pass through the capture card? Will the capture card have to be G-Sync enabled? I think that will be very interesting.

October 18, 2013 | 01:15 PM - Posted by YTech

What about HTS (Home Theater Systems)?

How would the new GPU work when it passes through the Receiver (HDMI 1.4) and then outputs to an HDTV Display (60" SMART VIERA Full HD 3D Plasma)?

Will a firmware be available for the HDTV or a G-sync display required?

Will the Receiver affect the signal where the G-Sync won't be able to utilize the data correctly?

Many more questions that should be presented on Monday! :)

October 18, 2013 | 01:21 PM - Posted by MachineDojo (not verified)

I see problems in adoption from the monitor side of things if this is NVidia only, will ATi have their own version in the future requiring monitors to adopt multiple standards just so we can have what is effectively the same feature? Or is this something that other video card manufacturers can adopt reasonably?

October 18, 2013 | 01:24 PM - Posted by Mark "Dusty" D (not verified)

I agree, the propriatary hardware makes me leery. I hope that in the future, the firmware of LCD display ICs will just be written to accept this new protocol. Then the transport (HDMI, DVI, DisplayPort) will not matter and it will all be controlled by software.

October 18, 2013 | 01:52 PM - Posted by Anonymous (not verified)

Game changer. Literally.

Can't wait.

October 18, 2013 | 02:48 PM - Posted by DFWallace (not verified)

How does this 'G Sync' compare to using LightBoost-2d with a 120hz monitor and nVidia vid card to get great improvements in 'FPS' style games?

October 19, 2013 | 09:52 PM - Posted by Anonymous (not verified)

They're not really comparable, IMO because these two technologies work on different sides of the smoothness problem.

Lighboost is a way of refreshing the screen in such a way that eliminates ghosting. G-sync changes the timing of the refresh so that it's controlled by the GPU.

There's absolutely no reason there can't be a lightboosted G-sync display.

October 18, 2013 | 02:54 PM - Posted by Anonymous (not verified)

What is the limit refresh rate of this monitor? If the Video card is free to dish out any framerate it wants, then what happens when it reaches 1000 frames per second? Will the monitor be capable of having an unlimited refresh rate?

October 23, 2013 | 06:37 PM - Posted by Anonymous (not verified)

OLED monitors will be. But at that point the interconnect will be the real limit...

October 18, 2013 | 05:02 PM - Posted by Anonymous (not verified)

So $399 for a 24inch.

That's a 50% price increase just to have G-Sync in a monitor 24inch monitor.

OUCH!!!!

October 18, 2013 | 05:30 PM - Posted by Fergdog (not verified)

Is uneven frame pacing a serious concern for gsync? If finished frames are sent out as fast as they can, won't the times between frames being displayed constantly change? I think this tech sounds great, but it seems like the game or gsync needs to intelligently find a refresh rate it can maintain still to avoid uneven frame pacing.

October 25, 2013 | 08:26 PM - Posted by Luciano (not verified)

They still have a buffer of 768mb inside the monitor.

You can predict if a frame will take too long to be rendered if you are the GPU. And then you can start to add artificial "exibition" latency in your "quick" frames.
This decrease in speed would give you time to shift the pace in seamless way.

Lucid already does that.
Its like F1 racing: the gpu can predict what the corners will look like and start to brake.
And in the exit apply "traction control" also.

October 18, 2013 | 06:00 PM - Posted by blitzio

Hi Ryan, did Nvidia give any indication as to a potential upgrade kit for existing owners of any of the 4 monitor OEMs mentioned?

I ask because I recently purchased a BenQ monitor only a few months ago, and am hoping there would be a kit I could buy from them and install on my existing monitor so I could take advantage of the G-Sync.

Thanks.

October 18, 2013 | 08:29 PM - Posted by nabokovfan87

sounds like more proprietary crap is required.

It hard to tell from the story what this ACTUALLY is without seeing it in action or having the nvidia religious types oozing over how they saved the world for buying nvidia.

Plain and simple the ONLY way to do this is to change the actual display logic inside the monitor. That means you need a new monitor or one that has the chip they use.

October 18, 2013 | 06:16 PM - Posted by ap84 (not verified)

Hi Sorry I am a little confused...I currently have a vg248qe and its amazing but what is the point of this gsync if my monitor already has 144hz? when I game I don't enable vsync because I don't have to because of the 144hz and I don't see screen tearing because of the 144hz....so what exactly is a g sync vg248qe going to do for me?

October 18, 2013 | 07:23 PM - Posted by Fergdog (not verified)

Screen tearing is still visible on 144hz monitors. It's less noticeable but still noticeable.

October 18, 2013 | 07:36 PM - Posted by ap84 (not verified)

so basically this just eliminates all possible screen tearing and that's it?

October 18, 2013 | 07:42 PM - Posted by Fergdog (not verified)

Ya and also eliminates the input lag that vsync causes. It may be really good for syncing up with video too not sure though.

October 18, 2013 | 07:45 PM - Posted by Anonymous (not verified)

It reduces tearing without introducing any input latency at all, it allows native playback of fixed lower-framerate content without pulldown or other judder-inducing tricks, it improves frame perception dramatically, and it allows variable framerate to be vsynced (also with no lag).

Kind of a big deal.

Vsync without any additional lag has been an unattainable holy grail. But variable-framerate vsync without lag? I know plenty of people willing to get a second job to pay for that if it actually works. Myself included.

October 18, 2013 | 07:49 PM - Posted by Fergdog (not verified)

I think the tech sounds great, I'm still worried that uneven frame pacing might be a problem though since frame rate can fluctuate a lot.

October 18, 2013 | 07:57 PM - Posted by Anonymous (not verified)

To be fair, uneven frame pacing would be a problem even if the video card had shared memory with your frontal cortex. Even if the frame was sent directly to your optic nerve in real time with 0femtosecond delay, the uneven frame pacing would still be a problem.

This just makes all the other problems around it become small enough to be unnoticeable. Then we can focus on the REAL frame pacing. See, when we say frame pacing now we mean frame pacing as seen on screen. What matters is also the frame pacing as seen by the game engine. Not just how far apart the frames are, but how far apart is the content in the frames as it exists in the virtual game world, and also how variable is that temporal distance. We've got a long way to go, that's for sure.

But hey, this is the first concrete step in a long time, and I can't wait to experience it.

October 18, 2013 | 08:37 PM - Posted by nabokovfan87

http://www.pcper.com/files/imagecache/article_max_width/review/2013-10-1...

"G-Sync essentially functions by altering and controlling the vBlank signal sent to the monitor. In a normal configuration, vBlank is a combination of the combination of the vertical front and back porch and the necessary sync time. That timing is set a fixed stepping that determines the effective refresh rate of the monitor; 60 Hz, 120 Hz, etc."

Alright. In terms of ACTUALS, this chart is about the best that you can view. It refers to the rise/fall time (Tr and Tf) that it takes for the signal to output. The capacitors, resistors, etc. all have to work and it isn't instantaneous. That is the limiting factor of refresh rates. Yes you can overvolt, overclock, or do other things to speed that up, but nothing makes up for actual hardware that takes less time to rise and fall.

So, you need something from the micro scale to the nano scale. Nanotechnology is ever present in the news and there are a variety of options to build circuits on, but there is a matter of cost that goes along with it.

So, nvidia is selling an nvidia monitor with an nvidia chip to use nvidia commands with thier nvidia processor that works with the nvidia cable on the nvidia power pc because "that's the way it's meant to be played".

I'll believe it when I see it, and I'm not paying 100000 dollars for an extra 30 hertz when we all know we are content with 60 and haven't even transitioned to 120-240 hertz yet.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.