Review Index:
Feedback

NVIDIA G-Sync: Death of the Refresh Rate

Author:
Manufacturer: NVIDIA

Our Legacys Influence

We are often creatures of habit.  Change is hard. And often times legacy systems that have been in place for a very long time can shift and determine the angle at which we attack new problems.  This happens in the world of computer technology but also outside the walls of silicon and the results can be dangerous inefficiencies that threaten to limit our advancement in those areas.  Often our need to adapt new technologies to existing infrastructure can be blamed for stagnant development. 

Take the development of the phone as an example.  The pulse based phone system and the rotary dial slowed the implementation of touch dial phones and forced manufacturers to include switches to select between pulse and tone based dialing options on phones for decades. 

Perhaps a more substantial example is that of the railroad system that has based the track gauge (width between the rails) on the transportation methods that existed before the birth of Christ.  Horse drawn carriages pulled by two horses had an axle gap of 4 feet 8 inches in the 1800s and thus the first railroads in the US were built with a track gauge of 4 feet 8 inches.  Today, the standard rail track gauge remains 4 feet 8 inches despite the fact that a wider gauge would allow for more stability of larger cargo loads and allow for higher speed vehicles.  But the cost of updating the existing infrastructure around the world would be so cost prohibitive that it is likely we will remain with that outdated standard.

View Full Size

What does this have to do with PC hardware and why am I giving you an abbreviated history lesson?  There are clearly some examples of legacy infrastructure limiting our advancement in hardware development.  Solid state drives are held back by the current SATA based storage interface though we are seeing movements to faster interconnects like PCI Express to alleviate this.  Some compute tasks are limited by the “infrastructure” of standard x86 processor cores and the move to GPU compute has changed the direction of these workloads dramatically.

There is another area of technology that could be improved if we could just move past an existing way of doing things.  Displays.

Continue reading our story on NVIDIA G-Sync Variable Refresh Rate Technology!!

 

Development of the CRT and the Refresh Rate

The very first CRT (cathode ray tube) was built in Germany in 1897 but it took a Russian scientist named Boris Rosing in 1907 to first draw simple shapes on a screen that would lead to the creation of the television and the monitor.  1934 saw the first commercially available electronic televisions with cathode ray tubes by Telefunken in Germany.

View Full Size

Image source: CircuitsToday

A CRT TV or monitor produces an image on a fluorescent screen with the use of a set of electron guns that accelerate and excite different color phosphors (red, green and blue most often).  The guns move in a fixed and systematic pattern that is called a raster from left to right and from top to bottom (as you face the display) in order to draw the image provided by a graphics controller on the screen.

CRT displays were built around a refresh rate, otherwise known as the scan rate, which represents the speed of the electron guns ability to draw a complete image from top to bottom on the screen and then relocate back to the starting position (top left).  All kinds of devices have a refresh rate – CRTs, LCDs, movie projectors, etc.  On CRT monitors, a faster refresh rate resulted in reduced screen flicker and reduced eye strain for users.  Screen flicker is the result of the eye being able to witness the phosphors going dark before the next “scan” that activates them.  The faster the monitor could update the image and keep the phosphors illuminated, the less likely you were to see flickering.

View Full Size

Image source: Wikipedia

What may not be common knowledge is why refresh rates were set to the speeds they are most commonly built at.  In the 1920s as TVs were first being produced it became obvious that running vacuum tube hardware at anything other than a multiple of the AC line frequency coming into a home was problematic.  AC powers lines run at 60 Hz in the US and 50 Hz in Europe – starting to sound familiar?  By running a TV at 60 Hz in the US manufacturers of televisions could prevent moving horizontal bands of noise that were caused by power supply ripple. 

An entire industry was built around the 60 Hz (and 50 Hz in Europe) refresh rates which has caused numerous other problems in video.  Films are recorded at 24 frames per second most of the time which is not an easy multiple of the 60 Hz refresh rate, introducing the need for technologies like telecine and 3:2 pulldown to match up rates.  Many readers will likely remember uneven, jittering movement of movies being played back on TV – these rate match problems are the reason why.

October 18, 2013 | 09:19 PM - Posted by Fergdog (not verified)

We are content with 60? Who's we? Anybody who has gamed on a 120hz monitor will want to go back, its a huge improvement. There's nothing wrong with being content with technology, but that won't stop it from advancing. When people experience the advances, then they prob won't be so content.

October 21, 2013 | 09:06 AM - Posted by Lord Binky (not verified)

My monitor is 2560*1440 @ 120hz. It's fantastic. The 'you can't see a difference' bit is a fallacy and I have yet to show it to someone who can't see the difference. They may write off the difference as trivial or insignificant, but they will admittedly see the difference.

That isn't to say, you don't NOTICE the difference all the time, just like you don't notice the colors aren't calibrated when playing a game. That's the point though, if you noticed it all the time it would detract from what you are doing, but it is definitely an enhancement. Just like well done 3D, especially something like using an oculus rift, you shouldn't be thinking 'This is 3D, This is 3D, This is 3D' the entire time you play a game, you should be immersed in the game with the improvements making it an overall better experience, not constantly reminding you or pulling your attention away so you an actively away that they exist and just overall detracting from the experience.

October 18, 2013 | 10:33 PM - Posted by Mouse123 (not verified)

There was already a display controller architecture update coming - to implement Panel Self Refresh (to conserve power). Implementing G-Sync (or similar) technology at the same time will be splendid.

October 18, 2013 | 10:58 PM - Posted by Robert892375298 (not verified)

I will buy this :)

October 18, 2013 | 11:22 PM - Posted by Anonymous (not verified)

http://www.geforce.com/hardware/technology/g-sync/faq

NVIDIA G-SYNC works with all games. However, we have found some games that do not behave well and for those games we recommend that users take advantage of our control panel’s ability to disable G-SYNC per game. Games that NVIDIA discovers that have trouble with G-SYNC will be disabled by default in our driver.

October 19, 2013 | 01:08 AM - Posted by wujj123456

What i am worried about is SLI and triple display. I am not sure if I am the only one, but I wasn't able to overclock pixel clock on my SLI and triple monitor setup. Single display works. Any hints?

October 19, 2013 | 02:06 PM - Posted by Anonymous (not verified)

Ya Ruski ! Great article. We shout ... for Shrout.

October 19, 2013 | 02:13 PM - Posted by Anonymous (not verified)

NVIDIA FTW !

October 19, 2013 | 03:17 PM - Posted by razor512

Why do lcd monitors need even that? Why not have the gpu have a done signal instead where each complete signal has a 1 or 2 bit done packet which tells the monitor that a frame is fully rendered and in a more dumb fashion the display simply displays each complete frame

All the monitor tells the videocard is the max frame rate that it can handle and the gpu will cap its frame rate to that limit.

Think of it like this, imagine you have 1 person showing drawings to a some people and next to that person, you have someone drawing pictures quickly and once he or she is finished he will ring a bell and the person showing the images to the group of people, will simply grab the next image and show them, not caring about the rate at which they are being drawn at.

October 22, 2013 | 06:08 AM - Posted by YTech

Razor512 > From my understanding, that is what G-Sync is attempting to resolve. To ignore the Monitor's timing rate and display the images from the GPU when they are ready.

I believe they are focusing at LED/LCD monitors as these are widely available and the ideal display format trend. And being the most restricted in clock timing, it gives them a great foot into a large future for nVidia.

So when they have it well adopted, they will go down the chain to the next wide spread display.

We will know more in Fall 2014.

All I can say is, it's about time someone as large releases something like this. There will be many more advantages from the GPU with these new features.

October 22, 2013 | 06:09 AM - Posted by YTech

*page request error*
- Sorry for double post -

October 20, 2013 | 01:38 AM - Posted by Panta

For Me, this maybe AMD's last nail in their coffin.
i head mostly ATI/AMD's card's, but it's clear now
who is the leader & who is flower in this technology race.

wish Nvidia make cooler running cards. (80c ugh :S)
and that we don't need sell a kidney for a very good card.

October 20, 2013 | 02:27 PM - Posted by Anonymous (not verified)

AMD/BBY drinking at a bar together very soon.

October 20, 2013 | 03:20 AM - Posted by lensig24 (not verified)

Well if this will signal the end of AMD GPU's, wont it mean that NVIDIA will monopolize PC graphics card and we will pay an arm and a leg just to buy great graphics card????

October 21, 2013 | 04:42 AM - Posted by Anonymous (not verified)

I have to say the article left me with the feeling that several crucial issues are not even mentioned. I am not an expert and I am not absolutely sure I remember everything correctly. But here are the issues I would like to be explained:

1. As far as I can remember LCD monitor do not tolerate very low refresh rates well - although these very low frequencies are probably not important for gmaing.

2. Flexible refresh probably has more important applications than improved gaming. How about power savings? How about discarding frame rate conversions on some archival videos? These seem to be more important for general public.

3. OLED in monitors and TVs is probably a very close futre. Other technologies like electrowetting are also a possiblity. These should be also considered, not just LCD.

4. It is understandable that Nvidia wants to make money on propriatary technology. But looking form a wider perspetivce only a standrdized non-propiatary technology seems sensible here.

October 27, 2013 | 04:52 AM - Posted by JayO (not verified)

In regards to #2 "Power Savings"
The Displayport standards can directly control a display's refresh rate. One of these reasons IS energy savings.

It's not just a video out plug. It's a whole new world.

I wouldn't be surprised if G-Sync works because of Displayport standards and the vast number of possibilities that Displayport makes possible.

October 21, 2013 | 02:36 PM - Posted by Lord Binky (not verified)

Yeah I'm bummed it's another proprietary push from Nvidia, but at least it is so basic of a concept they can lock it away in patents. I do have hope this could drive an open standard to be picked up that can be implimented.

October 21, 2013 | 02:37 PM - Posted by Lord Binky (not verified)

Ugh, I meant can't lock it away.

October 22, 2013 | 05:50 AM - Posted by YTech

Been hearing a lot about train derailment in Canada.
"tasks are limited by the “infrastructure”"

Perhaps the Government should visit PCper and nVidia and discuss the concept of G-Sync and apply it to the real world! :)

Besides, computer hardware requires a lot of natural resources for productions and oil is part of that progress. *hint hint*

Cheers!

October 24, 2013 | 05:47 PM - Posted by Luciano (not verified)

It would be awesome if somebody compared G-Sync with Lucid Virtual Vsync.

October 25, 2013 | 01:20 PM - Posted by Alex Antonio (not verified)

it is unfortunate that nvidia has not catered this technology to many of us gaming enthusiasts who have just upgraded to 2560 120hz IPS displays.

I doubt many people will jump ship to an inferior display simply to get no tearing. (even though its a significant improvement)

I would have liked to see Nvidia produce adapters for monitors that use DVI.

For me, this probably means that I'll buy a 290X because I wouldn't be able to take advantage of G-sync.

October 29, 2013 | 12:39 AM - Posted by Anonymous (not verified)

Great article and even greater technology.

Does this also work with Nvidia's 3D glasses for 3d gaming?

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.