Dissecting G-Sync and FreeSync - How the Technologies Differ
It's more than just a branding issue
As a part of my look at the first wave of AMD FreeSync monitors hitting the market, I wrote an analysis of how the competing technologies of FreeSync and G-Sync differ from one another. It was a complex topic that I tried to state in as succinct a fashion as possible given the time constraints and that the article subject was on FreeSync specifically. I'm going to include a portion of that discussion here, to recap:
First, we need to look inside the VRR window, the zone in which the monitor and AMD claims that variable refresh should be working without tears and without stutter. On the LG 34UM67 for example, that range is 48-75 Hz, so frame rates between 48 FPS and 75 FPS should be smooth. Next we want to look above the window, or at frame rates above the 75 Hz maximum refresh rate of the window. Finally, and maybe most importantly, we need to look below the window, at frame rates under the minimum rated variable refresh target, in this example it would be 48 FPS.
AMD FreeSync offers more flexibility for the gamer than G-Sync around this VRR window. For both above and below the variable refresh area, AMD allows gamers to continue to select a VSync enabled or disabled setting. That setting will be handled as you are used to it today when your game frame rate extends outside the VRR window. So, for our 34UM67 monitor example, if your game is capable of rendering at a frame rate of 85 FPS then you will either see tearing on your screen (if you have VSync disabled) or you will get a static frame rate of 75 FPS, matching the top refresh rate of the panel itself. If your game is rendering at 40 FPS, lower than the minimum VRR window, then you will again see the result of tearing (with VSync off) or the potential for stutter and hitching (with VSync on).
But what happens with this FreeSync monitor and theoretical G-Sync monitor below the window? AMD’s implementation means that you get the option of disabling or enabling VSync. For the 34UM67 as soon as your game frame rate drops under 48 FPS you will either see tearing on your screen or you will begin to see hints of stutter and judder as the typical (and previously mentioned) VSync concerns again crop their head up. At lower frame rates (below the window) these artifacts will actually impact your gaming experience much more dramatically than at higher frame rates (above the window).
G-Sync treats this “below the window” scenario very differently. Rather than reverting to VSync on or off, the module in the G-Sync display is responsible for auto-refreshing the screen if the frame rate dips below the minimum refresh of the panel that would otherwise be affected by flicker. So, in a 30-144 Hz G-Sync monitor, we have measured that when the frame rate actually gets to 29 FPS, the display is actually refreshing at 58 Hz, each frame being “drawn” one extra instance to avoid flicker of the pixels but still maintains a tear free and stutter free animation. If the frame rate dips to 25 FPS, then the screen draws at 50 Hz. If the frame rate drops to something more extreme like 14 FPS, we actually see the module quadruple drawing the frame, taking the refresh rate back to 56 Hz. It’s a clever trick that keeps the VRR goals and prevents a degradation of the gaming experience. But, this method requires a local frame buffer and requires logic on the display controller to work. Hence, the current implementation in a G-Sync module.
As you can see, the topic is complicated. So Allyn and I (and an aging analog oscilloscope) decided to take it upon ourselves to try and understand and teach the implementation differences with the help of some science. The video below is where the heart of this story is focused, though I have some visual aids embedded after it.
Still not clear on what this means for frame rates and refresh rates on current FreeSync and G-Sync monitors? Maybe this will help.
This graph shows typical (and most popular) 40-144 Hz panel implementations of each technology and the relationship between frame rate and refresh rate. The bottom axis shows the game's frame output rate, what would be reported by a program like Fraps. You can see that from ~40 FPS to 144 FPS, both technologies offer pure variable frame rate implementations where the refresh rate of the screen matches the game's frame rate. The quality and experience between the two technologies here are basically identical (and awesome). Above 144 FPS, both will go into a V-Sync state (as long as V-Sync is enabled on the FreeSync panel).
Below that 40 FPS mark though things shift. The red line shows how AMD's FreeSync and Adaptive Sync work: the refresh rate stays static at 40 Hz even as the frame rate dips below 40 FPS, to 35 FPS, 30 FPS, etc. G-Sync works differently, doubling the refresh rate and inserting duplicate frames starting at around the 37 FPS mark. This continues until the game frame rate hits 19/20 FPS where a third frame is inserted and the refresh rate is increased again. The result is the dashed line representing the effective experience of G-Sync.
Zoomed in on the area of interest, you get a better view of how G-Sync and FreeSync differ. Effectively, G-Sync has no bottom window for variable refresh and produces the same result as if the display technology itself was capable going to lower refresh rates without artifacting or flickering. It is possible that in the future, as display technologies improve, the need for this kind of frame doubling algorithm will be made unnecessary, but until we find a way to reduce screen flicker at low refresh rates, NVIDIA's G-Sync VRR implementation will have the edge for this scenario.
As we discuss in the video, it is possible that AMD could implement a similar algorithm for FreeSync at the driver level, without the need for an external module. A Radeon GPU knows what frame rate it is rendering at and it could send out a duplicate frame, at a higher frame rate, to trick the display and have the same effect. It will require a great deal of cooperation between the panel vendors and AMD however, as with each new monitor release AMD would have to have a corresponding driver or profile update to go along with it. That hasn't been AMD's strong suit in the past several years though, so it would require a strong commitment from them.
It's also important to note that the experience below the VRR window on a FreeSync panel today is actually worse in practice than in theory. Because the refresh rates stays at 40 Hz when your frame rates are low, you get a combination of stutter and frame tearing (if V-Sync is off) that is worse than if the refresh rate was higher, at say 60 Hz or even 144 Hz. No doubt complications would arise from an instantaneous refresh rate shift of ~40 Hz to 144 Hz but some middle ground likely exists that FreeSync could implement to improve low-FPS experiences.
I hope you found this story (and the video!) informative and interesting, we spent a lot of time gathering the data and figuring out how best to present it. Please leave us feedback in the comments here and we will try to answer as many question as well can.
Thanks for reading!