Dissecting G-Sync and FreeSync - How the Technologies Differ

Manufacturer: Various

It's more than just a branding issue

As a part of my look at the first wave of AMD FreeSync monitors hitting the market, I wrote an analysis of how the competing technologies of FreeSync and G-Sync differ from one another. It was a complex topic that I tried to state in as succinct a fashion as possible given the time constraints and that the article subject was on FreeSync specifically. I'm going to include a portion of that discussion here, to recap:

First, we need to look inside the VRR window, the zone in which the monitor and AMD claims that variable refresh should be working without tears and without stutter. On the LG 34UM67 for example, that range is 48-75 Hz, so frame rates between 48 FPS and 75 FPS should be smooth. Next we want to look above the window, or at frame rates above the 75 Hz maximum refresh rate of the window. Finally, and maybe most importantly, we need to look below the window, at frame rates under the minimum rated variable refresh target, in this example it would be 48 FPS.

AMD FreeSync offers more flexibility for the gamer than G-Sync around this VRR window. For both above and below the variable refresh area, AMD allows gamers to continue to select a VSync enabled or disabled setting. That setting will be handled as you are used to it today when your game frame rate extends outside the VRR window. So, for our 34UM67 monitor example, if your game is capable of rendering at a frame rate of 85 FPS then you will either see tearing on your screen (if you have VSync disabled) or you will get a static frame rate of 75 FPS, matching the top refresh rate of the panel itself. If your game is rendering at 40 FPS, lower than the minimum VRR window, then you will again see the result of tearing (with VSync off) or the potential for stutter and hitching (with VSync on).

But what happens with this FreeSync monitor and theoretical G-Sync monitor below the window? AMD’s implementation means that you get the option of disabling or enabling VSync.  For the 34UM67 as soon as your game frame rate drops under 48 FPS you will either see tearing on your screen or you will begin to see hints of stutter and judder as the typical (and previously mentioned) VSync concerns again crop their head up. At lower frame rates (below the window) these artifacts will actually impact your gaming experience much more dramatically than at higher frame rates (above the window).

G-Sync treats this “below the window” scenario very differently. Rather than reverting to VSync on or off, the module in the G-Sync display is responsible for auto-refreshing the screen if the frame rate dips below the minimum refresh of the panel that would otherwise be affected by flicker. So, in a 30-144 Hz G-Sync monitor, we have measured that when the frame rate actually gets to 29 FPS, the display is actually refreshing at 58 Hz, each frame being “drawn” one extra instance to avoid flicker of the pixels but still maintains a tear free and stutter free animation. If the frame rate dips to 25 FPS, then the screen draws at 50 Hz. If the frame rate drops to something more extreme like 14 FPS, we actually see the module quadruple drawing the frame, taking the refresh rate back to 56 Hz. It’s a clever trick that keeps the VRR goals and prevents a degradation of the gaming experience. But, this method requires a local frame buffer and requires logic on the display controller to work. Hence, the current implementation in a G-Sync module.

As you can see, the topic is complicated. So Allyn and I (and an aging analog oscilloscope) decided to take it upon ourselves to try and understand and teach the implementation differences with the help of some science. The video below is where the heart of this story is focused, though I have some visual aids embedded after it.

Still not clear on what this means for frame rates and refresh rates on current FreeSync and G-Sync monitors? Maybe this will help.

Continue reading our story dissecting NVIDIA G-Sync and AMD FreeSync!!

View Full Size

This graph shows typical (and most popular) 40-144 Hz panel implementations of each technology and the relationship between frame rate and refresh rate. The bottom axis shows the game's frame output rate, what would be reported by a program like Fraps. You can see that from ~40 FPS to 144 FPS, both technologies offer pure variable frame rate implementations where the refresh rate of the screen matches the game's frame rate. The quality and experience between the two technologies here are basically identical (and awesome). Above 144 FPS, both will go into a V-Sync state (as long as V-Sync is enabled on the FreeSync panel).

Below that 40 FPS mark though things shift. The red line shows how AMD's FreeSync and Adaptive Sync work: the refresh rate stays static at 40 Hz even as the frame rate dips below 40 FPS, to 35 FPS, 30 FPS, etc. G-Sync works differently, doubling the refresh rate and inserting duplicate frames starting at around the 37 FPS mark. This continues until the game frame rate hits 19/20 FPS where a third frame is inserted and the refresh rate is increased again. The result is the dashed line representing the effective experience of G-Sync.

View Full Size

Zoomed in on the area of interest, you get a better view of how G-Sync and FreeSync differ. Effectively, G-Sync has no bottom window for variable refresh and produces the same result as if the display technology itself was capable going to lower refresh rates without artifacting or flickering. It is possible that in the future, as display technologies improve, the need for this kind of frame doubling algorithm will be made unnecessary, but until we find a way to reduce screen flicker at low refresh rates, NVIDIA's G-Sync VRR implementation will have the edge for this scenario.

As we discuss in the video, it is possible that AMD could implement a similar algorithm for FreeSync at the driver level, without the need for an external module. A Radeon GPU knows what frame rate it is rendering at and it could send out a duplicate frame, at a higher frame rate, to trick the display and have the same effect. It will require a great deal of cooperation between the panel vendors and AMD however, as with each new monitor release AMD would have to have a corresponding driver or profile update to go along with it. That hasn't been AMD's strong suit in the past several years though, so it would require a strong commitment from them.

View Full Size

It's also important to note that the experience below the VRR window on a FreeSync panel today is actually worse in practice than in theory. Because the refresh rates stays at 40 Hz when your frame rates are low, you get a combination of stutter and frame tearing (if V-Sync is off) that is worse than if the refresh rate was higher, at say 60 Hz or even 144 Hz. No doubt complications would arise from an instantaneous refresh rate shift of ~40 Hz to 144 Hz but some middle ground likely exists that FreeSync could implement to improve low-FPS experiences.

I hope you found this story (and the video!) informative and interesting, we spent a lot of time gathering the data and figuring out how best to present it. Please leave us feedback in the comments here and we will try to answer as many question as well can.

Thanks for reading!

Video News

March 27, 2015 | 01:55 PM - Posted by Allyn Malventano

Aging analog oscilloscopes are the best kind.

March 27, 2015 | 01:56 PM - Posted by Ryan Shrout

Just like people, right? :)

March 28, 2015 | 07:40 PM - Posted by Anonymous (not verified)

To complement G-Sync we need the (see below) feature that reduces clock speeds when frame rate go above a user selectable limit. This will greatly help in decreasing power consumption.

NVIDIA has such a feature on laptops. They really need to bring it over to desktops.

March 28, 2015 | 08:02 PM - Posted by Anonymous1 (not verified)

I think AND should give users an option that when frame rates drop below the free sync window, the refresh rate of the monitor jumps to max. It's not as good as what G-Sync does but it's a lot better than what they are doing now.

March 30, 2015 | 09:31 AM - Posted by Chizow (not verified)

More fine work Ryan and Allyn, it seems as if PCPer is once again leading the tech industry's coverage on these cutting edge hot button topics. You guys are now my go-to for asking the hard questions and getting the answers that matter from vendors.

Looking forward to more testing on input lag/ghosting if you all find the time, thanks!

March 30, 2015 | 02:00 PM - Posted by Chizow (not verified)

Also some ideas for you all, since you seem to be open to looking at this exciting new tech (VRR) at different angles. I suggested on AT but Jarred seems unwilling or incapable of performing more detailed testing.

You all hinted at this as well in your video and a lot of people seem to be dismissing the importance of a working VRR window in that 25-40FPS range. People seem to forget, that the range from 30-60FPS was kind of that Vsync no man's land where you had to choose between double and triple buffering or hit that steep 60 to 30 FPS cliff on 60Hz panels. So naturally, VRR is immensely important here especially since minimums are going to regularly drop into this range even if someone AVERAGES 60+ FPS.

It would be interesting for you to demonstrate 2560x1440 and 4K during a staged, moderately demanding test run and map out the % of time various graphics cards spend at various refresh rates, % in each bucket (below VRR, in VRR, Above VRR). Test suite can be wide strokes here, like GTX 770/280, GTX 290/780/970), 290X/780Ti/980, Titan X) 1 or 2 in each bucket to get a wide sample. I think this would be a real eye opener for many to see, just how often they would experience the jarring drops out of VRR windows.

Also, I am not so sure AMD can just cure this problem via driver, let's not forget they invented a spec that relies on VBlank, creative driver solutions would undoubtedly throw a wrench in all of this and may very well mean violating their own spec. The various VRR scalers may also pose a limitation here, so I guess we will have to wait and see what AMD's response is, but I don't think the fix will be quick, or easy, and certainly not Free. :)

April 2, 2015 | 11:07 AM - Posted by Anonymous (not verified)

I think you guys make a fundamental mistake in saying that FreeSync is a fault that taring happens below X fps.

Its not FreeSync fault but the maker of the video decoder chip in the monitor that dose not the same job as the G-Sync module dose.

So its properly something a new model video decoder monitor chip could fix in new monitors, as its a monitor hardware problem and not a FreeSync signal problem.

March 27, 2015 | 03:57 PM - Posted by oscopeitup (not verified)

GJ on creating that refresh monitoring setup. What sensor do you have attached to the monitor?

April 1, 2015 | 06:03 PM - Posted by Anonymous (not verified)

I would like if you done Input lag at these low frame rates.. Doesn't adding these extra Frames add more input lag??

Just like Triple Buffering for example..

June 2, 2015 | 02:06 PM - Posted by byo (not verified)

You're already down in the region where the next frame isn't ready so there is no frame with newer input available. The refresh rate is adaptively timed so that when it thinks a new frame should be available, it is - just after two refresh cycles rather than 1.

June 2, 2015 | 02:06 PM - Posted by byo (not verified)

You're already down in the region where the next frame isn't ready so there is no frame with newer input available. The refresh rate is adaptively timed so that when it thinks a new frame should be available, it is - just after two refresh cycles rather than 1.

March 27, 2015 | 02:02 PM - Posted by Pixy Misa (not verified)

Good analysis. If AMD specify that the maximum:minimum refresh rate for Freesync monitors must be at leat 2:1, a driver solution will work just as well as G-Sync at lower cost.

Though even with the LG monitor with its 48-75Hz range, only refresh rates between 38 and 47Hz are going to be a problem. Below that the driver can frame-double; above that the monitor can accept it natively.

So even with existing monitors a driver update can fix the worst cases.

March 30, 2015 | 09:28 AM - Posted by Anonymous (not verified)

Except the drive CAN'T just double frame rates, that's what the local buffer on the G-Sync module does, y'know, the same hardware AMD wondered why Nvidia needed.

Bottomline is Nvidia already did a lot of work and answered a lot of questions and came up with solutions with their G-Sync module.

It does not appear AMD asked and answered these same questions when haphazardly throwing together their FreeSync solution, which is why they end up with the inferior solution.

March 30, 2015 | 05:07 PM - Posted by ppi (not verified)

They can. All the GPU needs to do is send another refresh, as if it had new frame. And with information available at driver level, they can do the prediction even better than G-Sync.

I am pretty sure nVidia will internally have the solution sooner than AMD :D.

As a downside, unlike G-Sync, it will have a small performance penalty, as gfx card memory needs to be read. Given the memory bandwidth of current GPUs, the impact would likely be pretty minor, though.

March 27, 2015 | 02:13 PM - Posted by Anonymous (not verified)

This is why i like this site over others you guys go extra mile.

March 27, 2015 | 02:24 PM - Posted by Rustknuckle (not verified)


March 27, 2015 | 02:43 PM - Posted by benbass

As usual, amazing stuff. Making the whole PC hardware industry goes forward.

March 27, 2015 | 02:44 PM - Posted by Ophelos

I can't wait to see alot more updates through out the year on this topic. Things like "Driver updates, AMD 300 series GPUs, more selection of monitors". Then we can see where the true problem can be at.

March 27, 2015 | 03:03 PM - Posted by bhappy (not verified)

As they always say it's much easier said than done, knowing AMD's track record with their rather infrequent driver releases over the last decade relative to their competitor's. I wouldn't get my hopes up too high for them to be suddenly releasing frequent driver updates. G-sync monitors may always cost slightly more than Freesync monitors but you get what you pay for. I think this video helps to prove that G-sync is still the superior technology if you take cost out of the equation. Great work guys on this article/video, clearly a cut above most other tech sites such as guru3d, techspot, etc.

March 28, 2015 | 09:45 PM - Posted by kn00tcn

decade!? amd released at least one driver a month for a decade... until april 2012, while nvidia has had moments of a few months without a driver (like amd now.. in the last only 3 years)

March 29, 2015 | 02:08 AM - Posted by arbiter

skip the beta, they haven't released a driver since December.

March 29, 2015 | 07:34 PM - Posted by Evernessince (not verified)

Dude, AMD releases Drivers every month, and thank god too. I fricking hate the Nvidia control center popping up and telling me I have an update to play some crap game with the most FPS.

"G-sync monitors may always cost slightly more than Freesync monitors"

Slightly? How about a minimum of $200. You can go and buy an AMD card with the money you save from not buying into G-Sync. It's an added bonus that the Free-Sync monitor isn't locked to only an AMD card as well.

March 30, 2015 | 10:20 AM - Posted by Chizow (not verified)

AMD ditched their monthly driver release ball and chain some time ago, probably for the best since it didn't really indicate you were getting anything other than whatever was ready on Friday at 4pm PST on the last work day of the month.

There is a premium to G-Sync but as we can see from Pcpers work on it, the difference is justified because G-Sync actually does what it sets out to do, while FreeSync falls short in a number of areas. You get what you pay for.

Also, for any Nvidia user who has a Kepler 600 series or newer, they don't have to spend anything other than the G-Sync monitor itself to experience it.

Meanwhile, most existing AMD users will have to go out and buy a new GCN1.1 capable card (R9 285, R9 290/X, bonaire) or a handful of underpowered APUs, so yeah, cost savings there for Nvidia users and an additional expense tacked on for any AMD fan who doesn't already have one of those cards.

March 27, 2015 | 03:12 PM - Posted by razor512

Can you test this on the laptop that could have gsync forced on without the module? does it get the motion blur?
Doing this will give more complete info as you willhave a comparison of what the panel is doing with and without the module, thus better telling all of what that module contributes.

March 27, 2015 | 03:43 PM - Posted by Ryan Shrout

We don't have that laptop anymore...

March 27, 2015 | 03:18 PM - Posted by IRO-bot (not verified)

Ryan- can you run a test on the Frame Time for both G-Sync and Free Sync below the minimum frame threshold. Whild nvidia is doubling or quadrupling the frame, wouldn't it also be increasing the frame time? Like it wouldn't be a smooth experience?

March 27, 2015 | 03:43 PM - Posted by Ryan Shrout

The extra redraws are invisible to the game and graphics card setup, so no.

March 27, 2015 | 04:45 PM - Posted by killurconsole

if the redraws are done on the GPU side ,we would get less latency, right !

in other words ,would my mouse input get more responsive ?


March 27, 2015 | 05:04 PM - Posted by arbiter

When FPS is that low, no it wouldn't g-sync module doubles refresh rate of the monitor so input latency at 20fps on g-sync would be same as freesync.

As they stated the reason the panel does that is to prevent things like flickering and even damage to the panel.

April 2, 2015 | 04:08 AM - Posted by Joakim L (not verified)

Visual artifacts due to no updating the panel often enough is one thing, but dmg the panel thats just ridiculous.

March 28, 2015 | 09:49 PM - Posted by kn00tcn

at for example 20fps, do you really care about latency at that point? the framerate is too low & the high ms per frame IS latency

all it's doing is raising the refresh rate within the sync enabled min-max scale, which is the most logical thing to do if you're trying not to tear

March 29, 2015 | 04:45 AM - Posted by Anonymous (not verified)

Well console gamers have enjoyed 20-30 fps games for last few years so it must be somewhat tolerable :D

March 27, 2015 | 03:54 PM - Posted by Hz

As long as the display has a maximum refresh rate of at least double its minimum—which is true for the existing TN panels, but not the IPS ones—I see no reason that AMD shouldn't be able to solve this minimum framerate problem in software.

I don't think that they would need monitor-specific driver updates either, since the driver should already be aware of the maximum and minimum refresh rates as soon as you connect the display.

At the same time, I am not particularly concerned about the minimum refresh rates on the existing panels. Gaming below 50 FPS is still a bad experience even with G-Sync, so this seems like more of a theoretical problem to me.

And it is worth pointing out that doubling frames can in fact result in stutter if your display has anything less than 100% persistence.

It could prove to be a problem on displays which use PWM-controlled backlights for example.

A good demonstration of this would be comparing 60 FPS at 120Hz on a flicker-free monitor, and 60 FPS at 120Hz with ULMB enabled.

On the flicker-free display, 60 FPS @ 120 Hz should be perfectly smooth—though there will be a lot of motion blur due to the high persistence.
ULMB will greatly reduce the motion blur, but those repeated frames will cause awful judder and double-images to appear.

This is not the fault of ULMB though. If you display 60 FPS at 60 Hz on a strobed display, or even an old CRT, you will also eliminate the motion blur, but it won't judder. The downside to this is that low persistence at only 60Hz will flicker a lot.

As for the overdriving issues, I suspect that is up to the panel manufacturer more than anything else. Existing electronics are probably only tuned for a single refresh rate. Updated electronics would probably do a better job with this.

But really, the main problem there is that we're still using crappy LCD panels with 100% persistence, which all require some degree of overdriving to achieve decent response times. Even on the best displays, you get a lot of ugly artifacting when overdriving is enabled. What I'm hoping to see is the next generation of OLED displays from LG adding Adaptive-Sync support. I will be buying one of those the instant they become available if that happens.

March 27, 2015 | 04:04 PM - Posted by Anonymous (not verified)

I think that every g-syn display is flicker free automatically. At least every single one tested so far was.

March 28, 2015 | 07:05 PM - Posted by Edmond (not verified)

LCD`s are pretty shit really.

OLED brings the promise of a TRUE fps=hz display that can go as low as 0hz, if the fps really drops below 1fps (in load screens or smth).

As OLED is good enough to HOLD an image @ 0hz, while on LCD it would degrade instantly. As far as i know.

Then you could get a gsync display that refreshes once only for each frame. A 100% fps=hz, 100% flicker free panel. With a max cap still, but were fine with that :)

March 30, 2015 | 10:07 AM - Posted by Chizow (not verified)

No offense, but most of this post is rubbish. Have you used a G-Sync monitor long enough to back the claim "Gaming below 50 FPS is still a bad experience even with G-Sync, so this seems like more of a theoretical problem to me."? Because every single review of the Acer 4K G-Sync panel would disagree with you since it is limited to 60Hz and most graphics cards and solutions of that time and even now were having a hard time pushing anywhere close to 60Hz capped FPS, and it was still a splendid experience from all accounts.

The MAIN benefit of G-Sync or any VRR solution *SHOULD* be its ability to reduce input lag, eliminate tearing and stutter at low FPS, and as an ROG Swift owner I can attest to the fact the G-Sync solutions do this wonderfully even at low FPS in the 20s. There is no sense of stutter, tearing or input lag, just low FPS.

In any case, Nvidia solves this low FPS problem by having low latency local memory on the G-Sync module that acts as a frame cache and lookaside buffer, which simply repeats frames as needed until the next live frame arrives. AMD may be able to do similar by allocating some local memory on their graphics card, but this will undoubtedly be higher latency than having that frame buffer local on the monitor itself.

June 22, 2018 | 09:27 PM - Posted by Anonymous123451 (not verified)

I don't care if this post is years old. I keep seeing it all over the internet. Stop this meme. VRR doesn't make low FPS look any better. All VRR does is allows you to run vsync but without input lag and at variable instantaneous frame rates. 50FPS will look exactly as good at 50Hz vsync. The only truth behind this meme is that 50FPS at 60Hz vsync will cause frame rate to halve, and with VRR it will stay at 50. But that's only becuase you're misusing vsync in a situation where you can't reach FPS=HZ.

April 2, 2015 | 04:21 AM - Posted by Joakim L (not verified)

LG OLED TV's suffer from motion blur aswell. It's a by-product of the "Sample and hold" technique thats naturally used in both LCD's and OLED's. There is as far as I know, no way to get around this problem today without inserting a black frame the actual pictures being displayed. And at this time, there is no solution for using that combined with either G-sync or Free-sync. It's just to complicated to make that work when the frame rate varies all the time. The screen would constantly shift in brightness if you did.

March 27, 2015 | 04:36 PM - Posted by gloomfrost

The only site the made AMD fix it's crossfire problem by demonstrating the problem. PC Per has done it again showing the problem with high FPS G-sync and low FPS Freesync.

Keep on keeping them honest bros!

March 27, 2015 | 05:02 PM - Posted by arbiter

yup, and the only kinda thanks AMD fans will give them, verbal attacks over it.

March 27, 2015 | 05:54 PM - Posted by Anonymous (not verified)

Yup.... sad state of affairs.

March 30, 2015 | 10:09 AM - Posted by Chizow (not verified)

Haha yes it makes you wonder, it is as if AMD fans DON'T want AMD to fix these problems and produce better solutions for them.

If it takes "hostile" press like Pcper and some Nvidia fans to force AMD to fix their product so be it, but at least it will keep AMD fans who claim their solutions just as good honest in the meantime.

March 27, 2015 | 05:17 PM - Posted by muddymind (not verified)

In the video you state that AMD could try to replicate a similar behaviour in the drivers on the lower gap. The problem with that is if you send the repeated frame at the same time you finished the next frame you will introduce stutter because it will have to wait for the frame to go completely to the monitor to free that framebuffer and start the next frame. The only way to fix that is by using triple buffering but that will increase the memory usage which is not good.

March 27, 2015 | 05:22 PM - Posted by Ryan Shrout

If that precise thing happens, you're right, you do have to wait for that redraw on the screen to finish first. But, if you are smart about implementation, you should be able to predict to a reasonable degree when the next frame will finish. A good algorithm will never see more than 30-40% delay compared to current frame time. That's a lot but should be very rare. GPUs are very good and knowing how long it will be until the next frame is ready.

March 27, 2015 | 05:33 PM - Posted by muddymind (not verified)

hum... It didn't occur to me that AMD could do such prediction. Still it would be tricky. i.e. you predicted that you could finish a frame about the same time the next image goes up, if you miss that time window you'll have stutter. In G-Sync that will never be a problem since the previous frame is stored in the module buffer so I tend to believe that free-sync will never be as good as G-Sync even with major driver tweaks imho.

March 28, 2015 | 09:14 PM - Posted by yasamoka (not verified)

So would the previous frame be stored in the graphics card's VRAM in the case of AMD doing it over software.

In Nvidia's case, two components along the display chain have memory that is able to fit and retain the previous frame: VRAM and the G-Sync module RAM. The G-Sync module does more than just retain the last frame; however, why would Nvidia pay to include RAM on the module instead of storing that frame in VRAM?

Plus, the minimum refresh rate below which the graphics driver would take care of multiple refreshes per frame could be a certain margin above the minimum refresh rate reported by the display, as long as the first minimum is still less than double the maximum refresh rate. I don't see why AMD would have to implement a different solution for every single monitor that gets released with Adaptive Sync support (and FreeSync certification) if most or all monitors would exhibit the same issue i.e. when going below the minimum reported refresh rate + a safety margin.

March 28, 2015 | 10:25 PM - Posted by Anonymous (not verified)

What exactly do thing Nvidia's gsync module does? It has to take a guess and decide whether to start a redraw from the on-board buffer, or wait for the next frame to be delivered. It could suffer from the same issues if the next frame is earlier than expected; it could be in the middle of a refresh which cannot be interrupted without tearing. This isn't a big issue because the refresh doesn't take very long for something like a 144 Hz panel.

March 30, 2015 | 10:32 AM - Posted by Chizow (not verified)

No, the G-Sync doesn't need to guess because it is taking commands directly from the GPU and letting it know when its next refresh is available while repeating frames when a new refresh is not available. This was all covered years ago when G-Sync first launched, the bi-directional communication AMD claimed they did not need and did not understand why Nvidia needed to use an expensive FPGA module for.

I guess now we have a better understanding why Nvidia chose the route they did! :)

March 28, 2015 | 09:19 PM - Posted by yasamoka (not verified)

I'm guessing two other solutions would be:

1) The graphics driver refrains from rendering the next frames after such framerate dips in faster than 1 / x frametime where x is slightly higher than the current framerate. That is, render the frame, then hold if it has finished early. Following frames will be rendered with a shifted timeframe, and object positioning would certainly be more accurate than with a sharp frametime reduction (think Sleeping Dogs rolling average). It's certainly better than allowing stutter to occur, and replacing that with a framerate ramp. Given variable refresh rate technology allows highly variable FPS to be kind of smoothed out, the ramp doesn't have to be that slow.

2) Same as 1, but the game takes care of it. The game engine has to be aware that it's running on a variable refresh rate monitor via driver reporting such a feature as enabled or disabled.

March 29, 2015 | 09:43 PM - Posted by Anonymous (not verified)

G-sync module implements t-buffer by hardware, so another reason to understand the hardware solution of nvidia, not only better low fps. Better responsiveness.

March 30, 2015 | 06:03 AM - Posted by Master Chen (not verified)

If I got them right, though, it IS essentially triple buffering. Gay-Sync module has build-in memory that's being used for this particular type of caching only, it looks like. How much, I don't know, but it looks like it's still enough to at least perform a full-blown triple buffering job.
As for FreeSync...there's no physical module, so it really boils down only to three roads they could take: use cache from system memory, use cache from video card's memory, and...hope some monitor manufacturer puts more cache in their panels by default. :\

March 27, 2015 | 05:32 PM - Posted by siriq111 (not verified)


So far that we have an conclusion, from driver maybe able to fix this issue AMD. Well i say maybe because so many other therms come up at the same time. Brightens and so on. The monitor manufactures have to invest more to "step out" from this shoe, not just the AMD driver. Sounds simple to fix the whole issue but it is more complicated at the other end.

March 27, 2015 | 05:38 PM - Posted by JeffroGymnast (not verified)

For AMD to implement a similar technology, the panel would need to have a relatively wide range of usable refresh rates. The panel would need to be able to (at least) double its minimum refresh rate.

There are panels released already that have too narrow of a range for this to work.

For example, the LG that Linus just reviewed (34UM67) has a refresh window of 48-75 Hz. If it dips below 48 Hz, it would have to be able to double 47 Hz (94 Hz), which it can't.

I wonder why the LG has such a high minimum refresh rate...

March 27, 2015 | 05:41 PM - Posted by siriq

Also don't forget guys. What else it could lead to fix some of the issues.

March 27, 2015 | 05:59 PM - Posted by qsert (not verified)

AMD mentioned that Freesync will also support video playback and power saving features that was originally built for laptops. Would playing videos at lets say 24fps cause it to also judder and tear if the refresh window is 40 to 144 hz?

March 27, 2015 | 06:28 PM - Posted by Allyn Malventano

That is correct. If anyone was to write a full screen video player app that behaved like a game engine and drew at the native video FPS (24), all FreeSync panels would tear or judder while G-Sync panels would simply frame double and remain at an effective 24 FPS, without tearing or judder.

March 27, 2015 | 06:51 PM - Posted by Anonymous (not verified)

What about working in different resolutions.

Can either one work outside full screen mode?

Do you always have to be in monitor native resolution for it to work ?

March 27, 2015 | 11:47 PM - Posted by Allyn Malventano

Outside of full screen gaming modes, the display reverts to the Windows setting. 

March 28, 2015 | 12:27 AM - Posted by Anonymous (not verified)

Which display? The gsync, freesync or both ?

What about outside native resolution ?

March 29, 2015 | 02:12 AM - Posted by arbiter

Both displays would revert to native windows refresh rate. As for resolution that shouldn't matter as long as game is full screen without border.

March 28, 2015 | 03:43 PM - Posted by Anonymous (not verified)

Do either Gsync or FreeSync displays work in full screen mode in resolutions other than the native panel resolution?

March 27, 2015 | 09:44 PM - Posted by me (not verified)

Fixed refresh source material shouldn't be a big issue if the coder was aware of the limitations. ie. the player could just double the frames and present a 48fps video effectively doing what gsync is doing in software. A 120hz panel should be able to play 24fps material as intended by replaying the same image 5 times. However variable source material like games is a different story since you need to predict the time to the next frame.

March 27, 2015 | 10:32 PM - Posted by Anonymous (not verified)

If a player was doubling frames the LG 34um67's VR minimum of 48 fps would enable smooplayback I would think.

March 27, 2015 | 06:20 PM - Posted by Anonymous (not verified)

Did you guys contact Benq for the specs?

In the video you said they are the same but as people mentioned in the last freesync article in the comments they aren't. Googling the specs for the models shows they aren't.

Curious as to why Ryan and Allyn continue to say they are the same while monitor sites like TFTCentral have them listed as different models.

March 27, 2015 | 06:26 PM - Posted by Allyn Malventano

The resolution, max refresh, panel type (TN), grey to grey response, are the same for both panels. They may not be the exact same part number, but for the purposes of comparison, they are close enough for us to consider them 'the same panel'.

March 27, 2015 | 06:41 PM - Posted by Anonymous (not verified)

That's an issue because if you frequent monitor sites advertised 1ms GtG never translate to actual 1ms RT. Asus Swift has a 6.9 gtg rt w/ od off and 2.4 gtg rt w/ od extreme w/ RTCO 10.5%

How can you say they are close enough without even testing such differences in the panels to begin with?

March 27, 2015 | 06:58 PM - Posted by Jacozz (not verified)

Good work guys!
I actually think that the ghosting thing is a bigger issue than the below VRR thing.
Even though the animations will be a lot smoother at 35 fps at them G-syncmontors, I still wouldn't want to be that low when playing a game. I Just lower a setting or two and keep them fps above the minimum refresh rate. Hey, for that extra dollars I save I could spend on a faster graphics card :)

It would be interesting to see if that new Asus 144 hz ips-freesync monitor has the same ghosting problem as that BenQ you tested?

March 27, 2015 | 10:50 PM - Posted by Anonymous (not verified)

If you pause the video in the previous FreeSync article in random spots you will see the ROG Swift ghosting also. Or save the video and frame by frame advance through it and you will see the ROG Swift ghosting when the LG and BenQ aren't.

The exact opposite of what you see in the frame grab PCPer used.

This so called ghosting issue is a little blown out of proportion IMO.

They all do it.

March 27, 2015 | 11:52 PM - Posted by Allyn Malventano

That's not ghosting. That's the high speed video catching (within a single captured video frame exposure) the actual scan of the Swift. Once the scan is done, there is only one blade, therefore no ghosting. 

March 27, 2015 | 08:32 PM - Posted by fade2blac

What if desktop displays start to support panel self refresh? It makes sense for saving power in mobile devices, but perhaps it can provide another approach to smoothing out sub-optimal frame rates.

The idea of just repeating the last frame when a new one isn't ready seems obvious and simple. Panel self refresh could perhaps allow for the same sort of frame multiplication when frame rates drop below the true VRR window. The panel control logic should already know how to manage overdrive or ULMB parameters to keep brightness/flicker and ghosting in check. At some point, though it seems like eventually this will be effectively integrating all the functionality of a G-Sync module into the display control logic.

*EDIT* After thinking a bit more, I wonder why would a panel with up to 144Hz refresh rates which also has some sort of frame rate multiplication capability ever bother running at frame rates significantly below 1/2 the max refresh rate? For example 24 FPS could be doubled and shown at 48Hz to keep in the supported range. But it could just as easily be shown at 72Hz, 96Hz, 120Hz, or even 144Hz refresh rates. Shouldn't a higher effective refresh rate help reduce things like ghosting and flicker? I understand that it is important to avoid relatively large deltas in refresh rates, but there seems to be plenty of room in a 40-144Hz range for more aggressive frame multiplication.

March 27, 2015 | 09:42 PM - Posted by Anonymous (not verified)


That is what G-Sync is doing below 40fps. Just like a TVs would do motion interpolating.

I suspect that's why people in forums and reviews notice a stutter and a jitter what TVs viewers call the soap opera effect.

March 27, 2015 | 09:46 PM - Posted by Anonymous (not verified)

Tom Petersen calls it "Look-a-side" instead of motion or frame interpolation

March 27, 2015 | 10:27 PM - Posted by fade2blac

I sincerely hope there is no frame interpolation or motion estimation going on. I despise frame interpolation and turn it off on my TVs because of the soap opera effect. I have noticed it almost immediately on any TV that has this feature.

Frame multiplication is NOT the same as frame interpolation. Interpolation is injecting artificially created frames where none existed in the original content. To many people this is an unnatural artifact that creates an undesirable experience. Frame multiplication on the other hand only displays the actual frames of the content but repeats those frames to avoid judder from things like 3:2 pull down when frame rate and refresh rates are not in sync. For example, a 120Hz TV can repeat each frame of a 24FPS movie 5 times to match the native refresh rate of the TV without "creating" any new frames. Similarly, 30FPS and 60FPS content maps evenly by quadrupling and doubling frames respectively.

March 27, 2015 | 11:55 PM - Posted by Allyn Malventano

There is no interpolation. It redraws the exact same frame. 

March 27, 2015 | 09:13 PM - Posted by nevzim (not verified)

Which one is better upgrade for geforce 660 owner gsync monitor or geforce 980?

March 28, 2015 | 01:16 PM - Posted by Anonymous (not verified)


March 27, 2015 | 10:22 PM - Posted by trenter (not verified)

Everyone talks about amd not having a driver update since omega, but fail to mention they release several beta drivers between whql releases. Whql means nothing and I've had just as many problems with signed drivers as I have beta drivers from both amd and nvidia, which is very few. Also, readers are constantly reminded of amd's failure to provide a woworking crossfire driver for far cry 4, yet it's ubisoft that was responsible for broken crossfire profiles. Amd stated that they had disabled crossfire in far cry 4 due to issues on ubisofts end and would enable the profile when they fixed the issue, ubisoft never once argued against that claim. Go figure, a gameworks title with 6 patches containing fixes for nvidia sli and not a single fix for crossfire. That is a story pcper needs to be digging into, I would read that article. To be fair I don't know that pcper has ever brought this specific issue up or not, but it has beened thrown around by every other tech site I know of. Pcper also gets their fair share of cheap shots in about drivers though.

March 28, 2015 | 04:07 AM - Posted by zMeul (not verified)

let's talk 2014
for the whole year, AMD only had 3 WHQL driver updates: 14.4, 14.9 and 14.12
yes, between those, they were BETA drivers and RCs, but not even one per month to cover the whole year - and the problem with those is that they broke some stuff or weren't generally stable

now, imagine manufacturers release just 10 monitors in a month; AMD, on top of optimizing for specific game(s) they also need to add those monitors in

March 28, 2015 | 01:21 PM - Posted by renz (not verified)

AFAIK AMD have no driver release between Omega (14.12) and current 15.13 beta. before 15.3 comes out AMD have no CF profile for games that come out in 2015. not just FC4.

March 29, 2015 | 04:50 AM - Posted by Anonymous (not verified)

In 2015 this is first driver since Omegas from end of 2014.

March 28, 2015 | 12:45 AM - Posted by Dr_Orgo

Nice job covering this topic. I was wondering when you would share the "raw data" on your VRR investigating. This level of detail is why I started following your site.

I think given the higher prices of G-Sync displays due to supply and demand and the low refresh rate problems of Free-Sync, the smart choice is to wait on VRR at the moment. In 6-12 months, the 3440x1440p IPS 144Hz VRR monitors or 2560x1440 versions will be out. Also the prices should drop by then. I think that until Free-Sync is supported in EVERY monitor, you're better off waiting until the low fps problem is fixed. If you're spending more on VRR, why settle for a sub-optimal implementation when it will likely be fixed later.

March 29, 2015 | 06:23 PM - Posted by Allyn Malventano

The raw data is not very good for anything other than a chart because the thresholds actually shift based on the rate of change of FPS. G-Sync module does its best to get ahead of any frame rate changes in order to minimize any possible of judder caused by a frame incoming during a redraw. My raw data is static in nature while the actual progression is very dynamic. 

March 28, 2015 | 03:31 AM - Posted by capawesome9870

at the end of the video you had a Theoretical discussion on what AMD could do to Driver update to fix the what happens under the Low side of the Monitor.

you (both) mentioned that AMD would have to do a Driver Update for each monitor that would come out in order to support the Frame Multiplication feature and (theoretically) it would not have to be for each monitor, in the same way that the driver doesn't need to be update for each monitor currently, just that it needed to support VRR/FreeSync/Adaptive-Sync.

point being that when the monitor at start up it reports the VRR Window, and all the driver would have to do is set a FPS of when to start multiplying frames to alleviate the issue.

also, if AMD Does do this, FreeSync labeled monitor should have a VRR cutoff that is less than half of the Max refresh of the monitor (or a half minus 1 or 2 to make sure that it doesn't tear). with the LG ultra wide with the 45-75hz, the Frame Multiplier would put the frame to 88 (from 44) above the Max Refresh of the monitor causing tearing.

March 28, 2015 | 08:13 AM - Posted by Mac (not verified)

Mr Malventano I do believe you're incorrect in that video when you state that gsync is pacing the low fps in such a manner as you describe. Here is why: The whole point of doubling or tripling the refresh he is to minimize not just flickering, but also the amount of time the next frame in our animated sequence has to wait before it's displayed. An example would be say at 30 fps, instead of scanning out every 33.3ms, double-up and scanout every 16.7ms. This has the effect of reducing the maximun amount of time the next frame has to wait before being shown from 33.3ms in my example to a maximum 16.7ms. There's no clever algorithm pacing something as unpredictable as the when next frame is coming. This fits in with what Mr Petersn said about gsync when he likened it to double buffering, rendering into A and B buffer and scanning alternately out of said buffers.

March 29, 2015 | 06:25 PM - Posted by Allyn Malventano

Incorrect. It paces the frame insertions centered within the incoming frames, as is plainly visible on the scope. If they were as you describe, they would not be evenly spaced. 

March 30, 2015 | 04:39 AM - Posted by Mac (not verified)

It's a kind of adaptive variable v-sync, doubling at 36fps and below and tripling at 18fps and below which explains the even spacing. If it is as you say then it's got to know ahead of time when the incoming frames are going to be and this points to additional buffering. Have you any latency testing in the offing?

March 28, 2015 | 08:50 AM - Posted by Anonymous (not verified)

For gsync are you using the ROG swift or the Acer panel? The video says you are using the swift, but the graph shows the Acer. If it is the Acer, is it the new IPS gsync monitor? Love to see a review on that.

March 28, 2015 | 09:32 AM - Posted by Anonymous (not verified)

Its the new Acer XB270HU IPS 144Hz monitor. Don't bother trying to get one right now, they are back order till the end of NEXT month.

I got my preordered. :/

March 29, 2015 | 02:29 PM - Posted by Ryan Shrout

We actually tested both and both behaved the same, moving into the frame doubling windows at the same points.

March 28, 2015 | 10:04 AM - Posted by Ryun (not verified)

I don't understand the thought process behind AMD requiring a profile for every monitor if they want to fix this problem.

The driver should already be given the minimum refresh rate of the monitor. After that it's just as simple as redrawing the last frame at a refresh rate divisible by the FPS between the minimum and maximum until the new frame is ready.

March 28, 2015 | 10:43 AM - Posted by Mac (not verified)

I agree with that. Quick question, wasn't gsync vsyncing at 30 fps and below before this doubling an tripling of refreshes was introduced a couple of months later?

March 29, 2015 | 02:32 PM - Posted by Ryan Shrout

No, but G-Sync did have an issue where it didn't handle quick oscillations around those frame doubling transition points very well, causing a kind of stutter when you were rendering at 36-39 FPS for a while.

March 29, 2015 | 06:27 PM - Posted by Allyn Malventano

That actually was more my old oscilloscope having a hard time triggering. It's an old scope. Cut it some slack :)

March 28, 2015 | 10:36 AM - Posted by Mac (not verified)

As we discuss in the video, it is possible that AMD could implement a similar algorithm for FreeSync at the driver level, without the need for an external module. A Radeon GPU knows what frame rate it is rendering at and it could send out a duplicate frame, at a higher frame rate, to trick the display and have the same effect. It will require a great deal of cooperation between the panel vendors and AMD however,
I'm not sure about it requiring a great deal of cooperation with panel vendors.If freesync truly is the mono-directional protocol they say it is,if the monitor is truly slave to the gpu and it does as it's told, then the monitor should simply scan out whatever frame is sent regardless of whether it's the same frame or not.The ghosting is what's going to require a lot of cooperation to get sorted. Interestingly, the higher you fps on freesync, the less your ghosting - so doubling-up or even tripling the refresh rate at low fps is definitely something AMD needs to look at implementing.

March 28, 2015 | 01:11 PM - Posted by Anonymous (not verified)

While INCREDIBLY tedious to test, it would be VERY interesting to see the results of running a colorimeter (or spectrophotometer if available) on a G-Sync and DP Adaptive sync display at different refresh rates, as well as hooking up the oscilloscope and measuring ghosting at different refresh rates. If the G-sync module is doing some dynamic variation in panel driving voltage/time which commodity panel controllers are not, then there may be variations in brightness and contrast (or even a slight colour shift) when driving a DP Adaptive Sync display at different refresh rates.

March 28, 2015 | 10:16 PM - Posted by Anonymous (not verified)

I was thinking a similar thought. It seems like something like a 144hz panel would have a minimum, a maximum (144hz), and possibly a preferred. Given how gsync seems to work, it seems like you could set a preferred refresh rate (maybe 120 hz) that would give you good color accuracy and less overdrive artifacts and just have the gpu send extra frames when necessary. The gsync system has this behavior below the panels minimum refresh rate, so why not set the target refresh rate higher.

March 29, 2015 | 02:33 PM - Posted by Ryan Shrout

Interesting. We'll try to discuss the testing process for something like this...

March 29, 2015 | 06:30 PM - Posted by Allyn Malventano

Color variations are not really suspect, because even though both panels are in VRR mode, the draw speed is still at the max rate (each frame is drawn at 1/144 top to bottom). Color consistency issues normally present when changing to higher scan rates, not differing refresh rates - not in this context at least. 

March 28, 2015 | 03:18 PM - Posted by Martin Trautvetter

Awesome work, guys!

March 29, 2015 | 10:33 AM - Posted by BrightCandle (not verified)

One of the things I noticed initially with gsync was that if I turned vsync on in some games (F1 2014 was one example) the game felt like it had more latency compared to vsync off.

Based on how freesync is implemented is it possible to have vysnc off in the game settings but have it forced on in the driver so the game doesn't try to apply its own frame limiting or other problematic vsync like elements to a VRR target?

The second point I wanted to make is that I feel the best of both worlds would be vsync on to the maximum for VRR and vsync off at the minimum but with maximum refresh rate. That way you do get the updates as quick as possible. Neither solution actually goes here but I can't say I have been unhappy with gsync's solution, its actually been really amazing (the ROG Swift not so much, that monitor is a disaster, on my 4th RMA)

March 29, 2015 | 08:04 PM - Posted by wolsty7

articles like this are why pcper is the best

March 30, 2015 | 05:52 AM - Posted by Master Chen (not verified)

Alright, let me get this straight. What you, guys, basically saying here is that, literally, Gay-Sync module has a build-in cash for double/triple buffering, which it uses only at low FPS. It doesn't produce stutter at lower FPS when it doubles/triples the refresh frequency, because it stocks up ahead several cloned frames which it then uses to smooth up things. To put it simply: when you think it renders one frame at low FPS, it actually renders one WHILE storing up to three absolute same ones in it's build-in cash at the same time. And it ups the refresh frequency simply to move out those stocked up frame-clones as fast as possible, thus why no stuttering. It's basically double/triple buffering, but, unlike the typical frame buffering methods, this one is instant since the stocked-up clone frames are ready to be moved out instantly thanks to the doubled/tripled refresh frequency. I hope I got this right.

Anyway, either way, I think I still prefer FreeSync more so far, simply because it's: 1) open, 2) cheaper to implement/no price premium, 3) technically supports much lower refresh rates than Gay-Sync module would ever allow to go to without all this "triple buffering + higher refresh frequency" voodoo mumbo-jumbo magic. I feel like nGreedia simply tries to make us buy their proprietary crap again, and no matter how better it may look like at the lower FPS, I still think that correctly applied FreeSync would still be a better route to go simply of how much cheaper and more versatile it is. There's just way too many restrictions with nGreedia's stuff, almost always. I want to be FREE.

March 30, 2015 | 11:26 AM - Posted by bhappy (not verified)

Master Chen you really come across like an ignorant homophobic hater, so unless someone gives you the hardware for free there's nothing free about freesync despite its name.

March 30, 2015 | 12:38 PM - Posted by Master Chen (not verified)

Cry more, brainwashed marketing victim sheep noVideot. You make me laugh.

March 30, 2015 | 01:52 PM - Posted by Jeremy Hellstrom

Tone it down children.  Attack the site, tech and the companies all you like but keep the personal stuff to a minimum. 

Unless it is a really good zinger, those always get to stay.

March 31, 2015 | 04:13 AM - Posted by Master Chen (not verified)

You truly amuse me, Jeremy. You're one to talk here about "companies", lol...NOT.

April 1, 2015 | 06:33 PM - Posted by Gregster

Great video and explains a lot of why gaming is so smooth on the Swift. Can you confirm to settle an argument that 30fps on the swift is actually running at 60Hz with an extra frame being sent, so in effect is 60 fps?

April 2, 2015 | 06:12 AM - Posted by shankly1985

You not serious are you? April Fools?

April 7, 2015 | 10:19 AM - Posted by Lord Binky (not verified)

What I'm taking away from this is that the ball is really in the panel maker's court. They need to get the lower end of the VRR window down to 30Hz, after that if your FPS is dropping lower than that, lower you're settings. That makes any screen tearing and artifacts you get with freesync effectively an alarm. ;)

April 7, 2015 | 06:56 PM - Posted by Anonymous (not verified)

A few questions.

1) If the Asus RoG Swift is showing doubling frames at 36hz why is the G-Sync rated at 30hz-144hz? Anyone bothered to ask about the discrepancy.

2) Power consumption differences BenQ is rated at 65watts. Asus RoG Swift is rated at 90watts. What are the additional 25watts doing? Asus RoG swift owners have said in forums they can notice a green led light that's always on inside the enclosure. What's that about ?

3) If G-Sync is doing some sort of PSR why cant it do it at say 60hz, 120hz or 144hz ? Why only when it drops out of VRR ?

April 14, 2015 | 09:41 PM - Posted by OldFartApparently (not verified)

I'm confused. What changed? I've been gaming for decades, and we never needed some overpriced monitor to get 60 fps. That's really all I care to experience, as I honestly don't think I can tell the difference above 60 fps (yes, I've taken the tests...). So what changed? Is this the result of lazy engineering? Pushing the performance burden off to the monitors? Or are the GPU manufacturers hyping up this new tech and playing a branding game, or what? Or are resolutions becoming so high that they cannot keep up with fps demand, even up to (my) golden 60 frames? I mean, gosh, doesn't anyone but me play in 1280x1024 anymore?? Now we have 4k out there... I'm going to need to go get my glasses prescription updated just to see the difference. Such inflation of standards. *sigh* Sorry this turned into a "back in my day" post. Yes, I'm getting old. :)

June 10, 2015 | 01:07 PM - Posted by Anonymous (not verified)

I think you need to read what does G-sync and Free-sync do in order to understand that this is not just about getting a monitor to 60 or 144hz.

June 4, 2015 | 06:13 AM - Posted by db87

You don't talk about the disadvantages of G-Sync: no multiple input, displayport only, restricted menu settings due to G-Sync, additional costs and so on.

July 24, 2015 | 04:47 AM - Posted by Stefem (not verified)

Manufacturer could add more port and OSD if they want but you would still need to use DisplayPort in order to enable G-Sinc (and this apply to AMD's FreeSync too).

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

This question is for testing whether you are a human visitor and to prevent automated spam submissions.