Review Index:

AMD FreeSync First Impressions and Technical Discussion

Author: Ryan Shrout
Subject: Displays
Manufacturer: AMD

Gaming Experience with FreeSync - LG 34UM67 and BenQ XL2730Z

Before having to leave for a work-related trip this week I got to spend two full days with the 34UM67 monitor from LG, the first functional FreeSync monitor to make it to our office. This display is unique with a 21:9 aspect ratio and a 2560x1080 resolution all built into an ultra-wide 34-in IPS screen. Available soon with an MSRP of $649, the 34UM67 has a dynamic refresh rate range of 48-75 Hz.

Let’s talk about that first: the 48-75 Hz range, with the 75 Hz being the limit of this panel, is rather narrow. That only gives us a 26 Hz window where the variable frame rate capability of FreeSync can actually work and function. When you game runs at 75 FPS or above, and at 48 FPS and below, you are no longer seeing the benefits of tear-free and stutter-free gaming and instead are going to be forced to use the VSync on/off settings that we discussed above. Why is this range so narrow? This is partly due to the implementation of FreeSync on an IPS panel, something we haven’t actually had hands on with from G-Sync compatible products yet. Also, the 2560x1080 screen is fairly low production and a 75 Hz maximum refresh is actually a step above what we have seen for 21:9 monitors in the past year.

AMD is working on other panels including two 2560x1440 options from BenQ and Acer that work in a 40-144 Hz VRR window, giving the game more room to see the benefits of variable refresh.

View Full Size

The LG 34UM67 34-in 2560x1080 FreeSync monitor

But let’s focus back on the 34UM67 specific as that is where I spent the majority of my time last week. The 34-in screen is impressive to see in person and the IPS configuration means you get great viewing angles – especially important considering the ultra-wide design that creates unique angles even in a normal usage scenario. Using the monitor in a desktop and productivity environment is superb – 21:9 monitors are definitely unique and provide some interesting benefits and caveats for users. More on that later.

But you’re all here to talk about the gaming implications of this monitor and the FreeSync technology implementation. First, let’s focus on the good news: the gaming experience of the variable frame rate when operating inside the 48-75 Hz window is pretty much perfect and matches the experiences we have had with G-Sync. When playing games like Crysis 3 or Metro: Last Light at 55-70 FPS, which is a common frame rate given the settings we were using at the 2560x1080 resolution, the games ran smoothly and we didn’t experience stutter or horizontal tearing that you would normally see with a typical vertical sync configuration. It’s obviously a bit of a disappointment that the range on this panel is so constricted, but when you are operating in that 26 Hz range, the experience was flawless.

When running games over the 75 Hz maximum refresh rate you have the option to enable or disable VSync through the normal methods. When playing a game that is running at consistently higher than 75 FPS the operation is basically unchanged from what you have seen and used before. You will either get a moderate amount of screen tearing depending on the rate matching between the 75 Hz maximum refresh rate of the panel and the actual render rate of the game, or you will see a maximum render rate of 75 FPS if VSync is turned on.

View Full Size

Our next guest, the BenQ XL2730Z 27-in 2560x1440 144 Hz FreeSync Monitor

The result at the lower experience zone is similar but with some more complication. As we played Crysis 3 with the settings jacked up and moved to locations where frame rates dropped below 48 FPS, which is pretty often honestly, we saw the same type of concerns that have plagued monitors for decades. If you had VSync off I saw frequent screen tearing but if you had VSync on you were often experiencing the stutter and judder effects of having a mismatched frame rate and refresh rate. And because the minimum variable refresh rate is 48 FPS, and probably the worst judder comes within the 35-45 FPS range on a monitor with this configuration.

AMD sees some of this as an issue of flexibility and of options for the gamer. With G-Sync today you do not get the option to enable or disable VSync when outside the variable refresh range. At the high end that means that NVIDIA today forces users into a VSync on state, something that many competitive gamers don’t approve of. AMD’s implementation allows those gamers to disable VSync, giving them the lowest possible input latency, highest possible frame rates though at the cost of re-introducing screen tearing. That just happens to be a choice that many gamers would make.

As it turns out, NVIDIA is starting to recognize this as a concern for some users and will  hopefully consider enabling the option to set VSync off when gaming above the maximum panel refresh rate in a future driver.

But on the low side there is really no reason to WANT to have either tearing or stutter unless the implication is that performance is lost or latency is increased with G-Sync. NVIDIA’s implementation today is superior when you are working under the minimum panel refresh, using the embedded controller and module in the panel to force frame multiplication at certain frame rates. And as it turns out, the transition between a variable refresh rate that is smooth and consistent to a tearing or stuttering frame rate (depending on the VSync setting) can be quite jarring if it happens frequently. We found an area in Assassin’s Creed Unity for example that by simply running across the rooftops to a mission marker would push frame rates between 35 FPS and 60 FPS, often crossing that 48 Hz minimum VRR window. When it makes the transition the effect is immediate and I think that some gamers will like that even less than normal stutter or tearing options. Obviously the ideal scenario would be to game ONLY inside that VRR window on FreeSync displays, but that seems unlikely in this case.

It’s fair to assume that the FreeSync experience we got with the LG 34UM67 is probably the worst monitor to start with, as its narrow window of variable refresh more easily brings the complications of FreeSync to light. That is not a slight on the LG monitor itself, it is doing what it can based on the IPS panel technology that it has access to. With the Acer or BenQ monitor that each use a TN 40-144 Hz panel, you get a much larger window of variable refresh rates and thus are you going to see the issues presented by the outside experience zones much less frequently. We have the BenQ monitor in-house as well and I’ll be spending more time with it in the next handful of days for an updated FreeSync story.

Speaking of other models, AMD has eight monitor models that are FreeSync approved that are either shipping now or will be shipping within the next couple of weeks.

View Full Size

Acer and BenQ will have 40-144 Hz TN panels option, LG has a 34-in and 29-in 2560x1080 resolution option, Nixeus and Viewsonic have 144 Hz 1920x1080 options and Samsung will be shipping a pair of 4K options with a maximum refresh rate of 60 Hz with screen sizes ranging from 24-in to 32-in. I don’t all the information for all of these monitors yet but that is a great range of products for initial release. Most of the emphasis from AMD is coming on the 2560x1440 and 2560x1080 models based simply on the demand from users. The Acer XG270HU will ship with a starting MSRP of just $499, $100 less than that matching G-Sync monitor from Acer. The BenQ option will sell for $599 MSPR, though I see it showing up on already with a price tag of $629. Consider that the most popular G-Sync monitor, the ASUS ROG Swift PG278Q, currently sells for $760, it appears that FreeSync monitors will sell for a lower price than G-Sync. How long that lasts will be the most interesting question: I have often hoped that competition in the VRR display market will drive all prices down.

This shows another point in favor of AMD's FreeSync implementation: a wide and varying array of display options. Because the standard is open and can be adopted by vendors using different scalers and controllers, at different resolutions and panel technologies, AMD already has a more varied collection of displays supporting its variable refresh technology than NVIDIA has with G-Sync. I expect that trend will continue as well, with FreeSync showing up in all kinds of display shapes and sizes thanks to the lower barrier to entry of implementation.

The current driver released today to support FreeSync monitors that are already shipping adds VRR support for most Radeon GPUs on the market today. The Radeon R9 295X2, R9 290X, R9 290, R9 285, R7 260X and R7 260 all support FreeSync, though the R9 280X/280 and R9 270X/270, as well as the entirety of the HD 7000-series does not. That is not an arbitrary designation on AMD’s part but instead is a silicon limitation as the display controller in Southern Islands does not support dynamic switching. Obviously all future GPUs will support VRR technology.

CrossFire support isn’t included for any GPUs today though AMD is promising another driver update next month that will include it – apparently making sure that all the frame pacing work that went into the driver over the last 2 years is properly implemented and functional with FreeSync took a bit longer than expected.

Ghosting Concerns

With the two working FreeSync monitors we have been testing over the past week or so we did notice ghosting of the animated images was prevalent. Our first monitor was the LG 34UM67, an IPS display, and thus we actually expected a bit more ghosting than our comparable 144 Hz TN panels. But we actually see a similar effect with the BenQ 144 Hz TN FreeSync monitor as well.

You can see an animated GIF implementation of this side-by-side as well.

Also, because these were filmed in the high speed mode of an iPhone 6, don't consider the slow draw rate that appears to be stutter in the animations. The point is to focus on the ghosting concerns.

While I was away at GTC, Allyn did some more work looking into and exploring the ghosting we saw on the FreeSync monitors. In the animation above you can see three different displays at work, all using the AMD FreeSync demo of a windmill rotating and also panning on the screen. On the left is the G-Sync enabled ASUS PG278Q ROG Swift at 144 Hz refresh rate; in the middle is the BenQ XL2730Z 144 Hz monitor and on the right is the LG 34UM67 at 75 Hz. The animation frame rate was set to 45 FPS for both the ASUS and BenQ displays but the LG had to be set at 55 FPS due to limitations of the demo software.

View Full Size

Click to Enlarge

The ROG Swift animates at 45 FPS without any noticeable ghosting at all. The BenQ actually has a very prominent frame ghost though the image still remains sharp and in focus. The LG 34UM67 shows multiple ghost frames and causes the blade to appear smudgy and muddled a bit.

The question now is: why is this happening and does it have anything to do with G-Sync or FreeSync? NVIDIA has stated on a few occasions that there is more that goes into a VRR monitor than simply integrated vBlank extensions and have pointed to instances like this as an example as to why. Modern monitors are often tuned to a specific refresh rate – 144 Hz, 120 Hz, 60 Hz, etc. – and the power delivery to pixels is built to reduce ghosting and image defects. But in a situation where the refresh rate can literally be ANY rate, as we get with VRR displays, the LCD will very often be in these non-tuned refresh rates. NVIDIA claims its G-Sync module is tuned for each display to prevent ghosting by change the amount of voltage going to pixels at different refresh rates, allowing pixels to untwist and retwist at different rates.

It’s impossible now to know if that is the cause for the difference seen above. But with the ROG Swift and BenQ XL2730Z sharing the same 144 Hz TN panel specifications, there is obviously something different about the integration. It could be panel technology, it could be VRR technology or it could be settings in the monitor itself. We will be diving more into the issue as we spend more time with different FreeSync models.

For its part, AMD says that ghosting is an issue it is hoping to lessen on FreeSync monitors by helping partners pick the right components (Tcon, scalars, etc.) and to drive a “fast evolution” in this area.

Initial Thoughts

Our time with FreeSync has been shorter than expected and I don’t want to make any kind of concrete decisions on the technology just yet. We now have a good range of monitors in our office or on the way to help us make the best possible decision about recommendations for gamers. It took longer than we wanted to arrive, but the technology is finally available to AMD Radeon PC gamers.

When using the FreeSync display inside the VRR window, which is unique to each monitor model, the gaming experience is drastically improved over traditional VSync enabled or disabled configurations. You get a clean, tear-free and stutter-free animation and PC games look better than they ever have on an AMD GPU. The NVIDIA G-Sync experience and the AMD FreeSync experience in this case are basically identical.

View Full Size

Above the maximum refresh rate, AMD’s current solution is actually better than what NVIDIA offers today, giving users the option to select a VSync enabled or disabled state. G-Sync forces a VSync enabled state, something that hardcore PC gamers and competitive gamers take issue with.

Below the minimum refresh rate things get dicey. I believe that NVIDIA’s implementation of offering a variable frame rate without tearing is superior to simply enabling or disabling VSync again, as the issues with tearing and stutter not only return but are more prominent at lower frame rates. When I pressed AMD on this issue during my briefing they admitted that there were things they believed could work to make that experience better and that I should “stay tuned.” While that doesn’t help users today, AMD thinks that these problems can be solved on the driver side without a change to the AdaptiveSync specification and without the dedicated hardware that NVIDIA uses in desktop G-Sync monitors.

My time with today’s version of FreeSync definitely show it as a step in the right direction but I think it is far from perfect. It’s fair to assume that after telling me FreeSync would be sent to reviewers as far back as September of last year, AMD found that getting display technologies just right is a much more difficult undertaking than originally expected. They have gotten a lot right: no upfront module costs for monitor vendors, better monitor feature support, wide vendor support and lower prices than currently selling G-Sync options. But there is room for improvement: ghosting concerns, improving the transition experience between VRR windows and non-VRR frame rates and figuring out a way to enable tear-free and stutter-free gaming under the minimum variable refresh rate.

Even though our LG3 34UM67 has a fairly narrow VRR window at just 26 Hz, the BenQ and Acer 2560x1440 will over a much larger 104 Hz range, diminishing the impact of the negative characteristics above. I’ll be spending more time with those over the weekend and into early next week and will definitely have an updated story for our readers to follow up on.

FreeSync is doing the right things and is headed in the right direction, but it can’t claim to offer the same experience as G-Sync. Yet.

March 19, 2015 | 12:26 PM - Posted by Shambles (not verified)

"So let’s talk about that. Though it doesn’t exist, imagine if you would a matching 2560x1080 75 Hz monitor with G-Sync support."

I'm guessing that should be 'imagine you are watching'.

March 19, 2015 | 12:32 PM - Posted by Allyn Malventano

No, he's talking about a hypothetical 2560x1080 75 Hz G-Sync panel.

March 19, 2015 | 03:05 PM - Posted by Shambles (not verified)

Ahhh right. My english to brain translator is working again.

March 23, 2015 | 08:17 AM - Posted by Ninjawithagun

Uh, no it's not a hypothetical monitor:

"...Available soon with an MSRP of $649, the 34UM67 has a dynamic refresh rate range of 48-75 Hz."

Next time, read the ENTIRE article...oh wait, you were one of the authors of the article, so what gives? lol

May 5, 2015 | 03:56 PM - Posted by Allyn Malventano

Next time, you should read the word G-Sync.

March 19, 2015 | 12:32 PM - Posted by Anonymous (not verified)

I don't like that ghosting at all. I do like seeing the panels with very little mark up, if anything I like this so maybe G-Sync panels will drop some. Each monitor using G-Sync should drop a minimum of $200 to really entice people into making the plunge.

March 19, 2015 | 12:33 PM - Posted by Allyn Malventano

Now that there's actual competition, I wouldn't be surprised to see prices come down some.

March 19, 2015 | 01:35 PM - Posted by Anonymous (not verified)

So, uhm...Allyn. Could you explain to us how Adaptive Sync might introduce Ghosting? I don't really see the connection. I would guess it has to do with the preconfigured settings of the monitor, set by the manufacturer. Or am I missing something here?

March 19, 2015 | 02:11 PM - Posted by Anonymous (not verified)

A-Sync doesn't introduce ghosting, the panel does that. This review is painting with a rather broad brush, intentionally or not. FreeSync is capable of ranges from 9-240Hz, it's up to the monitor manufacturer/panel to establish what range within THAT range the monitor will operate, as well as all the other components within the monitor (IPS, TN, 4K, 1440p, 720p, backlight, scaler, inputs, speakers, bezel material).

Sh*t panel = issues. That's not FreeSync's fault.

March 19, 2015 | 02:12 PM - Posted by Anonymous (not verified)


Sh*t implementation = issues. That's not FreeSync's fault.

March 19, 2015 | 02:44 PM - Posted by Allyn Malventano

The implementation *is* FreeSync, at least as far as how AMD has been claiming it. The panels we are testing have been 'certified' by AMD and have a FreeSync logo on the box.

March 23, 2015 | 08:10 AM - Posted by Anonymous (not verified)

Can you please make detailed tests with
1- FreeSync OFF & Blur Reduction OFF.
2- FreeSync OFF & Blur Reduction ON.
3- FreeSync ON & Blur Reduction OFF.
4- FreeSync ON & Blur Reduction ON.

While testing Blur Reduction ON try to cover all the possible Blur Reduction possibilities and options.

And make it all on video because i don't trust PCPER.

March 19, 2015 | 02:43 PM - Posted by Allyn Malventano

The panel in the BENQ and ROG Swift are the same. The difference is how it is being driven.

Further, that FPS range claim is irrelevant and a bit misleading. All G-Sync panels can actually stay in VRR mode all the way down to 1 FPS, while FreeSync panels drop out of VRR once they hit the minimum allowable (hardware) panel refresh rate.

March 23, 2015 | 11:57 AM - Posted by db87

That doesn't matter really unless you're looking for slideshows.
Everything below 30fps should be avoided regardless of using G-Sync or FreeSync...

March 30, 2015 | 03:49 PM - Posted by Anonymous (not verified)

while you obviously arent aiming for high framerates sudden dips could lead to noticeable stutter, and some panels will disengage free sync mode as high as 48hrtz.

March 30, 2015 | 03:54 PM - Posted by Anonymous (not verified)

*low meant to say LOW not HIGH

March 22, 2015 | 04:54 PM - Posted by Anonymous (not verified)

You know thaf ut is intentional. Pcper is green mostly

March 19, 2015 | 02:40 PM - Posted by Allyn Malventano

As we understand it, G-Sync uses its on-board memory to buffer the previous frame, using this data combined on-the-fly with the next incoming frame to calculate appropriate drive voltages and compensate for ghosting (similar to overdrive).

FreeSync relies on the regular adaptive sync TCONs that at present don't seem capable of correcting for this when operating in VRR modes. The BENQ panel does have a 'Blur Reduction' mode, but that drops it out of FreeSync mode and shifts to constant refresh when enabled. One note - switching this while running the FreeSync demo hung the display and I could only get the screen back up after rebooting both the system and hard restarting the panel by unplugging it.

March 19, 2015 | 03:13 PM - Posted by Anonymous (not verified)

Though Freesync may not be able of correcting it, it is not introducing it either.

So it's actually an issue of the panel technology and not really tied to Adaptive Sync. G-Sync only compensates for it by having a dedicated scaler with an onboard buffer, and the manufacturer has to rely on Nvidia for the OSD and other stuff.

March 19, 2015 | 03:39 PM - Posted by Allyn Malventano

...but FreeSync / adaptive sync *are* introducing it. This is an issue that was corrected years ago in constant refresh rate LCDs, and continued to be corrected with G-Sync. The issue is now re-introduced with current non-G-Sync VRR panels. Whoever's fault it is is not relevant to the fact that ghosting *is* present in the first three FreeSync panels we have looked at so far.

March 19, 2015 | 04:11 PM - Posted by Anonymous (not verified)

So it *is* present with Freesync ON and OFF. Or is it only present with Freesync ON?

I think it is very relevant whether Adaptive Sync is to blame or not. If its fixable on the manufacturer's end it has nothing to do with FreeSync/Adaptive Sync. The manufacturer could also introduce a frame buffer at their own leisure and mitigate that effect.

The question would be (if your theory is proven to be correct), how big of a buffer you need. Panel Self Refresh necessitates a buffer as well, I wonder if one could use that.

I also wonder about Panel Self Refresh as a technology that is supposed to be part of FreeSync (and part of the VESA standard). Where you able to test it at all?

March 19, 2015 | 04:46 PM - Posted by Allyn Malventano

Having a buffer for self refresh is different than what is needed to properly correct ghosting at variable refresh rates. It's a matter of how you are using the buffer. Simple self refresh would simply write to the buffer and only read it when a self refresh is needed. Overdrive calcs on-the-fly performed as the frame comes in *and* saving that frame back to the same buffer (also on-the-fly) requires better RAM than simple TCON's are using.

Overdrive on constant refresh rate panels can be done more simply without necessarily needing a buffer (not as good without one, as you may get negative ghosting, etc, but still possible).

I'd have to recheck, but I was fairly certain that the BENQ panel is either FreeSync *or* 'Blur Reduction' at a constant refresh rate (no VRR).

March 19, 2015 | 04:19 PM - Posted by Anonymous (not verified)

Ghosting is a RTC issue not a G-Sync/FreeSync.

Did you measure the RT, RTC & RTCO of each panel.

The ASUS could be doing less RTC then the other panels which would be the reason its not ghosting. Faster RT and more RTCO translating into less color accuracy.

JJ from Asus talked about it when Ryan was asking him about the variable refresh monitor that isn't G-sync nor Freesync.

March 19, 2015 | 05:03 PM - Posted by Allyn Malventano

The draw time of the BENQ and the Swift both appeared to be at the 'Screen Refresh Rate' setting in Windows - for all tests set to 144 - 7 msec scan from top to bottom. With the BENQ and the Swift draw rates both being the same, and on the same model LCD panel glass, the color accuracy would be similar as well.

(These panels still draw each frame at the same speed, there is just more idle time between draws when operating at lower VRR frame rates. The panels do not appear to be drawing slower at lower rates. We will be testing this with further detail in the coming weeks.)

March 19, 2015 | 06:00 PM - Posted by Anonymous (not verified)


Just because the panel is the same and both say 1ms RT doesn't mean it has the same characteristics the firmware and its setting will decide that. Making and assumption like that is naïve. You need to measure them to know what is really going on.

Its more troubling that your blaming a gsync/freesync for the effects.

March 19, 2015 | 07:10 PM - Posted by Allyn Malventano

For the same panel glass, those values must scale in proportion as the refresh rate setting is changed - it is simple math. You only have 7 msec to draw out the entire panel at a 144 rate. You only have a fixed amount of time to drive each pixel as the scan takes place.

Also, we are not assuming, we are measuring. For now we are discussing measurement results. Later we will present them. Turning the data into an easily presentable form takes time. Further - arguing about the cause is naive as the end result is what matters, and the 'firmware and its setting's of the FreeSync panels we have tested are visually ghosting, and the G-Sync panels are not. It is causing FreeSync to fall behind G-Sync in the visual quality of moving scenes. It's something they (panel manufacturers in this case - not AMD) should fix if they want to be competitive.

March 19, 2015 | 10:26 PM - Posted by Anonymous (not verified)

Allyn might want to read up on this review done by people who know monitors a bit more then just looking at the screen with an iPhone camera.

ASUS Rog Swift is ghosting at its optimal settings.

March 20, 2015 | 11:15 AM - Posted by Allyn Malventano

1. Our site is not named after panel technology, so we don't dive that deep.

2. Yes, light ghosting, not 2-3 full previous frames worth where the complete ghost (and detail even) is clearly visible.

March 20, 2015 | 02:00 PM - Posted by Anonymous (not verified)

As the site details out of the box its worse and not until they calibrated and adjusted settings.

None of which you detailed in this preview. You end up asking.

The question now is: why is this happening and does it have anything to do with G-Sync or FreeSync?

You then go on to implicate FreeSync as the culprit and GSync as the solution in the comments.

The implementation *is* FreeSync, at least as far as how AMD has been claiming it. The panels we are testing have been 'certified' by AMD and have a FreeSync logo on the box.

One minute earlier you had said this.

The panel in the BENQ and ROG Swift are the same. The difference is how it is being driven.

You recognize its the hardware driven then the next minute your blaming the gpu software side.

As TFT Central shows the Gsync monitors ghosting to varying degrees depending on settings. Did you even take the time to properly test and calibrate the monitor settings before you jumped to this conclusion.

You can clearly see depending on setting in that review you can get ghosting on a Gsync monitor. I do expect the adaptive sync monitor to have slightly more RTC and RTCO since they have to handle more inputs DVI, HDMI along with AUDIO INPUT as oppose to just dealing with DP as the Gsync monitors do.

That would be common sense but I wouldn't jump to the conclusion or speculate that its the software having said one minute earlier it could be the hardware in the monitor. Which I would agree with.

It's just a shame.

March 21, 2015 | 01:38 PM - Posted by Anonymous (not verified)

It is a shame. Confirmed not the same panel by a trusted source.

This is 10 days old. Wonder why it wasn't brought in.

March 24, 2015 | 07:22 AM - Posted by Ninjawithagun

You are naive to think the opposite WITHOUT PROOF. You are great at quoting specifications and expectations, but crappy proving anything yourself. If you feel so passionate about FreeSync then WRITE YOUR OWN ARTICLE AND POST YOUR FINDINGS WITH FACTS AND EVIDENCE!! You are great at providing article links, so what? You know how to use Google - woo hoo. Now, be the man and provide your OWN analysis and review. NUFF SAID!

March 23, 2015 | 12:03 PM - Posted by db87

Did you guys even talk with Benq about this?
To me it seems you guys are very fast with your conclusions without doing some proper apples to apples with in-depth investigation. It's so easy to blame FreeSync...

March 24, 2015 | 08:33 PM - Posted by trenter (not verified)

I think either maximum pc or pcworld have a freesync monitor that shows no sign of ghosting, they were debating this topic themselves. Amd has commented that different displays house different electronics and that is what causes ghosting, not freesync.

March 19, 2015 | 01:03 PM - Posted by mAxius

unfortunately gsync is nothing more than drm that nvidia is using to lock users into its ecosystem and nv needs to maintain its lock in and profit margins from the gsync modules

March 19, 2015 | 07:35 PM - Posted by svnowviwvn

How often have you hit your head with a brick?

March 19, 2015 | 08:25 PM - Posted by Anonymous (not verified)

Well duh. The company does exist to make a profit and provide returns for its shareholders. They were first to market with this so they will enjoy the spoils while they can.

March 19, 2015 | 10:52 PM - Posted by Anonymous (not verified)

g-sync isnt a drm. it works as advertised, and besides, amd's gpu offerings of late have been pitiful. this is why their market share is where it is at. nvidia came to market with this technology first, therefore they can charge a premium for it for adopters who want it. its the way the world works. yes, sure they are locking users into an ecosystem, but its a damn sight better than the alternative.

March 23, 2015 | 07:48 AM - Posted by Ninjawithagun

You can buy a used Asus ROG Swift on Amazon right now for below $600. Go check it out. As long as they are a Prime reseller, your purchase will be covered by Amazon directly. This makes it a "no risk" purchase as you have plenty of time to return the monitor if you don't like it AND Amazon will pay for the return shipping! Win-win

June 29, 2015 | 01:18 AM - Posted by Astrix_au (not verified)

There was a firmware fix for the BenQ XL2730Z that fixed the overdrive issue on Freesync. It disabled the overdrive when freesync was activated. I have the V002 firmware and only after updating did overdrive work on display port with freesync on or off.

They should retest it now with a updated monitor. They will be shipped with the fix in July or you could take yours anytime to benq for the update.

June 29, 2015 | 01:26 AM - Posted by Astrix_au (not verified)

Actually overdrive didn't work with or without freesync on. It wasn't working over displayport. I mentioned with it on above, I was wrong. Only after updating my firmware at Benq did overdrive or (AMA) started working on displayport.

March 19, 2015 | 12:41 PM - Posted by Anonymous (not verified)

It's a great showing for an initial release of a new tech. What I'm more excited about is when this tech has been out for a year or two and I have the money then to upgrade to a VRR monitor and compatiible GPU, maybe end of next year or year after, then I'm sure the Gen 2 or even Gen 3 of both sync techs will be greatly improved and prices will be a good bit lower.

For now, I think it is safe to say that if you are going to get a freesync monitor and compatible gpu anytime soon, better make sure you are getting a high end GPU and not hte r9 260(x) to avoid the low end. Otherwise, wait.

March 19, 2015 | 12:51 PM - Posted by gloomfrost

aw too bad that ghosting is a real deal breaker for me :( hope they fix this problem real soon.

I really want to thank you guys for keeping these manufacturers honest.

March 19, 2015 | 04:20 PM - Posted by Ryan Shrout

Thanks for reading.

I think the ghosting fix can be done in a couple of ways so AMD has some options here.

March 23, 2015 | 12:10 PM - Posted by db87

It's so easy to blame AMD but what makes you think that?
Did you talk with monitor manufactures?

March 19, 2015 | 12:53 PM - Posted by Anonymous (not verified)

What amateurs there are doing reviews at this site. Ghosting is a byproduct of the panel, whether it is TN or IPS, not the freesync/gsync/any other sync technology.

Not all panels exhibit the same level of ghosting. Go ahead and test the ROG swift vs another gsync panel and see there are differences in ghosting.

Total amateurs here. Although why would anyone be surprised to see pcper, a marketing wing of nvidia, spin up some nonsense to claim gsync is better.

Refuse review.

March 19, 2015 | 01:06 PM - Posted by Anonymous (not verified)

Did you actually read the review.The panels in the Rog Swift and the benq are the same so the difference resides on the implementation of the technology.

March 19, 2015 | 05:02 PM - Posted by trenter (not verified)

Actually just because the panels are the same does not mean they will perform identically or even close at all really. Go read up on it, I think tomshardware did a review of a gsync monitor recently and noticed huge differences from the same panel in two different monitors. Even panels in the same monitor can perform differently from one monitor to another.

March 19, 2015 | 05:05 PM - Posted by Allyn Malventano

Very true, but in this case, we have no ghosting vs. multiple frames of ghosting. This is a matter of which panel driving technology is properly correcting for ghosting. One technology is, while the other one (at present) is not.

March 21, 2015 | 07:41 PM - Posted by remon (not verified)

The panels aren't the same, as someone posted above.

March 19, 2015 | 01:20 PM - Posted by lordbinky (not verified)

Really? It is SOLELY the byproduct of the panel? Well, I guess all those overdrive settings monitors have are all made, pixel overshoot causing dark trailing is just a delusion, and there is no difference in the hardware that actually controls the panel since according to your professional assessment that ghosting is only a byproduct just the panel selection.

March 20, 2015 | 05:17 PM - Posted by Jeff Armstrong (not verified)

You clearly did to check the "review". First, this is not a review, it is a first impressions and technical discussion. Between this and other HW sites, this has done by far the best job discussing the technical differences, both advantages and disadvantages.

Next, more in detail, NVidia was quoted as saying that their scaler does more than just trigger a new refresh, including but not limited to adjusting voltages to the panel depending upon the VRR at any given time. This is what makes the panel for the Asus not ghost as opposed to the BenQ panel (the identical panel).

Refuse the fanboy comment.

March 20, 2015 | 05:17 PM - Posted by NonFanboy Anonymous (not verified)

You clearly did to check the "review". First, this is not a review, it is a first impressions and technical discussion. Between this and other HW sites, this has done by far the best job discussing the technical differences, both advantages and disadvantages.

Next, more in detail, NVidia was quoted as saying that their scaler does more than just trigger a new refresh, including but not limited to adjusting voltages to the panel depending upon the VRR at any given time. This is what makes the panel for the Asus not ghost as opposed to the BenQ panel (the identical panel).

Refuse the fanboy comment.

March 19, 2015 | 01:16 PM - Posted by Anonymous (not verified)

The second page has some absurd things for a comparison.
First you are not sure if the technology for g-sync works that way, without a confirmation from Nvidia that is only speculation.
With vsync what you get is a repeated frame, if your monitor is 40Hz and your fps are below you'll see the same frame repeated at the 40Hz interval until a new one is drawn and the monitor can refresh the screen.
Your speculation about G-sync is that if you have 25fps the monitor is refreshing at 50Hz, let's say it could be possible, but that only means you are seeing the same frame refreshed more times until a new one is drawn. Depending on when the next frame is ready it can be better or worse for fluidity.

What is the advantage of refreshing more times the same frame? In any case your eyes will get more tired and power consumption will be higher.

March 19, 2015 | 01:27 PM - Posted by Spacebob

"What is the advantage of refreshing more times the same frame?"

Like they said in the article, this is to help reduce screen flicker.

March 19, 2015 | 01:50 PM - Posted by Anonymous (not verified)

For Free-sync the VRR window is chosen to avoid flickering.
Perhaps Nvidia needs to double the refresh rate at low fps with G-Sync to avoid flickering but that is not an advantage compared to Free-Sync.

March 19, 2015 | 02:58 PM - Posted by Allyn Malventano

Yes, it is a definite advantage, as the display remains in VRR mode at far lower rates than FreeSync is capable of. At low rates, the G-Sync panel remains smooth while the FreeSync panel either tears or judders.

March 19, 2015 | 03:53 PM - Posted by Anonymous (not verified)

Is there any reason why frames are doubled into 60 Hz area in this case?
I'm curious why it works like this when theoretically 25 fps could be multiplied in 125 Hz theoretically.

March 19, 2015 | 04:50 PM - Posted by Allyn Malventano

To put it simply, it doesn't need to go that high. Also, at 125 Hz, the chances are more likely that a next frame coming in would conflict with a refresh already in progress, as there would be less idle time between redraws at that higher rate. Basically, if you're at the higher rate, you want it perfectly in sync with the game, and if you are having to insert extra frames on the low end, you want as much time between those redraw events as possible, so that if the rate is still changing over time, you are still prepared to draw the next frame the instant the GPU has completed rendering it.

March 24, 2015 | 08:45 PM - Posted by trenter (not verified)

Does that not introduce more input lag?

March 19, 2015 | 05:27 PM - Posted by trenter (not verified)

So what happened to gsync being unable to avoid stuttering at 30 fps or so, when did they fix that? I was under the impression that low framerates were undesireable and created a bad experience either way.

March 19, 2015 | 05:41 PM - Posted by Allyn Malventano

That was with the original upgrade kit *only*, which juddered between 24-30 FPS. The retail panels do not have this issue, as additional redraws are proportionally centered between the incoming frames (as best as the module can approximate based on the frame rate it has seen come across). Any steady drop in frame rate (over a few successive frames) remain smooth all the way down to 1 FPS. There is still an issue when instantaneously going from a high frame rate to zero, but that is more of a corner case. I've observed this effect with our lab equipment, but it's a complicated thing to try to explain and demonstrate, as evidenced by the mass confusion even after Ryan and I both offered our own explanation /graphs in this article :).

March 19, 2015 | 10:34 PM - Posted by Anonymous (not verified)

I don't see the behavior below 30 Hz or so to be that important. Once you drop this low, it is going to be noticable just about no matter what you do. One frame per second is a slide show. You will lose the illusion of smooth motion although the cut-off varies from person to person. They should try to extend the vrr window down to 24 Hz for these adaptive sync display.

March 20, 2015 | 06:15 AM - Posted by Anonymous (not verified)

"your lab equipment", like that iPhone camera you used to capture a 144hz panel? Yeah, I totally trust your professional grade lab equipment.

March 20, 2015 | 11:17 AM - Posted by Allyn Malventano

An iPhone camera didn't capture this.

March 19, 2015 | 01:36 PM - Posted by obababoy

Interesting, I didn't think of it that way. The double framing could be smoother than AMD's implementation. I think they were speaking on experience looking at the two monitors. NVIDIA's appeared to be the smoother experience at those framerates.
BTW, I am an AMD fan and dislike NVIDIA's business models and ways of doing things.

March 19, 2015 | 02:13 PM - Posted by Anonymous (not verified)

No, the difference would come when the next frame is ready to be displayed, for example if you have a 120-60Hz VRR monitor:

If the monitor refreshes after 8.33ms (120Hz) the same image because of low fps and the next frame is ready at 9ms it has to wait 7.66ms more, without doubling the refresh rate the monitor could refresh the image anytime between 8.33-16.66ms and in this example it could refresh the image at 9ms and you'll have more smoothness.

March 19, 2015 | 02:51 PM - Posted by Allyn Malventano

First, we have observed the refresh rate effects ourselves, with lab equipment. It's rather easy to see the actual refresh rate when measuring with proper equipment. We will likely revisit this in more detail in the future, but as this was a FreeSync article it was not proper to include here.

For the rest of your comment, you don't appear to understand the effects of V-Sync judder / tearing, especially at lower refresh rates. Lower refresh rates translates to a worse effect when not in VRR mode. The G-Sync panel in my / your 25/50 FPS example is still in VRR mode and is drawing full frames in sync with the game output, while the FreeSync panel would *not* be in VRR mode at that rate.

The frame must be refresh at a minimum rate or else the pixels will bleed and drift, potentially causing damage if voltages drift too far and pixels are overdriven too high or too low. You can't just shut off the input stream indefinitely and have the image remain static. It is a limit of LCD technology.

March 19, 2015 | 06:53 PM - Posted by Anonymous (not verified)

The monitor is always in the VRR window or it is malfunctioning. This is logical.

Supposedly with freesync and "vsync on" the drivers handle the timing between frames and decide if a new frame is sent or an old one if the new frame is not ready and the monitor needs to refresh. Unless the driver works exactly like vsync below the VRR window refreshing always at the maximum refresh rate, but according to the FAQ freesync can work between 21-144Hz.

With Nvidia there is a buffer on the monitor and it will do the same thing, all reviews say that if the next frame is not ready it will push the same frame, I can't understand the need to double the refresh rate and how it would do it.

Lets follow your 25fps example with a 120/30 Hz monitor, that's between 8.33 and 33.33ms. Consider 3 frames, first came at 25fps (40ms), 41.66ms later comes the second frame(24fps), finally 66.66 ms for the last one(15fps):
- Desired:
Next frame (41.66ms) - Next frame (108.33ms)
-Doubling refresh rate:
Same frame(20ms)- Same (40ms) - Next (48.33ms)- Same (69.16ms) - Same (90ms) - Next (108.33ms)
- Without doubling refresh rate:
Same (33.33ms) - Next (41.66ms) - Same (75ms) - Next (108.33ms)
In this example doubling the refresh rate gives a worse experience and there is no sync with the second frame.

March 20, 2015 | 11:19 AM - Posted by Allyn Malventano

That's how it worked on the G-Sync upgrade kit. Retail panels insert the frames into the center (temprally speaking) of the rendered frames. They don't wait until the last second.

March 21, 2015 | 07:35 AM - Posted by Anonymous (not verified)

What you say only makes sense if the fps are constant because then the application knows when the next frame comes, if not you can double or have 100x refresh rates and you are going to be out of synch if your fps are below the VRR window. The only way to avoid it would be to have one frame lag.

What are you using for your tests, a variable frame rate scenario or a constant one like for example a video?

March 21, 2015 | 08:04 AM - Posted by Anonymous (not verified)

Let me guess, windmill with constant or near constant 25 fps.

March 19, 2015 | 01:37 PM - Posted by Anonymous (not verified)

There is nothing preventing AMD from modifying their driver to output the same frame twice or thrice and get the same behaviour as G-Sync below 48Hz. My guess is that they delayed that feature to get the driver out now to use with teh FreeSync monitors. There is no technological hurdle to caching the frame on the card instead of the monitor.

March 19, 2015 | 02:53 PM - Posted by Allyn Malventano

Very good point. This could likely be implemented at the driver level. I for one hope to see them do just that actually!

March 19, 2015 | 04:21 PM - Posted by Ryan Shrout

If you read the last page of the story, I have heard AMD hint that might be coming in the future.

March 19, 2015 | 04:25 PM - Posted by Anonymous (not verified)

If that is true and AMD can make that work, then there really will be no point in having a gsync module. If I am understanding this correctly, both gsync and freesync do the same thing both in the VRR range of a monitor and also above the VRR range, it is just below the range is where they differ, right?

March 19, 2015 | 04:55 PM - Posted by Allyn Malventano

Mostly correct.

If AMD can correctly implement some form of frame redraw multiplication on the low end of the range, and if the manufacturers can get the ghosting issue under control in their TCON's, then yes, they would then be equal. I believe those issues contribute to NVIDIA's decision to use a module, at least for the time being.

Above the VRR range, AMD actually has an advantage here as G-Sync is forced to V-Sync ON while FreeSync is selectable (though that same selection applies to operating below the VRR range - you can't choose one for high and another for low).

March 19, 2015 | 08:06 PM - Posted by Annoyingmoose (not verified)

I see no benefit in disabling V-Sync above the VRR range. all you get is tearing at this point.

the additional input-lag measured at max refresh (aka native V-Sync) is easily eliminated by limiting your fps globally to a couple of frames under max refresh, be it G-Sync or FreeSync.

people claiming AMD's advantage in VRR tech is "V-Sync OFF" should slap themselves for being silly :D

March 21, 2015 | 09:38 PM - Posted by Anonymous (not verified)

Agreed. We're also talking about a limit at 144Hz, which aside from the compeitive gamers that have a problem with it, is fine for the remaining 95% percent of us.

When you buy a G-Sync branded monitor, you know what to expect, and FreeSync isn't there yet. My next monitor will be a 1080p G-Sync monitor. Most likely the BenQ 24".

March 20, 2015 | 12:09 AM - Posted by Anonymous (not verified)

Then this just confirms that Nvidia all along knew that a Gsync module was never really needed in the long run, but in order to be first to market they had to have a module.

As with any new tech that is released, the first wave of the new toys are going to have issues of some type. It is Gen 2 or Gen 3 of these products I'm more interested in. By that time, bugs will be mostly worked out and prices would have dropped a good bit, that is when I'll upgrade to VRR.

March 24, 2015 | 07:17 AM - Posted by Ninjawithagun

GSYNC is superior to FreeSync for a number of reasons. You need to re-read this article again to understand the technical differences. It's not about needing a module or not in order for an adaptive refresh rate technology to work. And it's not about price or cost. It's about the fact that Nvidia's proprietary implementation has distinct advantages over the open Adaptive Sync implementation. Stop thinking like a fanboy and think like a scientist - with an OPEN MIND!! High school is over now, learn how to think for yourself.

March 20, 2015 | 03:02 PM - Posted by Alexious (not verified)

Do you believe that the ghosting issue could be resolved with a firmware/driver update?

It seems like you think this is specifically caused by FreeSync more than the panel itself. I'm very interested in the LG 29UM67 as it provides 21:9 experience, IPS panel and FreeSync with a very low price, however this much ghosting is worrying.

Can you tell us if this was very noticeable while playing or not? I'm used to the Asus PA246Q (IPS 16:10), which has 8.6ms according to Prad.

Also, I don't quite get why the colors look really bad in that image - I thought IPS was supposed to be more accurate than TN screens in this regard?

March 20, 2015 | 06:21 PM - Posted by Allyn Malventano

The LG had the worst ghosting by far. It's not *caused* by FreeSync, though it did get by AMD's validation process. The ghost correction would come from overdriving, which is a function of the TCON. At present it does not seem that this part of any display is end user firmware updateable, so I don't believe a fix could be pushed out. It would have to be a newer revision of the display. Dell used to do this a lot back in the day (same model, newer revision, fixing issues with ghosting, dithering, etc).

The ghosting was very noticable on the LG. Moving objects, especially light vs. dark, appears 'smeary'. IPS panels will all do this to some extent, but the LG IPS seems worse than other regular IPS panels.

Don't trust the coloring in those images. The camera was on auto white balance and trying to lock onto moving targets in a room with the lights out :). The pics were only meant to show the ghosting, and were all taken at the same aperature / exposure / ISO.

March 20, 2015 | 06:30 PM - Posted by Alexious (not verified)

Thanks for the answer. I take it there are no viable options to tweak the panel's overdrive via OSD, then?

Anyway, really disappointed. I hope in the short term future a new 21:9 29" 2560x1080 GSYNC/FreeSync display will be available, without this level of ghosting.

March 22, 2015 | 05:17 PM - Posted by Anonymous (not verified)

Freesync is working on the monitor as intended yes?
What are they going to do? Force LG to switch to a different panel?

March 24, 2015 | 08:57 PM - Posted by trenter (not verified)

What do you mean it got by amd? If ghosting is caused by the panels electronics and not freesync then what does amd have to do with that? I could see how you could blame nvidia for their flickering issues because they built the module. Amd certifies that freesync works correctly on the monitors they test, they can't tell the manufacturer not to release their product because the image ghosts. It's like amd saying no you can't release that monitor because the color is off, the response time is slow, the voewing angle sucks ect.

March 19, 2015 | 06:40 PM - Posted by Anonymous (not verified)

I fully agree. Why it does not just refresh sooner, right? But it took them unearthly long to release what seems to be the easiest part of the process - the driver.

Also, independent setting of what happens above and below the zone would not hurt.

March 19, 2015 | 01:46 PM - Posted by Ophelos

"NVIDIA’s has often stated that G-Sync introduces no additional frame latency and should cause no performance deltas when compared to a standard monitor with VSync off as long as the frame rates don’t cross over the maximum refresh rate of the panels. Though the performance deltas that AMD claims exist on the NVIDIA solutions are basically within the margin of error for manual testing of gameplay, the fact that it exists at all is interesting and should be investigated."

This info about the latency from G-Sync was known for along time now. Even Richard Huddy from AMD said during CES that G-Sync had this issue.

You can check out the video on it here:
Starts at the 5:48 in the video.

March 19, 2015 | 02:56 PM - Posted by Allyn Malventano

We don't disagree that it has been said, but we have not been able to confirm it in our own testing. Further, we have been told that the G-Sync module does not buffer the stream in any way that would delay the frame coming in to the actual drawing to the panel. For both technologies in VRR mode, the panel is being drawn *as* the frame data is coming in from the GPU, so where this delay could be coming from is difficult to figure and even harder to measure.

March 19, 2015 | 03:50 PM - Posted by Anonymous (not verified)

Hmm maybe frametime analysis would be more telling - is it possible that some super fast frames are slipping into averages with sync technologies disabled?

March 20, 2015 | 01:58 AM - Posted by Anonymous (not verified)

The fact that the GPU has to poll the display when G-sync is enabled is what adds latency. Free sync does not require any polling of the display.

March 20, 2015 | 11:21 AM - Posted by Allyn Malventano

Nobody has confirmed any such polling exists.

March 20, 2015 | 01:41 PM - Posted by Anonymous (not verified)

Yet you refered to it.

March 19, 2015 | 02:40 PM - Posted by Allyn Malventano

As we understand it, G-Sync uses its on-board memory to buffer the previous frame, using this data combined on-the-fly with the next incoming frame to calculate appropriate drive voltages and compensate for ghosting (similar to overdrive).

March 20, 2015 | 06:23 PM - Posted by Allyn Malventano

None of this needs any sort of two way communication between the G-Sync module and the GPU. The two way communication in that case is betwen the module and its connected RAM.

March 20, 2015 | 08:08 PM - Posted by Anonymous (not verified)

Who said anything about two-way communication? Are you OK? You don't make sense.

March 21, 2015 | 01:13 AM - Posted by Anonymous (not verified)

Hes got to move the goal post or distract.

Half of his comments don't make sense at all and this is suppose to be a monitor review.

Hes getting laughed at in other forums and people are pointing to this as yet another example of why people thing PC Perspective is a shill for Nvidia.

March 21, 2015 | 02:20 AM - Posted by Anonymous (not verified)

"polling" implies two way communication.

March 20, 2015 | 04:10 PM - Posted by Anonymous (not verified)

This is what the guys over at Blurblusters said.

Why is there less lag in CS:GO at 120fps than 143fps for G-SYNC?

We currently suspect that fps_max 143 is frequently colliding near the G-SYNC frame rate cap, possibly having something to do with NVIDIA’s technique in polling the monitor whether the monitor is ready for the next refresh. I did hear they are working on eliminating polling behavior, so that eventually G-SYNC frames can begin delivering immediately upon monitor readiness, even if it means simply waiting a fraction of a millisecond in situations where the monitor is nearly finished with its previous refresh.

March 20, 2015 | 06:31 PM - Posted by Allyn Malventano

They state 'fps_max' - if they are setting it to 143 (or whatever), the game engine is likely colliding with the graphics driver, or specifically telling the GPU driver to poll the display. This has nothing to do with what would normally happen (i.e. not using fps_max). G-Sync does not need this two-way communication to operate.

March 19, 2015 | 03:21 PM - Posted by Anonymous (not verified)

I can tell you going from vsync to gsync you take little hit in performance i picked up a monitor this weekend and all my GPU scores dropped slightly even with gsync globally disabled.

Once i hooked up my old 120hz monitor everything was fine.

March 19, 2015 | 03:34 PM - Posted by Allyn Malventano

Which two panels are you talking about here? If you see a difference, we'd like to replicate it.

March 20, 2015 | 05:23 PM - Posted by NonFanboy Anonymous (not verified)

Somehow I don't believe you.... I picked up a Gsync monitor recently and my frame rates turn out to be the same. Just like what PCper quoted.

March 20, 2015 | 05:21 PM - Posted by NonFanboy Anonymous (not verified)

"This info about the latency from G-Sync was known for along time now. Even Richard Huddy from AMD said during CES that G-Sync had this issue."

Wait, wait... AMD says that Gysnc suffers for added latency, so you quote another AMD person saying the same thing and that somehow refutes the hard data PCper collected that concludes contrary to AMD's criticism of Gsync (a technology they scrambled to copy at the last minute).

Fanboys, sheeeeshhhhh.

March 22, 2015 | 05:22 PM - Posted by Anonymous (not verified)

"a technology they scrambled to copy at the last minute"
Nice jab. They said pretty early on that this was possible without a module and they've taken some time to develop an alternative.
That's not copying and that's not scrambling at the last minute.

But hey.. Fanboys, sheeeeshhhhh.

March 19, 2015 | 02:41 PM - Posted by Anonymous (not verified)

First order of business for VESA as a standards organization, is to force AMD to stop using the FreeSync branding on your DisplayPort 1.2a AdaptiveSync standard.

FreeSync is a proprietary naming/branding from AMD, the proper name for the standard is NOW VESA DisplayPort 1.2a AdaptiveSync. There should be no use of any proprietary company's naming/branding on a now industry standard video synchronization standard, This naming monkey business can not be allowed, and VESA will suffer damage to its impartial reputation as an all inclusive standards organization. The "Freesync" name, and associated proprietary branding should have been retired, as a pre condition of the "FreeSync" IP's adoption by the VESA standards organization. This makes VESA look bad, and will continue to result in confusion, VESA your reputation is at stake as an impartial all inclusive Standards organization.

The marketing departments of both AMD and Nvidia, or any other proprietary company, needs to be kept away from any industry standards naming, impartiality must be maintained!

March 19, 2015 | 04:15 PM - Posted by Dave B (not verified)

AdaptiveSync details the display standard specification to make a number of things feasible, one of which is dynamic adaptive VSYNC. FreeSync is the combination of standards body hardware functionality as well as software and other hardware controls that enables adaptively controlled panel refresh rates to work with applications.

March 19, 2015 | 05:17 PM - Posted by arbiter

Problem is too many people talk and act like freesync and adaptive sync are one and the same. When they are not. Its like calling g-sync and adaptive sync one and the same since they do same thing. That is completely ignoring both are proprietary solution from each company.

Since AMD has to give the logo to certify the monitor to be freesync. AMD would never give that to a g-sync monitor and the software is locked up to their gpu. Didn't AMD just lock freesync to amd gpu's without anyone batting an eye while they kept dissing Nivida for same thing?

March 19, 2015 | 05:30 PM - Posted by trenter (not verified)

The difference is that amd gpus that are freesync capable are also adaptive sync capable. Asus are releasing a monitor with adaptive sync that will not be certified by amd but will still function with amd cards just like freesync would. Unfortunately nvidia gpus will not function with adadaptive sync.

March 19, 2015 | 05:31 PM - Posted by Allyn Malventano

FreeSync *is* adaptive sync. No more. No less.

G-Sync is its own thing, but we believe it uses the adaptive sync vblank method while having a module on the other end that does other things to minimize the negative effects at the low end of the VRR range (as well as effectively extending that range lower than the panel hardware limit).

Good point on the AMD locking, but I don't think that is what is really going on here. A G-Sync panel won't work with an AMD card + FreeSync driver not because of AMD white listing (they reportedly don't do that), but because the G-Sync panel does not report itself as being a true adaptive sync capable device. That's more on NVIDIA than AMD.

March 19, 2015 | 05:49 PM - Posted by Dave B (not verified)

"FreeSync *is* adaptive sync. No more. No less."

From the panel side, yes, the panel vendors are using the name for marketing the feature. From the GPU side, no. Just supporting the Adaptive Sync spec does not mean that dynamic adaptive refresh rate will operate at an application level. FreeSync is the full feature mechanism that takes to adaptive sync specification and makes it into a feature that operates on applications and provides the end user benefit.

March 19, 2015 | 07:14 PM - Posted by Allyn Malventano

Yes, and on the GPU side, FreeSync and G-Sync are doing the same thing. Again, the difference lies in G-Sync panels having a different module.

March 19, 2015 | 07:38 PM - Posted by trenter (not verified)

Oops that last statement was redundant, sorry.

March 19, 2015 | 07:29 PM - Posted by trenter (not verified)

Yeah that's what I meant. Amd has a certification program for freesync, but you can ignore their certification program and build a monitor with adaptive vsync that will be supported by amd gpus. Nvidia only supports gsync at the moment, adaptive sync monitors will not work with nvidia gpus, they also may not be certified by amd for freesync yet they will still work with amd products that also support freesync.

March 19, 2015 | 08:17 PM - Posted by trenter (not verified)

Freesync is only branded on a monitor to communicate to customers that the said monitor has been tested and certified by amd. Not all monitors with dp 1.2a will have freesync branding, although they may be adaptive sync capable.

March 19, 2015 | 03:57 PM - Posted by Anonymous (not verified)

Are not standard monitor features (audio, scaling, color processing, multiple inputs) accessible with FreeSync, better than GSync equivalent from NVidia?

March 19, 2015 | 04:22 PM - Posted by Ryan Shrout

That is an advantage of FreeSync for sure. It's noted in the story several times.

March 19, 2015 | 05:00 PM - Posted by nevzim (not verified)

The main advantage of Freesync is that is is standard while G-sync is just a hack.
Now, that we have freesync monitors on the market, please ask Nvidia which GPUs of theirs will support freesync so people who prefer Nvidia cards can go and buy one.

March 23, 2015 | 09:11 AM - Posted by fadeout32 (not verified)

This, a thousand times this.

March 19, 2015 | 04:28 PM - Posted by Anonymous (not verified)

Thank you, excellent article! Unfortunately, I am very disappointed in FreeSync/GSync monitors because of the very high minimum refresh rate. It looks like I'll have to wait for OLED screens in order to play with 24-60 hz dynamic refresh rate.

March 19, 2015 | 05:12 PM - Posted by Allyn Malventano

OLED would definitely fix this, but for the time being, G-Sync is working around that issue on the low end. We hope AMD gets on board with a driver-based solution at some point in the future. Gaming through that 40 FPS low end is like intermittently hitting a wall of judder or very obvious tearing (depending on what you have set V-Sync to in the AMD driver).

March 19, 2015 | 05:33 PM - Posted by trenter (not verified)

I never see tearing on my non adaptive sync ips 1440p monitor unless the framerate is above the refresh rate. Where are you guys seeing all this tearing below the native refresh rate?

March 19, 2015 | 05:48 PM - Posted by Allyn Malventano

The only time a fixed refresh panel, with V-Sync off, does not tear, is when:

A. The game FPS is *exactly* at the panel refresh rate (or a multiple thereof, be it higher or lower).

B. The frames are ready at *exactly* the same time that the panel is starting its frame draw.

March 20, 2015 | 10:51 AM - Posted by Anonymous (not verified)

As another has pointed out, some people just don't notice the tearing below the lower limit. I'd say that the only time I ever notice tearing is when the FPS goes above the refresh rate of the monitor, never noticed it when going below. I'm sure AMD will come up with something in driver support to adjust the pacing below the VRR range and I'm also sure that there will be other monitors that come out that will have a VRR range that extends lower than 40. What are hte official release numbers for freesync monitors, something like 8, right? So yea, I think it is a safe thing to say that more monitors are on the way with better ranges.
so until that actually gets changed, I'm looking at my 3 yr old monitor playing games that are maxed out and occasionally dips down into the 40s to 50s FPS range and don't notice any tearing/visual aritifacts, in fact, it looks really good. so what this tells me is that when using freesync and fps drops below VRR lower limit, then it will look like what I've got now, which actually looks really good, then this is not as bad as so many make it out to be. It sounds to me like its making what I have now better, as opposed to only the VRR range is good and everything below it looks like crap.

March 20, 2015 | 06:35 PM - Posted by Allyn Malventano

Dropping below the VRR limit on FreeSync panels (from what we have observed so far), results in the panel stopping at the lower end of the range and continuing to refresh at that rate, with v-sync on or off depending on setting. The effect low outside of the VRR window would be worse than what you are experiencing now, since your panel is likely refreshing higher than 40, which is what the BENQ would remain at for FPS <40. A 60 Hz LCD would net a ~50% improvement in that scenario.

March 24, 2015 | 09:10 PM - Posted by trenter (not verified)

This! I never notice tearing when my fps is below the native refresh rate, only when it's above.

March 20, 2015 | 09:18 AM - Posted by collie

Some people just dont notice the tearing. It's there, little offnesses that just make it look "Like a video game" IF you spend some time with a adaptive sync the first thing you will notice is how "Realish smoothness" it is. Then when you go back you will notice the tear.

It;s something alot of us just live with.... for now..

March 21, 2015 | 09:59 PM - Posted by Anonymous (not verified)

Tearing below minimum refresh is only half of the problem when performance dips below 60fps. The other half is choppy and laggy framerates as your GPU struggles to push them out. These are two problems G-Sync don't have. FreeSync has yet to find a solution for either problem.

March 19, 2015 | 04:33 PM - Posted by Keven Harvey (not verified)

The "above the range" zone is a bit pointless if you use a fps limiter no? Set your settings so that it doesn't dip below minimum refresh rate and a fps limiter so it doesn't go above. I'm curious how OLED monitors would react, do they need to be refreshed like LCD?

March 19, 2015 | 04:34 PM - Posted by engineer123 (not verified)

Thanks a lot for the in-depth analysis.

I hoped that FreeSync would offer almost the exact same excellent experience than G-Sync, but the practical "no-solution" for below 40 Hz is honestly pretty bad.

What I find interesting is that a lot of the far less detailed first impressions or "Reviews" (as they call it) just state at the end "almost as good as G-Sync" - thus playing down the big disadvantage below 40 Hz :)

March 19, 2015 | 04:42 PM - Posted by arbiter

AMD loves to make claims of highest levels then end up always falling just a bit short. Their Marketing and their tech departments tend to not talk to each other very much ever.
Marketing writes a check the technical departments ass ends up not being able to cash.

Its shows why its nice when Tom comes on for Nvidia as he is a tech guy not marketing. When talking on a hardware/tech podcast you need to send a tech guy not marketing.

March 19, 2015 | 05:08 PM - Posted by Anonymous (not verified)

Tom Petersen is Technical Marketing for Nvidia.

March 19, 2015 | 05:13 PM - Posted by Allyn Malventano

Sure, but his name in on all of the patents (as an inventor). He is about as far from a stereotypical 'marketing guy' as you can get, technically.

March 19, 2015 | 06:02 PM - Posted by Anonymous (not verified)

That makes sense why he never told or disclose the 970 issues then.

March 19, 2015 | 04:50 PM - Posted by nevzim (not verified)

1. Nvidia wants to push scalers manufactures out of business by taking over high end of the market.
2. Scaller manufactures stop supporting Nvidia GPUs?

March 19, 2015 | 06:38 PM - Posted by Anonymous (not verified)

Hey, Malventano, does your BenQ's FreeSync really go down to 40Hz, or is it limited to the 56Hz cap as described on its product page?

March 19, 2015 | 07:18 PM - Posted by Allyn Malventano

The panel we tested is very clearly operating at VRR all the way down to 40. At 39 I get very clear tearing / judder.

March 19, 2015 | 06:41 PM - Posted by Anonymous (not verified)

Can you enable vsync for maximum refresh rate and at the same time have it disabled for mimumum refresh rate or is it just on or off.

March 19, 2015 | 07:17 PM - Posted by Allyn Malventano

There is only one setting, and that setting applies to both ends of the VRR range.

March 20, 2015 | 12:13 AM - Posted by Anonymous (not verified)

I see, I expected as much. Thanks for the reply.
But say you don't have problems with fps dipping below 40(on a 40-144hz) display, but you might go above 144 fps, wouldn't it then eliminate your problems altogether if you set an fps limit to 143 fps? This means you are always inside the VRR window, yet you don't hit the max that would enable V-sync and cause stutter. Or am I wrong here?

March 20, 2015 | 11:23 AM - Posted by Allyn Malventano

Yeah, within those limits you describe, it would be just fine and would be VRR full time.

March 19, 2015 | 07:37 PM - Posted by Fasic (not verified)

There is 3 problems:
1. High FPS
2. Low FPS
3. Ghosting
1. Game makers, engines to make possible FPS lock, and/or amd to make option "highest fps lock"
2. "Double", "Triple" buffering thing :D
3. Make buffer mem, on TCON, so it can buffer 1 frame, and make Ghosting disappears...
That is all what nVidia do with G-sync and module, "You get what you pay for" :D...yea green team is a little too expensive but it is how it is, when you dont have competition, so cross fingers for better AMD , and cheaper nVidia :D

March 19, 2015 | 07:41 PM - Posted by ppi (not verified)

I might have an explanation why the judder was worst and actually visible between 35-48 fps.

The 40-75 fps panel refreshes screen between 20.833ms and 13.333ms.

So it goes like this (VSYNC ON):

If frame is faster than 13.333ms, it gets locked to that timeframe
If it is slower, monitor waits for it
But it cannot wait longer than 20.833ms
If it has to wait that long, it has to refresh.
Now, when it refreshes, the next refresh can take place only 13.333ms later. I.e. the next frame would be drawn after 20.833+13.333 = 34.166ms, i.e. at 29.2fps as soonest.

If the driver is written correctly, then it can then wait a bit longer, so you could have second adaptive sync window between 24-29 fps.

Why it is not visible at ROG? Because the range is 30 fps (33.333 ms) and 144 fps (6.944ms). Therefore, if there is forced refresh at 30 fps, the next frame can be redrawn after 33.333+6.944 = 40.277ms, i.e. 24.8 fps. This is much smaller difference, and thus likely less noticeable.

Therefore, the high-refresh monitor can "hide" much better the lower refresh rates.

March 19, 2015 | 08:41 PM - Posted by Allyn Malventano

First, the driver 'waiting' at all is an issue, and introducts judder / intermittend added latency.

Second, your description at the end there describes what the first generation of G-Sync panels (the upgrade kit) did, and it caused judder between 24-30 FPS, just as you could predict by your math. That was corrected in current G-Sync since the module does not wait until the last possible second (33ms) prior to forcing a refresh. Instead it 'paces' the refresh so that they fall in-between incoming frames. That way it can perform the extra refresh while being (most likely) ready for the next incoming frame. This prediction only goes so far, but it works the vast majority of the time.

March 20, 2015 | 07:53 AM - Posted by ppi (not verified)

From what you describe, the 30-144 fps window gives the driver&panel much more to work with.

Basically, if say halfway, after 16.7ms, you estimate the next frame will land at around 30fps mark or later, you can afford to run a quick refresh and then comfortably sit in the adaptive sync window around the 30fps mark.

But 48-75 fps panel has an issue. Basically, if the frame lands between 37.5 and 48fps, you’re screwed.
First, you must do the prediction right once you draw the old frame. And then:
If you would predict the frame will be quicker than 48fps, you do not refresh and wait for the frame and get the perfect adaptive sync experience.
If you would predict the frame will be slower than 48 fps, you can refresh straigth away, but that additional refresh will take until 37.5 fps.

It would improve situation for frames between 37.5 fps and 29 fps. But with this panel nothing can be done about 37.5 and 48 fps.

That made me thinking … unless I have some serious fallacy in my line of thinking, fast adaptive refresh monitors are helpful even at low fps rates, and it may make sense to invest in them.

March 20, 2015 | 11:25 AM - Posted by Allyn Malventano

Yes, you are absolutely correct, but I would argue that the frame rate rarely fluctuates at such a high rate (it would normally take at least a few frames to change by 20-40 FPS). In those cases where it does, a G-Sync panel would judder that single frame that unexpectedly came in during a redraw.

March 21, 2015 | 01:29 PM - Posted by SoulWager (not verified)

From what I understand, the g-sync display still judders between 24 and 30, but the GPU doesn't stall from it. Is this accurate?

March 19, 2015 | 09:28 PM - Posted by Anonymous (not verified)

Interesting article, but the analysis of the low-end judder doesn't seem properly quantified to me, and the spin on the ghosting really seems to conflate implementation issues likely revolving around TCON choice with the Adaptive-sync protocol.

Vsync on a traditional 60 Hz panel yields frame intervals of 16.7/33.3/50.0/... ms, with the worst judder being between 16.7/33.3 ms frame alternation in the 30-60 Hz range. On an Adaptive-sync panel, judder is determined by the frame interval at the maximum refresh rate, or about 6.9 ms at 144 Hz, which would yield worst case alternation of 25/32 ms frames around a 40 Hz floor, which is substantially less perceptible.

As for the ghosting, it sounds like all the early Adaptive-sync/FreeSync vendors cheaped out on TCON overdrive buffers, but the reasoning that Adaptive-sync can't easily support it seems like specious insinuation. Overdrive spikes tend to be in the first millisecond or so of a pixel transition, so artificially lengthened frame have zero effect on pixel driver logic. If PCPer knows a concrete reason why complexity actually needs to go up substantially, they should say so, and Nvidia marketing swearing up and down it's so isn't really sufficient.

March 20, 2015 | 11:29 AM - Posted by Allyn Malventano

I don't see how we said it would be so much harder to implement. In my mind it should be the same and I was frankly surprised to see the FreeSync (or any modern LCD) panels ghosting. The fact remains though - there is ghosting.

March 19, 2015 | 11:33 PM - Posted by Mandrake

Interesting read, thanks Ryan. Thanks also to Allyn for the comments. Very informative.

Sitting pretty with my ROG Swift. :)

March 23, 2015 | 08:30 AM - Posted by Ninjawithagun

Me too. Man, I LOVE my Asus ROG Swift. Not one single complaint...except maybe the Also, I own the Acer XB280HK 4K GSYNC and it is friggin awesome too and still the only dynamic refresh rate 4K monitor on the market :D

March 20, 2015 | 12:04 AM - Posted by dangerous (not verified)

How do you not have a table comparing the refresh rate ranges on these FreeSync monitors???

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.