AMD Demonstrates Prototype FreeSync Monitor with DisplayPort Adaptive Sync Feature

Subject: Graphics Cards, Displays | June 4, 2014 - 12:40 AM |
Tagged: gsync, g-sync, freesync, DisplayPort, computex 2014, computex, adaptive sync

AMD FreeSync is likely a technology or brand or term that is going to be used a lot between now and the end of 2014. When NVIDIA introduced variable refresh rate monitor technology to the world in October of last year, one of the immediate topics of conversation was the response that AMD was going to have. NVIDIA's G-Sync technology is limited to NVIDIA graphics cards and only a few (actually just one still as I write this) monitors actually have the specialized hardware to support it. In practice though, variable refresh rate monitors fundamentally change the gaming experience for the better

View Full Size

At CES, AMD went on the offensive and started showing press a hacked up demo of what they called "FreeSync", a similar version of the variable refresh technology working on a laptop. At the time, the notebook was a requirement of the demo because of the way AMD's implementation worked. Mobile displays have previously included variable refresh technologies in order to save power and battery life. AMD found that it could repurpose that technology to emulate the effects that NVIDIA G-Sync creates - a significantly smoother gaming experience without the side effects of Vsync.

Our video preview of NVIDIA G-Sync Technology

Since that January preview, things have progressed for the "FreeSync" technology. Taking the idea to the VESA board responsible for the DisplayPort standard, in April we found out that VESA had adopted the technology and officially and called it Adaptive Sync

So now what? AMD is at Computex and of course is taking the opportunity to demonstrate a "FreeSync" monitor with the DisplayPort 1.2a Adaptive Sync feature at work. Though they aren't talking about what monitor it is or who the manufacturer is, the demo is up and running and functions with frame rates wavering between 40 FPS and 60 FPS - the most crucial range of frame rates that can adversely affect gaming experiences. AMD has a windmill demo running on the system, perfectly suited to showing Vsync enabled (stuttering) and Vsync disabled (tearing) issues with a constantly rotating object. It is very similar to the NVIDIA clock demo used to show off G-Sync.

View Full Size

The demo system is powered by an AMD FX-8350 processor and Radeon R9 290X graphics card. The monitor is running at 2560x1440 and is the very first working prototype of the new standard. Even more interesting, this is a pre-existing display that has had its firmware updated to support Adaptive Sync. That's potentially exciting news! Monitors COULD BE UPGRADED to support this feature, but AMD warns us: "...this does not guarantee that firmware alone can enable the feature, it does reveal that some scalar/LCD combinations are already sufficiently advanced that they can support some degree of DRR (dynamic refresh rate) and the full DPAS (DisplayPort Adaptive Sync) specification through software changes."

View Full Size

The time frame for retail available monitors using DP 1.2a is up in the air but AMD has told us that the end of 2014 is entirely reasonable. Based on the painfully slow release of G-Sync monitors into the market, AMD has less of a time hole to dig out of than we originally thought, which is good. What is not good news though is that this feature isn't going to be supported on the full range of AMD Radeon graphics cards. Only the Radeon R9 290/290X and R7 260/260X (and the R9 295X2 of course) will actually be able to support the "FreeSync" technology. Compare that to NVIDIA's G-Sync: it is supported by NVIDIA's entire GTX 700 and GTX 600 series of cards.

View Full Size

All that aside, seeing the first official prototype of "FreeSync" is awesome and is getting me pretty damn excited about the variable refresh rate technologies once again! Hopefully we'll get some more hands on time (eyes on, whatever) with a panel in the near future to really see how it compares to the experience that NVIDIA G-Sync provides. There is still the chance that the technologies are not directly comparable and some in-depth testing will be required to validate.

Video News

June 4, 2014 | 12:48 AM - Posted by SikSlayer

"Monitors COULD BE UPGRADED to support this feature, but AMD warns us: "...this does not guarantee that firmware alone can enable the feature, it does reveal that some scalar/LCD combinations are already sufficiently advanced that they can support some degree of DRR (dynamic refresh rate) and the full DPAS (DisplayPort Adaptive Sync) specification through software changes."

So just like G-Sync, except for the lucky few who just happen to have the right hardware, everyone has to go out and buy new displays that can take advantage of this. Good to see AMD support this, but in the end all this means is I have to buy a new display with either (or preferably both) technologies implemented.

I bring this up because when G-Sync was announced, a fanboy war instantly started for no good reason, trying to vilify one company or the other for something that requires a new purchase for 99% of the population anyway.

June 4, 2014 | 01:29 AM - Posted by arbiter

"Only the Radeon R9 290/290X and R7 260/260X (and the R9 295X2 of course) will actually be able to support the "FreeSync" technology."

You forgot about that little tid bit to. only 5 AMD cards support it which means you might not only have to buy a new monitor but a certain gpu model. Yea fanboyz and AMD was in on the vilifying nvidia over the need for a new monitor but AMD's idea is in the same boat well really worse since lack of gpu support nvidia already has on their side.

June 4, 2014 | 07:30 AM - Posted by Spunjji

Your reactionary approach here is a little wrong-headed. How are AMD supposed to add DP 1.2a support to existing products? :)

June 4, 2014 | 11:06 PM - Posted by arbiter

Well Nvidia's gsync wasn't released to well after 700 series, yet 600 series cards support it so?

June 8, 2014 | 11:30 AM - Posted by Anonymous (not verified)

Yes, because they add proprietary logic to the monitor. AMD is limited with freesync because they cannot add logic to the monitor.

June 4, 2014 | 07:26 AM - Posted by Spunjji

You've gleefully missed several points there:

1) Some people may have monitors already capable of it. Not true with G-SYNC.
2) It doesn't require any additional hardware above and beyond what would already be in the monitor. G-SYNC requires some fairly hefty additional hardware to be added to the display.
3) Any DisplayPort 1.2a device going forward will support this. AMD, Intel, nVidia - it's there for all.

The fanboy war started because it's a great idea and, like PhysX, nVidia decided to try to turn it into a proprietary competitive advantage rather than moving the whole industry forwards. Whether or not you think that makes nVidia evil or whatever shit people come out with is irrelevant, that sort of thing does tend to aggravate people.

June 4, 2014 | 07:37 AM - Posted by Anonymous (not verified)

You obviously ignore the fact that AMD's solution is based on the DisplayPort standard. It's easier to implement for monitor vendors because the Adaptive Sync specification exists for a long time. It's also less cost intensive and license-free. Nvidia just do their typical proprietary crap to milk their fangirls.

And btw, FreeSync works more flawlessly. G-Sync has some drawbacks. For instance it costs some performance when running games in 3D.

June 4, 2014 | 12:41 PM - Posted by Anonymous (not verified)

He's not ignoring that. He's correctly quoting AMD that firmware isn't enough. They need specific hardware in that display. He could be over estimating that 99% of monitor owners need to buy a new monitor but you nor I know any better what percentage of people already have the right hardware.

June 4, 2014 | 12:58 PM - Posted by renz (not verified)

Did you mean stereoscopic 3D? Anyway any links proving freesync implementation is better than g-sync?

June 5, 2014 | 09:48 PM - Posted by arbiter

Yea don't think anyone uses that 3D stuff cept a few people.

June 4, 2014 | 02:17 AM - Posted by Anonymous (not verified)

The bottom line will be price of the hardware..From both AMD and Nvidia.

June 4, 2014 | 04:44 AM - Posted by Anonymous (not verified)

"Mobile displays have previously included variable refresh technologies in order to save power and battery life. AMD found that it could repurpose that technology to emulate the effects that NVIDIA G-Sync creates"

That's backward.
nVidia found that they could use existing VESA stuff to make fanboy milking modules with a new name.

June 4, 2014 | 06:16 AM - Posted by Mac (not verified)

So in ~6 months AMD has shown something similar to what nVidia took 2 years to get at? Awesome! Any video of this running?

June 4, 2014 | 07:13 AM - Posted by Relayer (not verified)

Here's a link to the AMD Free-Sync demo. Maybe you should have included it instead of an nVidia vid in an AMD article?

June 4, 2014 | 10:35 AM - Posted by Ryan Shrout

When I posted this, no video was available. But I'll add it now.

June 4, 2014 | 11:30 AM - Posted by Mac (not verified)

Can you verify this 40-60Hz range because Computerbase say otherwise, they're talking 47-48Hz

June 5, 2014 | 03:55 AM - Posted by Relayer (not verified)

Thanks. Sorry for the tone in my post. It wasn't necessary.

June 4, 2014 | 07:48 AM - Posted by Anonymous (not verified)

40-60fps is a weak range. Why can't DPAS or G-Sync dynamically cover the entire range of playable frame rates? 30fps is generally considered playable for most FPS titles even though a minimum of 60 is best, but 20fps is often considered the minimum for RTS/MMO games. Given that even beast rigs with Quadfire/QuadSLI still dip into the 20s when gaming at 4K, it seems like a missed window.

Why not support syncing for ALL frame rates? Why not a complete solution? If a monitor supports 144Hz natively, then do 0-144Hz. There is noticeable screen tearing and stuttering at all frame rates.

Glad to see DPAS can be a firmware update to existing DP1.2 monitors. But it's a shame to see AMD support it in only 5 GPUs. Variable VBLANK has been supported by every GPU since the 90s. LAME!

June 4, 2014 | 08:27 AM - Posted by Anonymous (not verified)

Relax, it is just an early hacked together demo. It is not a finished product.

The supported refresh rates have been named already: 36-240Hz, 21-144Hz, 17-120Hz and 9-60Hz.

In addition to the listed GPUs, FreeSync is supported in Kabini/Kaveri APUs too.

June 4, 2014 | 01:37 PM - Posted by Anonymous (not verified)

I'm completely calm, just disappointed. Like you said, they claimed to support ranges of 6-240Hz, 21-144Hz, 17-120Hz and 9-60Hz, but that's not what they are showing in their demos.

I wonder if those ranges are resolution dependent? For example:

6-240Hz (720p/768p)
21-144Hz (1080p)
17-120Hz (1440p/1600p)
9-60Hz (4K)

June 5, 2014 | 08:36 PM - Posted by Relayer (not verified)

The GPU reads the monitor to find out what range it can support. You can only slow the refresh rate down so much and it's dependent on the monitor's capabilities. This monitor was hacked to support Adaptive-Sync. There will be monitors that will be better and/or worse. That's on the monitor though. The refresh rates AMD shows are what is/will be supported by their graphics cards.

June 4, 2014 | 01:06 PM - Posted by renz (not verified)

The very point of g-sync (or freesync) was suppose to improve the smoothness at low frame rates. For example to make 40 fps feels the same as 60 fps or even 120fps. (Though nvidia g-sync only works for 30fps and above). When nvidia first show g-sync game Dev were talking about no longer to worry about targeting 60 or 30 fps. With low fps can feel as smooth as high fps Dev can take the extra performance that always reserved for high fps to get much better graphic quality.

June 4, 2014 | 11:09 PM - Posted by arbiter

Maybe on a console 30fps is playable but on a desktop computer you start to notice stuttering easy only 50fps, even at 60fps when you turn you see it.

June 4, 2014 | 11:19 AM - Posted by ZoA (not verified)

From what I heard about how Freesync and G-sync work I suspect nvidia solution will perform better under circumstances of highly variable frame rates. Most games have tendency to drastically change their frame rates with time, and given the way VESA adaptive sync adjusts screen frame rates I think freesync might have trouble keeping up with those changes.

If I'm right about freesync rapidly variable frame rates problems I suspect G-sync might survive on the market. It will become sort of overpriced premium solution for enthusiasts, while freesync will be value orientated product for average consumer. Overall I expect freesync will be better then current v-sync , but not as good as G-sync.

Frankly neither of those technologies makes me want to buy new monitor just for them. From what I have seen on g-sync and freesync demos, while there is visible improvements in smoothness, to me effect is not as immediately obvious or overwhelming to justify the cost of buying new screen.

June 5, 2014 | 04:11 AM - Posted by Relayer (not verified)

FWIU the time it takes to read the timing signal is measured in nanoseconds. Maybe someone with more technical understanding can verify that?

June 4, 2014 | 02:18 PM - Posted by Anoneimus (not verified)

1. Competition is good! It makes everything better and cheaper!

2. I dont champion "brands", im not retarded.

3. The first 21:9 monitor supporting either Freesync or Gsync is MINE! (I cannot possibly describe how much i want this.)

4. There are APUs that support it with their integrated graphics?!
Allah Fcuking Christ!! Budget gaming will be as smooth and artifact free as all the most expensive gaming, just with lower eye-candy? [bold]OMG[/bold]

June 4, 2014 | 02:39 PM - Posted by OctaveanActually (not verified)

That monitor looks awfully familiar,......

Nixeus VUE27D Monitor maybe,.....

~$430 MSRP,.....

June 4, 2014 | 03:07 PM - Posted by Anonymous (not verified)

2560x1440 IPS monitor

Take my money NOW!!!

June 4, 2014 | 03:38 PM - Posted by OctaveanActually (not verified)

I'm guessing that the Nixeus Vue 27" is about the same although the case is a little different and the port placement as well. Probably very Similar to the Auria, overlord and other such clones. Chances are a lot of people already have monitors with similar internal hardware,.....

Unless AMD did something different to their demo monitor,.....

June 4, 2014 | 04:42 PM - Posted by A Gsync user (not verified)

As a Gsync user.I have to say. once you play with Gsync you never want to go backward:
Right now I've been using the modified ASUS VG248QE with NVIDIA G-SYNC for about 5 months now. Purchased the monitor with the Gsync installed by Digital storm.

I cannot play game without it now. Totally spoiled. Even down in the low 40FPS is shockingly smooth.

The best part of Gsync though is the ZERO tearing, and minimal input lag.

VR is about gaming. IF your not a gamer and don't care about gaming then don't bother posting about VR or Gsync. It's not your business.

June 4, 2014 | 05:19 PM - Posted by Anonymous (not verified)

Yeah, all the people who buy $150-$3000 GPUs don't care about paying that much money to have washed out colors on a TN panel.

Gamers don't care if their expensive GPUs aren't pushing proper colors thru a low end panel during gaming.

Silly non-gamers who think good color on a panel matters when you spend so much on graphic setups.

June 4, 2014 | 10:01 PM - Posted by A Gsync user (not verified)

And thus post the first dueschbag non gamer =)

June 5, 2014 | 12:33 AM - Posted by Anonymous (not verified)

Yeah what fool. Doesn't he know us gamers love spending money on 8-bit video cards only to play them on 6bit panels. DUH!!!

June 6, 2014 | 07:42 AM - Posted by Anonymous (not verified)

Enjoy playing your games on an IPS panel with low Hz and is specifically made for graphics artists with HIGH latency etc. Enjoy your HIGH input lag and blur when moving around in games LOL. Non-gamer douchbag!

June 6, 2014 | 07:36 PM - Posted by G Sync user (not verified)

Yeah doesn't he know that gaming panels advertising a 1ms response time are actually 5-9ms with a average of 7ms at 144mhz. At 120hmz the average goes up to 9ms. Which is similar to a IPS direct drive panel at 60hz with 8ms response time.

We gamerz like paying extra for 75% color gamut along with that Hz and ms marketing. Marketing cost extra and we gamerz are willing to pay for the placebo affect it gives us.

You non-gamers don't appreciate the monitor features that distort color accuracy and do little to improve response time like we true gamerz do.

June 10, 2014 | 10:02 PM - Posted by xander dyson (not verified) all fairness, half the time a 'gamer' is complaining about tearing or blur during fast action sequences it's psychosomatic.

Numerous instances where I spent hours (more like days actually, especially after building a new system) listening to various 'gurus' only to find that their various sativa-induced rantings about this or that amounted to absolutely no real world outcome.

June 6, 2014 | 08:13 PM - Posted by Anonymous (not verified)

VR is also for precise frame rate playback for the multitude of media formats.

June 4, 2014 | 06:25 PM - Posted by Gregster

That looks like a Laptop screen. Is it the same screen and demo they did in Montreal?

June 4, 2014 | 07:55 PM - Posted by Anonymous (not verified)

Nixeus Vue 27″ IPS LED 2560×1440 DisplayPort Monitor
2560×1440 IPS screen with 100% sRGB

No more saturated washed up colors of the TN panels.

June 5, 2014 | 03:30 AM - Posted by Annoneimus (not verified)


Now give me a quality 3440x1440 34" 10bit IPS with freesync and you can have all my moneys.

Hopefully dell delays their 34" to the end of the year to implement this. But i wont hold my breath.

June 5, 2014 | 12:10 PM - Posted by Anonymous (not verified)

I more interested in actual products and their release dates than prototypes.

June 12, 2014 | 05:18 PM - Posted by Anoneimus (not verified)

well... the asus 1440p 144hz monitor has been "RELEASED" for 6 months now :)

June 5, 2014 | 04:38 PM - Posted by Anonymous (not verified)

Free the SYNC! Free Willy !

June 10, 2014 | 09:56 PM - Posted by xander dyson (not verified)

What is wrong with you?


You are not that important. You are not that busy. If you are, then perhaps you should conduct your interview another time. I really do not understand this behavior. Isn't it self-evident that such a behavior is not only rude and annoying, but completely unprofessional?

June 10, 2014 | 11:04 PM - Posted by xander dyson (not verified)

...and NOBODY gives a **** about 3D-vision!

"3D" is dead, Finally! Put a cork in it until next decade when they dig this trash back up for the next round of suckers.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

This question is for testing whether you are a human visitor and to prevent automated spam submissions.