AMD Demonstrates Prototype FreeSync Monitor with DisplayPort Adaptive Sync Feature

Subject: Graphics Cards, Displays | June 3, 2014 - 09:40 PM |
Tagged: gsync, g-sync, freesync, DisplayPort, computex 2014, computex, adaptive sync

AMD FreeSync is likely a technology or brand or term that is going to be used a lot between now and the end of 2014. When NVIDIA introduced variable refresh rate monitor technology to the world in October of last year, one of the immediate topics of conversation was the response that AMD was going to have. NVIDIA's G-Sync technology is limited to NVIDIA graphics cards and only a few (actually just one still as I write this) monitors actually have the specialized hardware to support it. In practice though, variable refresh rate monitors fundamentally change the gaming experience for the better

View Full Size

At CES, AMD went on the offensive and started showing press a hacked up demo of what they called "FreeSync", a similar version of the variable refresh technology working on a laptop. At the time, the notebook was a requirement of the demo because of the way AMD's implementation worked. Mobile displays have previously included variable refresh technologies in order to save power and battery life. AMD found that it could repurpose that technology to emulate the effects that NVIDIA G-Sync creates - a significantly smoother gaming experience without the side effects of Vsync.

Our video preview of NVIDIA G-Sync Technology

Since that January preview, things have progressed for the "FreeSync" technology. Taking the idea to the VESA board responsible for the DisplayPort standard, in April we found out that VESA had adopted the technology and officially and called it Adaptive Sync

So now what? AMD is at Computex and of course is taking the opportunity to demonstrate a "FreeSync" monitor with the DisplayPort 1.2a Adaptive Sync feature at work. Though they aren't talking about what monitor it is or who the manufacturer is, the demo is up and running and functions with frame rates wavering between 40 FPS and 60 FPS - the most crucial range of frame rates that can adversely affect gaming experiences. AMD has a windmill demo running on the system, perfectly suited to showing Vsync enabled (stuttering) and Vsync disabled (tearing) issues with a constantly rotating object. It is very similar to the NVIDIA clock demo used to show off G-Sync.

View Full Size

The demo system is powered by an AMD FX-8350 processor and Radeon R9 290X graphics card. The monitor is running at 2560x1440 and is the very first working prototype of the new standard. Even more interesting, this is a pre-existing display that has had its firmware updated to support Adaptive Sync. That's potentially exciting news! Monitors COULD BE UPGRADED to support this feature, but AMD warns us: "...this does not guarantee that firmware alone can enable the feature, it does reveal that some scalar/LCD combinations are already sufficiently advanced that they can support some degree of DRR (dynamic refresh rate) and the full DPAS (DisplayPort Adaptive Sync) specification through software changes."

View Full Size

The time frame for retail available monitors using DP 1.2a is up in the air but AMD has told us that the end of 2014 is entirely reasonable. Based on the painfully slow release of G-Sync monitors into the market, AMD has less of a time hole to dig out of than we originally thought, which is good. What is not good news though is that this feature isn't going to be supported on the full range of AMD Radeon graphics cards. Only the Radeon R9 290/290X and R7 260/260X (and the R9 295X2 of course) will actually be able to support the "FreeSync" technology. Compare that to NVIDIA's G-Sync: it is supported by NVIDIA's entire GTX 700 and GTX 600 series of cards.

View Full Size

All that aside, seeing the first official prototype of "FreeSync" is awesome and is getting me pretty damn excited about the variable refresh rate technologies once again! Hopefully we'll get some more hands on time (eyes on, whatever) with a panel in the near future to really see how it compares to the experience that NVIDIA G-Sync provides. There is still the chance that the technologies are not directly comparable and some in-depth testing will be required to validate.

June 3, 2014 | 09:48 PM - Posted by SikSlayer

"Monitors COULD BE UPGRADED to support this feature, but AMD warns us: "...this does not guarantee that firmware alone can enable the feature, it does reveal that some scalar/LCD combinations are already sufficiently advanced that they can support some degree of DRR (dynamic refresh rate) and the full DPAS (DisplayPort Adaptive Sync) specification through software changes."

So just like G-Sync, except for the lucky few who just happen to have the right hardware, everyone has to go out and buy new displays that can take advantage of this. Good to see AMD support this, but in the end all this means is I have to buy a new display with either (or preferably both) technologies implemented.

I bring this up because when G-Sync was announced, a fanboy war instantly started for no good reason, trying to vilify one company or the other for something that requires a new purchase for 99% of the population anyway.

June 3, 2014 | 10:29 PM - Posted by arbiter

"Only the Radeon R9 290/290X and R7 260/260X (and the R9 295X2 of course) will actually be able to support the "FreeSync" technology."

You forgot about that little tid bit to. only 5 AMD cards support it which means you might not only have to buy a new monitor but a certain gpu model. Yea fanboyz and AMD was in on the vilifying nvidia over the need for a new monitor but AMD's idea is in the same boat well really worse since lack of gpu support nvidia already has on their side.

June 4, 2014 | 04:30 AM - Posted by Spunjji

Your reactionary approach here is a little wrong-headed. How are AMD supposed to add DP 1.2a support to existing products? :)

June 4, 2014 | 08:06 PM - Posted by arbiter

Well Nvidia's gsync wasn't released to well after 700 series, yet 600 series cards support it so?

June 8, 2014 | 08:30 AM - Posted by Anonymous (not verified)

Yes, because they add proprietary logic to the monitor. AMD is limited with freesync because they cannot add logic to the monitor.

June 4, 2014 | 04:26 AM - Posted by Spunjji

You've gleefully missed several points there:

1) Some people may have monitors already capable of it. Not true with G-SYNC.
2) It doesn't require any additional hardware above and beyond what would already be in the monitor. G-SYNC requires some fairly hefty additional hardware to be added to the display.
3) Any DisplayPort 1.2a device going forward will support this. AMD, Intel, nVidia - it's there for all.

The fanboy war started because it's a great idea and, like PhysX, nVidia decided to try to turn it into a proprietary competitive advantage rather than moving the whole industry forwards. Whether or not you think that makes nVidia evil or whatever shit people come out with is irrelevant, that sort of thing does tend to aggravate people.

June 4, 2014 | 04:37 AM - Posted by Anonymous (not verified)

You obviously ignore the fact that AMD's solution is based on the DisplayPort standard. It's easier to implement for monitor vendors because the Adaptive Sync specification exists for a long time. It's also less cost intensive and license-free. Nvidia just do their typical proprietary crap to milk their fangirls.

And btw, FreeSync works more flawlessly. G-Sync has some drawbacks. For instance it costs some performance when running games in 3D.

June 4, 2014 | 09:41 AM - Posted by Anonymous (not verified)

He's not ignoring that. He's correctly quoting AMD that firmware isn't enough. They need specific hardware in that display. He could be over estimating that 99% of monitor owners need to buy a new monitor but you nor I know any better what percentage of people already have the right hardware.

June 4, 2014 | 09:58 AM - Posted by renz (not verified)

Did you mean stereoscopic 3D? Anyway any links proving freesync implementation is better than g-sync?

June 5, 2014 | 06:48 PM - Posted by arbiter

Yea don't think anyone uses that 3D stuff cept a few people.

June 3, 2014 | 11:17 PM - Posted by Anonymous (not verified)

The bottom line will be price of the hardware..From both AMD and Nvidia.

June 4, 2014 | 01:44 AM - Posted by Anonymous (not verified)

"Mobile displays have previously included variable refresh technologies in order to save power and battery life. AMD found that it could repurpose that technology to emulate the effects that NVIDIA G-Sync creates"

That's backward.
nVidia found that they could use existing VESA stuff to make fanboy milking modules with a new name.

June 4, 2014 | 03:16 AM - Posted by Mac (not verified)

So in ~6 months AMD has shown something similar to what nVidia took 2 years to get at? Awesome! Any video of this running?

June 4, 2014 | 04:13 AM - Posted by Relayer (not verified)

Here's a link to the AMD Free-Sync demo. Maybe you should have included it instead of an nVidia vid in an AMD article?
https://www.youtube.com/watch?v=cK-aV4ryKdE

June 4, 2014 | 07:35 AM - Posted by Ryan Shrout

When I posted this, no video was available. But I'll add it now.

June 4, 2014 | 08:30 AM - Posted by Mac (not verified)

Can you verify this 40-60Hz range because Computerbase say otherwise, they're talking 47-48Hz

June 5, 2014 | 12:55 AM - Posted by Relayer (not verified)

Thanks. Sorry for the tone in my post. It wasn't necessary.

June 4, 2014 | 04:48 AM - Posted by Anonymous (not verified)

40-60fps is a weak range. Why can't DPAS or G-Sync dynamically cover the entire range of playable frame rates? 30fps is generally considered playable for most FPS titles even though a minimum of 60 is best, but 20fps is often considered the minimum for RTS/MMO games. Given that even beast rigs with Quadfire/QuadSLI still dip into the 20s when gaming at 4K, it seems like a missed window.

Why not support syncing for ALL frame rates? Why not a complete solution? If a monitor supports 144Hz natively, then do 0-144Hz. There is noticeable screen tearing and stuttering at all frame rates.

Glad to see DPAS can be a firmware update to existing DP1.2 monitors. But it's a shame to see AMD support it in only 5 GPUs. Variable VBLANK has been supported by every GPU since the 90s. LAME!

June 4, 2014 | 05:27 AM - Posted by Anonymous (not verified)

Relax, it is just an early hacked together demo. It is not a finished product.

The supported refresh rates have been named already: 36-240Hz, 21-144Hz, 17-120Hz and 9-60Hz.

In addition to the listed GPUs, FreeSync is supported in Kabini/Kaveri APUs too.

June 4, 2014 | 10:37 AM - Posted by Anonymous (not verified)

I'm completely calm, just disappointed. Like you said, they claimed to support ranges of 6-240Hz, 21-144Hz, 17-120Hz and 9-60Hz, but that's not what they are showing in their demos.

I wonder if those ranges are resolution dependent? For example:

6-240Hz (720p/768p)
21-144Hz (1080p)
17-120Hz (1440p/1600p)
9-60Hz (4K)

June 5, 2014 | 05:36 PM - Posted by Relayer (not verified)

The GPU reads the monitor to find out what range it can support. You can only slow the refresh rate down so much and it's dependent on the monitor's capabilities. This monitor was hacked to support Adaptive-Sync. There will be monitors that will be better and/or worse. That's on the monitor though. The refresh rates AMD shows are what is/will be supported by their graphics cards.

June 4, 2014 | 10:06 AM - Posted by renz (not verified)

The very point of g-sync (or freesync) was suppose to improve the smoothness at low frame rates. For example to make 40 fps feels the same as 60 fps or even 120fps. (Though nvidia g-sync only works for 30fps and above). When nvidia first show g-sync game Dev were talking about no longer to worry about targeting 60 or 30 fps. With low fps can feel as smooth as high fps Dev can take the extra performance that always reserved for high fps to get much better graphic quality.

June 4, 2014 | 08:09 PM - Posted by arbiter

Maybe on a console 30fps is playable but on a desktop computer you start to notice stuttering easy only 50fps, even at 60fps when you turn you see it.

June 4, 2014 | 08:19 AM - Posted by ZoA (not verified)

From what I heard about how Freesync and G-sync work I suspect nvidia solution will perform better under circumstances of highly variable frame rates. Most games have tendency to drastically change their frame rates with time, and given the way VESA adaptive sync adjusts screen frame rates I think freesync might have trouble keeping up with those changes.

If I'm right about freesync rapidly variable frame rates problems I suspect G-sync might survive on the market. It will become sort of overpriced premium solution for enthusiasts, while freesync will be value orientated product for average consumer. Overall I expect freesync will be better then current v-sync , but not as good as G-sync.

Frankly neither of those technologies makes me want to buy new monitor just for them. From what I have seen on g-sync and freesync demos, while there is visible improvements in smoothness, to me effect is not as immediately obvious or overwhelming to justify the cost of buying new screen.

June 5, 2014 | 01:11 AM - Posted by Relayer (not verified)

FWIU the time it takes to read the timing signal is measured in nanoseconds. Maybe someone with more technical understanding can verify that?

June 4, 2014 | 11:18 AM - Posted by Anoneimus (not verified)

1. Competition is good! It makes everything better and cheaper!

2. I dont champion "brands", im not retarded.

3. The first 21:9 monitor supporting either Freesync or Gsync is MINE! (I cannot possibly describe how much i want this.)

4. There are APUs that support it with their integrated graphics?!
Allah Fcuking Christ!! Budget gaming will be as smooth and artifact free as all the most expensive gaming, just with lower eye-candy? [bold]OMG[/bold]

June 4, 2014 | 11:39 AM - Posted by OctaveanActually (not verified)

That monitor looks awfully familiar,......

Nixeus VUE27D Monitor maybe,.....

http://www.anandtech.com/show/7585/nixeus-vue27d-monitor-review

~$430 MSRP,.....

June 4, 2014 | 12:07 PM - Posted by Anonymous (not verified)

2560x1440 IPS monitor

Take my money NOW!!!

June 4, 2014 | 12:38 PM - Posted by OctaveanActually (not verified)

I'm guessing that the Nixeus Vue 27" is about the same although the case is a little different and the port placement as well. Probably very Similar to the Auria, overlord and other such clones. Chances are a lot of people already have monitors with similar internal hardware,.....

Unless AMD did something different to their demo monitor,.....

June 4, 2014 | 01:42 PM - Posted by A Gsync user (not verified)

As a Gsync user.I have to say. once you play with Gsync you never want to go backward:
Right now I've been using the modified ASUS VG248QE with NVIDIA G-SYNC for about 5 months now. Purchased the monitor with the Gsync installed by Digital storm.

I cannot play game without it now. Totally spoiled. Even down in the low 40FPS is shockingly smooth.

The best part of Gsync though is the ZERO tearing, and minimal input lag.

VR is about gaming. IF your not a gamer and don't care about gaming then don't bother posting about VR or Gsync. It's not your business.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.