AMD Releases FreeSync Information as a FAQ

Subject: General Tech, Graphics Cards, Displays | July 29, 2014 - 09:02 PM |
Tagged: vesa, nvidia, g-sync, freesync, DisplayPort, amd

Dynamic refresh rates have two main purposes: save power by only forcing the monitor to refresh when a new frame is available, and increase animation smoothness by synchronizing to draw rates (rather than "catching the next bus" at 16.67ms, on the 16.67ms, for 60 Hz monitors). Mobile devices prefer the former, while PC gamers are interested in the latter.

Obviously, the video camera nullifies the effect.

NVIDIA was first to make this public with G-Sync. AMD responded with FreeSync, starting with a proposal that was later ratified by VESA as DisplayPort Adaptive-Sync. AMD, then, took up "Project FreeSync" as an AMD "hardware/software solution" to make use of DisplayPort Adaptive-Sync in a way that benefits PC gamers.

Today's news is that AMD has just released an FAQ which explains the standard much more thoroughly than they have in the past. For instance, it clarifies the distinction between DisplayPort Adaptive-Sync and Project FreeSync. Prior to the FAQ, I thought that FreeSync became DisplayPort Adaptive-Sync, and that was that. Now, it is sounding a bit more proprietary, just built upon an open, VESA standard.

If interested, check out the FAQ at AMD's website.

Source: AMD

July 29, 2014 | 11:57 PM - Posted by H1tman_Actua1

AMD fail again. Now you know why Nvidia went all out and developed their own logic aka G-SYNC board, because monitor manufacturers and logic board manufacturers are not going to do the work for them. AMD is relying completely on faith of that....lol

July 30, 2014 | 03:21 AM - Posted by JohnGR

Oh wow... the fanboy is on a roll. Posting like there is no tomorrow. Also understood nothing from the faq.

July 30, 2014 | 10:52 AM - Posted by H1tman_Actua1

hey next year this time. I'll ask you if you've seen a monitor that's freesync enabled. While you pout I'll have been enjoying my ROG swift for over a year...poor little guy was misled.

BTW I've been using Gysnc since last January. Purchased the ASUS VG248QE(Gsync logic installed) from digital storm. WORTH EVERY PENNY!

enjoy being entitled. BTW Obama called said the cell phone he gave you is about to go out of service.

July 30, 2014 | 11:50 AM - Posted by JohnGR

You better start investing in communications hardware. Your brain has lost contact with reality.

July 30, 2014 | 01:07 PM - Posted by Anonymous (not verified)

Looks more like he's lost contact with his brain.

August 2, 2014 | 02:34 AM - Posted by BDK (not verified)

Nvidia fails again actually. Once again they make everything overly expensive due to completely pointless propriety tech for something you can get for free. Typical nvidia really, and typical nvidia fanboys who are absolutely clueless as well.

July 29, 2014 | 11:58 PM - Posted by H1tman_Actua1

finally the light has been shown. been saying this since day one.

July 30, 2014 | 12:07 AM - Posted by H1tman_Actua1

:How is Project FreeSync different from NVIDIA G-Sync?
​There are three key advantages Project FreeSync holds over G-Sync: no licensing fees for adoption, no expensive or proprietary hardware modules, and no communication overhead.
The last benefit is essential to gamers, as Project FreeSync does not need to poll or wait on the display in order to determine when it’s safe to send the next frame to the monitor.
Project FreeSync uses industry-standard DisplayPort Adaptive-Sync protocols to pre-negotiate supported min/max refresh rates during plug’n’play, which means frame presentation to the user will never be delayed or impaired by time-consuming two-way handshakes.:

LMFAO!!!! lies!!! really? how does this magically come to life? oh yah...you have to build a logic board (VR) capable and oh yah a monitor company needs to support and build a monitor around that logic board.....oh yah..AMD you forgot to mention you need one of you most current GPU's...fucken liars.

July 30, 2014 | 01:29 AM - Posted by Iconix (not verified)

The only reason DisplayPort Adaptive-Sync monitors won't make it to market is if there is no market, and if you believe there is no market you don't know anything about gaming or film enthusiasts.

Have you ever considered that competition is the thing to champion, not brands? Because it sounds like you're still wiping NVidia's semen off your chin.

July 30, 2014 | 03:07 AM - Posted by Anonymous (not verified)

F***ing fanboys... go and try to buy yourself a G-Sync-Monitor ^^

July 30, 2014 | 10:53 AM - Posted by H1tman_Actua1

already did last January. worth every single penny.

August 2, 2014 | 08:22 PM - Posted by Anonymous (not verified)

You bought a kit not a monitor.

This guy is as dumb as they come.

August 7, 2014 | 10:54 AM - Posted by H1tman_Actua1

ummm it's called a company by the name of digital storm...and no I bought the monitor complete with Gsync (kit) installed by them... for $499

Hey Obama called said the cell phone he gave you is about to expire.

July 30, 2014 | 09:16 AM - Posted by Anonymous (not verified)

Gotta go with the fanboy on this one. The only thing that matters here is how much it will cost to get one of the these monitors on my desk. Gsync adds cost. That's fine, because I get a benefit. AMD should stop selling this thing as "Free". If you think that monitor manufactures are going to start giving away new features on their products for "Free" then you're batcrap insane.

July 30, 2014 | 10:24 AM - Posted by Anonymous (not verified)

Yeah, the NVIDIA "fanboy" is correct about this. AMD misrepresented/twisted/lied/exaggerated about the requirements and features of A-Sync/Free-Sync.

I should have just bought a G-Sync monitor in January when they were available (they were and are still available to purchase for anyone that knows how to use Google and shop online). I would have been a lot happier over the last 6 months than I have been waiting for more details on AMD Weak-Sync.

July 30, 2014 | 12:36 PM - Posted by ROdNEY

Liars? R Huddy stated clearly that higher modes will need logic board inside monitor (lower will not).
But since it is based on standard anyone can make their solution based on that, which AMD could never done it with G-SYNC.

Not sure what is all the trolling about, but what do I know. I thought if you have G-SYNC you should be happy with it (you bought what you want it, works as you expected), but according your responds, you do not sound happy at all.

Furthermore, If you do not have Radeon, then you do not need to care about Freesync at all, you could not use it anyway. Just saying if you missed that.

August 1, 2014 | 05:03 AM - Posted by nashathedog

There's only a very limited amount of AMD cards that can use freesync, The 260 and 290 ranges are the only ones. I'm on a 290 model so I'm okay but people on 270's, 280's and 7000 series cards are out of luck. I'm sure the next range will all support it but there bringing an inferior system to the table very late in the day and it is inferior. If you go far enough into the subject you can't avoid that truth unless your blinded by irrelevant points such as what brand does what.

July 30, 2014 | 12:36 AM - Posted by Anonymous (not verified)

"oh yah..AMD you forgot to mention you need one of you most current GPU's...fucken liars."

Desperate nvidia fanboy?

Since when is a product from 2011 the most current.

July 30, 2014 | 01:43 AM - Posted by mutantmagnet (not verified)

You misread the FAQ.

"All AMD Radeon™ graphics cards in the AMD Radeon™ HD 7000, HD 8000, R7 or R9 Series will support Project FreeSync for video playback and power-saving purposes."

2011 cards for video playback only.

" The AMD Radeon™ R9 295X2, 290X, R9 290, R7 260X and R7 260 GPUs additionally feature updated display controllers that will support dynamic refresh rates during gaming."

Cards in the last year required for gaming improvements.

July 30, 2014 | 04:56 AM - Posted by arbiter

what is really sad is that only AMD gpu's of newest range support it for games where as Nvidia has support for any gpu using kepler gpu so 650ti and up. SO really "free-sync" is not so free knowing that a new monitor regardless is gonna be needed either way.

As for him misreading that, I bet a ton of AMD users will misread it and think their card supports it in games then find out it doesn't when they try. Its one those things of people who don't read.

July 30, 2014 | 10:44 AM - Posted by H1tman_Actua1

exaactly

July 30, 2014 | 12:43 PM - Posted by ROdNEY

All APUs and GCN 2.0 support it, which for new tech just months old is not bad, VBLANK couldn't be used on its own anyway so 100% compatibility they cannot support more models. Would you expect them to change Graphics to everyone for free? Just because NVidia offer maybe 2/3 more chips that can handle theirs. I would love to hear what would shareholders said about that xP

July 31, 2014 | 07:09 AM - Posted by renz (not verified)

but isn't that they called it FreeSync because it was supposed to be Free solution (no need for new monitor and no need for new GPU)?

July 30, 2014 | 03:28 AM - Posted by Anonymous (not verified)

Nvidia + G-Sync = TN panel

AMD + Adaptive Sync = IPS panel

July 30, 2014 | 05:10 AM - Posted by arbiter

Um i don't think g-sync will be limited to TN panels. Ones announced are but likely IPS ones will show up as well.

Looking around, reason IPS is lacking is cause they are well to slow. g-sync is more focus is on gaming. Some stuff I read IPS is kinda mostly limited to under 100hz more around 75hz cept a few that can go higher.

July 30, 2014 | 06:33 AM - Posted by Anonymous (not verified)

Dont care about 100000000000hz.

ANYsync means NOTHING to me if its on some sh1t TN...it might as well not exist.

First 21:9 IPS with gsync OR freesync will be mine... dont care about cost or whatever

July 30, 2014 | 08:17 AM - Posted by Anonymous (not verified)

Everything is focused on TN because they rest of LCD are shit for gaming PERIOD.

July 30, 2014 | 10:46 AM - Posted by H1tman_Actua1

IPS... still not fast enough for gaming.

8bit high grade TN panel, 144hz, Gsync and 1440P

ROG SWIFT PG278Q FTW!!!

oh yah. ROG\ASUS makes a lot of AMD product...why havent they supported freessync..oh that's right. AMD never made a logic board to support Variable refresh rate. Nvidia did.

July 30, 2014 | 12:05 PM - Posted by mutantmagnet (not verified)

Actually IPS can be made fast enough for gaming. I found this out while reading up on lightboost on blurbusters.com.

The only problem is that these displays cost five figures. I think most gamers would like to spend that much money on their car and not an IPS display.

July 30, 2014 | 03:06 PM - Posted by Anonymous (not verified)

lightboost requires high framerate to work ips would have to run over 60hz and have support for it.

lightboost/ULMB work with 100-120FPS
BenQ lightboost can strobe down to 70 FPS

Also lightboost ads latency and you get color degrade from the strobe.

LCD in general blow for gaming that's why i have a Sony fw900 CRT we seriously need OLED

July 30, 2014 | 12:23 PM - Posted by ROdNEY

AMD will have to make logic board (monitor will need it, will work a bit different though) - for higher modes will be also necessary. But Freesync is not yet out so we will know (which modes) when it gets out.
IF you have Radeon, then you just wait, if you cannot wait you will buy NVidia. Not sure why all those hate responds. NVidia presented fix for some problem, AMD made their solution pretty fast. Pretty much what AMD has done with mantle, NVidia answered with AZDO.
I definitely not going to hang myself for AMD having no adaptive-sync monitor atm.

August 2, 2014 | 04:31 PM - Posted by Anonymous (not verified)

Nah, your face is shit... period.

July 30, 2014 | 07:25 AM - Posted by Anonymous (not verified)

All gaming TN panels are overdriven even the new ASUS ROG SWIFT. Nothing stopping them from overdriving IPS panels except price.

Monitor sellers didn't pay attention to TN panels being overdriven until they realized there was money to be made from people who bought Asian brands which overdrove them.

July 30, 2014 | 12:45 PM - Posted by ROdNEY

It will not, G-SYNC has its limits already (same as Freesync) no need to add more.
Another question is why use no latency system for monitor that has some unnecessary latency on its own, but that is something else.

July 30, 2014 | 10:45 AM - Posted by H1tman_Actua1

pfff in what imaginary world does this AMD freesync monitor exist. Oh wait that's right. There isn't a single logic board\monitor manufacturer working on this....

July 30, 2014 | 11:53 AM - Posted by Iconix (not verified)

Actually some monitors already in existence support DisplayPort Adaptive-Sync with a simple firmware upgrade.

http://www.anandtech.com/show/8129/computex-2014-amd-demonstrates-first-...

I wouldn't expect you to know that though, sounds like you've had blinkers on since you bought your GSync monitor. Your posts read like a textbook on choice-supportive bias.

July 30, 2014 | 12:24 PM - Posted by ROdNEY

And works fine, but that only proof that solution is working, it is just not available on market yet.

July 30, 2014 | 08:33 PM - Posted by arbiter

That is an AMD claim, which we all know the history of AMD and their claims. AMD has made many claims over the years and end up having to back pedal on them. We don't know what they did to those monitors to get them to support it. Likely only 1 to a handful will support it. I wouldn't be shock if there is no firmware release for any and you gotta buy new, doubt monitor makers want to be on hook for end users bricking their monitors and then whining about it.

September 4, 2014 | 03:26 AM - Posted by Anonymous (not verified)

lol for over a year. you lies are beautiful^^ free sync is a vesa Standard and freesync will support much more then 144Hz.
the first FreeSync monitors will sample this month, with finished products expected to ship early next year.

The Video Electronics Standards Association (VESA) already announced the addition of 'Adaptive-Sync' to its popular DisplayPort 1.2a video interface standard.
The protocol FreeSync / Adaptive Sync will be embedded into DisplayPort 1.2a

July 30, 2014 | 11:54 AM - Posted by Anonymous (not verified)

The use of GSync or ASync has nothing to do with TN or IPS. Gaming monitors in general will mostly be TN because IPS pixel response is too slow for much over 60Hz. TN can be used for almost any frequency due to its blazing response time. Combined with true 8-bit panels (ASUS ROG PG278Q), TN's only flaw is viewing angles... and why would you ever play a game at an off-angle?

July 30, 2014 | 05:02 PM - Posted by Anonymous (not verified)

Do you dumb kiddies even know what you are talking about?

That "response time" you see advertised is for color shifts. Lower = less ghosting.

Has precisely NOTHING to do with responsiveness of the game.
Your thinking of "input lag".
Ye, wonder why they dont advertise this one... because some nongaming panels like the IPS quinx has that lower than some "GAMING" TN trash.

That ROG Swift is SH1T. Its just as SH1T as any other TN, but a lot smoother now... dunno, i rather blowtorch my eyes myself.

July 31, 2014 | 03:33 PM - Posted by Anonymous (not verified)

Go home, you're drunk.

Not only did I not bring up input lag or game responsiveness, I wasn't even referring to it. I only brought up the pixel response time - the time it takes to change colors, gray to gray, light to dark. Many IPS monitors suffer from ghosting exactly for this reason (too slow).

July 30, 2014 | 04:12 AM - Posted by Henrik (not verified)

This will probably be an easy implementation for monitor manufacturers.
They will do it even if no one will use it as they can add an extra label on the monitor and in marketing etc.

G-sync will not survive. Do you even know anyone that owns a G-sync monitor or even seriously plan to buy one? I don't. It is too dear and you must have an Nvidia which cuts the market in half right there.

July 30, 2014 | 05:14 AM - Posted by arbiter

cut the market in half huh? just like AMD did with mantle?

July 30, 2014 | 07:52 AM - Posted by Henrik (not verified)

AMD has openly promised Intel to get access to Mantle and have invited Nvidia to join the party. That Nvidia do not want to use this technology is not AMD's fault.

July 30, 2014 | 01:15 PM - Posted by dlpatague (not verified)

AMD will invite Nvidia to "join the party" so Nvidia will spend their development money on it so AMD can steal Nvidia's work on improving it. You don't think they would? AMD doesn't have the money to develop anything on their own. Why do you think AMD is relying on monitor manufactures and VESA to help get this whole "FREE-SYNC" up and going? Nvidia spent their own money to get G-SYNC implemented with manufactures. Creating their own logic board is not cheap. You think AMD could afford to do that?

July 30, 2014 | 01:15 PM - Posted by dlpatague (not verified)

AMD will invite Nvidia to "join the party" so Nvidia will spend their development money on it so AMD can steal Nvidia's work on improving it. You don't think they would? AMD doesn't have the money to develop anything on their own. Why do you think AMD is relying on monitor manufactures and VESA to help get this whole "FREE-SYNC" up and going? Nvidia spent their own money to get G-SYNC implemented with manufactures. Creating their own logic board is not cheap. You think AMD could afford to do that?

July 30, 2014 | 12:28 PM - Posted by ROdNEY

Yes but you mistaken AMD for NVidia. AMD using standards when possible (mantle is beta, cannot be released without documentation, but it will be open to other vendors). NVidia proprietary tech when possible.

July 30, 2014 | 09:29 AM - Posted by Anonymous (not verified)

It doesn't matter how easy it is for them to implement. They are not going to give it away new functionality for free.

July 30, 2014 | 11:44 AM - Posted by Anonymous (not verified)

I don't think AMD just want to make money here. They indirectly want G-sync to fail. Just like Nvidia want (even more so) for Mantle to fail.

July 30, 2014 | 06:15 AM - Posted by Silver Sparrow

Interesting to find out that older generation cards will get some benefit from this technology.

What excites me after reading the FAQ is a mantle optimised game teamed up with a freesync or 'ADAPTIVE SYNC based tech' monitor. *Smooth operator* Silky smooth gameplay for sure!

AMD should get Sade to do their adverts for this lol :)

July 30, 2014 | 04:28 PM - Posted by arbiter

I have watched a ton of video on my 60hz monitor never seen any issue's with them of tearing. As for power saving, my monitor uses 25 watts so nothing really to save there.

July 30, 2014 | 08:19 AM - Posted by Anonymous (not verified)

These threads always turn into pissing contest of ignorance

July 30, 2014 | 11:43 AM - Posted by Anonymous (not verified)

Both G-Sync and Freesync are irrelevant until we get get decent LCDs. As it stands, CRTs are still the only real choice, especially for gaming.

July 30, 2014 | 11:47 AM - Posted by Anonymous (not verified)

Maybe for CS professionals, but the rest of us do not miss CRTs.

July 30, 2014 | 02:39 PM - Posted by snook

adaptive sync is a vesa standard. it will be added to monitors, that much is simple. It certainly wont run monitor cost up $100+ like gsync. mantle may die due to DX12, not nvidia. gsync will die due to vesa standard, not AMD. get used to it.
listening to and arguing with boors is a poor use of time.
have a nice day.

July 30, 2014 | 04:32 PM - Posted by arbiter

Well less you have proper card for AMD's Not-so "free-sync". that 100$ difference disappears with face you spending 100-150$ for their cheapest card that supports it for gaming. The movie watching I never tearing issues on my current and even my last monitor so.

July 30, 2014 | 05:26 PM - Posted by Anonymous (not verified)

What about multi-monitor use.

G-Sync doesn't work with more then one monitor and apparently Nvidia forgot to tell ASUS about it according to the ASUS ROG forums.

July 30, 2014 | 08:27 PM - Posted by arbiter

i doubt not-so-"free-sync" will either

July 30, 2014 | 09:06 PM - Posted by Anonymous (not verified)

We know for certain G-Sync doesn't.

LOL

July 31, 2014 | 04:09 PM - Posted by Jabbadap (not verified)

[OT]It does. But you need as many cards in sli as you have monitors. g-sync needs dp port on card, dp-mst-hubs does not cut it(which is kind of retarded).

That's a limitation of current geforce card outputs, maybe next generation cards will have more dp outputs to make surround with one card possible(or even über düper surround g-sync edition cards with dp outputs only).[/OT]

Looking forward to see tests with a-sync monitors, when they arrive end of year. Time will tell if it's superior to g-sync in games(well it's already with video/power usage, meaning normal desktop usage).

After all amd's freesync is just a driver side support for using a-sync. So it's very possible that nvidia will write it's own driver support i.e. extension to g-sync to cover usage of a-sync.

July 30, 2014 | 09:14 PM - Posted by snook

I don't use an AMD card, gsync doesn't move me at all. it will fail to be adopted industry wide. adaptive sync will not fail to be adopted, it's an industry standard. I wouldn't buy a gsync monitor for any reason. Nvidia will drop it in favor of the vesa standard soon enough. if not, they are dumber than I think.
my point isn't just cost. it's adoption, adaptive sync will be around far longer and more widely available than gsync ever will be.

July 31, 2014 | 10:55 AM - Posted by godrilla (not verified)

Directx 12 requires an OS system upgrade will keep Mantle alive, mantle will eventually come to Linux, ios, and Android, while dx12 will be limited to windows 9! There are more AAA titles today supporting mantle than ones that do not ( COD, GTAV, dragon age, mass effect 4, etc) mantle is here to stay!

August 2, 2014 | 03:25 AM - Posted by arbiter

How long after DX12 comes out will dev's support mantle (without a large check from AMD)?

August 2, 2014 | 08:37 PM - Posted by Anonymous (not verified)

A good amount.

DX12 wont support CPU optimization due to backwards compatibility issues which was covered in The Techrepot article. If they want to sell more games they will not target DX12 market alone. Mantle will be the CPU optimization for those still on DX11 or older running windows 8/7.

July 31, 2014 | 10:55 AM - Posted by godrilla (not verified)

Directx 12 requires an OS system upgrade will keep Mantle alive, mantle will eventually come to Linux, ios, and Android, while dx12 will be limited to windows 9! There are more AAA titles today supporting mantle than ones that do not ( COD, GTAV, dragon age, mass effect 4, etc) mantle is here to stay!

July 31, 2014 | 07:03 AM - Posted by Anonymous (not verified)

It's difficult to tell without an actual technical psecification, but from the way AMd very carefully describe it:
"Project FreeSync’s ability to synchronize the refresh rate of a display to the framerate of a graphics card "
"Project FreeSync uses industry-standard DisplayPort Adaptive-Sync protocols to pre-negotiate supported min/max refresh rates during plug’n’play"
"The basic benefit of Project FreeSync is the dynamic refresh rate ("DRR"), which allows the graphics card to synchronize the refresh rate of a monitor 1:1 with the framerate of an AMD Radeon™ GPU"
"Using this approach, no communication must occur to negotiate the time a current frame remains on-screen, or to determine that is safe to send a new frame to the monitor.

By eliminating the need for ongoing communication with pre-negotiated screen update rates,"
"the graphics card can detect and set an appropriate maximum and minimum refresh rate based on the capabilities reported by the display. Potential ranges include 36-240Hz, 21-144Hz, 17-120Hz and 9-60Hz."

It sounds very much like FreeSync negotiates a 'lowest common multiple' refresh rate that both the GPU and monitor are locked to, with a lower and upper bound. This allows varying of frame delivery without handshaking, but means that frame delivery times are quantised, unlike G-sync where frame delivery is fully asynchronous right up until the monitor cannot handle another frame due to reading out the current one.

If FreeSync can negotiate a low enough common multiple refresh rate, then the two are probably indistinguishable except for edge cases where latency is key (e.g. head-mounted displays such as the Rift), where G-sync probably has the edge. The fixed minimum handshake latency is likely to be more tolerable than a variable quantisation latency.

July 31, 2014 | 07:28 AM - Posted by Anonymous (not verified)

lowest common multiple
I meant highest common factor. Brainfart.

July 31, 2014 | 07:28 AM - Posted by Anonymous (not verified)

lowest common multiple
I meant highest common factor. Brainfart.

July 31, 2014 | 07:29 AM - Posted by Anonymous (not verified)

I meant Highest Common Factor. Brainfart.

July 31, 2014 | 07:29 AM - Posted by Anonymous (not verified)

Aaand, 'page not found' apparently means 'post submitted'

July 31, 2014 | 10:00 AM - Posted by Mac (not verified)

By eliminating the need for ongoing communication with pre-negotiated screen update rates,"
"the graphics card can detect and set an appropriate maximum and minimum refresh rate based on the capabilities reported by the display. Potential ranges include 36-240Hz, 21-144Hz, 17-120Hz and 9-60Hz."

It sounds very much like FreeSync negotiates a 'lowest common multiple' refresh rate that both the GPU and monitor are locked to, with a lower and upper bound. This allows varying of frame delivery without handshaking, but means that frame delivery times are quantised, unlike G-sync where frame delivery is fully asynchronous right up until the monitor cannot handle another frame due to reading out the current one. Yeah, I'm not digging your quantisation theory, are you sure those ranges aren't just chunks that a monitor manufaturer can choose a range from? For e.g, I make a monitor that goes up to 144hz means I can set my minimum refresh rate at anywhere from 21hz (ideally, I would want the longest time a frame can be held before it starts to decay). They're sending vblank info with/ahead of every frame.

August 2, 2014 | 03:33 AM - Posted by arbiter

even doing all the negotiating in real time like g-sync, latency would be what 1-2ms. I can ping a router that is like 150+ miles away and get 15ms. that is goin through tons of routers and extra hardware. this is 1 cable from video card to monitor.

August 2, 2014 | 06:25 AM - Posted by arbiter

As said during the podcast "DisplayPort Adaptive-Sync" is part of DP standard. Project not-so"free-sync" is a proprietary thing by AMD that mod to the adaptive-sync spec so its something completely different from the VESA spec. So all the stuff AMD claimed about it being VESA DP spec was false.

August 2, 2014 | 01:13 PM - Posted by Mac (not verified)

As said during the podcast "DisplayPort Adaptive-Sync" is part of DP standard. Project not-so"free-sync" is a proprietary thing by AMD that mod to the adaptive-sync spec so its something completely different from the VESA spec. So all the stuff AMD claimed about it being VESA DP spec was false.
Freesync is radeon hardware and catalyst software, what good is that to anyone but AMD? There is zero need for it to be open. Adaptive sync is what's open and free to any vesa members and other parties that wish to licence it and they're free to call their syncing whatever they want.

August 5, 2014 | 01:29 AM - Posted by snook

thank you

October 15, 2014 | 08:39 PM - Posted by Anonymous (not verified)

I have a Catleap 2560x1440 IPS Monitor with the 2B PCB overclocked to run at 120Hz. I use a GTX 690. Until GSync, FreeSync, or WhateverSync is available on a low response time high refresh rate monitor, I can deal with the minimal tearing that I get.

October 15, 2014 | 08:40 PM - Posted by Anonymous (not verified)

I left out the part about also being on a IPS monitor. :p

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.