AMD Releases FreeSync Information as a FAQ

Subject: General Tech, Graphics Cards, Displays | July 29, 2014 - 09:02 PM |
Tagged: vesa, nvidia, g-sync, freesync, DisplayPort, amd

Dynamic refresh rates have two main purposes: save power by only forcing the monitor to refresh when a new frame is available, and increase animation smoothness by synchronizing to draw rates (rather than "catching the next bus" at 16.67ms, on the 16.67ms, for 60 Hz monitors). Mobile devices prefer the former, while PC gamers are interested in the latter.

Obviously, the video camera nullifies the effect.

NVIDIA was first to make this public with G-Sync. AMD responded with FreeSync, starting with a proposal that was later ratified by VESA as DisplayPort Adaptive-Sync. AMD, then, took up "Project FreeSync" as an AMD "hardware/software solution" to make use of DisplayPort Adaptive-Sync in a way that benefits PC gamers.

Today's news is that AMD has just released an FAQ which explains the standard much more thoroughly than they have in the past. For instance, it clarifies the distinction between DisplayPort Adaptive-Sync and Project FreeSync. Prior to the FAQ, I thought that FreeSync became DisplayPort Adaptive-Sync, and that was that. Now, it is sounding a bit more proprietary, just built upon an open, VESA standard.

If interested, check out the FAQ at AMD's website.

Source: AMD
July 29, 2014 | 11:57 PM - Posted by H1tman_Actua1

AMD fail again. Now you know why Nvidia went all out and developed their own logic aka G-SYNC board, because monitor manufacturers and logic board manufacturers are not going to do the work for them. AMD is relying completely on faith of

July 30, 2014 | 03:21 AM - Posted by JohnGR

Oh wow... the fanboy is on a roll. Posting like there is no tomorrow. Also understood nothing from the faq.

July 30, 2014 | 10:52 AM - Posted by H1tman_Actua1

hey next year this time. I'll ask you if you've seen a monitor that's freesync enabled. While you pout I'll have been enjoying my ROG swift for over a year...poor little guy was misled.

BTW I've been using Gysnc since last January. Purchased the ASUS VG248QE(Gsync logic installed) from digital storm. WORTH EVERY PENNY!

enjoy being entitled. BTW Obama called said the cell phone he gave you is about to go out of service.

July 30, 2014 | 11:50 AM - Posted by JohnGR

You better start investing in communications hardware. Your brain has lost contact with reality.

July 30, 2014 | 01:07 PM - Posted by Anonymous (not verified)

Looks more like he's lost contact with his brain.

August 2, 2014 | 02:34 AM - Posted by BDK (not verified)

Nvidia fails again actually. Once again they make everything overly expensive due to completely pointless propriety tech for something you can get for free. Typical nvidia really, and typical nvidia fanboys who are absolutely clueless as well.

July 29, 2014 | 11:58 PM - Posted by H1tman_Actua1

finally the light has been shown. been saying this since day one.

July 30, 2014 | 12:07 AM - Posted by H1tman_Actua1

:How is Project FreeSync different from NVIDIA G-Sync?
​There are three key advantages Project FreeSync holds over G-Sync: no licensing fees for adoption, no expensive or proprietary hardware modules, and no communication overhead.
The last benefit is essential to gamers, as Project FreeSync does not need to poll or wait on the display in order to determine when it’s safe to send the next frame to the monitor.
Project FreeSync uses industry-standard DisplayPort Adaptive-Sync protocols to pre-negotiate supported min/max refresh rates during plug’n’play, which means frame presentation to the user will never be delayed or impaired by time-consuming two-way handshakes.:

LMFAO!!!! lies!!! really? how does this magically come to life? oh have to build a logic board (VR) capable and oh yah a monitor company needs to support and build a monitor around that logic board.....oh yah..AMD you forgot to mention you need one of you most current GPU's...fucken liars.

July 30, 2014 | 01:29 AM - Posted by Iconix (not verified)

The only reason DisplayPort Adaptive-Sync monitors won't make it to market is if there is no market, and if you believe there is no market you don't know anything about gaming or film enthusiasts.

Have you ever considered that competition is the thing to champion, not brands? Because it sounds like you're still wiping NVidia's semen off your chin.

July 30, 2014 | 03:07 AM - Posted by Anonymous (not verified)

F***ing fanboys... go and try to buy yourself a G-Sync-Monitor ^^

July 30, 2014 | 10:53 AM - Posted by H1tman_Actua1

already did last January. worth every single penny.

August 2, 2014 | 08:22 PM - Posted by Anonymous (not verified)

You bought a kit not a monitor.

This guy is as dumb as they come.

August 7, 2014 | 10:54 AM - Posted by H1tman_Actua1

ummm it's called a company by the name of digital storm...and no I bought the monitor complete with Gsync (kit) installed by them... for $499

Hey Obama called said the cell phone he gave you is about to expire.

July 30, 2014 | 09:16 AM - Posted by Anonymous (not verified)

Gotta go with the fanboy on this one. The only thing that matters here is how much it will cost to get one of the these monitors on my desk. Gsync adds cost. That's fine, because I get a benefit. AMD should stop selling this thing as "Free". If you think that monitor manufactures are going to start giving away new features on their products for "Free" then you're batcrap insane.

July 30, 2014 | 10:24 AM - Posted by Anonymous (not verified)

Yeah, the NVIDIA "fanboy" is correct about this. AMD misrepresented/twisted/lied/exaggerated about the requirements and features of A-Sync/Free-Sync.

I should have just bought a G-Sync monitor in January when they were available (they were and are still available to purchase for anyone that knows how to use Google and shop online). I would have been a lot happier over the last 6 months than I have been waiting for more details on AMD Weak-Sync.

July 30, 2014 | 12:36 PM - Posted by ROdNEY

Liars? R Huddy stated clearly that higher modes will need logic board inside monitor (lower will not).
But since it is based on standard anyone can make their solution based on that, which AMD could never done it with G-SYNC.

Not sure what is all the trolling about, but what do I know. I thought if you have G-SYNC you should be happy with it (you bought what you want it, works as you expected), but according your responds, you do not sound happy at all.

Furthermore, If you do not have Radeon, then you do not need to care about Freesync at all, you could not use it anyway. Just saying if you missed that.

August 1, 2014 | 05:03 AM - Posted by nashathedog

There's only a very limited amount of AMD cards that can use freesync, The 260 and 290 ranges are the only ones. I'm on a 290 model so I'm okay but people on 270's, 280's and 7000 series cards are out of luck. I'm sure the next range will all support it but there bringing an inferior system to the table very late in the day and it is inferior. If you go far enough into the subject you can't avoid that truth unless your blinded by irrelevant points such as what brand does what.

July 30, 2014 | 12:36 AM - Posted by Anonymous (not verified)

"oh yah..AMD you forgot to mention you need one of you most current GPU's...fucken liars."

Desperate nvidia fanboy?

Since when is a product from 2011 the most current.

July 30, 2014 | 01:43 AM - Posted by mutantmagnet (not verified)

You misread the FAQ.

"All AMD Radeon™ graphics cards in the AMD Radeon™ HD 7000, HD 8000, R7 or R9 Series will support Project FreeSync for video playback and power-saving purposes."

2011 cards for video playback only.

" The AMD Radeon™ R9 295X2, 290X, R9 290, R7 260X and R7 260 GPUs additionally feature updated display controllers that will support dynamic refresh rates during gaming."

Cards in the last year required for gaming improvements.

July 30, 2014 | 04:56 AM - Posted by arbiter

what is really sad is that only AMD gpu's of newest range support it for games where as Nvidia has support for any gpu using kepler gpu so 650ti and up. SO really "free-sync" is not so free knowing that a new monitor regardless is gonna be needed either way.

As for him misreading that, I bet a ton of AMD users will misread it and think their card supports it in games then find out it doesn't when they try. Its one those things of people who don't read.

July 30, 2014 | 10:44 AM - Posted by H1tman_Actua1


July 30, 2014 | 12:43 PM - Posted by ROdNEY

All APUs and GCN 2.0 support it, which for new tech just months old is not bad, VBLANK couldn't be used on its own anyway so 100% compatibility they cannot support more models. Would you expect them to change Graphics to everyone for free? Just because NVidia offer maybe 2/3 more chips that can handle theirs. I would love to hear what would shareholders said about that xP

July 31, 2014 | 07:09 AM - Posted by renz (not verified)

but isn't that they called it FreeSync because it was supposed to be Free solution (no need for new monitor and no need for new GPU)?

July 30, 2014 | 03:28 AM - Posted by Anonymous (not verified)

Nvidia + G-Sync = TN panel

AMD + Adaptive Sync = IPS panel

July 30, 2014 | 05:10 AM - Posted by arbiter

Um i don't think g-sync will be limited to TN panels. Ones announced are but likely IPS ones will show up as well.

Looking around, reason IPS is lacking is cause they are well to slow. g-sync is more focus is on gaming. Some stuff I read IPS is kinda mostly limited to under 100hz more around 75hz cept a few that can go higher.

July 30, 2014 | 06:33 AM - Posted by Anonymous (not verified)

Dont care about 100000000000hz.

ANYsync means NOTHING to me if its on some sh1t might as well not exist.

First 21:9 IPS with gsync OR freesync will be mine... dont care about cost or whatever

July 30, 2014 | 08:17 AM - Posted by Anonymous (not verified)

Everything is focused on TN because they rest of LCD are shit for gaming PERIOD.

July 30, 2014 | 10:46 AM - Posted by H1tman_Actua1

IPS... still not fast enough for gaming.

8bit high grade TN panel, 144hz, Gsync and 1440P


oh yah. ROG\ASUS makes a lot of AMD product...why havent they supported freessync..oh that's right. AMD never made a logic board to support Variable refresh rate. Nvidia did.

July 30, 2014 | 12:05 PM - Posted by mutantmagnet (not verified)

Actually IPS can be made fast enough for gaming. I found this out while reading up on lightboost on

The only problem is that these displays cost five figures. I think most gamers would like to spend that much money on their car and not an IPS display.

July 30, 2014 | 03:06 PM - Posted by Anonymous (not verified)

lightboost requires high framerate to work ips would have to run over 60hz and have support for it.

lightboost/ULMB work with 100-120FPS
BenQ lightboost can strobe down to 70 FPS

Also lightboost ads latency and you get color degrade from the strobe.

LCD in general blow for gaming that's why i have a Sony fw900 CRT we seriously need OLED

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.