Dynamic refresh rates have two main purposes: save power by only forcing the monitor to refresh when a new frame is available, and increase animation smoothness by synchronizing to draw rates (rather than "catching the next bus" at 16.67ms, on the 16.67ms, for 60 Hz monitors). Mobile devices prefer the former, while PC gamers are interested in the latter.
Obviously, the video camera nullifies the effect.
NVIDIA was first to make this public with G-Sync. AMD responded with FreeSync, starting with a proposal that was later ratified by VESA as DisplayPort Adaptive-Sync. AMD, then, took up "Project FreeSync" as an AMD "hardware/software solution" to make use of DisplayPort Adaptive-Sync in a way that benefits PC gamers.
Today's news is that AMD has just released an FAQ which explains the standard much more thoroughly than they have in the past. For instance, it clarifies the distinction between DisplayPort Adaptive-Sync and Project FreeSync. Prior to the FAQ, I thought that FreeSync became DisplayPort Adaptive-Sync, and that was that. Now, it is sounding a bit more proprietary, just built upon an open, VESA standard.
If interested, check out the FAQ at AMD's website.
AMD fail again. Now you know
AMD fail again. Now you know why Nvidia went all out and developed their own logic aka G-SYNC board, because monitor manufacturers and logic board manufacturers are not going to do the work for them. AMD is relying completely on faith of that….lol
Oh wow… the fanboy is on a
Oh wow… the fanboy is on a roll. Posting like there is no tomorrow. Also understood nothing from the faq.
hey next year this time. I’ll
hey next year this time. I’ll ask you if you’ve seen a monitor that’s freesync enabled. While you pout I’ll have been enjoying my ROG swift for over a year…poor little guy was misled.
BTW I’ve been using Gysnc since last January. Purchased the ASUS VG248QE(Gsync logic installed) from digital storm. WORTH EVERY PENNY!
enjoy being entitled. BTW Obama called said the cell phone he gave you is about to go out of service.
You better start investing in
You better start investing in communications hardware. Your brain has lost contact with reality.
Looks more like he’s lost
Looks more like he’s lost contact with his brain.
Nvidia fails again actually.
Nvidia fails again actually. Once again they make everything overly expensive due to completely pointless propriety tech for something you can get for free. Typical nvidia really, and typical nvidia fanboys who are absolutely clueless as well.
finally the light has been
finally the light has been shown. been saying this since day one.
:How is Project FreeSync
:How is Project FreeSync different from NVIDIA G-Sync?
There are three key advantages Project FreeSync holds over G-Sync: no licensing fees for adoption, no expensive or proprietary hardware modules, and no communication overhead.
The last benefit is essential to gamers, as Project FreeSync does not need to poll or wait on the display in order to determine when it’s safe to send the next frame to the monitor.
Project FreeSync uses industry-standard DisplayPort Adaptive-Sync protocols to pre-negotiate supported min/max refresh rates during plug’n’play, which means frame presentation to the user will never be delayed or impaired by time-consuming two-way handshakes.:
LMFAO!!!! lies!!! really? how does this magically come to life? oh yah…you have to build a logic board (VR) capable and oh yah a monitor company needs to support and build a monitor around that logic board…..oh yah..AMD you forgot to mention you need one of you most current GPU’s…fucken liars.
The only reason DisplayPort
The only reason DisplayPort Adaptive-Sync monitors won’t make it to market is if there is no market, and if you believe there is no market you don’t know anything about gaming or film enthusiasts.
Have you ever considered that competition is the thing to champion, not brands? Because it sounds like you’re still wiping NVidia’s semen off your chin.
F***ing fanboys… go and try
F***ing fanboys… go and try to buy yourself a G-Sync-Monitor ^^
already did last January.
already did last January. worth every single penny.
You bought a kit not a
You bought a kit not a monitor.
This guy is as dumb as they come.
ummm it’s called a company by
ummm it’s called a company by the name of digital storm…and no I bought the monitor complete with Gsync (kit) installed by them… for $499
Hey Obama called said the cell phone he gave you is about to expire.
Gotta go with the fanboy on
Gotta go with the fanboy on this one. The only thing that matters here is how much it will cost to get one of the these monitors on my desk. Gsync adds cost. That’s fine, because I get a benefit. AMD should stop selling this thing as “Free”. If you think that monitor manufactures are going to start giving away new features on their products for “Free” then you’re batcrap insane.
Yeah, the NVIDIA “fanboy” is
Yeah, the NVIDIA “fanboy” is correct about this. AMD misrepresented/twisted/lied/exaggerated about the requirements and features of A-Sync/Free-Sync.
I should have just bought a G-Sync monitor in January when they were available (they were and are still available to purchase for anyone that knows how to use Google and shop online). I would have been a lot happier over the last 6 months than I have been waiting for more details on AMD Weak-Sync.
Liars? R Huddy stated clearly
Liars? R Huddy stated clearly that higher modes will need logic board inside monitor (lower will not).
But since it is based on standard anyone can make their solution based on that, which AMD could never done it with G-SYNC.
Not sure what is all the trolling about, but what do I know. I thought if you have G-SYNC you should be happy with it (you bought what you want it, works as you expected), but according your responds, you do not sound happy at all.
Furthermore, If you do not have Radeon, then you do not need to care about Freesync at all, you could not use it anyway. Just saying if you missed that.
There’s only a very limited
There’s only a very limited amount of AMD cards that can use freesync, The 260 and 290 ranges are the only ones. I’m on a 290 model so I’m okay but people on 270’s, 280’s and 7000 series cards are out of luck. I’m sure the next range will all support it but there bringing an inferior system to the table very late in the day and it is inferior. If you go far enough into the subject you can’t avoid that truth unless your blinded by irrelevant points such as what brand does what.
“oh yah..AMD you forgot to
“oh yah..AMD you forgot to mention you need one of you most current GPU’s…fucken liars.”
Desperate nvidia fanboy?
Since when is a product from 2011 the most current.
You misread the FAQ.
“All AMD
You misread the FAQ.
“All AMD Radeon™ graphics cards in the AMD Radeon™ HD 7000, HD 8000, R7 or R9 Series will support Project FreeSync for video playback and power-saving purposes.”
2011 cards for video playback only.
” The AMD Radeon™ R9 295X2, 290X, R9 290, R7 260X and R7 260 GPUs additionally feature updated display controllers that will support dynamic refresh rates during gaming.”
Cards in the last year required for gaming improvements.
what is really sad is that
what is really sad is that only AMD gpu’s of newest range support it for games where as Nvidia has support for any gpu using kepler gpu so 650ti and up. SO really “free-sync” is not so free knowing that a new monitor regardless is gonna be needed either way.
As for him misreading that, I bet a ton of AMD users will misread it and think their card supports it in games then find out it doesn’t when they try. Its one those things of people who don’t read.
exaactly
exaactly
All APUs and GCN 2.0 support
All APUs and GCN 2.0 support it, which for new tech just months old is not bad, VBLANK couldn’t be used on its own anyway so 100% compatibility they cannot support more models. Would you expect them to change Graphics to everyone for free? Just because NVidia offer maybe 2/3 more chips that can handle theirs. I would love to hear what would shareholders said about that xP
but isn’t that they called it
but isn’t that they called it FreeSync because it was supposed to be Free solution (no need for new monitor and no need for new GPU)?
Nvidia + G-Sync = TN
Nvidia + G-Sync = TN panel
AMD + Adaptive Sync = IPS panel
Um i don’t think g-sync will
Um i don’t think g-sync will be limited to TN panels. Ones announced are but likely IPS ones will show up as well.
Looking around, reason IPS is lacking is cause they are well to slow. g-sync is more focus is on gaming. Some stuff I read IPS is kinda mostly limited to under 100hz more around 75hz cept a few that can go higher.
Dont care about
Dont care about 100000000000hz.
ANYsync means NOTHING to me if its on some sh1t TN…it might as well not exist.
First 21:9 IPS with gsync OR freesync will be mine… dont care about cost or whatever
Everything is focused on TN
Everything is focused on TN because they rest of LCD are shit for gaming PERIOD.
IPS… still not fast enough
IPS… still not fast enough for gaming.
8bit high grade TN panel, 144hz, Gsync and 1440P
ROG SWIFT PG278Q FTW!!!
oh yah. ROGASUS makes a lot of AMD product…why havent they supported freessync..oh that’s right. AMD never made a logic board to support Variable refresh rate. Nvidia did.
Actually IPS can be made fast
Actually IPS can be made fast enough for gaming. I found this out while reading up on lightboost on blurbusters.com.
The only problem is that these displays cost five figures. I think most gamers would like to spend that much money on their car and not an IPS display.
lightboost requires high
lightboost requires high framerate to work ips would have to run over 60hz and have support for it.
lightboost/ULMB work with 100-120FPS
BenQ lightboost can strobe down to 70 FPS
Also lightboost ads latency and you get color degrade from the strobe.
LCD in general blow for gaming that’s why i have a Sony fw900 CRT we seriously need OLED
AMD will have to make logic
AMD will have to make logic board (monitor will need it, will work a bit different though) – for higher modes will be also necessary. But Freesync is not yet out so we will know (which modes) when it gets out.
IF you have Radeon, then you just wait, if you cannot wait you will buy NVidia. Not sure why all those hate responds. NVidia presented fix for some problem, AMD made their solution pretty fast. Pretty much what AMD has done with mantle, NVidia answered with AZDO.
I definitely not going to hang myself for AMD having no adaptive-sync monitor atm.
Nah, your face is shit…
Nah, your face is shit… period.
All gaming TN panels are
All gaming TN panels are overdriven even the new ASUS ROG SWIFT. Nothing stopping them from overdriving IPS panels except price.
Monitor sellers didn’t pay attention to TN panels being overdriven until they realized there was money to be made from people who bought Asian brands which overdrove them.
It will not, G-SYNC has its
It will not, G-SYNC has its limits already (same as Freesync) no need to add more.
Another question is why use no latency system for monitor that has some unnecessary latency on its own, but that is something else.
pfff in what imaginary world
pfff in what imaginary world does this AMD freesync monitor exist. Oh wait that’s right. There isn’t a single logic boardmonitor manufacturer working on this….
Actually some monitors
Actually some monitors already in existence support DisplayPort Adaptive-Sync with a simple firmware upgrade.
http://www.anandtech.com/show/8129/computex-2014-amd-demonstrates-first-freesync-monitor-prototype
I wouldn’t expect you to know that though, sounds like you’ve had blinkers on since you bought your GSync monitor. Your posts read like a textbook on choice-supportive bias.
And works fine, but that only
And works fine, but that only proof that solution is working, it is just not available on market yet.
That is an AMD claim, which
That is an AMD claim, which we all know the history of AMD and their claims. AMD has made many claims over the years and end up having to back pedal on them. We don’t know what they did to those monitors to get them to support it. Likely only 1 to a handful will support it. I wouldn’t be shock if there is no firmware release for any and you gotta buy new, doubt monitor makers want to be on hook for end users bricking their monitors and then whining about it.
lol for over a year. you lies
lol for over a year. you lies are beautiful^^ free sync is a vesa Standard and freesync will support much more then 144Hz.
the first FreeSync monitors will sample this month, with finished products expected to ship early next year.
The Video Electronics Standards Association (VESA) already announced the addition of ‘Adaptive-Sync’ to its popular DisplayPort 1.2a video interface standard.
The protocol FreeSync / Adaptive Sync will be embedded into DisplayPort 1.2a
The use of GSync or ASync has
The use of GSync or ASync has nothing to do with TN or IPS. Gaming monitors in general will mostly be TN because IPS pixel response is too slow for much over 60Hz. TN can be used for almost any frequency due to its blazing response time. Combined with true 8-bit panels (ASUS ROG PG278Q), TN’s only flaw is viewing angles… and why would you ever play a game at an off-angle?
Do you dumb kiddies even know
Do you dumb kiddies even know what you are talking about?
That “response time” you see advertised is for color shifts. Lower = less ghosting.
Has precisely NOTHING to do with responsiveness of the game.
Your thinking of “input lag”.
Ye, wonder why they dont advertise this one… because some nongaming panels like the IPS quinx has that lower than some “GAMING” TN trash.
That ROG Swift is SH1T. Its just as SH1T as any other TN, but a lot smoother now… dunno, i rather blowtorch my eyes myself.
Go home, you’re drunk.
Not
Go home, you’re drunk.
Not only did I not bring up input lag or game responsiveness, I wasn’t even referring to it. I only brought up the pixel response time – the time it takes to change colors, gray to gray, light to dark. Many IPS monitors suffer from ghosting exactly for this reason (too slow).
This will probably be an easy
This will probably be an easy implementation for monitor manufacturers.
They will do it even if no one will use it as they can add an extra label on the monitor and in marketing etc.
G-sync will not survive. Do you even know anyone that owns a G-sync monitor or even seriously plan to buy one? I don’t. It is too dear and you must have an Nvidia which cuts the market in half right there.
cut the market in half huh?
cut the market in half huh? just like AMD did with mantle?
AMD has openly promised Intel
AMD has openly promised Intel to get access to Mantle and have invited Nvidia to join the party. That Nvidia do not want to use this technology is not AMD’s fault.
AMD will invite Nvidia to
AMD will invite Nvidia to “join the party” so Nvidia will spend their development money on it so AMD can steal Nvidia’s work on improving it. You don’t think they would? AMD doesn’t have the money to develop anything on their own. Why do you think AMD is relying on monitor manufactures and VESA to help get this whole “FREE-SYNC” up and going? Nvidia spent their own money to get G-SYNC implemented with manufactures. Creating their own logic board is not cheap. You think AMD could afford to do that?
AMD will invite Nvidia to
AMD will invite Nvidia to “join the party” so Nvidia will spend their development money on it so AMD can steal Nvidia’s work on improving it. You don’t think they would? AMD doesn’t have the money to develop anything on their own. Why do you think AMD is relying on monitor manufactures and VESA to help get this whole “FREE-SYNC” up and going? Nvidia spent their own money to get G-SYNC implemented with manufactures. Creating their own logic board is not cheap. You think AMD could afford to do that?
Yes but you mistaken AMD for
Yes but you mistaken AMD for NVidia. AMD using standards when possible (mantle is beta, cannot be released without documentation, but it will be open to other vendors). NVidia proprietary tech when possible.
It doesn’t matter how easy it
It doesn’t matter how easy it is for them to implement. They are not going to give it away new functionality for free.
I don’t think AMD just want
I don’t think AMD just want to make money here. They indirectly want G-sync to fail. Just like Nvidia want (even more so) for Mantle to fail.
Interesting to find out that
Interesting to find out that older generation cards will get some benefit from this technology.
What excites me after reading the FAQ is a mantle optimised game teamed up with a freesync or ‘ADAPTIVE SYNC based tech’ monitor. *Smooth operator* Silky smooth gameplay for sure!
AMD should get Sade to do their adverts for this lol 🙂
I have watched a ton of video
I have watched a ton of video on my 60hz monitor never seen any issue’s with them of tearing. As for power saving, my monitor uses 25 watts so nothing really to save there.
These threads always turn
These threads always turn into pissing contest of ignorance
Both G-Sync and Freesync are
Both G-Sync and Freesync are irrelevant until we get get decent LCDs. As it stands, CRTs are still the only real choice, especially for gaming.
Maybe for CS professionals,
Maybe for CS professionals, but the rest of us do not miss CRTs.
adaptive sync is a vesa
adaptive sync is a vesa standard. it will be added to monitors, that much is simple. It certainly wont run monitor cost up $100+ like gsync. mantle may die due to DX12, not nvidia. gsync will die due to vesa standard, not AMD. get used to it.
listening to and arguing with boors is a poor use of time.
have a nice day.
Well less you have proper
Well less you have proper card for AMD’s Not-so “free-sync”. that 100$ difference disappears with face you spending 100-150$ for their cheapest card that supports it for gaming. The movie watching I never tearing issues on my current and even my last monitor so.
What about multi-monitor
What about multi-monitor use.
G-Sync doesn’t work with more then one monitor and apparently Nvidia forgot to tell ASUS about it according to the ASUS ROG forums.
i doubt not-so-“free-sync”
i doubt not-so-“free-sync” will either
We know for certain G-Sync
We know for certain G-Sync doesn’t.
LOL
[OT]It does. But you need as
[OT]It does. But you need as many cards in sli as you have monitors. g-sync needs dp port on card, dp-mst-hubs does not cut it(which is kind of retarded).
That’s a limitation of current geforce card outputs, maybe next generation cards will have more dp outputs to make surround with one card possible(or even über düper surround g-sync edition cards with dp outputs only).[/OT]
Looking forward to see tests with a-sync monitors, when they arrive end of year. Time will tell if it’s superior to g-sync in games(well it’s already with video/power usage, meaning normal desktop usage).
After all amd’s freesync is just a driver side support for using a-sync. So it’s very possible that nvidia will write it’s own driver support i.e. extension to g-sync to cover usage of a-sync.
I don’t use an AMD card,
I don’t use an AMD card, gsync doesn’t move me at all. it will fail to be adopted industry wide. adaptive sync will not fail to be adopted, it’s an industry standard. I wouldn’t buy a gsync monitor for any reason. Nvidia will drop it in favor of the vesa standard soon enough. if not, they are dumber than I think.
my point isn’t just cost. it’s adoption, adaptive sync will be around far longer and more widely available than gsync ever will be.
Directx 12 requires an OS
Directx 12 requires an OS system upgrade will keep Mantle alive, mantle will eventually come to Linux, ios, and Android, while dx12 will be limited to windows 9! There are more AAA titles today supporting mantle than ones that do not ( COD, GTAV, dragon age, mass effect 4, etc) mantle is here to stay!
How long after DX12 comes out
How long after DX12 comes out will dev’s support mantle (without a large check from AMD)?
A good amount.
DX12 wont
A good amount.
DX12 wont support CPU optimization due to backwards compatibility issues which was covered in The Techrepot article. If they want to sell more games they will not target DX12 market alone. Mantle will be the CPU optimization for those still on DX11 or older running windows 8/7.
Directx 12 requires an OS
Directx 12 requires an OS system upgrade will keep Mantle alive, mantle will eventually come to Linux, ios, and Android, while dx12 will be limited to windows 9! There are more AAA titles today supporting mantle than ones that do not ( COD, GTAV, dragon age, mass effect 4, etc) mantle is here to stay!
It’s difficult to tell
It’s difficult to tell without an actual technical psecification, but from the way AMd very carefully describe it:
“Project FreeSync’s ability to synchronize the refresh rate of a display to the framerate of a graphics card ”
“Project FreeSync uses industry-standard DisplayPort Adaptive-Sync protocols to pre-negotiate supported min/max refresh rates during plug’n’play”
“The basic benefit of Project FreeSync is the dynamic refresh rate (“DRR”), which allows the graphics card to synchronize the refresh rate of a monitor 1:1 with the framerate of an AMD Radeon™ GPU”
“Using this approach, no communication must occur to negotiate the time a current frame remains on-screen, or to determine that is safe to send a new frame to the monitor.
By eliminating the need for ongoing communication with pre-negotiated screen update rates,”
“the graphics card can detect and set an appropriate maximum and minimum refresh rate based on the capabilities reported by the display. Potential ranges include 36-240Hz, 21-144Hz, 17-120Hz and 9-60Hz.”
It sounds very much like FreeSync negotiates a ‘lowest common multiple’ refresh rate that both the GPU and monitor are locked to, with a lower and upper bound. This allows varying of frame delivery without handshaking, but means that frame delivery times are quantised, unlike G-sync where frame delivery is fully asynchronous right up until the monitor cannot handle another frame due to reading out the current one.
If FreeSync can negotiate a low enough common multiple refresh rate, then the two are probably indistinguishable except for edge cases where latency is key (e.g. head-mounted displays such as the Rift), where G-sync probably has the edge. The fixed minimum handshake latency is likely to be more tolerable than a variable quantisation latency.
lowest common multiple
I
lowest common multiple
I meant highest common factor. Brainfart.
lowest common multiple
I
lowest common multiple
I meant highest common factor. Brainfart.
I meant Highest Common
I meant Highest Common Factor. Brainfart.
Aaand, ‘page not found’
Aaand, ‘page not found’ apparently means ‘post submitted’
By eliminating the need for
By eliminating the need for ongoing communication with pre-negotiated screen update rates,”
“the graphics card can detect and set an appropriate maximum and minimum refresh rate based on the capabilities reported by the display. Potential ranges include 36-240Hz, 21-144Hz, 17-120Hz and 9-60Hz.”
It sounds very much like FreeSync negotiates a ‘lowest common multiple’ refresh rate that both the GPU and monitor are locked to, with a lower and upper bound. This allows varying of frame delivery without handshaking, but means that frame delivery times are quantised, unlike G-sync where frame delivery is fully asynchronous right up until the monitor cannot handle another frame due to reading out the current one. Yeah, I’m not digging your quantisation theory, are you sure those ranges aren’t just chunks that a monitor manufaturer can choose a range from? For e.g, I make a monitor that goes up to 144hz means I can set my minimum refresh rate at anywhere from 21hz (ideally, I would want the longest time a frame can be held before it starts to decay). They’re sending vblank info with/ahead of every frame.
even doing all the
even doing all the negotiating in real time like g-sync, latency would be what 1-2ms. I can ping a router that is like 150+ miles away and get 15ms. that is goin through tons of routers and extra hardware. this is 1 cable from video card to monitor.
As said during the podcast
As said during the podcast “DisplayPort Adaptive-Sync” is part of DP standard. Project not-so”free-sync” is a proprietary thing by AMD that mod to the adaptive-sync spec so its something completely different from the VESA spec. So all the stuff AMD claimed about it being VESA DP spec was false.
As said during the podcast
As said during the podcast “DisplayPort Adaptive-Sync” is part of DP standard. Project not-so”free-sync” is a proprietary thing by AMD that mod to the adaptive-sync spec so its something completely different from the VESA spec. So all the stuff AMD claimed about it being VESA DP spec was false.
Freesync is radeon hardware and catalyst software, what good is that to anyone but AMD? There is zero need for it to be open. Adaptive sync is what’s open and free to any vesa members and other parties that wish to licence it and they’re free to call their syncing whatever they want.
thank you
thank you
I have a Catleap 2560×1440
I have a Catleap 2560×1440 IPS Monitor with the 2B PCB overclocked to run at 120Hz. I use a GTX 690. Until GSync, FreeSync, or WhateverSync is available on a low response time high refresh rate monitor, I can deal with the minimal tearing that I get.
I left out the part about
I left out the part about also being on a IPS monitor. :p