Richard Huddy Discusses FreeSync Availability Timeframes

Subject: General Tech, Displays | August 14, 2014 - 01:59 PM |
Tagged: amd, freesync, g-sync, Siggraph, siggraph 2014

At SIGGRAPH, Richard Huddy of AMD announced the release windows of FreeSync, their adaptive refresh rate technology, to The Tech Report. Compatible monitors will begin sampling "as early as" September. Actual products are expected to ship to consumers in early 2015. Apparently, more than one display vendor is working on support, although names and vendor-specific release windows are unannounced.

View Full Size

As for cost of implementation, Richard Huddy believes that the added cost should be no more than $10-20 USD (to the manufacturer). Of course, the final price to end-users cannot be derived from this - that depends on how quickly the display vendor expects to sell product, profit margins, their willingness to push new technology, competition, and so forth.

If you want to take full advantage of FreeSync, you will need a compatible GPU (look for "gaming" support in AMD's official FreeSync compatibility list). All future AMD GPUs are expected to support the technology.

Source: Tech Report
August 14, 2014 | 02:39 PM - Posted by Bret (not verified)

I still think "Free Synch" is in the sink,so to speak.Huddy says it is merely a BIOS change,now he is backing off to say it is monitors that 'can' support it.Would that not hint at the need for hardware/G-Synch utilization and is that charge hdwre or license?GJ Scott anyway it seems the duel between Green and Red will continue,last to speak will be winner with biggest lie,LOL!!!

August 14, 2014 | 03:12 PM - Posted by nathanddrews

If manufacturers are smart, they'll just make monitors that do both. For the enthusiast crowd this is targeting, monitors usually outlive the GPUs they connect to for several generations.

August 14, 2014 | 03:25 PM - Posted by Mac (not verified)

If I were a monitor manufacturer, I would have an aversion to implementing technology in my monitors that was restricted to one particular IHV's products. It might yet prove a masterstroke by AMD to go through a standards body with their tech

August 14, 2014 | 10:24 PM - Posted by arbiter

Yea if AMD's implementation was a standard which its still proprietary code.

August 14, 2014 | 03:17 PM - Posted by Qrash

I feel dumb for asking, but if it's called "Free"Sync does this mean that Nvidia GPUs can use it too?

August 14, 2014 | 03:33 PM - Posted by Scott Michaud

Nope. NVIDIA can use adaptive refresh, found in DisplayPort 1.2a, but not "FreeSync". Surprised us a bit that they were not the same technology (and maybe AMD originally intended it to be). Unfortunately, we don't know that.

August 15, 2014 | 01:42 AM - Posted by ZoA (not verified)

FreeSync part is AMD proprietary drivers needed to use adaptive sync in a way to reduce tearing and stuttering, so talking about freesync being used on nvida card is as stupid as talking about running nvidia card with catalyst drivers.

What nvida can do is write their own drivers for adaptive sync that would do same thing as g-sinc, but only without all the additional hardware that comes with g-sync. Of course they are unlikely to do that as g-sync is more profitable because monitor manufacturers have to pay them license for every monitor they make with g-sync. This is why g-sync adds some $200 to cost of monitor, hardware + nvida license.

AMD solution is to have monitor manufacturers integrate adaptive sync DisplayPort 1.2a standard free of license, and AMD will provide freesync drivers in their catalyst drives suit free of additional cost to either final user or monitor manufacturers.

I expect eventually Intel will follow AMD and produce drivers for their integrated graphic cards to use adaptive sync DisplayPort 1.2a standard in a same way as freesync, but Intel will name it something else.

So in the long run I expect adaptive sync based reduction of stuttering and tearing will work with both in intel and AMD cards, but not on nvidia cards as they will be pushing their more expensive and licensed solution.

August 15, 2014 | 05:20 AM - Posted by Anonymous (not verified)

"This is why g-sync adds some $200 to cost of monitor, hardware + nvida license."

No, that's because the current implementation uses an FPGA and a big bank of RAM (for bandwidth, rather than capacity) rather than an ASIC with an optimised interface. Fast FPGAs are expensive, dedicated ASICs are cheap.

August 15, 2014 | 05:31 AM - Posted by Mac (not verified)

No surprises here, in adopting amd's proposed additions to the dp spec they renamed freesync to adaptive sync. Naturally, anyone who wants to use it will need drivers/software and hardware compatible with adaptive sync to offer freesync like variable refresh.

August 14, 2014 | 04:07 PM - Posted by H1tman_Actua1

https://www.youtube.com/watch?v=ANPX0DQxalA

FAIL AMD

As with most things AMD hype of their version of VR will fall short of Gsync's superior gaming experience.

August 14, 2014 | 04:50 PM - Posted by Anonymous (not verified)

I guess your brain just FAILED!
Bring up some real arguments or shut up.
Freakin Fanboys...

August 15, 2014 | 01:31 AM - Posted by Lithium (not verified)

PC Perspective:

``Oh, and of course, you have this as the first true G-Sync capable monitor on the market, implementing NVIDIA’s custom variable refresh rate technology to offer up the best gaming experience you are going to find anywhere; it’s really not even close. Once you have seen and used G-Sync for some gaming, it’s going to be basically impossible to go back.``

Overclockers Club:

``By spending several hours in front of the ASUS ROG Swift PG278Q, what I did discover was that I found myself playing through the games much longer than I had before just because of how smooth the gameplay was. In addition, I found that, while playable with v-sync off, G-Sync fixed the screen tearing and lag issues you get when v-sync is enabled. After moving back to my standard gear, which is a 30-inch 60Hz IPS monitor, I was instantly yearning to go back to the G-Sync enabled ASUS ROG Swift PG278Q. The difference is truly mind bending and made the gaming experience all that more enjoyable.``

August 15, 2014 | 03:27 AM - Posted by Anonymous (not verified)

ASUS

"It can't undo the fundamental effect of very low frame rates and it doesn't do anything below 30FPS, but it does make it far more tolerable and the transition from high to low smooth. IMO it's not so suitable for very fast action games where you're better off using the ULMB option with normal or extreme pixel response setting instead."

August 15, 2014 | 04:35 AM - Posted by Anonymous (not verified)

Why would you want to game under 30fps? Change your settings.

August 15, 2014 | 12:34 PM - Posted by Scott Michaud

I actually played through Halo PC at about 14-19 FPS with everything on low before I updated my GPU (GeForce 2 MX -> GeForce FX 5700 Ultra). It was surprisingly... okay.

August 15, 2014 | 01:31 PM - Posted by Lithium (not verified)

HotHardware:

``Disabling V-Sync may eliminate lag, but tearing is evident. And enabling V-Sync may eliminate the tearing, but the lag can be annoying. With, G-Sync though, the on-screen images don't suffer from visual artifacts and the tearing is gone too.

We wish there was an easy way to visually convey how G-Sync affects on-screen animation, but there isn’t. We don’t have a means to capture DisplayPort feeds and shooting video of the screen and hosting it on-line doesn’t capture the full effect either. In lieu of an easy visual method to show how effective G-SYNC is, you'll just have to take our word for it. G-Sync is great.``

August 15, 2014 | 05:06 AM - Posted by Anonymous (not verified)

Good. Good. WONDERFUL.

Comptetition is king!

Work out all the bullsh1t and and make this a standard and work on improvements while we still sit on LCDs.

So when OLEDs come we can have 0-250hz smooth transitions with the monitor holding an image if the fps drops below 1.

August 15, 2014 | 06:56 AM - Posted by Anonymous (not verified)

I don't know how this shakes out, but I do know who's going to be making profits from new monitor technology and it's not AMD. Nvidia's making the profits from gsync and monitor manufacturers are making the profits from freesync. My guess is that both flavors on monitor are similarly priced. If you think that monitor manufacturers are going to give away new functionality for free, you're crazy.

Good work AMD, introducing new tech and not making any money from it. Hell of a way to run a business.

August 15, 2014 | 09:32 AM - Posted by siriq111 (not verified)

That is a big unfortunate AMD will not see any penny from this. They have to make money. In the other hand what worry's me : All AMD Radeon™ graphics cards in the AMD Radeon™ HD 7000, HD 8000, R7 or R9 Series will support Project FreeSync for video playback and power-saving purposes. The AMD Radeon™ R9 295X2, 290X, R9 290, R7 260X and R7 260 GPUs additionally feature updated display controllers that will support dynamic refresh rates during gaming.
Basically what really counts: 260(X) and 290(X) if you would like to use dynamic refresh rates during gaming. Rest of the cards are out of the game. APU`s are fine with both features. Sound a bit of lack of support.

August 15, 2014 | 04:19 PM - Posted by arbiter

Yea people that have say 280(x) or 7970 kinda got screwed in that deal. As they talked on the podcast about tonga which would be a 285(x) kinda puts people in a spot to have to buy that card which should support it in gaming, to buy a new gpu with about same as they had before. Does sound like AMD did this based on reaction to nvidia's idea. Nvidia looks like since most their cards support it have been working on it for a while, AMD just kinda ported it from laptops power saving idea to gaming.

August 15, 2014 | 03:24 PM - Posted by Anonymous (not verified)

Gimme a 1440p 21:9 freesync ips screen.

And ill power it with JUST a kaveri APU which ill vesa mount to the monitors back...

and ill SMOOTHLY play AND ACTUALLY ENJOY all the source games and pretty much everything expect some top10 most demanding games.

Thats some pretty cheap and HIGHLY enjoyable gaming on the horizon... if you dont NEED eye candy in the latest AAA single players.

August 15, 2014 | 04:24 PM - Posted by arbiter

The point of most AAA games is how good they look, playing on a gpu that could probably only push 30-40fps at low settings @ 1440p is only accept to a point. I know a lot of people agree with me on that.

August 15, 2014 | 09:06 PM - Posted by Anonymous (not verified)

If the point of most AAA games is how they look. Why play them on a TN panel in the first place?

August 16, 2014 | 01:41 AM - Posted by arbiter

cause IPS is piss slow ?

August 16, 2014 | 03:24 PM - Posted by Anonymous (not verified)

I have a TN panel that is 2 ms response time. A good ips panel is 5ms, which is average for a TN, not slow at all. Maybe you've reviewed the wrong ips panels?

August 18, 2014 | 03:57 AM - Posted by Anonymous (not verified)

A whatever-SYNC 21:9 ips is going to beat the piss out of your face and all your TN eye-burning trash.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.