Thursday, March 19th 2015

AMD Announces FreeSync, Promises Fluid Displays More Affordable than G-SYNC

AMD today officially announced FreeSync, an open-standard technology that makes video and games look more fluid on PC monitors, with fluctuating frame-rates. A logical next-step to V-Sync, and analogous in function to NVIDIA's proprietary G-SYNC technology, FreeSync is a dynamic display refresh-rate technology that lets monitors sync their refresh-rate to the frame-rate the GPU is able to put out, resulting in a fluid display output.

FreeSync is an evolution of V-Sync, a feature that syncs the frame-rate of the GPU to the display's refresh-rate, to prevent "frame tearing," when the frame-rate is higher than refresh-rate; but it is known to cause input-lag and stutter when the GPU is not able to keep up with refresh-rate. FreeSync works on both ends of the cable, keeping refresh-rate and frame-rates in sync, to fight both page-tearing and input-lag.
What makes FreeSync different from NVIDIA G-SYNC is that it's a specialization of a VESA-standard feature by AMD, which is slated to be a part of the DisplayPort feature-set, and advanced by DisplayPort 1.2a standard, featured currently only on AMD Radeon GPUs, and Intel's upcoming "Broadwell" integrated graphics. Unlike G-SYNC, FreeSync does not require any proprietary hardware, and comes with no licensing fees. When monitor manufacturers support DP 1.2a, they don't get to pay a dime to AMD. There's no special hardware involved in supporting FreeSync, either, just support for the open-standard and royalty-free DP 1.2a.
AMD announced that no less than 12 monitors from major display manufacturers are already announced or being announced shortly, with support for FreeSync. A typical 27-inch display with TN-film panel, 40-144 Hz refresh-rate range, and WQHD (2560 x 1440 pixels) resolution, such as the Acer XG270HU, should cost US $499. You also have Ultra-Wide 2K (2560 x 1080 pixels) 34-inch and 29-inch monitors, such as the LG xUM67 series, start at $599. These displays offer refresh-rates of up to 75 Hz. Samsung is leading the 4K Ultra HD pack for FreeSync, with the UE590 series 24-inch and 28-inch, and UE850 series 24-inch, 28-inch, and 32-inch Ultra HD (3840 x 2160 pixels) monitors, offering refresh-rates of up to 60 Hz. ViewSonic is offering a full-HD (1920 x 1080 pixels) 27-incher, the VX2701mh, with refresh-rates of up to 144 Hz. On the GPU-end, FreeSync is currently supported on Radeon R9 290 series (R9 290, R9 290X, R9 295X2), R9 285, R7 260X, R7 260, and AMD "Kaveri" APUs. Intel's Core M processors should, in theory, support FreeSync, as its integrated graphics supports DisplayPort 1.2a.
On the performance side of things, AMD claims that FreeSync has lesser performance penalty compared to NVIDIA G-SYNC, and has more performance consistency. The company put out a few of its own benchmarks to make that claim.
For AMD GPUs, the company will add support for FreeSync with the upcoming Catalyst 15.3 drivers.
Add your own comment

96 Comments on AMD Announces FreeSync, Promises Fluid Displays More Affordable than G-SYNC

#1
Cybrnook2002
!!!Thanks btarunr!!!

Why does this have to be labeled as "More affordable than G-Sync" , even if it's true all it is going to do is be click bait for fan boy arguments on why AMD blows and Freesync should come in the form of a firmware update..... We know Freesync is coming out (BenQ monitor is already out), why poke that horse again? I can already here it "Freesync is not free, long live green"

Anyways, super excited to try it out. Once some of the Asus monitors drop, I am going after IPS + 120hz + adaptive sync.
Posted on Reply
#2
Ferrum Master
Cybrnook2002
!!!Thanks btarunr!!!

Why does this have to be labeled as "More affordable than G-Sync" , even if it's true all it is going to do is be click bait for fan boy arguments on why AMD blows and Freesync should come in the form of a firmware update..... We know Freesync is coming out (BenQ monitor is already out), why poke that horse again? I can already here it "Freesync is not free, long live green"

Anyways, super excited to try it out. Once some of the Asus monitors drop, I am going after IPS + 120hz + adaptive sync.
Because it is... manufacturer doesn't need to invest in additional module, designing additional space and providing additional current to that board... It just works as native VESA mode.
Posted on Reply
#3
MakeDeluxe
"No proprietary hardware"
Gosh I so wish that means it will work on nVidia (and Intel) GPUs but I know I'll be wrong.

Also, dang those LG ultrawides look enticing
Posted on Reply
#4
dj-electric
Minimum required 40FPS on some monitors does not make me happy at all.
Currently, to have always above 40FPS on 1440P you got to have enough GPU horsepower to use decent settings on many new games. Otherwise it won't be worth it much.

Hoped to see is in the low 30s. Anyway, with a new top end gen of AMD cards this shouldn't (hopefully) be the case if performance are as leaked.
Posted on Reply
#5
esrever
Just gotta wait till intel supports their own version of adaptive sync. I doubt it would be hard for them to develop their own solution based on DP1.3.
Posted on Reply
#6
the54thvoid
Cybrnook2002
!!!Thanks btarunr!!!

Why does this have to be labeled as "More affordable than G-Sync" , even if it's true all it is going to do is be click bait for fan boy arguments on why AMD blows and Freesync should come in the form of a firmware update..... We know Freesync is coming out (BenQ monitor is already out), why poke that horse again? I can already here it "Freesync is not free, long live green"

Anyways, super excited to try it out. Once some of the Asus monitors drop, I am going after IPS + 120hz + adaptive sync.
Wait....

You have crossfired 295X2 and a (if your specs are up to date) 1080p(144hz) capable display? Are you insane? Free-sync is for when fps drop below your refresh rate. What at 1080p with the power of 4 290X's drops below 144fps? I cap BF4 at 90fps on sli 780ti at 1440p, you should be storming past 144fps on 1080p?

If I was you, I'd wait for a good 4K Free-sync monitor at 60Hz. You'd get ultra smooth motion with your graphics cards and freesync then.
Posted on Reply
#7
Ferrum Master
esrever
Just gotta wait till intel supports their own version of adaptive sync. I doubt it would be hard for them to develop their own solution based on DP1.3.
Yeah, it would be nice to see pulling 60FPS max in Alien:Isolation at 1440p from an Intel GPU :laugh:
Posted on Reply
#8
Cybrnook2002
Ferrum Master
Because it is... manufacturer doesn't need to invest in additional module, designing additional space and providing additional current to that board... It just works as native VESA mode.
I got that :-) I was more commenting on the bashing between nvidia and AMD that always happens. Was just saying that with the title, we are sure to draw an argumentative crowd.

But yes, it's free from the DP 1.2a spec, as long as the hardware displaying to it is supported.
Posted on Reply
#9
btarunr
Editor & Senior Moderator
MakeDeluxe
"No proprietary hardware"
Gosh I so wish that means it will work on nVidia (and Intel) GPUs but I know I'll be wrong.

Also, dang those LG ultrawides look enticing
As I mentioned in the article, all you need is DP 1.2a (and above). NVIDIA chose HDMI 2.0 on its Maxwell GPU, but held back on advancing DP. They didn't want to end up killing G-Sync.

Intel Broadwell IGP supports DP 1.2a.
Posted on Reply
#10
Cybrnook2002
the54thvoid
Wait....

You have crossfired 295X2 and a (if your specs are up to date) 1080p(144hz) capable display? Are you insane? Free-sync is for when fps drop below your refresh rate. What at 1080p with the power of 4 290X's drops below 144fps? I cap BF4 at 90fps on sli 780ti at 1440p, you should be storming past 144fps on 1080p?

If I was you, I'd wait for a good 4K Free-sync monitor at 60Hz. You'd get ultra smooth motion with your graphics cards and freesync then.
Oh I am , I am actually running three displays at the moment (1080P). I have my eyes on the Asus MG279Q when it comes out, want to run three of those. 1440 IPS 120hz+ with adaptive sync (DP1.2a)- or like you said, break down and get a super wide 4K. Only thing about those is I want a higher than 60hz refresh. Thats the only killer for me.... and since I am a compulsive early adopter I will likely be on the 1440 train.
And yes, the cards are on cruise control with 1080p, doesn't stop me anyways :)
Posted on Reply
#11
HossHuge
Still out of my price range.
Posted on Reply
#12
Cybrnook2002
@btarunr,

Am I reading that right, AMD is supporting 9 - 240 hz with freesync? Not the speculated bottom limit of 30/40?
Posted on Reply
#13
Patriot
Cybrnook2002
@btarunr,

Am I reading that right, AMD is supporting 9 - 240 hz with freesync? Not the speculated bottom limit of 30/40?
The bottom limit is up to the individual panel it seems.

Most seem to bottom out at 40 though. Getting the backlight to not flicker at 9 might be tricky.
MakeDeluxe
"No proprietary hardware"
Gosh I so wish that means it will work on nVidia (and Intel) GPUs but I know I'll be wrong.

Also, dang those LG ultrawides look enticing
I actually wouldn't be surprised to see Intel adopting this with a driver update.
It is an open standard they can use it if they want to.
Nvidia won't because G-sync would die.
Posted on Reply
#14
Hitman_Actual
Sub par performance compared to Gsync. Smaller affective VRR margin then Gsync and showing ghosting on 3 of the reviewd freesync monitors.
Posted on Reply
#15
Patriot
Also, dang those LG ultrawides look enticing[/QUOTE]
Hitman_Actual
Sub par performance compared to Gsync. Smaller affective VRR margin then Gsync and showing ghosting on 3 of the reviewd freesync monitors.
Larger range, panels used are the limiting factor.
Posted on Reply
#16
arbiter
Cybrnook2002
@btarunr,

Am I reading that right, AMD is supporting 9 - 240 hz with freesync? Not the speculated bottom limit of 30/40?
What the real limit g-sync module can support I don't think is known. Bios on the module is 30-144hz, but as far as anyone knows nvidia could release a bios update for monitor that supports larger range, or least update bios on monitors that Support the larger area, but ATM there is no point when the displays can't do it and connect of DP can't. It just ends up being faulty specs that standard can do it but there is nothing that can support it. Its more future proofing as its there, standard is hard to charge where as Nvidia can change their module when ever they need at any time the displays allow it.
btarunr
On the performance side of things, AMD claims that FreeSync has lesser performance penalty compared to NVIDIA G-SYNC, and has more performance consistency. The company put out a few of its own benchmarks to make that claim.


Problem with that statement is, "AMD Claims" Only realistic way that penalty could happen is if the gpu had to wait for display update window, which anyone would see that would be same for freesync as well. I watched most of that video this morning, just wow that guy suposed to be an expert but he said some of the dumbest crap I have ever heard. In that image he says you can turn v-sync off on freesync but you can't turn it off on g-sync due its effect of being on. If you look at that slide turning off v-sync fundamentally disabled what freesync and what g-sync is there to get rid of, SCREEN TEARING. He said g-sync has input latency cause that v-sync but Independent testers have already proven there g-sync doesn't have any of that less you go above the max refresh rate of the monitor.
Cybrnook2002
Why does this have to be labeled as "More affordable than G-Sync" , even if it's true all it is going to do is be click bait for fan boy arguments on why AMD blows and Freesync should come in the form of a firmware update.....
If you don't own one the support AMD gpu's, calling it more affordable is well complete lie. it ends up costing you just as much or more.
Posted on Reply
#17
Petey Plane
Cool, now if only AMD could get their power levels under control and their drivers more consistent.
Posted on Reply
#18
Xzibit
arbiter
Problem with that statement is, "AMD Claims" Only realistic way that penalty could happen is if the gpu had to wait for display update window, which anyone would see that would be same for freesync as well. I watched most of that video this morning, just wow that guy suposed to be an expert but he said some of the dumbest crap I have ever heard. In that image he says you can turn v-sync off on freesync but you can't turn it off on g-sync due its effect of being on. If you look at that slide turning off v-sync fundamentally disabled what freesync and what g-sync is there to get rid of, SCREEN TEARING. He said g-sync has input latency cause that v-sync but Independent testers have already proven there g-sync doesn't have any of that less you go above the max refresh rate of the monitor.
You mean like this ?



Posted on Reply
#19
qubit
Overclocked quantum bit
I absolutely love open standards. Looks like proprietary G-Sync is gonna go away and if NVIDIA prevent their cards from supporting FreeSync it will really hurt their sales.

G-Sync could potentially live as a premium option if it has a compelling performance improvement over FreeSync, but does it?
Posted on Reply
#20
Xzibit
qubit
I absolutely love open standards. Looks like proprietary G-Sync is gonna go away and if NVIDIA prevent their cards from supporting FreeSync it will really hurt their sales.

G-Sync could potentially live as a premium option if it has a compelling performance improvement over FreeSync, but does it?
So far it looks like the differences is when it dips below 30fps

G-Sync = flickers
Freesync = tears/lags (V-sync off/V-Sync On) depends on which option you choose

Above monitor refresh it looks like Freesync option to have V-Sync on/off is a better trade off then Gsync always on which adds lag.
Posted on Reply
#21
arbiter
qubit
I absolutely love open standards. Looks like proprietary G-Sync is gonna go away and if NVIDIA prevent their cards from supporting FreeSync it will really hurt their sales.

G-Sync could potentially live as a premium option if it has a compelling performance improvement over FreeSync, but does it?
Um, might want to go look at market share numbers, when numbers are like 75-25 in favor of nvidia um switching camps well cost a bit so freesync likely to be one that gonna go away first.
Xzibit
So far it looks like the differences is when it dips below 30fps

G-Sync = flickers
Freesync = tears/lags (V-sync off/V-Sync On) depends on which option you choose
Only time i heard that nvidia flickered was when it was on a laptop using an unreleased driver on none g-sync display. g-sync has memory in it so when fps drops that low it still has image data to update screen with.
Xzibit
You mean like this ?




that is AMD provided data, so how reliable are those numbers? i doubt in every game AMD card would gain performance. that is what really makes those number questionable.

All those % are what you could call a margin of error as nor all playthroughs are the same and small things could effect performance numbers. AMD could nit picked ones that looked best for them.
Posted on Reply
#22
Xzibit
arbiter
Only time i heard that nvidia flickered was when it was on a laptop using an unreleased driver on none g-sync display. g-sync has memory in it so when fps drops that low it still has image data to update screen with.
Which in turn adds lag at the higher end. I think Blurbusters article proved that. Nvidias Tom Petersen pointed out it adds a backlog if its consistently redrawing.

Flickering been in G-sync since its inception.

PC Perspective - A Look into Reported G-Sync Display Flickering

You can find a lot more by googling or looking through forums.
Posted on Reply
#23
metalslaw
Atm, all that freesync is a 'ati and intel' gpu technology. With g-sync being a nvidia gpu technology.

Without either camp budging, this may end up turning into a monitor battle as well. With monitors coming with either freesync, _or_ g-sync only (thus locking in the consumers choice of graphics manufacturer).

The only real hope for consumers, is if monitor makers start shipping monitors with both freesync and g-sync enabled in the same monitor.
Posted on Reply
#24
qubit
Overclocked quantum bit
arbiter
Um, might want to go look at market share numbers, when numbers are like 75-25 in favor of nvidia um switching camps well cost a bit so freesync likely to be one that gonna go away first.
No, a marketshare lead like that won't make FreeSync go away any time soon. If the share stayed the same, then there are still plenty of AMD and Intel customers who will buy it and keep it alive.

Now, since this is a killer feature and cheaper to get than G-Sync, then it's likely to take sales away from NVIDIA. Of course, NVIDIA will counter in some way and it will be interesting to see how they do this.

Oh and btw that shiny, new and seriously overpriced Titan X is looking even more lacklustre now for not supporting it. :laugh:
metalslaw
Atm, all that freesync is a 'ati and intel' gpu technology. With g-sync being a nvidia gpu technology.

Without either camp budging, this may end up turning into a monitor battle as well. With monitors coming with either freesync, _or_ g-sync only (thus locking in the consumers choice of graphics manufacturer).

The only real hope for consumers, is if monitor makers start shipping monitors with both freesync and g-sync enabled in the same monitor.
Unfortunately, NVIDIA is sure to have a clause in their contract which prevents both standards being implemented. Hopefully it will be seen as anti-competitive by the competition commision or whatever they're called today and be unenforceable by NVIDIA.
Posted on Reply
#25
arbiter
qubit
No, a marketshare lead like that won't make FreeSync go away any time soon. If the share stayed the same, then there are still plenty of AMD and Intel customers who will buy it and keep it alive.

Now, since this is a killer feature and cheaper to get than G-Sync, then it's likely to take sales away from NVIDIA. Of course, NVIDIA will counter in some way and it will be interesting to see how they do this.

Oh and btw that shiny, new and seriously overpriced Titan X is looking even more lacklustre now for not supporting it. :laugh:
In the one side the amd guy said their monitor was 499, g-sync version was 599. 100$ over freesync one. when g-sync module was released it 200$. So either module for g-sync got cheaper or freesync has premium on it? It will be only matter of time before it gets even cheaper.


Had a bit of a thought, Since AMD has to certify all monitors to be freesync. They have pretty much locked up freesync to AMD only gpu's. Since g-sync module could possible be updated firmware to support it, but since freesync software is as it stands amd proprietary software. AMD in sense has done same thing everyone rips on nvidia for, they just did it under everyones nose's.
Posted on Reply
Add your own comment