• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Announces FreeSync, Promises Fluid Displays More Affordable than G-SYNC

So after all of that, seeing that last image..... we still are not out of the woods.....

Either you get tearing or you get input lag, im sure tis all mitigated a little but damn it, why cant we just have neither already?
 
So after all of that, seeing that last image..... we still are not out of the woods.....

Either you get tearing or you get input lag, im sure tis all mitigated a little but damn it, why cant we just have neither already?
Only when your outside of the boundaries of adaptive sync. This is a good thing, you get to choose whether you want v sync on or off when you dip below. And honestly, hopefully your not gaming at less than 30 fps.
 
Minimum required 40FPS on some monitors does not make me happy at all.
Currently, to have always above 40FPS on 1440P you got to have enough GPU horsepower to use decent settings on many new games. Otherwise it won't be worth it much.

Hoped to see is in the low 30s. Anyway, with a new top end gen of AMD cards this shouldn't (hopefully) be the case if performance are as leaked.
770/280x or better is all you need.
 
Yeah the marketing wording makes it a touch confusing... As a Freesync certified panel is just a panel that has adaptive sync. Freesync is the driver being aware to the adaptive sync and handling the delivery of the frames to take advantage of it. So freesync and adaptive sync are two sides of the same coin. Anyone can make an adaptive sync aware driver and use a freesync certified monitor... There are no whitelists or blacklists only a free and open standard that AMD had pushed into VESA.

But if you want to get technical and call freesync the driver section...and call it proprietary then you are probably correct in that AMD will not do your work for you and make you a driver... but if you are in the green team you probably think that is a good thing... so you can stop your bitching. :P

The troll unsubbed ... Darn. :)
 
40-48 min refresh range seems like that expensive nvidia module actually does something seeing as they can do 30 min. Hard to say apples to apples in cost when performance isn't apples to apples. Still wait for numbers not produced by amd but 40 min is pretty disappointing, still waiting for a monitor that will do 20 min.
 
40-48 min refresh range seems like that expensive nvidia module actually does something seeing as they can do 30 min. Hard to say apples to apples in cost when performance isn't apples to apples. Still wait for numbers not produced by amd but 40 min is pretty disappointing, still waiting for a monitor that will do 20 min.

Freesync can go down to 9 ... It is the panels that are lacking not the adaptive or freesync...

As it is an open standard.... it is up to the individual manufacturer as to how them implement it in the panel.
 
Last edited:
Freesync can go down to 9 ... It is the panels that are lacking not the adaptive or freesync...

As it is an open standard.... it is up to the individual manufacturer as to how them implement it in the panel.
The comment was just a snide remark about people clamoring about things being cheaper yet equal. There is always a price to pay till things actually hit consumers it's all just hypotheticals and pointless to me. I'm not investing in either technology till i see monitors hitting 20 min or better. With AMD that could be when ever i suppose given it is up to the manufactures to invest. With nvidia i suppose it's when they release an update to their module probably with a cheaper ASCI instead of repurposed chips. So who ever does that first wins gets my money.

http://www.guru3d.com/articles-pages/amd-freesync-review-with-the-acer-xb270hu-monitor,3.html
Q: What is the supported range of refresh rates with FreeSync and DisplayPort Adaptive-Sync?
A: AMD Radeon graphics cards will support a wide variety of dynamic refresh ranges with Project FreeSync. Using DisplayPort Adaptive-Sync, the graphics card can detect and set an appropriate maximum and minimum refresh rate based on the capabilities reported by the display. Potential ranges include 36-240Hz, 21-144Hz, 17-120Hz and 9-60Hz.
Seems like i'll be waiting for 21-144hz.
 
The comment was just a snide remark about people clamoring about things being cheaper yet equal. There is always a price to pay till things actually hit consumers it's all just hypotheticals and pointless to me. I'm not investing in either technology till i see monitors hitting 20 min or better. With AMD that could be when ever i suppose given it is up to the manufactures to invest. With nvidia i suppose it's when they release an update to their module probably with a cheaper ASCI instead of repurposed chips. So who ever does that first wins gets my money.

There are pros and cons of a walled garden... AMD's open solution is more flexible, Nvidia's is more consistent. Freesync also allows you to chose how it acts when outside the panels range of adaptive sync. You can have vsync kick in or not. Gsync doesn't have that option. Frankly if you are much below 40 fps you are not going to have enjoyable gameplay... and around 30 you start getting panel flicker.

While the spec may allow for as low as 9.... going to take some magic on the panel side to make it work.
 
There are pros and cons of a walled garden... AMD's open solution is more flexible, Nvidia's is more consistent. Freesync also allows you to chose how it acts when outside the panels range of adaptive sync. You can have vsync kick in or not. Gsync doesn't have that option.
Not really a feature I care about, I'm buying such a monitor to eliminate tearing. Neither solution can deal with frame rates above monitor refresh rate, pretty sure they never will. So I'll just cap fps, seems like a waste of my money if I accept screen tearing anyways.
 
Of course it's not free. You have to buy monitor that has FreeSync...
 
Of course it's not free. You have to buy monitor that has FreeSync...

Yeah. I wonder when you buy MSI GTX 980 you get Military Class caps and Samsung memory for free? :laugh:
 
Only 11 compatible displays, with Samsung being the only brand with 4k offering(s).

Good, AMD, just expand the support, please, to more than this.
 
Funny, the monitors that have Freesync enabled in the UK are dreadfully overpriced considering this is something that's supposed to be a free VESA standard, almost as expensive as similar GSync enabled offerings. Perhaps we should title it almost-as-expensive-sync.
 
Funny, the monitors that have Freesync enabled in the UK are dreadfully overpriced considering this is something that's supposed to be a free VESA standard, almost as expensive as similar GSync enabled offerings. Perhaps we should title it almost-as-expensive-sync.

Just because of UK ??
 
"No proprietary hardware"
Gosh I so wish that means it will work on nVidia (and Intel) GPUs but I know I'll be wrong.

Also, dang those LG ultrawides look enticing
It works,
a leaked driver for aus, iirc, made an nVidia GPU connected w/ eDP (laptop) to enable G-sync.
What nvidia doesn't tell you is that their G-sync is DOA and already supported in old h/w e.g. all laptops w/ eDP.
 
What happens when using free sync and fps drops below minimal supported adaptive refresh rate? Does the screen go black like in leaked nv g-sync laptop driver?
 
Last edited:
What nvidia doesn't tell you is that their G-sync is DOA and already supported in old h/w e.g. all laptops w/ eDP.

Coherent proof requested, otherwise post is invalid. Seen enough posts saying things are possible with 'x' hardware when they're not.
Also, proof of it happening plus actual critical dissection by neutral source required.
If the above requirements can't be fulfilled, then its little more than Trolling.

What we can say is the adaptive v-sync pathway looks to be better for all involved (except perhaps Nvidia).
 
Coherent proof requested, otherwise post is invalid. Seen enough posts saying things are possible with 'x' hardware when they're not.
Also, proof of it happening plus actual critical dissection by neutral source required.
If the above requirements can't be fulfilled, then its little more than Trolling.

What we can say is the adaptive v-sync pathway looks to be better for all involved (except perhaps Nvidia).

Sure, you are violating your own requirements because of being so neutral and not a troll. Not a small troll, a big one, actually. :D

Just say that it is a lie, in the same way like someone else said earlier today that nvidia's price of 999$ is a lie. We could have accused them of big trolling....
 
Not really a feature I care about, I'm buying such a monitor to eliminate tearing. Neither solution can deal with frame rates above monitor refresh rate, pretty sure they never will. So I'll just cap fps, seems like a waste of my money if I accept screen tearing anyways.
But most of the problems arise when the FPS is changing which is why most of the time these monitors are ones with above 60hz refresh rates. It is hard to maintain the higher refresh rates than it is around 60 so that is why at least to me I feel its purpose is above at extreme areas instead of the lower ones. Even with G-Sync when you start dipping that low your not having a great experience as it is same will go for Freesync, but when you are in the range of 75-144 your going to be playing a smooth rate while the FPS is constantly changing.

Only when your outside of the boundaries of adaptive sync. This is a good thing, you get to choose whether you want v sync on or off when you dip below. And honestly, hopefully your not gaming at less than 30 fps.
Yea, gaming below a certain point still is going to produce a bad experience. Where does the line get crossed is the main problem as for me I would probably not want to go much below 50 but I have heard down to 30 with these features is not to bad but I would not shoot for that. I have seen some reviews already which claim it works so I am happy honestly but I am still waiting to see it for myself. I would love to see a decently priced one available in the U.S. already and I may try one but I still also need to wait for the CFX support next month.

Either way, this tech sounds cool and seems to work so I am interested.
 
"No proprietary hardware"
Gosh I so wish that means it will work on nVidia (and Intel) GPUs but I know I'll be wrong.

Also, dang those LG ultrawides look enticing

I guarantee Intel will support it by Skylake. Then Nvidia will be forced to support it by the end of 2016. Mark my words.
 
I guarantee Intel will support it by Skylake. Then Nvidia will be forced to support it by the end of 2016. Mark my words.

Where did you get your crystal ball?
 
It works,
a leaked driver for aus, iirc, made an nVidia GPU connected w/ eDP (laptop) to enable G-sync.
What nvidia doesn't tell you is that their G-sync is DOA and already supported in old h/w e.g. all laptops w/ eDP.

I disagree. Think about it - only GCN 1.1 cards support freesync. But all of nvidia's cards support g-sync. Obviously there's some kind of a component that does the hardware communication, that Amd decided to integrate into their newer cards while nvidia decided to keep it external. Both options have pros and cons.
Maybe the 980 has the hardware built in and doesn't need the external solution.
 
I disagree. Think about it - only GCN 1.1 cards support freesync. But all of nvidia's cards support g-sync. Obviously there's some kind of a component that does the hardware communication, that Amd decided to integrate into their newer cards while nvidia decided to keep it external. Both options have pros and cons.
Maybe the 980 has the hardware built in and doesn't need the external solution.

Indeed, there is apparently some ghosting going on with Freesync too:


PCPer said:
The ROG Swift animates at 45 FPS without any noticeable ghosting at all. The BenQ actually has a very prominent frame ghost though the image still remains sharp and in focus. The LG 34UM67 shows multiple ghost frames and causes the blade to appear smudgy and muddled a bit.

The question now is: why is this happening and does it have anything to do with G-Sync or FreeSync? NVIDIA has stated on a few occasions that there is more that goes into a VRR monitor than simply integrated vBlank extensions and have pointed to instances like this as an example as to why. Modern monitors are often tuned to a specific refresh rate – 144 Hz, 120 Hz, 60 Hz, etc. – and the power delivery to pixels is built to reduce ghosting and image defects. But in a situation where the refresh rate can literally be ANY rate, as we get with VRR displays, the LCD will very often be in these non-tuned refresh rates. NVIDIA claims its G-Sync module is tuned for each display to prevent ghosting by change the amount of voltage going to pixels at different refresh rates, allowing pixels to untwist and retwist at different rates.

It’s impossible now to know if that is the cause for the difference seen above. But with the ROG Swift and BenQ XL2730Z sharing the same 144 Hz TN panel specifications, there is obviously something different about the integration.

..............

FreeSync is doing the right things and is headed in the right direction, but it can’t claim to offer the same experience as G-Sync. Yet.

Source: http://www.pcper.com/reviews/Displa...hnical-Discussion/Gaming-Experience-FreeSync-

head-in-sand.png
 
Last edited:
Back
Top