• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

FreeSync or G-Sync or both?

  • Thread starter Thread starter Deleted member 6693
  • Start date Start date
i just swapped from a new-ish 144 hrz 1 ms TN gaming panel to an IPS 4 ms gaming panel.. i did it for photo editing.. colour reproduction and viewing angles are far better on the IPS style panel and the blacks are deeper..

but it does come down to what folks are used to.. i expect to keep the monitor i now have for quite some time.. i have not the slightest desire to move to 4K even though i do have the gpu power to do it..

trog
 
But response times on IPS and PLS are slow. And a lot of cheaper IPS/PLS monitors are not superior to the better TN panels in terms of viewing angles and colours. So unless you stay out of the cheaper segment IPS is not going to be (much) better than TN while still having the response time trade off. Although I suspect you buy only $300+ monitors.
My TN right now is 5ms. Most IPS panels can match that. IPS always has better viewing angles than TN. Most TVs and smartphones have an IPS display.
 
But response times on IPS and PLS are slow. And a lot of cheaper IPS/PLS monitors are not superior to the better TN panels in terms of viewing angles and colours. So unless you stay out of the cheaper segment IPS is not going to be (much) better than TN while still having the response time trade off. Although I suspect you buy only $300+ monitors.

TNs are 6-bit where IPS are 8-bit. Most TNs found on the higher gaming monitors are still 6-bit+FRC color is between 72-75%. You can easily find a IPS panel for $150 with color range 75%. Even a eco or value IPS panel which has a 65% will look better than a TN panel and those monitors you can get for $120 and under.
 
Last edited:
Negative. Adaptive sync is agnostic. If you have an adaptive sync graphics processor and an adaptive sync monitor, adapative sync will be enabled by default. The purpose of the technology is to act without user input. Again, the goal is to reduce bandwidth requirements as well as reduce idle power consumption. The branding matters not.

That's describing the panel which inadvertantly describes the minimum refreshrate the eDP will refresh at. The frame rate can be lower from the GPU--eDP will fill in the gaps to keep it at or above minimum.

Looks like you realized your mistake on the last paragraph but we'll address that soon. :)

AdaptiveSync is agnostic but again... FreeSync is not. And the point of this topic is FreeSync vs G-Sync.

From AMD: FreeSync is a unique AMD hardware/software solution that utilizes DisplayPort™ Adaptive-Sync protocols to enable user-facing benefits

More from AMD...
Q: What are the requirements to use FreeSync?
A: To take advantage of the benefits of Project FreeSync, users will require: a monitor compatible with DisplayPort™ Adaptive-Sync, a compatible AMD Radeon™ GPU with a DisplayPort™ connection, and a compatible AMD Catalyst™ graphics driver. AMD plans to release a compatible graphics driver to coincide with the introduction of the first DisplayPort™ Adaptive-Sync monitors.​

Yup, agnostic. :rolleyes:

I never said 1.3 is when it was first supported. It will be the first supported by NVIDIA, Intel, and the rest of the industry.
Actually you did say it. You also said with DisplayPort 1.3 will allow it to work natively without any other hardware. Want me to quote that too? You made a big stink about how DisplayPort 1.3 will be the end of G-Sync but nothing in DisplayPort 1.3's spec supports that claim of yours.
All hardware with DisplayPort 1.3 ports will support adaptive sync and that includes NVIDIA.



FYI, AMD Crimson drivers added "Low Framerate Compensation" for FreeSync:

G-sync lost its technical edge with a driver update. eDP/adaptive sync is just that awesome. :laugh:


Edit: Interesting caveat there: "greater than or equal to 2.5 times the minimum refresh rate."
30 Hz -> 75 Hz
35 Hz -> 87.5
40 Hz -> 100 Hz
42 Hz -> 105 Hz
47 Hz -> 117.5 Hz
48 Hz -> 120 Hz
56 Hz -> 140 Hz

That's definitely something buyers should be aware of.

Edit: Looks like LFC should work on all 144 Hz displays.

Did you just find out about the driver update? :) There are downsides to it being a driver update such as needing driver updates for new monitors, some monitors may be left out (and some are btw) and it is definitely not the most elegant solution which leaves room for bugs (some experience overshoot, flickering and ghosting). Then you have the caveat that you mentioned. The upper end resolutions is where this will have the most effect.

Verdict: Good first start but still room for improvement

So no, Nvidia didn't lose the technical edge ...yet.

Also here is the video in which they recommended those changes to AMD.


This debate is tired. I said what I wanted to say.
You know the jargon but your information is off.
Do your per-emptive victory cheer. Get the last word in. I'm done. We are not getting anywhere. You never see anyone else's point of view but your own.
 
Looks like you realized your mistake on the last paragraph but we'll address that soon. :)

AdaptiveSync is agnostic but again... FreeSync is not. And the point of this topic is FreeSync vs G-Sync.

From AMD: FreeSync is a unique AMD hardware/software solution that utilizes DisplayPort™ Adaptive-Sync protocols to enable user-facing benefits

More from AMD...
Q: What are the requirements to use FreeSync?
A: To take advantage of the benefits of Project FreeSync, users will require: a monitor compatible with DisplayPort™ Adaptive-Sync, a compatible AMD Radeon™ GPU with a DisplayPort™ connection, and a compatible AMD Catalyst™ graphics driver. AMD plans to release a compatible graphics driver to coincide with the introduction of the first DisplayPort™ Adaptive-Sync monitors.​

Yup, agnostic. :rolleyes:
Intel will undeniably call it something else. My point is that that the name does not matter. If you buy a monitor that is branded as "FreeSync compatible," you'll be able to plug it into an AMD graphics card and use "FreeSync" or plug it into an Intel IGP and it will use "[insert name here]." The branding doesn't matter. If your GPU supports external adaptive sync (be it DisplayPort 1.2a or newer or whatever HDMI backbone AMD is using for that) and your monitor supports adapative sync, it will work. The name used to market it doesn't matter. You cited AMD so AMD naturally uses FreeSync branding. It is still agnostic--part of the DisplayPort standard.

Actually you did say it. You also said with DisplayPort 1.3 will allow it to work natively without any other hardware. Want me to quote that too? You made a big stink about how DisplayPort 1.3 will be the end of G-Sync but nothing in DisplayPort 1.3's spec supports that claim of yours.
facepalm.jpg

Instead of arguing, I'm just going to be very blunt. External adapative synce was/is started at...
AMD: DisplayPort 1.2a
Everyone else: DisplayPort 1.3

DisplayPort 1.3 FAQs
VESA said:
Q: Does the release of DisplayPort 1.3 mean that DisplayPort 1.2 products are obsolete?

A: Not at all. VESA develops and publishes standards like DisplayPort prior to their actual deployment in the field. DisplayPort 1.2a represents the latest interconnect technology now available to consumers from manufacturers. The new DisplayPort capabilities included in DisplayPort 1.3 have begun the cycle of hardware development that will result in such technology becoming available to consumers in a range of products over the next few years. And like other new versions of DisplayPort, DisplayPort 1.3 is backward compatible with earlier DisplayPort standards.
The reason why 1.2a exists is because AMD wanted support in 2014 and didn't want to wait until 2016 for 1.3 because G-sync was already out the door. AMD...and VESA...couldn't wait.

Did you just find out about the driver update? :) There are downsides to it being a driver update such as needing driver updates for new monitors, some monitors may be left out (and some are btw) and it is definitely not the most elegant solution which leaves room for bugs (some experience overshoot, flickering and ghosting). Then you have the caveat that you mentioned. The upper end resolutions is where this will have the most effect.
The "driver update" is AMD only. It requires no changes in the monitor. Note the last bullet on the slide: "No user configuration or proprietary monitor hardware required." If you're wondering why that is, it is because of the eDP chip inside the monitor. The LFC bug was purely in software because AMD didn't code in a solution to the "FPS < Min Refresh" problem. That's been fixed.

Despite those couple of complaints, my first impression of LFC with the Crimson driver is one of satisfied acceptance. It's a great start, there is room to improve, but AMD is listening and fixing technologies to better compete with NVIDIA as we head into 2016.
AMD will fix the bugs in time and do so through software updates. There's nothing wrong with the hardware in the card nor the monitor. Their algorithms are still just a little rough around the edges. Such is the nature of working with a complex industry standard.

Intel and NVIDIA are likely to go through the same iterations to apply and improve external adaptive sync support.
 
Last edited:
FreeSync monitors come "for free" (and that's a problem for AMD, in some way) since most scaler chips out there support the tech out of the box.
AMD's way is FREEE FOR EVERYONE to use, being an industry standard. (please spare me from the "adaptive sync is not free sync" argument, in this context it's the same thing).

Tests show FreeSync to be slightly superior to G-Sync (former increases FPS a bit, latter decreases).

nVidia's stuff comes at a price (more chips to integrate into monitor), AND IT LIMITS MONITOR TO HAVING ONLY PORT: G-Sync enabled DISPLAYPORT. (it will still work as normal display port though).
FreeSync has no such restriction. (AMD even promised FreeSync over HDMI!)

Since nVidia officially announced about NOT licensing G stuff to anyone (and frankly there being no good reason to endorse G over F anyhow), Intel or AMD jumping on GSync wagon is extremely unlikely. (nVi exclusiveness is it's only point)

All that is needed for nVidia to jump on FreeSync wagon is for AMD to re-gain GPU market share (which is shamelessly low given how many great products they have on the market at the moment, but that's a different discussion).
 
Last edited:
Let's all keep our mouths shut until the arrival of Pascal and Polaris :toast:
 
Let's all keep our mouths shut until the arrival of Pascal and Polaris :toast:

What? Without speculation there wouldn't be much posting at all in tech forums. I'm speculating that of course.
 
Tests show FreeSync to be slightly superior to G-Sync (former increases FPS a bit, latter decreases).
Which tests? Everything I've read gives gsync a slight advantage at the moment.
 
90% sure Polaris will support DisplayPort 1.3 and 60% on Pascal. The future of GPUs is higher resolutions and refresh rates. The best path to those is DP 1.3.

I think the reason why Fiji doesn't have HDMI 2.0 is because AMD wanted HDMI 2.0a for external adaptive sync. AMD had no intent of ever using HDMI 2.0. That inevitably means HDMI 2.1 is likely to make external adaptive sync a standard too but it will come some time after Polaris (or announced at about the same time).
 
Last edited:
What? Without speculation there wouldn't be much posting at all in tech forums. I'm speculating that of course.
Speculation, huh? I thought my rather provocative* and puny attempt of being funny would lead to even more posts. Trolls attract attention, right? :fear:
*self-irony
90% sure Polaris will support DisplayPort 1.3 and 60% on Pascal.
Found this:
This release time frame will coincide with AMD’s and NVIDIA’s launch of next generation of GPUs which will be equipped with DisplayPort 1.3.
Source: http://120hzmonitors.com/samsung-144hz-3440x1440-ultrawide-monitors/
 
Last edited:
Haha, can one be DP 1.3 compliant without FreeSync (Adaptive Sync) thing? :)

Everything I've read gives gsync a slight advantage at the moment
Everything you read... where? And what "slight advantage"?
Tests show GS reducing FPS a bit (about -2%), while FS gives (even smaller) bump.
http://www.anandtech.com/show/9097/the-amd-freesync-review/4

But, hell yeah, then there is FUDzilla, fueled by pay rolled nTrolls quite a bit, about OMG nidividiaaaas infinite greatness.
Effectiveness of troll tactics is depressing. 380 is about 10% faster than 960, consumes about 20% more power, is regarded as "you can fry... on it". (applies to most AMD vs nVidia products).
 
Hmm......this thread is getting more and more interesting. I just wonder - am I the only who not switch monitor every second year?? I maybe switch GPU every second or third year,
but my monitor ..... well the one I primarily use now is 5+ years - and it works great for the games I play (the newest is FallOut 4) - but then again I only run at 1920 x 1080.
All I get out of this thread is still: buy a monitor that have a decent refresh rate and do not look at the fancy gimmicks (FreeSync/G-Sync) but go for Adaptive Sync.
Monitors are something I don't upgrade very often with the only exception being I switched from a 2160p monitor to a 1440p 144hz monitor because it was better to me for gaming and I wanted to try Freesync on it. I normally change my cards every other generation with some exceptions in the past but I try and stick to this method now. Also if a monitor says "Freesync" it means it is adaptive sync, they are only using that name because it will be more recognized by gamers but its the exact same thing (Freesync is just AMD's design on their cards to take advantage of it).

The problem is if you want to pay for it or not at the end of the day. Adaptive Sync (Freesync) monitors are generally more like standard monitors and are not proprietary and contain extra hookups where as a G-Sync monitor is going to have 1 DP output and be locked exclusively to Nvidia hardware. Its a hard sell right now either way because there is the fact that DP 1.3 monitors could come out this year and be supported by all (Heck maybe we could get lucky and 1.2a monitors will work on them with Adaptive Sync) and then you will have a much more obvious choice. Otherwise (Or if Nvidia locks it out) were still going to be stuck with the same debate. Its not really a matter of which is better as in their ranges they perform equally well (though again G-Sync goes down to 30 and ther others are generally down to 40 with one I believe at 35 currently) so its a matter of if you want to pay for it and what GPU you have and how long you want to keep both.
 
Everything you read... where? And what "slight advantage"?
Tests show GS reducing FPS a bit (about -2%), while FS gives (even smaller) bump.
http://www.anandtech.com/show/9097/the-amd-freesync-review/4

But, hell yeah, then there is FUDzilla, fueled by pay rolled nTrolls quite a bit, about OMG nidividiaaaas infinite greatness.
Effectiveness of troll tactics is depressing. 380 is about 10% faster than 960, consumes about 20% more power, is regarded as "you can fry... on it". (applies to most AMD vs nVidia products).
from the summary of the link you posted-

"Except for a glitch with testing Alien Isolation using a custom resolution, our results basically don’t show much of a difference between enabling/disabling G-SYNC/FreeSync – and that’s what we want to see. While NVIDIA showed a performance drop with Alien Isolation using G-SYNC, we weren’t able to reproduce that in our testing; in fact, we even showed a measurable 2.5% performance increase with G-SYNC and Tomb Raider. But again let’s be clear: 2.5% is not something you’ll notice in practice. FreeSync meanwhile shows results that are well within the margin of error."

So they found a glitch with one game, Alien Isolation.


Here's some gsync vs freesync articles

http://www.tomshardware.co.uk/amd-freesync-versus-nvidia-g-sync-reader-event,review-33278.html

http://www.pcworld.com/article/2974...rate-displays-make-pc-games-super-smooth.html

http://www.techradar.com/news/compu...cards/nvidia-g-sync-vs-amd-freesync-1289637/2

http://www.digitaltrends.com/computing/nvidia-g-sync-or-amd-freesync-pick-a-side-and-stick-with-it/


I'm not saying gsync is a better option-there are benefits to both. Freesync being cheaper is a big plus in my view.
But mostly it depends on what GPU is used.
 
Haha, can one be DP 1.3 compliant without FreeSync (Adaptive Sync) thing? :)
No in regards to the hardware (by installing a DisplayPort 1.3 connector, the hardware is there to do it) and yes in regards to software (the drivers determine under what condition to send a signal to the monitor). Most of the magic occurs in the eDP chip in the display but the drivers still have to instruct it what to do by sending it a signal.

There is a real danger that NVIDIA cards could look for a G-sync module and if it doesn't find it, disable adaptive sync. I'm thinking this might actually be likely in Pascal for two reasons:
1) NVIDIA wants to milk the G-sync cash cow a while longer.
2) NVIDIA needs time to not only implement adaptive sync in their drivers but also fine tune it so it matches or exceeds AMD's implementation.

Why would NVIDIA deliberately put itself behind AMD when they have a competitive product that is smoother despite being more expensive? I can only think of one reason why they wouldn't: cost. Obviously I don't have insider knowledge to know if that is a factor or not.


What isn't known is how Intel plays into this. They don't have a horse in the race yet like Intel. Are they going push their implementation out the door and tweak it like AMD or are they going to bide their time perfecting it until they push it out the door? I'm 50/50 on that one. I don't know.
 
So they found a glitch with one game, Alien Isolation.
Check other bits, it's just easier to see there:
http://www.anandtech.com/Gallery/Album/4315#6

Here's some gsync vs freesync articles
Sorry, I missed which one of them is saying GS is faster than FS.

I'm not saying gsync is a better option-there are benefits to both. Freesync being cheaper is a big plus in my view.
But mostly it depends on what GPU is used.

G-Sync, there is no visible technological advantage ('oh, monitor's manufacturer needs to do bla" isn't technical), yet it costs about a 100$ to support and restricts number of ports to 1.
It's an attempt to leverage dominant market position to bar competitors, akin to PhysX.
There is no (customer) benefit in it. nVidia could benefit from it, if AMD didn't strike back with FS. It still can if AMD goes bust.

There is a real danger that NVIDIA cards could look for a G-sync module and if it doesn't find it, disable adaptive sync.
So, "Dear Asus, put this chip of mine into your monitor, or else I disable adaptive sync"?
But then, which tech will be used for adaptive sync? And if it is GS (in which case chip is really really needed) what about the problem that caused all Monitors with GS support to have only single port, did they somehow overcome it?
 
So, "Dear Asus, put this chip of mine into your monitor, or else I disable adaptive sync"?
But then, which tech will be used for adaptive sync? And if it is GS (in which case chip is really really needed) what about the problem that caused all Monitors with GS support to have only single port, did they somehow overcome it?
All displays with the VESA-compliant eDP chip. It's really eDP versus G-sync module.

I doubt NVIDIA has any interest in adding more than one port to G-sync displays. I mean, the module makes it expensive, why you buy the display and not use it? :laugh:

I guess there is a third option I didn't mention: NVIDIA could sell their own eDP chip that is G-sync compliant as well. The G-sync module could check for non-NVIDIA cards and block adapative sync functionality if found.

I don't know what scares me more: the fact they could do this shady stuff or that I feel it is likely they will. If they do, I hope VESA and AMD pile on lawsuits. AMD would be in a better position because this would be anti-competitive behavior if NVIDIA did it.
 
Last edited:
i own a g-sync monitor so far i cant notice much of a difference with it (g-sync not the monitor) on or off.. mind you i dont play games at 30 or 40 fps.. :)

to be honest i read that much copy and paste or entirely speculative bollocks on this place i am beginning not to trust my own judgement.. :)

trog
 
i just swapped from a new-ish 144 hrz 1 ms TN gaming panel to an IPS 4 ms gaming panel.. i did it for photo editing.. colour reproduction and viewing angles are far better on the IPS style panel and the blacks are deeper..

but it does come down to what folks are used to.. i expect to keep the monitor i now have for quite some time.. i have not the slightest desire to move to 4K even though i do have the gpu power to do it..

trog

My TN right now is 5ms. Most IPS panels can match that. IPS always has better viewing angles than TN. Most TVs and smartphones have an IPS display.

TNs are 6-bit where IPS are 8-bit. Most TNs found on the higher gaming monitors are still 6-bit+FRC color is between 72-75%. You can easily find a IPS panel for $150 with color range 75%. Even a eco or value IPS panel which has a 65% will look better than a TN panel and those monitors you can get for $120 and under.

Note: I was talking on actual "independent" test results and not manufacturer specs. Specs can be outright lies especially with monitors and fans. I have been reading reviews on hardware.info recently, and they warned quit clearly about that (they are of the kind who grab colorimeters and oscilloscopes to test the screens they review).
And Windows reserves 2 bits of eight for stuff other than actual colour info (all monitors are 16-17 milion colours max).
 
Uh....no...There are only a handful of color choices in Windows: 8-bit (256 color pallete), 16-bit (5-bits per color, green may have 6-bits), 24-bit (8-bits per color), or 32-bit (8-bits per color + alpha).

6-bit displays still get 24 (if not 32) bits of color information. The display simply just can't reproduce that many colors.
10-bit displays, the graphics driver has to override Windows' color settings: NVIDIA AMD

AMD is pushing 10-bit support to become the norm through HDR.
 
And Windows reserves 2 bits of eight for stuff other than actual colour info (all monitors are 16-17 milion colours max).
Please verify these sort of things next time to avoid embarrassment in the future.
 
Although "embarrassment" is not the thing I feel right now, I see your point.

Although I have to say that FordGT90Concept gave a better answer than you by showing I was wrong, through explanation. That usually works better than personal attacks, especially if his reply was enough to let me realise I have had a derp moment.

BTW, if you still wish to "spank" me for being stupid, please do so in PM (directed at Dethroy , not Ford).
 
A personal attack? It was meant to be an advice, nothing else. But I can see how it may sound different (written word is not very good at transporting intentions) when I look at my statement from your point of view.
 
Back
Top