Monday, January 7th 2019

NVIDIA G-SYNC now Supports FreeSync/VESA Adaptive-Sync Technology

NVIDIA finally got around to realizing that the number of monitors with VESA adaptive-sync overwhelmingly outnumber those supporting NVIDIA G-Sync, and is going ahead with adding support for adaptive-sync monitors. This however, comes with a big rider. NVIDIA is not immediately going to unlock adaptive-sync to all monitors, just the ones it has tested and found to work "perfectly" with their hardware. NVIDIA announced that it has found a handful of the 550+ monitor models in the market that support adaptive-sync, and has enabled support to them. Over time, as it tests more monitors, support for these monitors will be added through GeForce driver updates, as a "certified" monitor.

At their CES event, the company provided a list of monitors that they already tested and that fulfill all requirements. G-Sync support for these models from Acer, ASUS, AOC, Agon and BenQ will be automatically enabled with a driver update on January 15th.

Update: We received word from NVIDIA that you can manually enable G-SYNC on all Adaptive-Sync monitors, even non-certified ones: "For gamers who have monitors that we have not yet tested, or that have failed validation, we'll give you an option to manually enable VRR, too."

Update 2: NVIDIA released these new Adaptive-Sync capable drivers, we tested G-SYNC on a FreeSync monitor.
Add your own comment

231 Comments on NVIDIA G-SYNC now Supports FreeSync/VESA Adaptive-Sync Technology

#76
Aquinus
Resident Wat-man
I find it amusing that this coincided with the open source amdgpu driver gaining vrr support as of 5.0-rc1 that was released yesterday.
Posted on Reply
#77
Legacy-ZA
Finally.

I was about to go and enable this in my nvidia driver, when I realized. I am still using my old 120Hz Samsung monitor before Freesync and Gsync was a thing. Ah the days when people fought back feebly with arguments like "you can't see more than 30fps" ^_^

I was in the market to buy a new monitor quite a while ago, then I saw the price tags of the monitors that came with G-Sync, suffice to say, I didn't buy a monitor. I will have to save up again for a monitor, until then, fast sync will have to do for now.

nVidia, take this to heart, some people will never support such greed, me being one of them. That being said, you dodged a bullet nVidia, I was going to buy a AMD graphics card with a Freesync monitor this year, now you get to keep me as a customer, for now... as I will just buy a monitor. If your greed continues however, I will just buy a AMD GPU in the future, just like how I am going to give Intel the finger for good when AMD releases their Zen 2 this year. Just how I already gave Razer the finger.
Posted on Reply
#78
Nxodus
TheGuruStud, post: 3971372, member: 42692"
They want to sell more Turdings and the jig is up.
Or, maybe, if we don't view this from an "Nvidia hating" angle, monitor manufacturers might have thrown a hissy fit about the eternal struggle to pick between two standards. I'm just theorycrafting. I know, Nvidia is a soulless megacorp, but still, occasionaly, they might do something for free, to boost the industry and innovation and whatnot. I mean, they have the power to keep this g-sync misery going on for eternity, so why retreat
Posted on Reply
#79
FordGT90Concept
"I go fast!1!11!1!"
Aquinus, post: 3971379, member: 102461"
I find it amusing that this coincided with the open source amdgpu driver gaining vrr support as of 5.0-rc1 that was released yesterday.
NVIDIA couldn't be that quick to do their testing and make this announcement.


The driver is releasing on the 15th.
Posted on Reply
#80
Aquinus
Resident Wat-man
FordGT90Concept, post: 3971384, member: 60463"
NVIDIA couldn't be that quick to do their testing and make this announcement.
This has been staged for release for over a month in the Linux kernel. It only just now made it into mainline. They had time since it was announced late in November.
Posted on Reply
#81
FordGT90Concept
"I go fast!1!11!1!"
That makes more sense then.

I still think NVDA stock tumbling had something to do with it too. It's not clear that NVIDIA is even making money on G-Sync modules because of their low volumes. It may make financial sense to discontinue production.

Put the two together, along with CES as the perfect launch venue, and you get this announcement.
Posted on Reply
#82
Metroid
There are hundreds of monitor models available capable of variable refresh rates (VRR) using the VESA DisplayPort Adaptive-Sync protocol. However, the VRR gaming experience can vary widely.
To improve the experience for gamers, NVIDIA will test monitors. Those that pass our validation tests will be G-SYNC Compatible and enabled by default in the GeForce driver.
G-SYNC Compatible tests will identify monitors that deliver a baseline VRR experience on GeForce RTX 20-series and GeForce GTX 10-series graphics cards, and activate their VRR features automatically.
Support for G-SYNC Compatible monitors will begin Jan. 15 with the launch of our first 2019 Game Ready driver. Already, 12 monitors have been validated as G-SYNC Compatible (from the 400 we have tested so far). We’ll continue to test monitors and update our support list. For gamers who have monitors that we have not yet tested, or that have failed validation, we’ll give you an option to manually enable VRR, too.
For VRR monitors yet to be validated as G-SYNC Compatible, a new NVIDIA Control Panel option will enable owners to try and switch the tech on – it may work, it may work partly, or it may not work at all.
https://blogs.nvidia.com/blog/2019/01/06/g-sync-displays-ces/

So there we have it, it may not work at all.
Posted on Reply
#83
FordGT90Concept
"I go fast!1!11!1!"
I know for a fact that AMD has to tweak their drivers to support some FreeSync monitors. I do not know what that entails or why it is necessary when the GPU should be sending exactly what the monitor expects.

That reason is why NVIDIA gave that caveat: some monitors need extra TLC to work with VESA adaptive sync and if they don't have that TLC, your mileage may vary. They aren't going to make a blanket statement that it works because they can't guarantee it will unless the monitor was certified by NVIDIA to work (same as AMD).

You'd think it would be a one-size-fits-all thing but it isn't.
Posted on Reply
#84
oxidized
Legacy-ZA, post: 3971380, member: 101150"
Finally.

I was about to go and enable this in my nvidia driver, when I realized. I am still using my old 120Hz Samsung monitor before Freesync and Gsync was a thing. Ah the days when people fought back feebly with arguments like "you can't see more than 30fps" ^_^

I was in the market to buy a new monitor quite a while ago, then I saw the price tags of the monitors that came with G-Sync, suffice to say, I didn't buy a monitor. I will have to save up again for a monitor, until then, fast sync will have to do for now.

nVidia, take this to heart, some people will never support such greed, me being one of them. That being said, you dodged a bullet nVidia, I was going to buy a AMD graphics card with a Freesync monitor this year, now you get to keep me as a customer, for now... as I will just buy a monitor. If your greed continues however, I will just buy a AMD GPU in the future, just like how I am going to give Intel the finger for good when AMD releases their Zen 2 this year. Just how I already gave Razer the finger.
They'd be so sad to see you go :(
/s
Posted on Reply
#85
londiste
FordGT90Concept, post: 3971391, member: 60463"
I know for a fact that AMD has to tweak their drivers to support some FreeSync monitors. I do not know what that entails or why it is necessary when the GPU should be sending exactly what the monitor expects.
Because monitor manufacturers are morons and configure their monitors wrong. At least this has been the cases with some widely publicized cases - I believe including the Samsung monitors that part of the AMD combo deal :)
Posted on Reply
#86
Robcostyle
Damn u, nvidia - even inferior mg278 got support, but 279 didn’t.. just as always
Posted on Reply
#87
vahn3565
epiqpnwage, post: 3971228, member: 183730"
You do know that he is talking about the physyx cards nvidia tried to sell sheep like you who don't know better right?
You do know that was Ageia and not Nvidia right? Nvidia bought the technology and integrated it into their GPUs, they didn't sell proprietary hardware for physx. FFS
Posted on Reply
#88
oxidized
Robcostyle, post: 3971397, member: 176192"
Damn u, nvidia - even inferior mg278 got support, but 279 didn’t.. just as always
They'll probably keep adding models over time, there's no point in supporting something inferior and not something superior.
Posted on Reply
#89
robal
Very good news! Knowing it's coming from Nvidia, I expect some kind of catch :)
Posted on Reply
#90
wolf
Performance Enthusiast
stimpy88, post: 3971365, member: 178509"
nGreedia may well be aware of just how low the average customer regards their shady shit.
I don't think those sentiments are an accurate reflection of the average consumer at all. But hey, I'm basing that off my own small sample size too.

Basically of the gamers and tech heads I know and talk to, about 1 in 10 or so is one of these 'ngreedia' bandwagoners, the rest happily use either their products or AMD's without being so tilted about it.
Posted on Reply
#91
Vayra86
wolf, post: 3971427, member: 38920"
I don't think those sentiments are an accurate reflection of the average consumer at all. But hey, I'm basing that off my own small sample size too.

Basically of the gamers and tech heads I know and talk to, about 1 in 10 or so is one of these 'ngreedia' bandwagoners, the rest happily use either their products or AMD's without being so tilted about it.
The salt in this topic is absolutely stunning isn't it. All consumers now officially lost their vendor-lock in for both FreeSync and Gsync and people complain.

o_O
Posted on Reply
#92
jabbadap
Do they support HDMI too, or is this only for DP?
Posted on Reply
#93
Old Ladies
This comment section is toxic.

This is good news for everyone. I currently have a 1080 Ti and the original ROG swift at 2560x1440 144hz gsync. It has served me well and has been easily the best monitor I have ever owned.

The main reason for going with gsync is that they ALL have support for 1-max refresh rate support for VRR. This is why most freesync monitors won't be gsync compatible as most were shitty monitors that have a small VRR range. So if your frames dropped it turned off VRR.

No matter what if you are playing on most of the latest games you will have frame dips and this is why I loved gsync as it didn't matter gsync was always turned on and I could tell.

Though I was thinking about switching to AMD in the future if they ever compete in the high end again as the amount of freesync monitors and being cheaper was appealing to me. I also don't like being locked in with only one company. This is great news for Nvidia users and probably hurt AMD if Nvidia ever come back to reality with their prices on GPUs.
Posted on Reply
#94
Sasqui
Didn't see this coming. And no, they are not doing it out of the kindness of their black hearts, they must be losing sales by banning the use of Free-Sync on monitors equipped with G-Sync.

That said, to buy a G-Sync monitor, you're still going to have to pay the premium for the proprietary hardware and technology. It'll be just a little more attractive now.
Posted on Reply
#95
GoldenX
Finally some good news from Nvidia.
This is great.
Posted on Reply
#96
Ruyki
This move makes sense for nVidia. G-sync monitors are around 100-200 USD more expensive compared to the adaptive sync/freesync model which is otherwise identical making them a hard sell. Even many nVidia users probably purchased an adaptive sync/freesync monitor. Such users may consider an AMD GPU when it's time to upgrade which is something nVidia should avoid.
Posted on Reply
#97
medi01
SIGSEGV, post: 3971262, member: 104324"
so, NVidia can use both Gaysync and FreeSync while AMD just locked to FreeSync?
Nope.

It's "so, FreeSync monitor can be used with AMD, nVidai and later Intel, while GSync are overpriced and only work with nGreedia?"

Nxodus, post: 3971381, member: 183665"
Or, maybe, if we don't view this from an "Nvidia hating" angle, monitor manufacturers might have thrown a hissy fit about the eternal struggle to pick between two standards
BS.
GSync not only "was not free to use" unlike FreeSync based standard, it was also "not for anyone to license" because, you know, "investments" of nVidia.
As VRR-ish tech was right there in notebooks for years already and as Vesa standard, no less, the main point of the mentioned "investment" was to grab market using proprietary tech AND have hefty premium for Huang branded chips.

AMD's version, on the other hand, was so cool, most upscaler chips on the market included it out of the box.

And even today, with greedia's dominance in GPU market they have failed so miserably.

Two standards talk would apply if GSync was available for others to use.
Posted on Reply
#98
OneMoar
There is Always Moar
good lord the AMD fanboy is strong here

once again if you wanna blame somebody for state of the GPU market blame AMD
its there fault for once again not being competitive

nvidia gets free rein because they have no competition you want that to change bitch at amd
Posted on Reply
#99
kings
Every thread with Nvidia on the title, has become into a Nvidia bashing nowadays, even if it´s good news for consumers!

It´s pretty sad really, seems no one cares about games, hardware and tech anymore, the only important thing is bashing Nvidia in every topic.

It feels that some people need to bash Nvidia everyday, to feel some kind of joy in their lives.
Posted on Reply
#100
INSTG8R
My Custom Title
OneMoar, post: 3971535, member: 83420"
good lord the AMD fanboy is strong here

once again if you wanna blame somebody for state of the GPU market blame AMD
its there fault for once again not being competitive

nvidia gets free rein because they have no competition you want that to change bitch at amd
Sorry bud this about monitors not GPUs a 460 Can use Freesync just as well Vega as a 1050 can use G-Sync as well as 2080ti. Swing and a Miss.
Posted on Reply
Add your own comment