Monday, January 7th 2019

NVIDIA G-SYNC now Supports FreeSync/VESA Adaptive-Sync Technology

NVIDIA finally got around to realizing that the number of monitors with VESA adaptive-sync overwhelmingly outnumber those supporting NVIDIA G-Sync, and is going ahead with adding support for adaptive-sync monitors. This however, comes with a big rider. NVIDIA is not immediately going to unlock adaptive-sync to all monitors, just the ones it has tested and found to work "perfectly" with their hardware. NVIDIA announced that it has found a handful of the 550+ monitor models in the market that support adaptive-sync, and has enabled support to them. Over time, as it tests more monitors, support for these monitors will be added through GeForce driver updates, as a "certified" monitor.

At their CES event, the company provided a list of monitors that they already tested and that fulfill all requirements. G-Sync support for these models from Acer, ASUS, AOC, Agon and BenQ will be automatically enabled with a driver update on January 15th.

Update: We received word from NVIDIA that you can manually enable G-SYNC on all Adaptive-Sync monitors, even non-certified ones: "For gamers who have monitors that we have not yet tested, or that have failed validation, we'll give you an option to manually enable VRR, too."

Update 2: NVIDIA released these new Adaptive-Sync capable drivers, we tested G-SYNC on a FreeSync monitor.
Add your own comment

231 Comments on NVIDIA G-SYNC now Supports FreeSync/VESA Adaptive-Sync Technology

#201
Mussels
Freshwater Moderator
I'm le tired and want naptime, so they need to hurry up :P
Posted on Reply
#202
danado
MusselsI'm le tired and want naptime, so they need to hurry up :p
Here they are!! Are available in this moment.
wolfAlmost 2pm on the 15th here in Perth Australia, I want this driver dammit!

Well since owning a Freesync monitor it's already been 18 months, another 18 hours wont hurt.
Are you there? The driver is ready now!!!
Posted on Reply
#203
Mussels
Freshwater Moderator
yeeeee boooooi
Posted on Reply
#204
Candor
GTX 1080 here running on an AG271QX monitor (freesync range 30-144hz).

G-Sync was enabled by default. No bad stuff happening, no blanking of the screen or pulsing.

Honestly I can't tell if it's working.

The old G-SYNC Pendulum Demo does not work. (Obviously it's outdated - released in 2015).
Posted on Reply
#205
danado
Is anybody with news
CandorGTX 1080 here running on an AG271QX monitor (freesync range 30-144hz).

G-Sync was enabled by default. No bad stuff happening, no blanking of the screen or pulsing.

Honestly I can't tell if it's working.

The old G-SYNC Pendulum Demo does not work. (Obviously it's outdated - released in 2015).
Thank you. I have the same configuration at home with yours, but in this moment I am at the office.
These are good news and in a way expected.
Posted on Reply
#206
Mussels
Freshwater Moderator
enabled and working fine - i can go into the settings menu of my monitor and see res/refresh info in realtime and it shows the refresh flickering all over the place, so its a green light for me

In the one game i've tested (supreme commander: forged alliance, which has a 100FPS cap) i'm seeing some very faint flickering of brightness - but its only noticeable if i look for it really hard in dim background areas
Posted on Reply
#207
danado
Musselsenabled and working fine - i can go into the settings menu of my monitor and see res/refresh info in realtime and it shows the refresh flickering all over the place, so its a green light for me

In the one game i've tested (supreme commander: forged alliance, which has a 100FPS cap) i'm seeing some very faint flickering of brightness - but its only noticeable if i look for it really hard in dim background areas
The HP 14402 is yours?
Posted on Reply
#208
danado
CandorBlur Buster's "Variable Refresh Rate Simulation" seems to confirm mine is working.

Haven't tried windowed mode yet.
Try For Honor in windowed mode if you can.
Posted on Reply
#209
Candor
Hmmm. Still no real confirmation it's working yet.

At least I have no negative issues.

Everything is smooth, but it looked that way before :)

Posted on Reply
#212
danado
CandorHmmm. Still no real confirmation it's working yet.

At least I have no negative issues.

Everything is smooth, but it looked that way before :)

For the same monitor I have checked and is moved in Freesync mode. Also G-Sync Demo Pendulum it works in G-Sync mode.
Posted on Reply
#213
John Naylor
Well it will increase sales for nVidia cards since AMD has no horse in the race in the top 4 tiers. However, with no alternative for MBR tech (ULMB) anyone looking to stay above 60 fps won't care.
Posted on Reply
#214
c2DDragon
CandorHmmm. Still no real confirmation it's working yet.

At least I have no negative issues.

Everything is smooth, but it looked that way before :)

It seems there is an indicator :

Not tested yet with my XG270HU but for sure I will.

Edit : It's working like a charm !
Posted on Reply
#215
Animalpak
G-Sync is a dedicated hardware ( with some similar characteristics of a GPU ) module installed behind the LCD screen, so explain to me why now monitors that dont have it now can do the same thing ?
Posted on Reply
#216
c2DDragon
AnimalpakG-Sync is a dedicated hardware ( with some similar characteristics of a GPU ) module installed behind the LCD screen, so explain to me why now monitors that dont have it now can do the same thing ?
G-sync just adds :
_less input lag
_an overdrive (well hmm ok lol)
_refresh rate starts lower than freesync or adaptative sync
_profits ? (for nVidia of course)
It's qualified as high end but it's just a marketing stuff which adds like $200 for something you may not be able to notice compared to freesync screens.
The thing is G-sync screens may have less (to no) ghosting compared to some Freesync 1 stuff, I don't know about adaptative sync screens. It can be not noticeable for some out of a bench by the way.

Edit : I just want to tell I'm not a fanboy or anything.
The only ATI (before AMD for young people) I owned was a X800GTO and the screen I bought was just for 1440p @ 144Hz, no way I planned to use Freesync on it. Now nVidia permits to use G-sync on it, I'm happy cause no way I would have bought a G-sync screen for this kind of small feature.
Small feature because I think it's only good when you get lower than 60fps, to permit the experience to remain smooth. For real right now I tried in The Division & AC Origins but I only get lower than 60 in The Division with nvidia custom shadows in some areas. I just used those shadows for the test and the sync is working good, by the way it's a no go for me to have less than 70fps (I picked back the ultra shadows instead of special ones) so I don't feel the need of the feesync/g-sync feature, I added it because why not ? It will be good for non optimized console port games I guess...before I switch the 1080Ti for something more powerful.
Posted on Reply
#217
Animalpak
c2DDragonG-sync just adds :
_less input lag
_an overdrive (well hmm ok lol)
_refresh rate starts lower than freesync or adaptative sync
_profits ? (for nVidia of course)
It's qualified as high end but it's just a marketing stuff which adds like $200 for something you may not be able to notice compared to freesync screens.
The thing is G-sync screens may have less (to no) ghosting compared to some Freesync 1 stuff, I don't know about adaptative sync screens. It can be not noticeable for some out of a bench by the way.

Edit : I just want to tell I'm not a fanboy or anything.
The only ATI (before AMD for young people) I owned was a X800GTO and the screen I bought was just for 1440p @ 144Hz, no way I planned to use Freesync on it. Now nVidia permits to use G-sync on it, I'm happy cause no way I would have bought a G-sync screen for this kind of small feature.
Small feature because I think it's only good when you get lower than 60fps, to permit the experience to remain smooth. For real right now I tried in The Division & AC Origins but I only get lower than 60 in The Division with nvidia custom shadows in some areas. I just used those shadows for the test and the sync is working good, by the way it's a no go for me to have less than 70fps (I picked back the ultra shadows instead of special ones) so I don't feel the need of the feesync/g-sync feature, I added it because why not ? It will be good for non optimized console port games I guess...before I switch the 1080Ti for something more powerful.
If i know that freesync monitors do the same job as g-sync now Nvidia has to give me a reason why I should pay that extra G-Sync module in the future ?
Posted on Reply
#218
FordGT90Concept
"I go fast!1!11!1!"
c2DDragon_less input lag
Enhanced Sync + FreeSync < V-Sync + G-Sync

Think about it: the signal is literally being processed by two GPUs (the graphics card and the GPU known as the G-Sync Module in the monitor itself). There's really no circumstance where G-Sync should be faster than FreeSync unless the graphics card driving it is faster. That's no fault of the technology though.
Posted on Reply
#219
GoldenX
Doesn't G-Sync save a bit of VRAM compared to Freesync? You use the RAM in the G-Sync module for frame buffering instead of your GPU's.
Posted on Reply
#220
FordGT90Concept
"I go fast!1!11!1!"
AMD cards are not short on VRAM so it's not an issue. Also, you're talking at most one frame. At 10-bit per color, that's maximum 45 MiB at 4K--a pittance.
Posted on Reply
#221
c2DDragon
FordGT90Concept
Enhanced Sync + FreeSync < V-Sync + G-Sync

Think about it: the signal is literally being processed by two GPUs (the graphics card and the GPU known as the G-Sync Module in the monitor itself). There's really no circumstance where G-Sync should be faster than FreeSync unless the graphics card driving it is faster. That's no fault of the technology though.
Well it's marketing I have read :) Scam it was then.
Those tests make people think. I never had trust in this G-sync tech by the way.
(real) Competitive people will not play in 4k@60fps@60Hz with G-sync & Vsync ON for sure so the little input lag will never be noticeable with a recent monitor, in games where a graphic card might struggle to keep the fps high. I say this cause input lag can be an issue in competitive games only. I don't know anybody who can notice it even on a TV screen as it is about ghosting but cyborgs might exist :D
No way can somebody play a competitive game with ultra graphics if it kills the min/max fps but lies have been told about G-Sync module for sure.

By the way I have read it's better to let nVidia Vsync to default in the nVidia control panel BUT put it OFF in games so G-Sync will do the job as intended.
AnimalpakIf i know that freesync monitors do the same job as g-sync now Nvidia has to give me a reason why I should pay that extra G-Sync module in the future ?
It appears there is a sort of certification, they said the screens pass like 300 tests for quality, so you would pay like $200 extra to be sure the "thing" works.



Now they tested the Freesync panels, I see no reason if you can see the monitor you want pass the nVidia tests (maybe not 300 I don't know about this I didn't read much about those). They report the Acer XG270HU as a 2560x1080 panel...it's 2560x1440 so you can see a lack of professionalism.
I do understand why they did lock G-Sync on monitors without the module : 1. Money / 2. Less bugs to fix
Now let's remember ATI hmm AMD said you could use different cards to pair in your computer for calculation (it could be games or anything) like a 980Ti + a RX 560X for example. Like a SLI without any synchronization. It's all about drivers. Imagine you could pair your 2080Ti with a 1080Ti, I don't see why you could not. Drivers just let you use 1 of the cards for PhysX but think about it, it's a driver lock.
Posted on Reply
#222
RichF
Vayra86So, please, dear god... talk about substance then. What solutions do you propose?
Thanks for merely recycling the argument I rebutted, ignoring the substantive rebuttal.

My entire previous post, since you didn't notice, was a rebuttal to the "What solutions do you propose?" bit. Recycling a distraction instead of rebutting a post is another tired rhetorical strategy.

Also, trying to condemn me because of who decides to like my posts is really cheesy, along with language like "dear god".
Posted on Reply
#223
Vayra86
RichFThanks for merely recycling the argument I rebutted, ignoring the substantive rebuttal.

My entire previous post, since you didn't notice, was a rebuttal to the "What solutions do you propose?" bit. Recycling a distraction instead of rebutting a post is another tired rhetorical strategy.

Also, trying to condemn me because of who decides to like my posts is really cheesy, along with language like "dear god".
That is a rather complicated way of saying 'I was actually adding the same nonsense to this topic as the guy I was correcting'... and this post is another one. Your walls of text are pointless to read if there's nothing in them.
Posted on Reply
#224
RichF
Vayra86That is a rather complicated way of saying 'I was actually adding the same nonsense to this topic as the guy I was correcting'... and this post is another one..
Still no rebuttal or substantive response to my posts, then.

Maybe a few more posturing bits like "dear god" and bolded text will increase the relevance.

Oh, I see you've added one: "walls of text are pointless". Since it's clear you have nothing on-topic to discuss I'm out.
Posted on Reply
#225
Vayra86
RichFStill no rebuttal or substantive response to my posts, then.

Maybe a few more posturing bits like "dear god" and bolded text will increase the relevance.
This is what I and others were wondering about:

www.techpowerup.com/forums/threads/nvidia-g-sync-now-supports-freesync-vesa-adaptive-sync-technology.251237/page-6#post-3971901

There is no substance here. Just a lot of words to convey the fact you don't like the tone of this discussion. OK. We got the memo - except everyone responding to you was already asking for that substance, including myself. We were listening - waiting, for you to make your point or drive it home. My 'dear god' got in there because after a few responses from others, we we were still not clear on the point of the post I linked above.

You can leave whenever you want to... but all this was is a bit of miscommunication. Literally nobody responded to your post's content, its up to you to figure out why. Or you can turn around and leave. All the same to me...
Posted on Reply
Add your own comment
Apr 23rd, 2024 22:26 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts