• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA G-SYNC now Supports FreeSync/VESA Adaptive-Sync Technology

It's 4:30 PM January 14 in Sunnyvale, CA. Wait at least 17 more hours. :P
 
Almost 2pm on the 15th here in Perth Australia, I want this driver dammit!

Well since owning a Freesync monitor it's already been 18 months, another 18 hours wont hurt.
 
I'm le tired and want naptime, so they need to hurry up :P
 
I'm le tired and want naptime, so they need to hurry up :p
Here they are!! Are available in this moment.

Almost 2pm on the 15th here in Perth Australia, I want this driver dammit!

Well since owning a Freesync monitor it's already been 18 months, another 18 hours wont hurt.
Are you there? The driver is ready now!!!
 
yeeeee boooooi
 
GTX 1080 here running on an AG271QX monitor (freesync range 30-144hz).

G-Sync was enabled by default. No bad stuff happening, no blanking of the screen or pulsing.

Honestly I can't tell if it's working.

The old G-SYNC Pendulum Demo does not work. (Obviously it's outdated - released in 2015).
 
Is anybody with news
GTX 1080 here running on an AG271QX monitor (freesync range 30-144hz).

G-Sync was enabled by default. No bad stuff happening, no blanking of the screen or pulsing.

Honestly I can't tell if it's working.

The old G-SYNC Pendulum Demo does not work. (Obviously it's outdated - released in 2015).

Thank you. I have the same configuration at home with yours, but in this moment I am at the office.
These are good news and in a way expected.
 
enabled and working fine - i can go into the settings menu of my monitor and see res/refresh info in realtime and it shows the refresh flickering all over the place, so its a green light for me

In the one game i've tested (supreme commander: forged alliance, which has a 100FPS cap) i'm seeing some very faint flickering of brightness - but its only noticeable if i look for it really hard in dim background areas
 
enabled and working fine - i can go into the settings menu of my monitor and see res/refresh info in realtime and it shows the refresh flickering all over the place, so its a green light for me

In the one game i've tested (supreme commander: forged alliance, which has a 100FPS cap) i'm seeing some very faint flickering of brightness - but its only noticeable if i look for it really hard in dim background areas

The HP 14402 is yours?
 
Hmmm. Still no real confirmation it's working yet.

At least I have no negative issues.

Everything is smooth, but it looked that way before :)

G-sync.jpg
 
Well it will increase sales for nVidia cards since AMD has no horse in the race in the top 4 tiers. However, with no alternative for MBR tech (ULMB) anyone looking to stay above 60 fps won't care.
 
Hmmm. Still no real confirmation it's working yet.

At least I have no negative issues.

Everything is smooth, but it looked that way before :)

View attachment 114563
It seems there is an indicator :
ximg_57cf562a0932e.png.pagespeed.gp+jp+jw+pj+ws+js+rj+rp+rw+ri+cp+md.ic.CX5Src5Pc0.png

Not tested yet with my XG270HU but for sure I will.

Edit : It's working like a charm !
 
Last edited:
G-Sync is a dedicated hardware ( with some similar characteristics of a GPU ) module installed behind the LCD screen, so explain to me why now monitors that dont have it now can do the same thing ?
 
G-Sync is a dedicated hardware ( with some similar characteristics of a GPU ) module installed behind the LCD screen, so explain to me why now monitors that dont have it now can do the same thing ?
G-sync just adds :
_less input lag
_an overdrive (well hmm ok lol)
_refresh rate starts lower than freesync or adaptative sync
_profits ? (for nVidia of course)
It's qualified as high end but it's just a marketing stuff which adds like $200 for something you may not be able to notice compared to freesync screens.
The thing is G-sync screens may have less (to no) ghosting compared to some Freesync 1 stuff, I don't know about adaptative sync screens. It can be not noticeable for some out of a bench by the way.

Edit : I just want to tell I'm not a fanboy or anything.
The only ATI (before AMD for young people) I owned was a X800GTO and the screen I bought was just for 1440p @ 144Hz, no way I planned to use Freesync on it. Now nVidia permits to use G-sync on it, I'm happy cause no way I would have bought a G-sync screen for this kind of small feature.
Small feature because I think it's only good when you get lower than 60fps, to permit the experience to remain smooth. For real right now I tried in The Division & AC Origins but I only get lower than 60 in The Division with nvidia custom shadows in some areas. I just used those shadows for the test and the sync is working good, by the way it's a no go for me to have less than 70fps (I picked back the ultra shadows instead of special ones) so I don't feel the need of the feesync/g-sync feature, I added it because why not ? It will be good for non optimized console port games I guess...before I switch the 1080Ti for something more powerful.
 
Last edited:
G-sync just adds :
_less input lag
_an overdrive (well hmm ok lol)
_refresh rate starts lower than freesync or adaptative sync
_profits ? (for nVidia of course)
It's qualified as high end but it's just a marketing stuff which adds like $200 for something you may not be able to notice compared to freesync screens.
The thing is G-sync screens may have less (to no) ghosting compared to some Freesync 1 stuff, I don't know about adaptative sync screens. It can be not noticeable for some out of a bench by the way.

Edit : I just want to tell I'm not a fanboy or anything.
The only ATI (before AMD for young people) I owned was a X800GTO and the screen I bought was just for 1440p @ 144Hz, no way I planned to use Freesync on it. Now nVidia permits to use G-sync on it, I'm happy cause no way I would have bought a G-sync screen for this kind of small feature.
Small feature because I think it's only good when you get lower than 60fps, to permit the experience to remain smooth. For real right now I tried in The Division & AC Origins but I only get lower than 60 in The Division with nvidia custom shadows in some areas. I just used those shadows for the test and the sync is working good, by the way it's a no go for me to have less than 70fps (I picked back the ultra shadows instead of special ones) so I don't feel the need of the feesync/g-sync feature, I added it because why not ? It will be good for non optimized console port games I guess...before I switch the 1080Ti for something more powerful.

If i know that freesync monitors do the same job as g-sync now Nvidia has to give me a reason why I should pay that extra G-Sync module in the future ?
 
_less input lag
Enhanced Sync + FreeSync < V-Sync + G-Sync

Think about it: the signal is literally being processed by two GPUs (the graphics card and the GPU known as the G-Sync Module in the monitor itself). There's really no circumstance where G-Sync should be faster than FreeSync unless the graphics card driving it is faster. That's no fault of the technology though.
 
Last edited:
Doesn't G-Sync save a bit of VRAM compared to Freesync? You use the RAM in the G-Sync module for frame buffering instead of your GPU's.
 
AMD cards are not short on VRAM so it's not an issue. Also, you're talking at most one frame. At 10-bit per color, that's maximum 45 MiB at 4K--a pittance.
 
Enhanced Sync + FreeSync < V-Sync + G-Sync

Think about it: the signal is literally being processed by two GPUs (the graphics card and the GPU known as the G-Sync Module in the monitor itself). There's really no circumstance where G-Sync should be faster than FreeSync unless the graphics card driving it is faster. That's no fault of the technology though.
Well it's marketing I have read :) Scam it was then.
Those tests make people think. I never had trust in this G-sync tech by the way.
(real) Competitive people will not play in 4k@60fps@60Hz with G-sync & Vsync ON for sure so the little input lag will never be noticeable with a recent monitor, in games where a graphic card might struggle to keep the fps high. I say this cause input lag can be an issue in competitive games only. I don't know anybody who can notice it even on a TV screen as it is about ghosting but cyborgs might exist :D
No way can somebody play a competitive game with ultra graphics if it kills the min/max fps but lies have been told about G-Sync module for sure.

By the way I have read it's better to let nVidia Vsync to default in the nVidia control panel BUT put it OFF in games so G-Sync will do the job as intended.

If i know that freesync monitors do the same job as g-sync now Nvidia has to give me a reason why I should pay that extra G-Sync module in the future ?
It appears there is a sort of certification, they said the screens pass like 300 tests for quality, so you would pay like $200 extra to be sure the "thing" works.

nvidia-g-sync-monitor-stack-comparison.png


Now they tested the Freesync panels, I see no reason if you can see the monitor you want pass the nVidia tests (maybe not 300 I don't know about this I didn't read much about those). They report the Acer XG270HU as a 2560x1080 panel...it's 2560x1440 so you can see a lack of professionalism.
I do understand why they did lock G-Sync on monitors without the module : 1. Money / 2. Less bugs to fix
Now let's remember ATI hmm AMD said you could use different cards to pair in your computer for calculation (it could be games or anything) like a 980Ti + a RX 560X for example. Like a SLI without any synchronization. It's all about drivers. Imagine you could pair your 2080Ti with a 1080Ti, I don't see why you could not. Drivers just let you use 1 of the cards for PhysX but think about it, it's a driver lock.
 
Last edited:
So, please, dear god... talk about substance then. What solutions do you propose?
Thanks for merely recycling the argument I rebutted, ignoring the substantive rebuttal.

My entire previous post, since you didn't notice, was a rebuttal to the "What solutions do you propose?" bit. Recycling a distraction instead of rebutting a post is another tired rhetorical strategy.

Also, trying to condemn me because of who decides to like my posts is really cheesy, along with language like "dear god".
 
Back
Top