Monday, November 25th 2019

NVIDIA to Open G-Sync Monitors to VRR Support, Enabling AMD Graphics Cards Support

In the wars of variable refresh rates, much ink has already been spilled regarding the open, AMD FreeSync approach and NVIDIA's proprietary G-Sync modules. The war started to give its first signs of abatement once NVIDIA seemed to throw in the towel by officially supporting VESA's VRR (Variable Refresh Rate) technology on its graphics cards, basically opening the way for NVIDIA graphics cards to correctly operate with previously AMD FreeSync-branded monitors. Now, it seems one more step will be taken on that road which should be G-Sync's proprietary approach final whiff, since according to a report from TFT Central, confirmed by NVIDIA, the company will enable VRR support for next releases of monitors equipped with the company's G-Sync module. This will essentially enable AMD graphics cards to work with NVIDIA-branded G-Sync monitors.
This move will only work for future monitor releases, mind you - a firmware update which will be distributed amongst monitor makers will enable the next releases of G-Sync to support VESA's VRR standard. This will not, apparently, be happening with already-released G-Sync modules, whether carrying NVIDIA's first take on the technology, nor the v2 G-Sync modules. It's not a perfect solution, and current adapters of G-Sync are still locked-in to NVIDIA graphics cards for VRR support on their monitors. It is, however, a definite step forward. Or a step backwards from a proprietary, apparently unneeded technology - you can really look at it either way.
Whether or not this makes sense from a product standpoint will only be understood once pricing on future NVIDIA G-Sync monitors surfaces - but we are finding it a hard sell for monitor makers to invest much in the G-Sync module going forward, since there are no practical, user-observable differences aside final product cost. Source: TFT Central
Add your own comment

66 Comments on NVIDIA to Open G-Sync Monitors to VRR Support, Enabling AMD Graphics Cards Support

#26
lynx29
R-T-B
Different monitor panels will make more difference than the tech really.

Gsync modules do offer some advantages, but they are so small the panel should matter exponentially more.
It really depends on the game... not much with the monitor. For example, when I get a frame drop of 30 fps when a large battle occurs in total war games, and it jumps up high when i zoom in, its flawless smooth with freesync... no new monitor tech is going to change that... so no he is simply wrong. Freesync still matters and helps.
Posted on Reply
#27
Fluffmeister
Apocalypsee
NV are forces to do this in the first place because of competition if not people wont buy Gsync because its expensive. I bet you can't even list that 'countless Freesync trash' you mentioned.
Easy, if it ain't G-Sync compatible... it's trash.
Posted on Reply
#28
BArms
Vya Domus
Capping the framerate does not eliminate tearing nor does it make up for a variable refresh rate.
Games that let you limit refresh rate lower than what your monitor can do, does prevent tearing and VRR isn't even necessary. I could be glossing over a few edge cases, but most games I play today support in game limits and I replaced my Asus GSYNC with a Samsung 32" non-gsync and I 1) play without Vsync 2) use in-game limits 3) never get tearing or input lag.

Could be there are other benefits, but I absolutely don't miss Gsync. I can actually enable Gsync now with latest drivers, but I see no benefits, in fact enabling FreeSync on my Samsung messes with/lowers the brightness for some damn reason, so there's just no reason for me to enable it to use it in Gsync-compatible mode.
Posted on Reply
#29
SIGSEGV
xkm1948
Nice. I do wonder what the motivation behind this is.
the motivation is ROFLMAO
Posted on Reply
#30
Steevo
So they are finally realizing open standards are better than what they offered? Congrats.

Also. Shouldn't the news read, Nvidia admits defeat in VRR arena, paving the way for open and easy to implement standards, the way it was meant to be played.
Posted on Reply
#31
Vya Domus
BArms
Games that let you limit refresh rate lower than what your monitor can do, does prevent tearing and VRR isn't even necessary.
Nope, it doesn't work like that. Tearing doesn't happen when a game's frame rate goes above the monitor's refresh rate, this is a myth that for some inexplicable reason is still perpetuated to this day.

Tearing happens when the video output is not in sync with the refresh rate, no matter how you limit the frame rate, unless some form of synchronization exists, you'll always get tearing.
Posted on Reply
#32
bubbleawsome
BArms
Games that let you limit refresh rate lower than what your monitor can do, does prevent tearing and VRR isn't even necessary.
Without VRR a monitor running 120hz can not evenly display 77hz. You will have tearing. That's just how this works.
Posted on Reply
#33
Midland Dog
lmao when your so far ahead of the competition that you support there hardware better than they do
Posted on Reply
#34
john_
If only Nvidia was thinking like that 10 years ago, we would have second mid/low range graphics cards in our PCs just for PhysX and the rest of GameWorks effects.
Posted on Reply
#35
TheLostSwede
HisDivineOrder
I was hoping this was nvidia finally announcing that they'll allow pseudo-Freesync via G-Sync over HDMI instead of just over Displayport. Well, besides on LG OLED TV's. I guess they'll wait till NEXT YEAR for that announcement.
No need to wait, according to someone here on the forums, the new Alienware monitor already has this module and as per the source link, so does at least one Acer model.
Posted on Reply
#36
zlobby
Vya Domus
How do I put this ... AMD's FreeSync was open for everyone from day one, including for Nvidia. It's even in the name : FreeSync
Hence my,statement. AMD brought it to us for free. If it was for nvidia we'd still pay $200 premium for a VRR monitor.
Posted on Reply
#37
gamefoo21
Oh NV you said the Gsync gunked up the hardware and broke the built in VRR tech in Displayport.

Guess it wasn't true afterall... good thing I never bought or recommended any display crippled with a Gunkified display controller.
Posted on Reply
#38
ratirt
The new Gsync monitors will support AMD cards? That is OK but will we still have to pay a premium for the Gsync module in these monitors? That is the question.
Anyway it is weird that NV is going free on this stuff. Wonder what makes NV do that. It's not the fear of the Gsync modules or monitors equipped with these modules not being sold well enough for sure. There must be some other reasons here.
gamefoo21
Oh NV you said the Gsync gunked up the hardware and broke the built in VRR tech in Displayport.

Guess it wasn't true afterall... good thing I never bought or recommended any display crippled with a Gunkified display controller.
Oh, of course it wasn't. Grow up :D
Posted on Reply
#39
bug
zlobby
Add this one to the countless technological advances brought forth by AMD.
Neither VRR nor Adaptive Sync are created by AMD. AMD just put the FreeSync sticker on them (it's a little more complicated than that, but that's the gist of it).
But as an implementer and promoter of said techs, yes, AMD did most of the heavy lifting.
Posted on Reply
#40
ratirt
bug
Neither VRR nor Adaptive Sync are created by AMD. AMD just put the FreeSync sticker on them (it's a little more complicated than that, but that's the gist of it).
But as an implementer and promoter of said techs, yes, AMD did most of the heavy lifting.
Not entirely correct if you talk about Adaptive sync protocol which was created by VESA over DiplayPort. Free sync on the other hand is slightly different although is using the adaptive sync protocol. (Standard protocol) On top of that Adaptive sync is dedicated to Display port only and Free sync runs over HDMI as well. AMD uses the protocol but offers way more merit than what adaptive sync offers. So not entirely true with the "AMD just put the free sync sticker on it" since AMD has contributed to the further development of the Adaptive sync protocol.
Posted on Reply
#41
medi01
xkm1948
Nice. I do wonder what the motivation behind this is.
Mixing in shit and expecting bigger revenues.

ratirt
That is OK but will we still have to pay a premium for the Gsync module in these monitors? That is the question.
Lol that you even ask.

And imagine all the wonderful possibilities of crippling competitor's GPU that one has in that scenario.
Posted on Reply
#42
B-Real
R-T-B
Uh... I love AMD too and all, but NVIDIA opened their modules to AMD, not vice versa.
Maybe because people weren't buying overpriced G-Sync monitors, as the Freesync models were as good as those and cost much less? Plus NV prohibited Freesync via a software limitation. Very nice move to be honest. :O
Posted on Reply
#43
ratirt
B-Real
Maybe because people weren't buying overpriced G-Sync monitors, as the Freesync models were as good as those and cost much less? Plus NV prohibited Freesync via a software limitation. Very nice move to be honest. :O
Exactly. And to open the modules they had to close them first. So no shock there and no contribution to what has been opened for the entire time. This NV move smells fishy to me anyway.
medi01
lol that you even ask.

And imagine all the wonderful possibilities of crippling competitor's GPU that one has in that scenario.
Good point. Why would NV do anything like that for? I Guess it's pretty simple.
Posted on Reply
#44
kings
Can we simply be glad that this split between VRRs is finally ending?

Amazing how some people always have to find something to complain about.
Posted on Reply
#45
bug
kings
Can we simply be glad that this split between VRRs is finally ending?

Amazing how some people always have to find something to complain about.
There's still a split: VRR is for HDMI and AdaptiveSync is for DP. I'm not sure if we can get rid of that, but like you said, it's going the right way.
Posted on Reply
#46
Chrispy_
Opening up G-Sync is a step in the right direction but I still don't understand why G-Sync monitors are being made today.

Rather than this (now completely unnecessary) G-Sync/Freesync branding - monitor manufacturers should just list the refresh rate like this:

60Hz
48-60Hz
40-75Hz
120Hz
30-144Hz

etc....

It's not rocket-science to work out which ones have variable refresh rates and which ones don't and there's zero ambiguity. At the moment you have "144Hz monitors" that may or may not support VRR, the existence of which is often buried deep in the detailed specifactions and an afterthought, and you're lucky if the refresh rate range of VRR is even mentioned.
Posted on Reply
#47
zoq
Why is no one mentioning variable overdrive? Freesync does not have this and is one of the reason it is inferior to G-Sync for the time being. Freesync has a fixed static overdrive.
Posted on Reply
#48
bug
Chrispy_
Opening up G-Sync is a step in the right direction but I still don't understand why G-Sync monitors are being made today.

Rather than this (now completely unnecessary) G-Sync/Freesync branding - monitor manufacturers should just list the refresh rate like this:

60Hz
48-60Hz
40-75Hz
120Hz
30-144Hz

etc....

It's not rocket-science to work out which ones have variable refresh rates and which ones don't and there's zero ambiguity. At the moment you have "144Hz monitors" that may or may not support VRR, the existence of which is often buried deep in the detailed specifactions and an afterthought, and you're lucky if the refresh rate range of VRR is even mentioned.
While it's not rocket science to spot monitors having variable refresh rates, it takes some education to be able to spot the ones that have usable ranges (hint: 48-60 and 40-75 do not). G-Sync enforces that when a monitor has a range of refresh rates, that range is actually usable in practice. It also comes with ULMB, which is better than variable refresh in some cases.
Whether that's worth the cost of G-Sync is debatable, but I have listed a couple of reasons why G-Syns still exists.
Posted on Reply
#49
Chrispy_
bug
While it's not rocket science to spot monitors having variable refresh rates, it takes some education to be able to spot the ones that have usable ranges (hint: 48-60 and 40-75 do not). G-Sync enforces that when a monitor has a range of refresh rates, that range is actually usable in practice. It also comes with ULMB, which is better than variable refresh in some cases.
Whether that's worth the cost of G-Sync is debatable, but I have listed a couple of reasons why G-Syns still exists.
I won't argue with that, but I'll STILL take a 48-60Hz over fixed 60Hz every single time. That's all my 4K television offers but it makes a huge difference even if it doesn't support LFC.

I'm looking at a slim/portable 14" laptop and I'm noticing one with a 3700U and 48-60Hz VRR panel. Being able to run a game at 48Hz instead of 60Hz is equivalent to a 25% performance boost when seen in the context of avoiding stuttering, which is the number one concern for me when low-end gaming. Hell, it might even be the sort of jump that allows running at native 1080p instead of non-native 720p. That sort of difference is HUGE.
Posted on Reply
#50
bug
Chrispy_
I won't argue with that, but I'll STILL take a 48-60Hz over fixed 60Hz every single time. That's all my 4K television offers but it makes a huge difference even if it doesn't support LFC.

I'm looking at a slim/portable 14" laptop and I'm noticing one with a 3700U and 48-60Hz VRR panel. Being able to run a game at 48Hz instead of 60Hz is equivalent to a 25% performance boost when seen in the context of avoiding stuttering, which is the number one concern for me when low-end gaming. Hell, it might even be the sort of jump that allows running at native 1080p instead of non-native 720p. That sort of difference is HUGE.
To sum it up, G-Sync shouldn't exist because you'll take any amount of variable refresh? :P
Posted on Reply
Add your own comment