Monday, November 25th 2019

NVIDIA to Open G-Sync Monitors to VRR Support, Enabling AMD Graphics Cards Support

In the wars of variable refresh rates, much ink has already been spilled regarding the open, AMD FreeSync approach and NVIDIA's proprietary G-Sync modules. The war started to give its first signs of abatement once NVIDIA seemed to throw in the towel by officially supporting VESA's VRR (Variable Refresh Rate) technology on its graphics cards, basically opening the way for NVIDIA graphics cards to correctly operate with previously AMD FreeSync-branded monitors. Now, it seems one more step will be taken on that road which should be G-Sync's proprietary approach final whiff, since according to a report from TFT Central, confirmed by NVIDIA, the company will enable VRR support for next releases of monitors equipped with the company's G-Sync module. This will essentially enable AMD graphics cards to work with NVIDIA-branded G-Sync monitors.
This move will only work for future monitor releases, mind you - a firmware update which will be distributed amongst monitor makers will enable the next releases of G-Sync to support VESA's VRR standard. This will not, apparently, be happening with already-released G-Sync modules, whether carrying NVIDIA's first take on the technology, nor the v2 G-Sync modules. It's not a perfect solution, and current adapters of G-Sync are still locked-in to NVIDIA graphics cards for VRR support on their monitors. It is, however, a definite step forward. Or a step backwards from a proprietary, apparently unneeded technology - you can really look at it either way.
Whether or not this makes sense from a product standpoint will only be understood once pricing on future NVIDIA G-Sync monitors surfaces - but we are finding it a hard sell for monitor makers to invest much in the G-Sync module going forward, since there are no practical, user-observable differences aside final product cost.
Source: TFT Central
Add your own comment

66 Comments on NVIDIA to Open G-Sync Monitors to VRR Support, Enabling AMD Graphics Cards Support

#51
Sybaris_Caesar
1) Freesync is VESA Adaptive Sync when run on AMD card. G-sync Compatible on Nvidia card.
2) Freesync over HDMI is AMD-specific port of VESA Adaptive Sync. Depending on HDMI version and bandwidth, there's upto 120hz HDMI Freesync monitors available.
3) G-sync over HDMI is Nvidia implementing HDMI Forum VRR spec in HDMI 2.1 and forward monitors. Since AMD is also a member of HDMI Forum, it's a only a matter of before AMD also implements it.
4) While no current graphics cards are HDMI 2.1, according to a reddit post some features of 2.1 can be back-ported to HDMI 2.0.
5) This news piece is about Nvidia opening up G-sync modules to take AMD Freesync signal in the future.

1) Freesync 1st gen is trash because there's no certification via AMD. Or if any monitors are certified (AMD's official list on their website) they aren't tested.
2) In this case, official G-sync Compatible monitors (from the list) are better bet because they were tested by Nvidia.
3) But being certified also means Nvidia doesn't want monitor OSD to have any mention of "Freesync". It should just show up on Nvidia driver if you plug it in. Not fiddle the OSD to turn on "Freesync" like my current monitor.
4) So Freesync monitors not certified are also G-sync Compatible compatible.
Posted on Reply
#52
Chrispy_
bugTo sum it up, G-Sync shouldn't exist because you'll take any amount of variable refresh? :p
Correction: I'll take any amount of variable refresh if the alternative is none :)

My main gaming display supports LFC with a reasonably-wide 48Hz VRR window and it's a significant improvement over 48-60Hz, just as 48-60Hz was a significant improvement over no VRR at all. Nvidia would like you to believe that VRR is a binary feature and that it's pointless having less than full G-Sync VRR support. I'm trying to educate people that Nvidia is wrong; Any VRR is better than no VRR.
Posted on Reply
#53
bug
Chrispy_Correction: I'll take any amount of variable refresh if the alternative is none :)

My main gaming display supports LFC with a reasonably-wide 48Hz VRR window and it's a significant improvement over 48-60Hz, just as 48-60Hz was a significant improvement over no VRR at all. Nvidia would like you to believe that VRR is a binary feature and that it's pointless having less than full G-Sync VRR support. I'm trying to educate people that Nvidia is wrong; Any VRR is better than no VRR.
I don't think Nvidia ever implied that's a binary feature.
If you're not paying for it, you'll probably take any amount. But if you're playing the G-Sync premium, I suspect you're not that easily pleased. Nvidia is simply guaranteeing you're getting something for your $$$.
Posted on Reply
#54
Prince Valiant
Bummer that older monitors can't get a FW update for their GS modules.
Posted on Reply
#55
Kaotik
xkm1948Nice. I do wonder what the motivation behind this is.
Consoles (at least for HDMI VRR, not sure for Adaptive-sync)
Posted on Reply
#56
Mouth of Sauron
In an unprecedented act of kindness... [quickly checks a local store; 56 FreeSync product types vs 10 G-Sync available]
Posted on Reply
#57
Vayra86
R-T-BSure but still irrelevant and OT.

People seem to think I have some side in this. They seem to forget I had an exclusively Ryzen/AMD system but a year ago, and love AMD as much as is healthy. That's not an excuse to bring them up where irrelevant, or worship them as anything more than what they are: A profit driven company.
Its a strange world where you have to support sensible arguments with the addendum that you have an AMD rig... man

People need to grow up already. FreeSync / Gsync and who did it better... its a different approach, be happy you had a choice to make instead of a green and red sticker that are identical in every way. Gsync was first and AMD followed. Gsync commanded a premium through quality and support that FreeSync could never achieve because its first incarnation was weak. We all know this, fanboying has no place. Both companies are in it for $$$ yes even AMD, because it is a mindshare battle too.

And guess what, now we have more options than ever and high refresh+VRR has even entered TV territory. What's not to like.
BArmsMeaning the game's own thread(s) caps the frame rate.
Got a source? As far as I know the game's thread is fundamentally capped by performance, which will vary all the time unless there is always ample headroom in the entire pipeline. I also fail to see the relation to manipulating the monitor refresh rate?

EDIT: just read the page full of similar responses, I think we can put that to bed.
Posted on Reply
#58
R-T-B
lynx29It really depends on the game... not much with the monitor. For example, when I get a frame drop of 30 fps when a large battle occurs in total war games, and it jumps up high when i zoom in, its flawless smooth with freesync... no new monitor tech is going to change that... so no he is simply wrong. Freesync still matters and helps.
Technically speaking, freesync should handle that worse than gsync because at such low fpses it will fall back to unsynced video.

ie, it may appear smooth, but screen tearing will return... Unsure I'd prefer that over NVIDIAs frame duplication thing the gsync module does.
Chrispy_Opening up G-Sync is a step in the right direction but I still don't understand why G-Sync monitors are being made today.

Rather than this (now completely unnecessary) G-Sync/Freesync branding - monitor manufacturers should just list the refresh rate like this:

60Hz
48-60Hz
40-75Hz
120Hz
30-144Hz

etc....

It's not rocket-science to work out which ones have variable refresh rates and which ones don't and there's zero ambiguity. At the moment you have "144Hz monitors" that may or may not support VRR, the existence of which is often buried deep in the detailed specifactions and an afterthought, and you're lucky if the refresh rate range of VRR is even mentioned.
Because the behavior of module based gsync and freesync is actually different at the low end exit point of the range.
Posted on Reply
#59
Space Lynx
Astronaut
I am using a freesync monitor right now with gsync compatible, and I have no tearing at all. feels same as when I had an official gsync screen. my range is 30 to 144 though, so maybe I just got a good monitor.
Posted on Reply
#60
bug
lynx29I am using a freesync monitor right now with gsync compatible, and I have no tearing at all. feels same as when I had an official gsync screen. my range is 30 to 144 though, so maybe I just got a good monitor.
See the comment above yours. The problem is with monitors having a narrow refresh range. FreeSync doesn't impose a range, whereas G-Sync does.
Posted on Reply
#61
R-T-B
lynx29I am using a freesync monitor right now with gsync compatible, and I have no tearing at all. feels same as when I had an official gsync screen. my range is 30 to 144 though, so maybe I just got a good monitor.
Yeah your range is pretty darn good too. That may have a lot to do with it.
Posted on Reply
#62
Cybrshrk
xkm1948Nice. I do wonder what the motivation behind this is.
The lg oled lineup and their implementation of hdmi 2.1 vrr pretty soon (like within a year or so) all gpu and gaming consoles will have a built in perfectly usable implementation (HDMI Forum VRR) via hdmi 2.1 chips they will all include and with that displays also coming on the market en masse with the feature just "built in" they know its time to open up and support the future. If not it'd be like them sticking to dvi after hdmi / dp came about.

I'm already asking advantage of hdmi 2.1 vrr with my 65c9 oled and I'll never go back and I don't think anyone else as a gamer will want to either once it's standard on all next Gen consoles and 2020 TV's.
BArmsGames that let you limit refresh rate lower than what your monitor can do, does prevent tearing and VRR isn't even necessary. I could be glossing over a few edge cases, but most games I play today support in game limits and I replaced my Asus GSYNC with a Samsung 32" non-gsync and I 1) play without Vsync 2) use in-game limits 3) never get tearing or input lag.

Could be there are other benefits, but I absolutely don't miss Gsync. I can actually enable Gsync now with latest drivers, but I see no benefits, in fact enabling FreeSync on my Samsung messes with/lowers the brightness for some damn reason, so there's just no reason for me to enable it to use it in Gsync-compatible mode.
You must be playing some pretty low spec game at sub 2k resolutions and at a max of 60 fps to always be at the cap you set.

Me I have a 2080 ti and and 4k oled with gsync compatibility and play many of today's best games and I cannot keep my refresh rate maxed out to the in game limits I set and fly anywhere between 75 to 120 and 1440p and 40 to 60 at 4k.

Without gsync even with in game limits set to my displays proper res minus a couple I'm still going to get tearing without gsync.
Posted on Reply
#63
Space Lynx
Astronaut
CybrshrkThe lg oled lineup and their implementation of hdmi 2.1 vrr pretty soon (like within a year or so) all gpu and gaming consoles will have a built in perfectly usable implementation (HDMI Forum VRR) via hdmi 2.1 chips they will all include and with that displays also coming on the market en masse with the feature just "built in" they know its time to open up and support the future. If not it'd be like them sticking to dvi after hdmi / dp came about.

I'm already asking advantage of hdmi 2.1 vrr with my 65c9 oled and I'll never go back and I don't think anyone else as a gamer will want to either once it's standard on all next Gen consoles and 2020 TV's.


You must be playing some pretty low spec game at sub 2k resolutions and at a max of 60 fps to always be at the cap you set.

Me I have a 2080 ti and and 4k oled with gsync compatibility and play many of today's best games and I cannot keep my refresh rate maxed out to the in game limits I set and fly anywhere between 75 to 120 and 1440p and 40 to 60 at 4k.

Without gsync even with in game limits set to my displays proper res minus a couple I'm still going to get tearing without gsync.
there are LG OLED monitor's out already that have gsync VRR hdmi 2.1. Linus just did a video on it last week, and put one in his personal house.
Posted on Reply
#64
Cybrshrk
lynx29there are LG OLED monitor's out already that have gsync VRR hdmi 2.1. Linus just did a video on it last week, and put one in his personal house.
I know this I have the lg c9 65" oled myself he got the lower end model the b9 from lg.

I was pissed when he put that Alienware in his living room cause he was talking crap about the lg OLED's t the time even though it was already announced at that time we would have vrr in just a couple weeks.

I had just picked mine up a month earlier and had a good feeling I would have something akin to gsync but figured it wouldn't be until the new year sometime before the new consoles imagine my surprise when Christmas came early lol.

Now linus is talking about going back to the lg oled in his living room and it's one of the reasons I hate these video these guys make where they take a sponsored product and put it in their home its not like they really chose this product cause it's the best. They chose it cause someone decided to give it to them possibly with a bag of money.

I was in the comments talking about how I got all the features of that Alienware for $3,500 less (I paid about $1700 for my 65" those are like $6,000) but everyone in the comments were just like links got rid of an oled for this it's obviously better.

Yea so much better he's replacing it with another oled less than 6 weeks later lol.
Posted on Reply
#65
evolucion8
BArmsI don't trust your eyes.
Wow so clueless lol
R-T-BTechnically speaking, freesync should handle that worse than gsync because at such low fpses it will fall back to unsynced video.

ie, it may appear smooth, but screen tearing will return... Unsure I'd prefer that over NVIDIAs frame duplication thing the gsync module does.



Because the behavior of module based gsync and freesync is actually different at the low end exit point of the range.
That feature is called LFC aka Low Frame Compensation and several Freesync monitors supports it, its meant to extend the VRR in the lower end of the spectrum. The Samsung C27FG70 27" QLED monitor supports it and allows for VRR to run as low as 40FPS and feels exactly as 60, once the 60FPS threshold is reached it stop and runs at the regular VRR all the way to 144Hz. Freesync 2 have several imposed standards that needs to be met or otherwise won't get certified compared to the more liberal Freesync 1.
Posted on Reply
#66
John Naylor
The thread starts with something that looks like it came out of Washingtn DC's Office of misinformation. Comparing G-Sync and Freesysnc is like comparing 4WD and AWD ... 99% of folks think they are the same thing. Nothing could be further from the truth.

My wife has AWD ... at least 3 times every winter, I tow her AWD out of the snow with my 4WD. Go off-road out in Moab w/ AWD ... only if ya wanna risk ya life. G-Sync and Freesync do what they do about equally from 40 - 75 fps. Hard to notice any difference between the two except nVidia seems to have an edge below 40 fps. But like 4WD where I can turn a switch on the dashboard and lock all 4 wheels, G-Syc can do something Freesync can't do... Motion Blur reduction (MBR) ,,,and that's one of the reasons why nvidia's market share if 5 times AMDs.

If I am at 75-80 fps of more, I have G-sync turned off and it's ULMB only for me. Waited 2 years for the 4k 144Hz panels and when they came to market w/o ULMB, I passed. Yes, you can buy Freesync monitors w/ MBR technology but it's not from AMD ... it's a hodpodge of different systems.

The move by nVidia hers is consistent with the GFX card strategy ... take the top spot, win mindshare and work you way down gobbling more and more market share. With the 7xx series, they had the top 2 tiers ... with 9xx they took another w/ the 970 ... with 10xx they took another with the 1060... with 2xx .... they have edged AMD in every market segment down to $200. AMD almsot held on with the 5700XT but when both cards are OCd ... nvidia has the edge ... in performance, power, heat and noise. They are doing the same thing w/ monitors. AMD had a niche that they owned in the budget market niche .... now the discussion in the boardroom isn't, as suggested "let's give up" .... that discussion is "here's a segment we haven't taken yet, let's jump in here too".
Posted on Reply
Add your own comment
May 11th, 2024 01:28 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts