• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA to Open G-Sync Monitors to VRR Support, Enabling AMD Graphics Cards Support

Meaning the game's own thread(s) caps the frame rate.

Capping the framerate does not eliminate tearing nor does it make up for a variable refresh rate.
 
Different monitor panels will make more difference than the tech really.

Gsync modules do offer some advantages, but they are so small the panel should matter exponentially more.

It really depends on the game... not much with the monitor. For example, when I get a frame drop of 30 fps when a large battle occurs in total war games, and it jumps up high when i zoom in, its flawless smooth with freesync... no new monitor tech is going to change that... so no he is simply wrong. Freesync still matters and helps.
 
NV are forces to do this in the first place because of competition if not people wont buy Gsync because its expensive. I bet you can't even list that 'countless Freesync trash' you mentioned.

Easy, if it ain't G-Sync compatible... it's trash.
 
Capping the framerate does not eliminate tearing nor does it make up for a variable refresh rate.

Games that let you limit refresh rate lower than what your monitor can do, does prevent tearing and VRR isn't even necessary. I could be glossing over a few edge cases, but most games I play today support in game limits and I replaced my Asus GSYNC with a Samsung 32" non-gsync and I 1) play without Vsync 2) use in-game limits 3) never get tearing or input lag.

Could be there are other benefits, but I absolutely don't miss Gsync. I can actually enable Gsync now with latest drivers, but I see no benefits, in fact enabling FreeSync on my Samsung messes with/lowers the brightness for some damn reason, so there's just no reason for me to enable it to use it in Gsync-compatible mode.
 
So they are finally realizing open standards are better than what they offered? Congrats.

Also. Shouldn't the news read, Nvidia admits defeat in VRR arena, paving the way for open and easy to implement standards, the way it was meant to be played.
 
Games that let you limit refresh rate lower than what your monitor can do, does prevent tearing and VRR isn't even necessary.

Nope, it doesn't work like that. Tearing doesn't happen when a game's frame rate goes above the monitor's refresh rate, this is a myth that for some inexplicable reason is still perpetuated to this day.

Tearing happens when the video output is not in sync with the refresh rate, no matter how you limit the frame rate, unless some form of synchronization exists, you'll always get tearing.
 
Games that let you limit refresh rate lower than what your monitor can do, does prevent tearing and VRR isn't even necessary.
Without VRR a monitor running 120hz can not evenly display 77hz. You will have tearing. That's just how this works.
 
lmao when your so far ahead of the competition that you support there hardware better than they do
 
If only Nvidia was thinking like that 10 years ago, we would have second mid/low range graphics cards in our PCs just for PhysX and the rest of GameWorks effects.
 
I was hoping this was nvidia finally announcing that they'll allow pseudo-Freesync via G-Sync over HDMI instead of just over Displayport. Well, besides on LG OLED TV's. I guess they'll wait till NEXT YEAR for that announcement.

No need to wait, according to someone here on the forums, the new Alienware monitor already has this module and as per the source link, so does at least one Acer model.
 
How do I put this ... AMD's FreeSync was open for everyone from day one, including for Nvidia. It's even in the name : FreeSync

Hence my,statement. AMD brought it to us for free. If it was for nvidia we'd still pay $200 premium for a VRR monitor.
 
Oh NV you said the Gsync gunked up the hardware and broke the built in VRR tech in Displayport.

Guess it wasn't true afterall... good thing I never bought or recommended any display crippled with a Gunkified display controller.
 
The new Gsync monitors will support AMD cards? That is OK but will we still have to pay a premium for the Gsync module in these monitors? That is the question.
Anyway it is weird that NV is going free on this stuff. Wonder what makes NV do that. It's not the fear of the Gsync modules or monitors equipped with these modules not being sold well enough for sure. There must be some other reasons here.
Oh NV you said the Gsync gunked up the hardware and broke the built in VRR tech in Displayport.

Guess it wasn't true afterall... good thing I never bought or recommended any display crippled with a Gunkified display controller.
Oh, of course it wasn't. Grow up :D
 
Add this one to the countless technological advances brought forth by AMD.
Neither VRR nor Adaptive Sync are created by AMD. AMD just put the FreeSync sticker on them (it's a little more complicated than that, but that's the gist of it).
But as an implementer and promoter of said techs, yes, AMD did most of the heavy lifting.
 
Neither VRR nor Adaptive Sync are created by AMD. AMD just put the FreeSync sticker on them (it's a little more complicated than that, but that's the gist of it).
But as an implementer and promoter of said techs, yes, AMD did most of the heavy lifting.
Not entirely correct if you talk about Adaptive sync protocol which was created by VESA over DiplayPort. Free sync on the other hand is slightly different although is using the adaptive sync protocol. (Standard protocol) On top of that Adaptive sync is dedicated to Display port only and Free sync runs over HDMI as well. AMD uses the protocol but offers way more merit than what adaptive sync offers. So not entirely true with the "AMD just put the free sync sticker on it" since AMD has contributed to the further development of the Adaptive sync protocol.
 
Nice. I do wonder what the motivation behind this is.
Mixing in shit and expecting bigger revenues.

That is OK but will we still have to pay a premium for the Gsync module in these monitors? That is the question.
Lol that you even ask.

And imagine all the wonderful possibilities of crippling competitor's GPU that one has in that scenario.
 
Uh... I love AMD too and all, but NVIDIA opened their modules to AMD, not vice versa.

Maybe because people weren't buying overpriced G-Sync monitors, as the Freesync models were as good as those and cost much less? Plus NV prohibited Freesync via a software limitation. Very nice move to be honest. :O
 
Maybe because people weren't buying overpriced G-Sync monitors, as the Freesync models were as good as those and cost much less? Plus NV prohibited Freesync via a software limitation. Very nice move to be honest. :O
Exactly. And to open the modules they had to close them first. So no shock there and no contribution to what has been opened for the entire time. This NV move smells fishy to me anyway.
lol that you even ask.

And imagine all the wonderful possibilities of crippling competitor's GPU that one has in that scenario.
Good point. Why would NV do anything like that for? I Guess it's pretty simple.
 
Can we simply be glad that this split between VRRs is finally ending?

Amazing how some people always have to find something to complain about.
 
Can we simply be glad that this split between VRRs is finally ending?

Amazing how some people always have to find something to complain about.
There's still a split: VRR is for HDMI and AdaptiveSync is for DP. I'm not sure if we can get rid of that, but like you said, it's going the right way.
 
Opening up G-Sync is a step in the right direction but I still don't understand why G-Sync monitors are being made today.

Rather than this (now completely unnecessary) G-Sync/Freesync branding - monitor manufacturers should just list the refresh rate like this:

60Hz
48-60Hz
40-75Hz
120Hz
30-144Hz

etc....

It's not rocket-science to work out which ones have variable refresh rates and which ones don't and there's zero ambiguity. At the moment you have "144Hz monitors" that may or may not support VRR, the existence of which is often buried deep in the detailed specifactions and an afterthought, and you're lucky if the refresh rate range of VRR is even mentioned.
 
Why is no one mentioning variable overdrive? Freesync does not have this and is one of the reason it is inferior to G-Sync for the time being. Freesync has a fixed static overdrive.
 
Opening up G-Sync is a step in the right direction but I still don't understand why G-Sync monitors are being made today.

Rather than this (now completely unnecessary) G-Sync/Freesync branding - monitor manufacturers should just list the refresh rate like this:

60Hz
48-60Hz
40-75Hz
120Hz
30-144Hz

etc....

It's not rocket-science to work out which ones have variable refresh rates and which ones don't and there's zero ambiguity. At the moment you have "144Hz monitors" that may or may not support VRR, the existence of which is often buried deep in the detailed specifactions and an afterthought, and you're lucky if the refresh rate range of VRR is even mentioned.
While it's not rocket science to spot monitors having variable refresh rates, it takes some education to be able to spot the ones that have usable ranges (hint: 48-60 and 40-75 do not). G-Sync enforces that when a monitor has a range of refresh rates, that range is actually usable in practice. It also comes with ULMB, which is better than variable refresh in some cases.
Whether that's worth the cost of G-Sync is debatable, but I have listed a couple of reasons why G-Syns still exists.
 
While it's not rocket science to spot monitors having variable refresh rates, it takes some education to be able to spot the ones that have usable ranges (hint: 48-60 and 40-75 do not). G-Sync enforces that when a monitor has a range of refresh rates, that range is actually usable in practice. It also comes with ULMB, which is better than variable refresh in some cases.
Whether that's worth the cost of G-Sync is debatable, but I have listed a couple of reasons why G-Syns still exists.
I won't argue with that, but I'll STILL take a 48-60Hz over fixed 60Hz every single time. That's all my 4K television offers but it makes a huge difference even if it doesn't support LFC.

I'm looking at a slim/portable 14" laptop and I'm noticing one with a 3700U and 48-60Hz VRR panel. Being able to run a game at 48Hz instead of 60Hz is equivalent to a 25% performance boost when seen in the context of avoiding stuttering, which is the number one concern for me when low-end gaming. Hell, it might even be the sort of jump that allows running at native 1080p instead of non-native 720p. That sort of difference is HUGE.
 
Last edited:
Back
Top