• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA G-SYNC now Supports FreeSync/VESA Adaptive-Sync Technology

Nvidia says G-Sync requirement is 600 series (650Ti and up) or newer. So, basically starting Kepler. There is no reason to assume they will reduce support in some way. Older GPUs like 500 series (Fermi) did not have DisplayPort. Also, 600-series is as far back as their current driver support goes.
 
That's for GSYNC as in monitors equipped with the GSYNC module. We're talking about "GSYNC Compatible" which is what NVIDIA is calling their adaption of VESA adaptive sync standard. As Rahnak pointed out, "GSYNC Compatible" is only on Pascal and newer cards.
 
That's for GSYNC as in monitors equipped with the GSYNC module. We're talking about "GSYNC Compatible" which is what NVIDIA is calling their adaption of VESA adaptive sync standard. As Rahnak pointed out, "GSYNC Compatible" is only on Pascal and newer cards.
We'll have to wait and see. Nvidia does not say anywhere that it is limited to Pascal/Turing and there is no technical reason for anything like that. Just the opposite, there are good reasons for Nvidia to keep the list of GPUs supporting G-Sync stable.
 
I suspect that last monitor shown doesn't even has FreeSync certification.
 
Last edited:
I suspect that last monitor shown doesn't even has FreeSync certification.

It does. last one is a LG G-Series UW-C, Second to last is a Samsung C or K series if i remember the letters correctly.

Funny how in the last monitor which looks like a LG G-series UW-C when he moves the mouse at the end the blinking goes away.

We would have heard if LG had defected monitors by now. Those models have been out for 3yrs. The newer ones have a different base.
 
Last edited:
it is entirely possible that NVIDIA is trying to mislead the media too. Perhaps he should have pointed at the fourth monitor and said "this is your FreeSync monitor without AMD driver optimizations." If 400+ monitors are certified and only 12 work with NVIDIA cards without driver optimizations, NVIDIA has a lot of work ahead of them that I suspect they're going to go about lackadaisically.
 
it is entirely possible that NVIDIA is trying to mislead the media too. Perhaps he should have pointed at the fourth monitor and said "this is your FreeSync monitor without AMD driver optimizations." If 400+ monitors are certified and only 12 work with NVIDIA cards without driver optimizations, NVIDIA has a lot of work ahead of them that I suspect they're going to go about lackadaisically.

Well its Gordon. He doesn't hide his bias in favor of Nvidia. On his podcast its like a on-going joke that the rest of the guys and gal will roll their eyes while Gordon tries to convince them of what a great deal your getting. Like Toms Hardware "just buy it" but in podcast form. Its entertaining.

To his defense he did say "Nvidia told me" and "This is what they told me" He never says he tested it to verify.
 
He's just reading the label: "Non-validated" (by NVIDIA). I hope we'll see testing comparing AMD vs NVIDIA cards across several FreeSync certified monitors. It'll also be interesting to look at where things stand a year down the road: did NVIDIA/monitor manufacturers put in an honest effort to make them work or did NVIDIA phone it only slapping their label on FreeSync monitors that work out of the box.
 
He's just reading the label: "Non-validated" (by NVIDIA). I hope we'll see testing comparing AMD vs NVIDIA cards across several FreeSync certified monitors. It'll also be interesting to look at where things stand a year down the road: did NVIDIA/monitor manufacturers put in an honest effort to make them work or did NVIDIA phone it only slapping their label on FreeSync monitors that work out of the box.

Probably that. Remember Nvidia Tom Peterson said the G-Sync module has a, he called it a "look-aside buffer" for synchronizing.
 
Yeah, where GPU VRAM is the buffer in FreeSync. The GPU/driver does all of the tricks the GSYNC module does...and NVIDIA drivers seem to be lacking a lot of that.
 
it is entirely possible that NVIDIA is trying to mislead the media too. Perhaps he should have pointed at the fourth monitor and said "this is your FreeSync monitor without AMD driver optimizations."
What is being demonstrated is LFC running on a monitor that does not support it.
LG 34UM69G-B has 40-75Hz frequency range.

Nvidia has said they want 2.4 range to even consider a monitor being G-Sync Compatible. This is an example of a monitor that does not fit that requirement.
He probably should have pointed at that monitor and said "This is a crappy VRR monitor".

Just FYI - The LG 34UM69G-B is flat the one in the video has a curv to it.
Oh. You are right. It was mentioned in the Youtube comments and I did not check very well. The point remains though, it is more than likely a monitor with a too small VRR range.

There is a very good technical reason for the requirement of a wide enough frequency range. It needs to be at least 2 to be able to double the frames when FPS falls below frequency range. Exactly 2 is too small because frequency needs to be more dynamic and this causes pretty much exactly what is demonstrated in the video. So manufacturers use a higher requirement, AMD uses 2.5 for LFC in Freesync and Nvidia now says 2.4 for G-Sync Compatible. Unofficial solution to these problems for Freesync monitors has generally been to manually increase the monitors range definition and hope that monitor works fine with it. This is effectively monitor overclocking and not guaranteed.

This frame doubling is the crux of both AMD's LFC (Low FrameRate Compensation) and has been part of basic Nvidia's G-Sync spec from the start. Monitor can (or is tested, specced and guaranteed to) work with a certain frequency range. Minimal refresh rate is usually 30-40 Hz while maximal varies a lot - 75, 100, 120, 144, 165, 240 Hz are most common ones.

Variable Refresh Rate (VRR) uses this entire range as opposed to a fixed refresh rate but it still cannot go beyond the range. When FPS drops below the minimum supported refresh rate simple VRR method of fixing refresh rate to current FPS (yes, technically GPU will trigger a refresh but for high-level explanation this is close enough) will no longer work as monitor will not be able to refresh at too low a rate. The solution was to start doubling frames. For every frame coming from GPU monitor gets refreshed twice. For example, when game runs at 20 FPS, monitor refreshes at 40 Hz and each frame from GPU is shown twice on monitor. This doubling may be repeated again if necessary, for example 10 FPS on monitor with 40 Hz minimum refresh rate will get each frame shown 4 times.

This is a simple and elegant solution that is not really a problem with a real wide frequency range gaming monitor - for example, the initial GSync requirement was 30-144Hz with properly low minimum refresh rate and a wide range (maximum is 4.8 times minimum). This does become a problem on monitors with high minimum refresh rate and/or narrow frequency range. There have been a lot of Freesync monitors with ranges like 48-75 Hz which AMD never bothered to tackle in any way.

In practice, such monitor with 48-75 Hz range will work well and do VRR in 48-75 FPS range but not ouside of it. Given that these are less expensive monitors and are likely to be paired with less expensive GPUs, drops below this range will be noticeable and not benefit from VRR.
 
Last edited:
There have been a lot of Freesync monitors with ranges like 48-75 Hz which AMD never bothered to tackle in any way.
What is Enhanced Sync (mimics what G-Sync does < monitor refresh rate) and Frame Rate Target Control (minimal input lag cap not only saving power but also preventing tearing)? :P

I have to assume the panels they demo'd were running above the monitor's refresh range. If all it needed was FRTC set, then NVIDIA went full retard.
 
What is Enhanced Sync (mimics what G-Sync does < monitor refresh rate) and Frame Rate Target Control (minimal input lag cap not only saving power but also preventing tearing)? :p
Enhanced Sync (as well as its counterpart Fast Sync) is absolutely not what G-Sync/Freesync does. Enhanced/Fast Sync is useful when FPS you are getting is (much, for real use it's have to be 2+ times) larger than monitor frequency. With these monitor refresh will still be happening a fixed rate but the frame being shown at the point of refresh is the latest frame that GPU generated. The main intent is reducing input lag compared to Vsync. For example, when monitor refreshes at 60 Hz and game is running at 120 FPS GPU will be generating two frames for each monitor refresh period with the last one being shown. This should be compared to Vsynced situation where the first frame is shown and the first frame is is 8ms older. This is a little simplified but this is the idea. The other part is Vsync basically cutting framerate to half when FPS drops below monitor frequency but this is a different discussion.

Frame Rate target Control limits the maximum frame rate. This is useful depending on circumstances but is not really directly related to Variable Refresh Rate things. And it has no effect on what I described above because these issues occur at refresh rate minimum, not the maximum.

I have to assume the panels they demo'd were running above the monitor's refresh range. If all it needed was FRTC set, then NVIDIA went full retard.
I have not seen exact details anywhere but the problem is probably primarily with running below the refresh range. You are kind of right though in that frame doubling as the solution to this problem would lead to trying to run above the range. It has nothing to do with going full retard. This demonstrates - and very much correctly - this specific problem.
 
Last edited:
Enhanced Sync (as well as its counterpart Fast Sync) is absolutely not what G-Sync/Freesync does. Enhanced/Fast Sync is useful when FPS you are getting is (much, for real use it's have to be 2+ times) larger than monitor frequency. With these monitor refresh will still be happening a fixed rate but the frame being shown at the point of refresh is the latest frame that GPU generated. The main intent is reducing input lag compared to Vsync. For example, when monitor refreshes at 60 Hz and game is running at 120 FPS GPU will be generating two frames for each monitor refresh period with the last one being shown. This should be compared to Vsynced situation where the first frame is shown and the first frame is is 8ms older. This is a little simplified but this is the idea. The other part is Vsync basically cutting framerate to half when FPS drops below monitor frequency but this is a different discussion.
5:30
TL;DW: Enhanced Sync allows tearing in situations where it preferable (like at low frame rate) and attempts to eliminate tearing where it is not (high frame rate).

Enhanced Sync fills in the edge cases in FreeSync--they're meant to compliment each other.

If FastSync truly works like Enhanced Sync does, then Fast Sync should be enabled when driving any FreeSync monitor.

Frame Rate target Control limits the maximum frame rate. This is useful depending on circumstances but is not really directly related to Variable Refresh Rate things. And it has no effect on what I described above because these issues occur at refresh rate minimum, not the maximum.
Not "limits" (that is v-sync), it paces the card so the card is producing approximately as many frames as is needed. 60 fps = targets a new frame every 16.67 ms. 144 fps = targets a new frame every 6.94 ms. Because of targeting, it has less stutter than v-sync because the graphics card isn't sitting on a frame for potentially 16+ ms.

I have not seen exact details anywhere but the problem is probably primarily with running below the refresh range. You are kind of right though in that frame doubling as the solution to this problem would lead to trying to run above the range. It has nothing to do with going full retard. This demonstrates - and very much correctly - this specific problem.
AMD fixed it in time. The question is will NVIDIA?
 
Last edited:
Enhanced Sync allows tearing in situations where it preferable (like at low frame rate) and attempts to eliminate tearing where it is not (high frame rate).
VRR does not have tearing. If it does, it is a crappy VRR solution.

Not "limits" (that is v-sync), it paces the card so the card is producing approximately as many frames as is needed. 60 fps = targets a new frame every 16.67 ms. 144 fps = targets a new frame every 6.94 ms. Because of targeting, it has less stutter than v-sync.
Frame Rate Target Control is a frame limiter, pure and simple. That is 100% of what it does. It has less stutter than Vsync but also has tearing (which is what VSync is intended to prevent).
There are more appropriate solutions to run with Vsync while framerate is high and disabling Vsync when framerate drops below monitor refresh rate - Dynamic Vsync (AMD) or Adaptive Vsync (Nvidia).

AMD fixed it in time. The question is will NVIDIA?
Fixed what? This particular monitor? It is a crappy monitor VRR monitor and should not be used as such.
Nvidia decided from day one that VRR solution has to work from 0 to max refresh rate of monitor and made this a requirement. So the VRR FPS range for G-Sync monitors have always started from 0.
AMD did provide a method for this eventually with LFC but does not require it (well, does for FreeSync2 which is a different story).
 
Last edited:
VRR does not have tearing. If it does, it is a crappy VRR solution.
There are more appropriate solutions to run with Vsync while framerate is high and disabling Vsync when framerate drops below monitor refresh rate - Dynamic Vsync (AMD) or Adaptive Vsync (Nvidia).
Obviously didn't absorb the wisdom of Scott Wasson. :P

Frame Rate Target Control is a frame limiter, pure and simple. That is 100% of what it does. It has less stutter than Vsync but also has tearing (which is what VSync is intended to prevent).
Unrelated technologies. FRTC paces the production of frames. You need v-sync or enhanced sync to address tearing.

Fixed what? This particular monitor? It is a crappy monitor VRR monitor and should not be used as such.
Reviewers disagree (assuming the alleged model is correct):
https://www.amazon.com/LG-34UM69G-B-34-Inch-UltraWide-Reduction/dp/B06XFXX5JH
https://www.newegg.com/Product/Product.aspx?Item=N82E16824025514

Nvidia decided from day one that VRR solution has to work from 0 to max refresh rate of monitor and made this a requirement. So the VRR range for G-Sync monitors have always started from 0.
AMD did provide a method for this eventually with LFC but does not require it (well, does for FreeSync2 which is a different story).
With Enhanced Sync enabled, there should be a single tear at the most yet, what does the guy in the video complain about? "Blinking." Go look at the reviews again for the monitor. How many complaints of blinking are there? None?
blinking.png


It's clear these monitors are fine by FreeSync spec. NVIDIA's just not ready to drive them.


Edit: Found another wrench to throw into the mix: the LG monitor is DisplayPort 1.2 where most of those that are marked "GSYNC Compatible" are DisplayPort 1.4. Might have something to do with NVIDIA struggling to drive it.
 
Last edited:
Obviously didn't absorb the wisdom of Scott Wasson. :p
I have no idea who Scott Wasson is or the relevance here.
https://www.amd.com/en/technologies/frtc said:
Frame Rate Target Control (FRTC) is a new feature we’re introducing with the AMD Radeon™ Fury X graphics card, enabling users to set a target maximum frame rate when playing an application in full screen mode; the benefit being that FRTC can reduce GPU power consumption (great for games running at frame rates much higher than the display refresh rate) and therefore reduce heat generation and fan speeds/noise on the graphics card.
Even AMD is not claiming this to be anything other than frame limiter. You are right about needing something else for tearing and that is what I said.
I said crappy VRR monitor and I will stand by my statement.
It's clear these monitors are fine by FreeSync spec.
Oh, I agree. Its just that this stupid small range is OK by Freesync spec.

Edit:
Nvidia is not struggling to drive this monitor. They are unwilling to put their mark - 'G-Sync Compatible' in this case - on what they do not think is a good VRR monitor.
The monitor in the video is a showcase of what happens if they drive this monitor as they would drive a one with a wide enough refresh rate range. Probably.
 
Last edited:
I have no idea who Scott Wasson is or the relevance here.
The video you didn't watch. He's the product manager at AMD, co-founded Ars Technica and Tech Report.
 
What is Enhanced Sync (mimics what G-Sync does < monitor refresh rate)
TL;DW: Enhanced Sync allows tearing in situations where it preferable (like at low frame rate) and attempts to eliminate tearing where it is not (high frame rate).
Do you see the contradiction there?

FreeSync/G-Sync address situations up to the monitor refresh rate. Enhanced/Fast Sync address situations above monitor refresh rate, in case of both these technologies with the specific goal of minimizing input lag. Both Enhanced and Fast Sync are not without downsides, microstutter being the main problem. This is why a common recommendation for both Enhanced and Fast Sync is to have FPS at least 2 times as high as monitor refresh rate.
 
Last edited:
If you're looking for a high end Freesync monitor then you probably should buy sooner rather than later. I'd expect the price on high end Freesync monitors to increase substantially over the course of the next 1+ year as manufacturers jockey, and pay, for new g-sync compatible branding. Branding is not free.

Hopefully it won't affect mid-range Freesync monitor pricing too much... :cry:
 
Do you see the contradiction there?
If you don't want tearing below minimum refresh rate, enable v-sync.

FreeSync/G-Sync address situations up to the monitor refresh rate. Enhanced/Fast Sync address situations above monitor refresh rate, in case of both these technologies with the specific goal of minimizing input lag.
Enhanced Sync:
FPS > Hz: sends the most recent completed frame to the monitor
FPS < Hz: sends whatever it has (at most one tear between old frame and new frame)
It does both. Watch the damn video.

Both Enhanced and Fast Sync are not without downsides, microstutter being the main problem.
Not true of Enhanced Sync.

This is why a common recommendation for both Enhanced and Fast Sync is to have FPS at least 2 times as high as monitor refresh rate.
By whom? AMD doesn't give any recommendations for Enhanced Sync because it is designed to deal with all frame rates, fixed sync, and FreeSync.

If you're looking for a high end Freesync monitor then you probably should buy sooner rather than later. I'd expect the price on high end Freesync monitors to increase substantially over the course of the next 1+ year as manufacturers jockey, and pay, for new g-sync compatible branding. Branding is not free.
If NVIDIA demands money, they'll be sold as separate models with separate price structures (not unlike GSYNC now). It only takes one FreeSync monitor maintaining their low cost to make all of the rest fall in line. The monitor market is extremely competitive.
 
If you don't want tearing below minimum refresh rate, enable v-sync.
I would rather use some VRR thing.
But on the topic, you said Enhanced Sync mimics what Gsync does which is simply wrong. And your later comment while correct contradicts the first one directly.

Enhanced Sync:
FPS > Hz: sends the most recent completed frame to the monitor
FPS < Hz: sends whatever it has (at most one tear between old frame and new frame)
It does both. Watch the damn video.
Enhanced sync does effectively nothing when FPS < Hz.
It has little effect when one frame is completed during refresh period which can introduce some microstutter depending on exact timing.
It does awesome when FPS >> Hz is then the latest frame is really the latest.
 
Great move Nvidia , NOW Next move drop the prices of RTX Titan $1199 , RTX 2080Ti $649 , RTX 2080 $499 , RTX 2070 $349 , RTX 2060 $249
 
15th of jan here already, wheres my driiiveeeeeeers
 
Back
Top