• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA: G-SYNC Validation Runs into 94% failure rates

The thing with this comparison is that this is very clearly not the same panel. In that comparison AG241QG is objectively the worse monitor and it has nothing at all to do with VRR method. They barely test GSync/Freesync beyond stating it works. Note that AG241QX is also an excellent Freesync monitor, with 30-144 Hz VRR range, making it one of the handful (maybe 20) out there.

This is actually on topic - AG241QX is also GSync Compatible, one of these 28 out of 503.

When talking about panel quality here, Nvidia does not look too much into actual image quality. A lot of the VRR monitors are gaming-TN so that would not make much sense. They are looking into the parameters related to VRR - sufficient range, adaptive overdrive etc.
 
Last edited:
The thing with this comparison is that this is very clearly not the same panel. In that comparison AG241QG is objectively the worse monitor and it has nothing at all to do with VRR method. They barely test GSync/Freesync beyond stating it works. Note that AG241QX is also an excellent Freesync monitor, with 30-144 Hz VRR range, making it one of the handful (maybe 20) out there.

This is actually on topic - AG241QX is also GSync Compatible, one of these 28 out of 503.

When talking about panel quality here, Nvidia does not look too much into actual image quality. A lot of the VRR monitors are gaming-TN so that would not make much sense. They are looking into the parameters related to VRR - sufficient range, adaptive overdrive etc.

They both use the same AUO M238DTN01.0 panel.
 
LOL,NVIDIA pickd out the best 6 percent that are guaranteed to work as good as those sold at premium,ppl moan.
 
They both use the same AUO M238DTN01.0 panel.
What is it then, that specific monitor was crappy or is the variability in panel quality really this huge? One monitor is 2/3 of contrast and twice the black level of the other. Color accuracy differences are also by factor of two both default and calibrated. Still, none of this has anything to do with VRR method used.
 
If my monitor can display fine with Freesync/Adaptive sync, why cant it display Nvidias implementation?

Blah. Just blah. (I settled for 144Hz, 120FPS cap via RTSS)
 
This whole validation feels like a shitty marketing affair. It's an appeal that's supposed to say "AMD FREESYNC MONITORS DON'T MATCH OUR STANDARDS AND ARE INFERIOR!".
 
If my monitor can display fine with Freesync/Adaptive sync, why cant it display Nvidias implementation?
It can. Anything with DP Adaptive Sync will work with Nvidia cards now.
Failure rate (what a stupid clickbait headline) is for Gsync Compatible certification. Nvidia certifies that these monitors work according to the standard they set. Yes, this is far stricter than Freesync where the technical threshold for being Freesync is low and AMD is not even bothering to pursue manufacturers who do not bother checking with them about monitors advertised with Freesync.
This whole validation feels like a shitty marketing affair. It's an appeal that's supposed to say "AMD FREESYNC MONITORS DON'T MATCH OUR STANDARDS AND ARE INFERIOR!".
It is a shitty marketing affair. However, your sentence with caps is quite accurate.
 
One thing tough, TH didn't test power consumption of the two monitor. Like to know how much less power the G-Sync draws with module, as it is most important thing for buy a monitor. You may paying less now, but in future your electric bill may surpass the price difference between G-Sync and Freesync monitor.:rolleyes:
is there a difference? I have to say, I dont think a lot of people look at power consumption for monitors.... i dont know if there is any difference. If a anything I'd guess gsync uses mkre due to the hardware onboard to run it?
 
I'm pretty sure my monitor fails on VRR...I don't think it goes low enough.
I've noticed that it goes line-ish and blocky under 30fps. Good part about that is the only time that it happens is with benchmarks... Games never drop below 45fps.
 
If my monitor can display fine with Freesync/Adaptive sync, why cant it display Nvidias implementation?

Blah. Just blah. (I settled for 144Hz, 120FPS cap via RTSS)

Again, more than half failed just because they didn't have a good enough VRR range for nVidia. It isn't that it won't work, it is just that nVidia requires more than a 15Hz range.

This whole validation feels like a shitty marketing affair. It's an appeal that's supposed to say "AMD FREESYNC MONITORS DON'T MATCH OUR STANDARDS AND ARE INFERIOR!".

Yeah, that's exactly what it is supposed to say, because that's the truth.
 
This whole validation feels like a shitty marketing affair. It's an appeal that's supposed to say "AMD FREESYNC MONITORS DON'T MATCH OUR STANDARDS AND ARE INFERIOR!".
You can view the list here : https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/
My monitor is Freesync and is listed (XG270HU and they still didn't correct it but it's 2560x1440). I just picked this one for the specs 27" 144Hz, colors, white & black levels, everything that suits my taste but I had no intention to use Freesync at all. I guess they just throw away the garbage ones we can see on the market...like the dozen of monitors (curved, IPS, VA etc. from many manufacturers) I returned before giving ACER a try and getting this model years ago.
 
This whole validation feels like a shitty marketing affair. It's an appeal that's supposed to say "AMD FREESYNC MONITORS DON'T MATCH OUR STANDARDS AND ARE INFERIOR!".
And on the flip side, if NVIDIA didn't do this and offered G-Sync on FS monitors, you would have 94% of people who tried it complaining about it... is that really what you people want?
 
is there a difference? I have to say, I dont think a lot of people look at power consumption for monitors.... i dont know if there is any difference. If a anything I'd guess gsync uses mkre due to the hardware onboard to run it?

I think the emoji he used was a giveaway. G-Sync uses a (Nvidia: Tom Petersen called it) look-a-side buffer which the mem on the module are used for, 4k doubles the mem on the module. Some models use an active fan solution, HDR. One of the youtubers asked at Computex if they (I think it was Acer or Asus) was going to improve on the noise, A possible future revision.
 
Last edited:
Shouldn't Nvidia's GPU's need to be Adaptive Sync(Freesync) compitable, as those monitor already passed Adaptive Sync(Freesync) compitability test??
I think Nvidia trying to give bad name to Adaptive Sync(Freesync) when their superior GPU arcitecture(!!) dont support adaptive sync properly. Maybe thats why Nvidia need external module(thats comsume more power).!!! :rolleyes:

Like many things, the view depends very much on where you are sitting. If you are spending $125 - $200 on your GFX card, you are facing an entirely different situation than when you are spending $500 - $700. The problem is, you can't take a $200 mindset intpo a $500+ card monitor evaluation. To rephrase the words of of the indomitable Indigo Montoya ... when you say "G-Sync" and "hardware module" ... I have to respond ... "those words don't mean what you think they mean."

After all these years ... the illusion that G-Sync and Freesync are two names for "the same thing" remains. I still shake my head every time I hear this ... my wife still thinks she drives a 4WD SUV when she actually has an AWD and no amount of explanation will deter her from this viewpoint despite the fact that I have explainjed it dozens of times, showed her the written articles and .... towed her 4 times already with my 4WD SUV when she gets stuck in the snow. It's not a perfect analogy but best i can think of at the moment.

-G-Sync = 4WD - Toyota Land Cruiser / Jeep Wrangler (sends power to all four wheels equally with locking differentials and without vectoring ... each wheel will spin at the same constant rate as all the others ... the vehicles are normally in 2 WD or AWD mode ... to get to 4WD, you stop and switch to 4WD locked mode. Pick one or the other.

-Freesync = AWD - Honda CRV / Hyundai Tuscon ... no lock, no switching, system does best it can with what it's got. Both can run in AWD mode, only true 4WD can sitch from modes that are mutually exclusive.

A. Adaptive Sync - When initially tested against one another, the two methods of Adaptive Sync were described in reviews as follows:

a) Freesync works extremely well from 40 - 60 fps ..... after 60 fps, it still performed well but the impact on the user experience starts to tail off after 60 fps

b) G-sync works extremely well from 30 - 60 fps ..... after 60 fps, it still performed well but the impact on the user experience starts to tail off after 60 fps
Nvidia's "external module" for one, is not "external" .... and t has nothing to do with adaptive sync. When you use "G-Sync", the hardware module is disabled .... non-functional ... doing nothing .... might as well not be there.

B. Motion Blur Reduction (MBR) - G-sync monitors also include a hardware module that provides MBR technology called Ultra Low Motion Blur (ULMB). What this does is strobe the backlight to eliminate motion blur. You can see the effect here:


In the 1st part of a video you see a flash card like display similar to an old alarm clock; as it changes from 07 to 08 to 09 ... in alternating black and white as each new number appears, the old number's image is still on screen. What ULMB does is turn off the backlight between frames to get rid of the ghosted images. When nVidia released G-Sync, AMD took the "almost as good but cheaper" marketing route. And it is a great approach in the lower priced segments. With that approach , including the strobing hardware module was out of the question or the cost argument disappears. So Freesync provided the adaptive sync part but skipped the hardware part .... so Freesync users have no MBR option ... at least as part of the "Freesync package".

To counter that deficiency, many vendors included their own hardware module/ strobing technology (ASUS ELMB, BENQ DyAc , BENQ Blur Reduction, EIZO Turbo 240, EIZO Blue reduction, LG Motion 240 , LG Blur Reduction and Samsung) on some of those models; these were usually close in cost to the corresponding nVidia G-Sync models from same manufacturers. However, naive consumers went along believing Freesync = G-sync and never looked beyond the cost savings, why MBR matters and eschewing the more expensive Freesync models that included separate MBR technology. As a result, over time, the number of Freesync models so equipped has diminished. And again, this is just fine in the lower priced segments ... if you are producing 30 - 75 fps in the games you play, you have little to gain from a G-Sync monitor. So where does it matter ?

Again, G-sync adaptive sync and ULMB are mutually exclusive .... You can not use both at the same time. If you have G-Sync on and want to use MRR technology, you have to TURN G-SYNC OFF in order to use ULMB. To use G-Sync you have to turn off ULMB. In practice .... the point at which you might like to use ULMB instead of G-Sync is between 60 and 80 fps.

Say for example, I have a MSI RX 2080 OC'd to match TPU's test results.

-At 1080p, all but 1 game will see over 100 fps .... and the last is 88.7 ... wouldn't use adaptive sync here at all ... this is a territory where most usersd will prefer ULMB over G-Sync / Freesync

-At 1440p, all but 6 games are over 100 fps .... and 6 between 80 and 100 ...with Ghost Recon Wildlands @ 70.8, this is a territory where most usersd will prefer ULMB over G-Sync / Freesync for all games except maybe GRW ... would recommend trying both and see what liked better.

-At 2160p, only 4 games are over 100 fps .... and 7 more between 80 and 100 ...out of the 12 left, would play 7 of them using adaptive sync. So, out of 23 games....
... 11 games => ULMB
... 5 Games => Tossup between ULMB and Adaptive sync
... 7 Games => G-sync / Freesync
In short .... for, you're in a better position to be sitting w/ a G-Sync monitor for 16 of the 23 games ... For the other 7, G-Sync / Freesync wouldn't matter ... however in those games AMD ' best card is running 4 - 12 fps behind amobe those 38 - 60 fps games.

Again, obviously this example is in a price / performance niche in which AMD doesn't compete well .... hopefully this will change in the fall. But as we get down to the 1650 / 570 level .... this is a "whole 'nother ball game" ... most games are going to be in that 30 - 70 fps range in which case whether or not you can turn off adaptive sync and use MBR technology is not a realistic option.

In summary ... it's hard to make an argument that G-Sync / Freesync decision matters in the lower budget ranges. Unless you can consistently stay above 60 fps at your resolution it doesn't. However, if you are selecting a monitor capable of 100 MHz or more ... and a card that can drive ya system up around 75 fps half the time or more ... then it's hard for me to see current AMD Freesync options as an attractive solution.
 
And on the flip side, if NVIDIA didn't do this and offered G-Sync on FS monitors, you would have 94% of people who tried it complaining about it... is that really what you people want?
They would be stuck with a ton of support on monitors that don't work as intended which then people will instantly blame them for it.


1 monitor after what 4 years now and even then doubt 99% of people would even notice the difference. Looking at specs they both are listing same specs so is it really FS is better or is it just silicon lottery in play that FS monitor got a slight better panel off that production line?
 
They would be stuck with a ton of support on monitors that don't work as intended which then people will instantly blame them for it.



1 monitor after what 4 years now and even then doubt 99% of people would even notice the difference. Looking at specs they both are listing same specs so is it really FS is better or is it just silicon lottery in play that FS monitor got a slight better panel off that production line?
Yeah right, factory calibrated, yet counted as silicon lottery in play. I'm sure 99% of the population wouldn't even notice the color quality whereas frame rates outside the test window should immediately disqualify the weak... /S.
 
1 monitor after what 4 years now and even then doubt 99% of people would even notice the difference. Looking at specs they both are listing same specs so is it really FS is better or is it just silicon lottery in play that FS monitor got a slight better panel off that production line?

Seams like the default is to blame everything other than

Its silly to think only one side would have to deal with panel deviation issues. Especially the one side that has less resources and collaboration. Doesn't speak too well for their G-sync cert process they been marketing and sponsoring videos on.

When it comes to power consumption the ASUS ROG Swift PG278Q does use slightly more power than your typical 27-inch TN panel due to the G-Sync module being used.

We saw the standby power being roughly 10.2Watts with the system in sleep mode or totally turned off. We expected this measurement to be less than 0.5 Watts like the ASUS specifications said, but that wasn’t the case on our unit.
 
Last edited:
Doesn't speak too well for their G-sync cert process they been marketing and sponsoring videos on, if its dependent on panel lottery to be better then the competition. Its silly to think only one side would have to deal with panel deviation issues. Especially the one side that has less resources and collab.
Don't discount the fact the G-Sync module overwrites panel OSD settings - whomever wrote that nonsense didn't just bring image specifications at random.
 
I wonder what makes them fail the test as that's a lot of screens not making the grade....

EDIT - (Taken from the Guru3D link, text at the bottom)
To sum it up:

  • 28 passed
  • 202 failed for image quality or “other” issues
  • 273 failed for having insufficient Variable Refresh Rate ranges

I guess they have some rather strict 'rules' for what they require... Shame
They're not being strict at all. It was AMD that has been lax so far (trying to gain market share) and let manufacturers slap FreeSync stickers on everything, even when FreeSync was supported for such a narrow range of frequencies it never activated in practice.
The image quality issues were flickering or blank screens, but so far we have no clue whether that's the fault of the monitor or of the video card.

Yes, but only DisplayPort gives me the option even though VRR can work over the HDMI port for my monitor.

Ah, that confusion again. VRR only works over HDMI 2.1 which no card currently supports. It's FreeSync that works some magic to do VRR over older HDMI revisions.
(This is just about HDMI. We all know DP has standardized VRR a long time ago.)
 
Back
Top