• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA: G-SYNC Validation Runs into 94% failure rates

Joined
Jan 5, 2006
Messages
18,584 (2.62/day)
System Name AlderLake
Processor Intel i7 12700K P-Cores @ 5Ghz
Motherboard Gigabyte Z690 Aorus Master
Cooling Noctua NH-U12A 2 fans + Thermal Grizzly Kryonaut Extreme + 5 case fans
Memory 32GB DDR5 Corsair Dominator Platinum RGB 6000MT/s CL36
Video Card(s) MSI RTX 2070 Super Gaming X Trio
Storage Samsung 980 Pro 1TB + 970 Evo 500GB + 850 Pro 512GB + 860 Evo 1TB x2
Display(s) 23.8" Dell S2417DG 165Hz G-Sync 1440p
Case Be quiet! Silent Base 600 - Window
Audio Device(s) Panasonic SA-PMX94 / Realtek onboard + B&O speaker system / Harman Kardon Go + Play / Logitech G533
Power Supply Seasonic Focus Plus Gold 750W
Mouse Logitech MX Anywhere 2 Laser wireless
Keyboard RAPOO E9270P Black 5GHz wireless
Software Windows 11
Benchmark Scores Cinebench R23 (Single Core) 1936 @ stock Cinebench R23 (Multi Core) 23006 @ stock
Ever since NVIDIA opened up Adaptive-Sync in their drivers as G-Sync (compatible) the display monitor vendors can get a G-Sync label and validation.
In order to get that G-Sync label, the monitor manufacturers need to submit their display to NVIDIA for testing.

As it turns out, quite a number of monitors submitted to NVIDIA, do not pass their certification test. NVIDIA obviously sets the criteria for these test but roughly 95 percent of the currently 475 models tested, failed.


index.php


 
I wonder what makes them fail the test as that's a lot of screens not making the grade....

EDIT - (Taken from the Guru3D link, text at the bottom)
To sum it up:

  • 28 passed
  • 202 failed for image quality or “other” issues
  • 273 failed for having insufficient Variable Refresh Rate ranges

I guess they have some rather strict 'rules' for what they require... Shame
 
Cash deposit to nVidia not found, validation failed! :laugh:
 
Ever since NVIDIA opened up Adaptive-Sync in their drivers as G-Sync (compatible) the display monitor vendors can get a G-Sync label and validation.
In order to get that G-Sync label, the monitor manufacturers need to submit their display to NVIDIA for testing.

As it turns out, quite a number of monitors submitted to NVIDIA, do not pass their certification test. NVIDIA obviously sets the criteria for these test but roughly 95 percent of the currently 475 models tested, failed.


index.php


Shouldn't Nvidia's GPU's need to be Adaptive Sync(Freesync) compitable, as those monitor already passed Adaptive Sync(Freesync) compitability test??
I think Nvidia trying to give bad name to Adaptive Sync(Freesync) when their superior GPU arcitecture(!!) dont support adaptive sync properly. Maybe thats why Nvidia need external module(thats comsume more power).!!! :rolleyes:
 
I guess they have some rather strict 'rules' for what they require... Shame
Is it though? Or would you(consumers) like the experience to be as they intended?

I mean, we would all like a lot of these monitors to do it, I get that, however, I understand why the 'rules' are in place and think its a GOOD thing to have as opposed to being 'a shame' they are in place. :)
 
I guess they have some rather strict 'rules' for what they require... Shame

I agree, f*ck quality control, the consumer should just accept crap quality monitors and be happy they even get that.
 
Last edited:
Sounds like Nvidia is paving the way for their higher prices. If it works with card A, but not card B they blame the monitor, when there is already a industry standard that card A meets, but not card B.
 
I like the spin Nvidia applied here.

First, make it seem like they're 'almost compatible'
Then, come out concluding they're not

We've been taken for a ride here and the message is: buy Gsync.

How about this. Stick it up your arse and comply with standards - because now we dó know that you can.

Is it though? Or would you(consumers) like the experience to be as they intended?

I mean, we would all like a lot of these monitors to do it, I get that, however, I understand why the 'rules' are in place and think its a GOOD thing to have as opposed to being 'a shame' they are in place. :)

Rules are good if we would see Nvidia push for an improvement to FreeSync, since that is the de facto standard now. Now, rules are just being used as an extension of marketing.
 
Don't care, they have the option in the Nvidia control panel, just turn it on and everything is good.
 
I like the spin Nvidia applied here.

First, make it seem like they're 'almost compatible'
Then, come out concluding they're not

We've been taken for a ride here and the message is: buy Gsync.

How about this. Stick it up your arse and comply with standards - because now we dó know that you can.



Rules are good if we would see Nvidia push for an improvement to FreeSync, since that is the de facto standard now. Now, rules are just being used as an extension of marketing.

I think you are completely missing the point here that you can still use G-Sync with the failed monitors if you want. The G-Sync qualification is just a quality control scheme.

More than half the monitors failed because they didn't have a good enough VRR range, and that's a quality issue on the monitor, not nVidia.

One would hope the monitor manufacturers would take the quality rules that nVidia puts in place and attempt to make their monitors meet those quality standards. But with AMD slapping a Freesync label on any garbage monitor it tests that can do VRR with not other quality standards, I don't see that happening. Freesync is just the standard because it takes pretty much no effort to achieve Freesync certification. All a monitor maker has to do is support the already free ARR standard already built into the DisplayPort standard and AMD gives them the sticker to put on their monitor.
 
I think you are completely missing the point here that you can still use G-Sync with the failed monitors if you want. The G-Sync qualification is just a quality control scheme.

More than half the monitors failed because they didn't have a good enough VRR range, and that's a quality issue on the monitor, not nVidia.

One would hope the monitor manufacturers would take the quality rules that nVidia puts in place and attempt to make their monitors meet those quality standards. But with AMD slapping a Freesync label on any garbage monitor it tests that can do VRR with not other quality standards, I don't see that happening. Freesync is just the standard because it takes pretty much no effort to achieve Freesync certification. All a monitor maker has to do is support the already free ARR standard already built into the DisplayPort standard and AMD gives them the sticker to put on their monitor.

No I'm not missing that. We already knew that FreeSync monitors had a varying VRR range from the moment they got launched. Nvidia is stating the obvious, and the move should have been a different one if they had any goal of 'enabling Gsync'.

This is the exact same type of spin as giving Pascal cards RTX support. Act like the big enabler for us gamers, but all it really is, is marketing to push for expensive product. Its not entirely without advantages, I mean its cool they enable it in the first place, sure.

The obvious question here, is who benefits. If Nvidia wanted US to benefit from a supposedly generous move on their part, they would have partnered with AMD and sought to improve FreeSync as its part of a standard now to get monitors to a quality standard. You know, a bit like taking part in Vulkan development.

Then again like you say - reading this, its 'phase 1'. If the next intended phase is a round table with manufacturers about those results, and how to improve them structurally, I am all ears and convinced Nvidia is doing good here. As it is, they have a pretty strong argument now.
 
Last edited:
Is it though? Or would you(consumers) like the experience to be as they intended?

I mean, we would all like a lot of these monitors to do it, I get that, however, I understand why the 'rules' are in place and think its a GOOD thing to have as opposed to being 'a shame' they are in place. :)

It's a shame that the specs for GSync are (as it may seem) too high for most monitors, but in complete honesty, who would go for a Gsync panel over a Freesync panel because of the requirements it needs? I wish I knew the difference between the two monitors as I've never had a high refreshing monitor as yet and never owned a FreeSync or Gsync panel... Would you be able to tell the difference between the two?

I agree, f*ck quality control, the consumer should just accept crap quality monitors and be happy they even get that.

Well, we do put up with it from Intel (well kinda)... But I digress........

Another question would be, are the Gsync panels actually pushing monitor manufactures to make better panels or are they just like, nah, we'll do what we want instead? ... Just a thought....
 
Can't you still enable any monitor manually?

What's the issue, then?
 
Can't you still enable any monitor manually?

What's the issue, then?
Yes, but only DisplayPort gives me the option even though VRR can work over the HDMI port for my monitor.

Untitled.png


I've never had a high refreshing monitor as yet
A minor bump from 60 Hz to 75 Hz is noticeable to me. I can pick out when any monitor is refreshing at 60 Hz instead of a higher refresh rate.
 
Last edited:
Yes, but only DisplayPort gives me the option even though VRR can work over the HDMI port for my monitor.
What did we think about proprietary solutions, again?
 
If Nvidia wanted US to benefit from a supposedly generous move on their part, they would have partnered with AMD and sought to improve FreeSync as its part of a standard now to get monitors to a quality standard.

The problem is AMD obviously had no intention, and still doesn't have any intention of making FreeSync better. They just needed a marketing point to counter G-Sync and decided to rebrand a free technology that already existed and then pretend it was just as good as what their competitor was offering when it clearly isn't.
 
Last edited:
The problem is AMD obviously had, and still doesn't have any intention of making FreeSync better. They just needed a marketing point to counter G-Sync and decided to rebrand a free technology that already existed and then pretend it was just as good as what their competitor was offering when it clearly isn't.

Mobile G-sync says, "Hi"

People forget the eDP standard already had these abilities prior to the tech. It was the Desktop side that was slow to adapt. Nvidia was using the DP AUX channel (Nvidia Tom Petersen interview) to achieve the communication on the Desktop (It disabled any audio pass-through). AMD showed the eDP feature a few months after on a laptop and said they wanted to work with industry standards to desktop..
 
Last edited:
So the golden rule is: Buy a Freesync monitor that is G-Sync compatible.

All credit to Nv for weeding out the good from the trash.
 
Shouldn't Nvidia's GPU's need to be Adaptive Sync(Freesync) compitable, as those monitor already passed Adaptive Sync(Freesync) compitability test??
I think Nvidia trying to give bad name to Adaptive Sync(Freesync) when their superior GPU arcitecture(!!) dont support adaptive sync properly. Maybe thats why Nvidia need external module(thats comsume more power).!!! :rolleyes:
Consider that an "adaptive sync" monitor can be 45-60hz and still be validated as adaptive sync. The spec is pretty vague and wide open as to what can pass, nvidia always had a strict standard for what want panel to do. Most people forgot how bad freesync monitors were in first 6-12 months they were out. Even after that there were still ones that were bad.

Sounds like Nvidia is paving the way for their higher prices. If it works with card A, but not card B they blame the monitor, when there is already a industry standard that card A meets, but not card B.
It will still work on nvidia cards but if there is problems then nvidia isn't gonna deal with support of it since they already tested it and seen it didn't meet standards it needs to.

Rules are good if we would see Nvidia push for an improvement to FreeSync, since that is the de facto standard now. Now, rules are just being used as an extension of marketing.
Nvidia will push the makers to use better quality parts and better software, instead of getting what ever cheapest panel they can find then throw it all together in a box and sell it. Yes it will cost more cause parts cost more but higher quality monitor is what you get in the end and not cheapest part trash they can find.

One would hope the monitor manufacturers would take the quality rules that nVidia puts in place and attempt to make their monitors meet those quality standards. But with AMD slapping a Freesync label on any garbage monitor it tests that can do VRR with not other quality standards, I don't see that happening. Freesync is just the standard because it takes pretty much no effort to achieve Freesync certification. All a monitor maker has to do is support the already free ARR standard already built into the DisplayPort standard and AMD gives them the sticker to put on their monitor.
If a monitor had issues AMD would jump through hoops to fix it in their drivers instead of asking monitor maker to fix it on their end. Which leads to a lack of care on monitor companies side to fix what could be legit issues.

The obvious question here, is who benefits. If Nvidia wanted US to benefit from a supposedly generous move on their part, they would have partnered with AMD and sought to improve FreeSync as its part of a standard now to get monitors to a quality standard. You know, a bit like taking part in Vulkan development.
You mean that software AMD kept locked up to their cards for longest time used to be called mantle and never gave source out even when they claimed they would? 1 year later after they claimed they would open source it, still no source and they ended the project then handed it all over to the Khronos group? Intel tried to partner with them on it and got nothing so yea.

The problem is AMD obviously had, and still doesn't have any intention of making FreeSync better. They just needed a marketing point to counter G-Sync and decided to rebrand a free technology that already existed and then pretend it was just as good as what their competitor was offering when it clearly isn't.
AMD put the spec out there and left it up to monitor makers to figure everything out, hence why first few years of freesync monitors were mostly all trash as they only had a few months to figure any of it out then put a monitor on sale. We are not even sure how long nvidia worked on g-sync before launching it but sure they had a few years in to it.
 
Last edited:
Yes, but only DisplayPort gives me the option even though VRR can work over the HDMI port for my monitor.
That's because AMD supports HDMI VRR via an unofficial hack on the HDMI standard. Nvidia doesn't want to or can't implement it.
If you want true HDMI VRR, you need true HDMI 2.1 support.
 
That's because AMD supports HDMI VRR via an unofficial hack on the HDMI standard. Nvidia doesn't want to or can't implement it.
If you want true HDMI VRR, you need true HDMI 2.1 support.
Which is why Nvidia opened up to Adaptive Sync. People tend to forget AMD has put one foot in front of the other by launching FreeSync 2. They now have LFC and HDR10 at disposal. How far are we into the future as that becomes obsolete next to Nvidia Dolby Vision? AMD has played safe and struck nice deals.
 
Nvidia will push the makers to use better quality parts and better software, instead of getting what ever cheapest panel they can find then throw it all together in a box and sell it. Yes it will cost more cause parts cost more but higher quality monitor is what you get in the end and not cheapest part trash they can find.

Here is a recent comparison on the same panel

TH: Nvidia G-Sync vs. AMD FreeSync: Which Monitors Perform Better?

When we first conceived this experiment, we expected the result to swing in favor of the G-Sync monitor. G-Sync monitors usually have more features and perform a little better in our color and contrast tests. We hypothesized that while you’re paying a premium for G-Sync, those monitors perform slightly better in other areas related to image quality.

Our testing of the AOC Agon AG241QG and AG241QX proved that theory wrong.
Obviously, the FreeSync-based QX is superior in every way except for its maximum refresh rate. But why would one give up contrast and spend an extra $300 just to gain 21Hz that you can’t actually see when playing? Clearly, the AG241QX is the better choice.
 
Mobile G-sync says, "Hi"

People forget the eDP standard already had these abilities prior to the tech. It was the Desktop side that was slow to adapt. Nvidia was using the DP AUX channel (Nvidia Tom Petersen interview) to achieve the communication on the Desktop (It disabled any audio pass-through). AMD showed the eDP feature a few months after on a laptop and said they wanted to work with industry standards to desktop..

What does any of that have to do with what I said? Mobile G-Sync still required a panel that met quality standards of G-Sync.

AMD just took the ARR standard already in DP and rebranded it for themselves with no real quality standards.

If a monitor had issues AMD would jump through hoops to fix it in their drivers instead of asking monitor maker to fix it on their end. Which leads to a lack of care on monitor companies side to fix what could be legit issues.

HAHAHAHA, you have to be joking right? Have you been paying attention? AMD doesn't care about the quality of FreeSync. Half the monitors they certified don't even have a good VRR range, which is why the failed nVidia's qualification. There isn't anything AMD can even do about that in drivers. And they give even less shit about IQ issues when VRR is activated.

AMD put the spec out there and left it up to monitor makers to figure everything out, hence why first few years of freesync monitors were mostly all trash as they only had a few months to figure any of it out then put a monitor on sale. We are not even sure how long nvidia worked on g-sync before launching it but sure they had a few years in to it.

AMD didn't put any spec out there, they just used the DP Adaptive Sync standard that was already in place and rebranded it. Yes, adaptive sync was originally only in eDP, but the fact is AMD just took a tech that already existed and rebranded it, and it really relied on VESA to add the standard to the standard desktop DP. If that didn't happen, AMD would have been f'd. AMD didn't develop anything, and they put minimal effort into it. They also didn't give a shit about quality, they slapped Freesync labels on any trash monitor, which you pointed out.
 
Hehe, that's the beauty of this, more choice for their customers, they get to choose between the AOC Agon AG241QG and AG241QX, no need to pay a premium.

AMD left the door wiiiiiiiiiiiiiiide open for Nvidia.
 
Back
Top