Tuesday, January 19th 2021

NVIDIA Quietly Relaxes Certification Requirements for NVIDIA G-SYNC Ultimate Badge

UPDATED January 19th 2021: NVIDIA in a statement to Overclock3D had this to say on the issue:
Late last year we updated G-SYNC ULTIMATE to include new display technologies such as OLED and edge-lit LCDs.

All G-SYNC Ultimate displays are powered by advanced NVIDIA G-SYNC processors to deliver a fantastic gaming experience including lifelike HDR, stunning contract, cinematic colour and ultra-low latency gameplay. While the original G-SYNC Ultimate displays were 1000 nits with FALD, the newest displays, like OLED, deliver infinite contrast with only 600-700 nits, and advanced multi-zone edge-lit displays offer remarkable contrast with 600-700 nits. G-SYNC Ultimate was never defined by nits alone nor did it require a VESA DisplayHDR1000 certification. Regular G-SYNC displays are also powered by NVIDIA G-SYNC processors as well.

The ACER X34 S monitor was erroneously listed as G-SYNC ULTIMATE on the NVIDIA web site. It should be listed as "G-SYNC" and the web page is being corrected.
NVIDIA has silently updated their NVIDIA G-SYNC Ultimate requirements compared to their initial assertion. Born as a spin-off from NVIDIA's G-SYNC program, whose requirements have also been laxed compared to their initial requirements for a custom and expensive G-SYNC module that had to be incorporated in monitor designs, the G-SYNC Ultimate badge is supposed to denote the best of the best in the realm of PC monitors: PC monitors that feature NVIDIA's proprietary G-SYNC module and HDR 1000, VESA-certified panels. This is opposed to NVIDIA's current G-SYNC Compatible (which enables monitors sans the G-SYNC module but with support for VESA's VRR standard to feature variable refresh rates) and G-SYNC (for monitors that only feature G-SYNC modules but may be lax in their HDR support) programs.

The new, silently-edited requirements have now dropped the HDR 1000 certification requirement; instead, NVIDIA is now only requiring "lifelike HDR" capabilities from monitors that receive the G-SYNC Ultimate Badge - whatever that means. The fact of the matter is that at this year's CES, MSI's MEG MEG381CQR and LG's 34GP950G were announced with an NVIDIA G-Sync Ultimate badge - despite "only" featuring HDR 600 certifications from VESA. This certainly complicates matters for users, who only had to check for the Ultimate badge in order to know they're getting the best of the best when it comes to gaming monitors (as per NVIDIA guidelines). Now, those users are back at perusing through spec lists to find whether that particular monitor has the characteristics they want (or maybe require). It remains to be seen if other, previously-released monitors that shipped without the G-SYNC Ultimate certification will now be backwards-certified, and if I were a monitor manufacturer, I would sure demand that for my products.
Sources: PC Monitor, via Videocardz
Add your own comment

36 Comments on NVIDIA Quietly Relaxes Certification Requirements for NVIDIA G-SYNC Ultimate Badge

#27
Randomoneh
bugYou can't meet the contrast required for HDR without local dimming on LCD.
I see no contrast ratio requirements here.

edit: OK, I see something here but I'd have to check if those can be achieved without local dimming.
Posted on Reply
#28
junglist724
bugYou can't meet the contrast required for HDR without local dimming on LCD. That said, rtings did not actually find lowered brightness in HDR mode, they found it's about the same: www.rtings.com/monitor/reviews/samsung/odyssey-g9
But that monitor is an abomination even before factoring in the lame local dimming implementation.
It's not the brightness that was lowered in HDR mode, it was the contrast ratio that was lowered because increasing a zone's brightness also raises the blacks on that 1/10th of the screen. Here's a quote straight from the contrast section of the rtings review.
The local dimming doesn't seem to do much in SDR, as the contrast remains almost unchanged. That said, it's much lower with local dimming enabled in HDR, as the large vertical dimming zones also light up the black squares in our test pattern, causing the contrast to drop to 446:1. In regular content, the contrast is expected to drop with local dimming enabled.
RandomonehI see no contrast ratio requirements here.

edit: OK, I see something here but I'd have to check if those can be achieved without local dimming.
That's exactly the problem with DisplayHDR certification. You need good color volume(color gamut + luminance range) to get a good HDR experience.
Posted on Reply
#29
bug
RandomonehI see no contrast ratio requirements here.
Sure you do. You see a min peak brightness and a max black level. That's the contrast.
junglist724It's not the brightness that was lowered in HDR mode, it was the contrast ratio that was lowered because increasing a zone's brightness also raises the blacks on that 1/10th of the screen. Here's a quote straight from the contrast section of the rtings review.
True, but the contrast ratio is not measured strictly within a dimming zone. The point of a dimming zone is to offer better contrast compared to other parts of the screen.
The monitor still sucks, but local dimming is the least of its worries: bulky, curved, static contrast is not even among the better ones for VA, some of the worst uniformity I have seen, will all hit you before local dimming does.
Posted on Reply
#30
Randomoneh
bugSure you do. You see a min peak brightness and a max black level. That's the contrast.
No, there's no max black level in the FIRST webpage I linked.

edit: okay, it's at the bottom of the page.
Posted on Reply
#31
bug
RandomonehNo, there's no max black level in the FIRST webpage I linked.
Maximum
Black Level
Luminance
Posted on Reply
#32
Vayra86
RandomonehNo, there's no max black level in the FIRST webpage I linked.
Ctrl+F is your friend. The page is long.
Posted on Reply
#33
bug
Vayra86Ctrl+F is your friend. The page is long.
Tbh, it wasn't showing up for me at first either, because of NoScript. They updated their website after releasing v1.1, it used to be much more straightforward.
Posted on Reply
#34
Vayra86
Well you can't really blame anyone for not finding something on those DisplayHDR bullshit pages. Its horrible to read a spec like that. Kinda reminds me of that fiasco we had a few weeks back about new PSU certification labels. Even just the introduction of it contained as many as 10 badges or something.

I understand a picture or badge can make it easier...but not when there are as many badges as you can find products...
Posted on Reply
#35
bug
Vayra86Well you can't really blame anyone for not finding something on those DisplayHDR bullshit pages. Its horrible to read a spec like that.
Well, when reading a spec, you should be prepared to do a little digging. This one is actually easier to read than a full-blown RFC, but yeah, it could be laid out better.
I also like how, since v1.1, they lumped all certified monitors together, in an attempt to mask how almost 90% of them are actually crappy DisplayHDR 400.
Vayra86Kinda reminds me of that fiasco we had a few weeks back about new PSU certification labels. Even just the introduction of it contained as many as 10 badges or something.

I understand a picture or badge can make it easier...but not when there are as many badges as you can find products...
Agree, but at the same time I don't know how you can make it more useful when you want to take into account so many aspects. Well, not exactly "many", but certainly more than just efficiency(load).
Posted on Reply
#36
bobbyrooney
so whats the best midrange card to get for my samsung Q90T tv? will freesync be better than gsync compatible? not sure if it matters anymore.
Posted on Reply
Add your own comment
Jun 16th, 2024 17:38 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts