Wednesday, June 7th 2017

NVIDIA Deliberately Worsens SDR Monitor Image Settings to Showcase HDR

In its eagerness to showcase just how important HDR (High Dynamic Range) support is for the image quality of the future, NVIDIA set up a display booth on Computex, where it showcased the difference between SDR (Standard Dynamic Range) and HDR images. However, it looks as if the green company was a smite too eager to demonstrate just how incredible HDR image quality is, considering they needed to fiddle with the SDR screen's settings to increase the divide.

The revelation comes courtesy of Hardware Canucks, who say were granted access to the monitor settings NVIDIA used on their displays while running the demo. And as it turns out, NVIDIA had changed default factory values for brightness, contrast, and even gamma in the SDR monitor, which compromised the image quality it was actually able to convey. Resetting the monitor settings to their factory values resulted in a severely less muted image on the SDR monitor than before, which plays out on the deliberate attempt to reduce image quality on the SDR presentation. Now granted, image quality perceptions comparing SDR to HDR may fall on the personal, subjective spectrum of each viewer; however, actual brightness, contrast and gamma settings being set outside even their set factory levels (which can usually be improved upon with calibration) does make it look like someone was trying too hard to showcase HDR's prowess.
Source: Hardware Canucks' YouTube Channel
Add your own comment

78 Comments on NVIDIA Deliberately Worsens SDR Monitor Image Settings to Showcase HDR

#76
Vayra86
medi01What!?!?!
Hell no!!!
HDR covers wider range of color gamut. You are able to display colors that normal monitor isn't capable of.

Think of it like TN vs IPS screeen

Even that image and the widest triangle shows that the difference in color intensity becomes extremely hard to discern the further towards the edge you get.

A difference may be measurable, that still doesn't mean we can observe it ourselves in any regular content viewing setting.

It is quite similar with high fps gaming. At some point, the difference becomes so small but it is still measurable, and because it is measurable, people are convinced going ever higher is going to keep giving them more advantage and perceived smoothness. At some point, higher numbers just enter the realm of placebo effect, and for high FPS gaming, that exists above 144hz, for color reproduction, similarly, there are other panel qualities that are FAR more important for good color and contrast but those are too expensive to implement, so we're made to believe we need silly stuff like HDR even though the current color space covers almost everything we can see with the naked eye.
Posted on Reply
#77
bug
Vayra86Even that image and the widest triangle shows that the difference in color intensity becomes extremely hard to discern the further towards the edge you get.
There's a perceptual version of the CIE diagram as well. But the reason you can't discern colours is that your monitor doesn't actually cover the gamut in that diagram. No monitor does.

The main issue is that when you're covering more of the visible spectrum, having to choose between only 256 shades of a primary colour means you spacing those shades farther apart. Thus, more chance for banding.
For processing, having more shades than the eye can see is even more important.
Posted on Reply
#78
medi01
Vayra86Even that image and the widest triangle shows that the difference in color intensity becomes extremely hard to discern the further towards the edge you get.
This is like judging a singer by hearing friend of yours sing a song, that he heard on his concert.
Posted on Reply
Add your own comment
May 10th, 2024 18:28 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts