Wednesday, June 7th 2017

NVIDIA Deliberately Worsens SDR Monitor Image Settings to Showcase HDR

In its eagerness to showcase just how important HDR (High Dynamic Range) support is for the image quality of the future, NVIDIA set up a display booth on Computex, where it showcased the difference between SDR (Standard Dynamic Range) and HDR images. However, it looks as if the green company was a smite too eager to demonstrate just how incredible HDR image quality is, considering they needed to fiddle with the SDR screen's settings to increase the divide.

The revelation comes courtesy of Hardware Canucks, who say were granted access to the monitor settings NVIDIA used on their displays while running the demo. And as it turns out, NVIDIA had changed default factory values for brightness, contrast, and even gamma in the SDR monitor, which compromised the image quality it was actually able to convey. Resetting the monitor settings to their factory values resulted in a severely less muted image on the SDR monitor than before, which plays out on the deliberate attempt to reduce image quality on the SDR presentation. Now granted, image quality perceptions comparing SDR to HDR may fall on the personal, subjective spectrum of each viewer; however, actual brightness, contrast and gamma settings being set outside even their set factory levels (which can usually be improved upon with calibration) does make it look like someone was trying too hard to showcase HDR's prowess.

Source: Hardware Canucks' YouTube Channel
Add your own comment

78 Comments on NVIDIA Deliberately Worsens SDR Monitor Image Settings to Showcase HDR

#1
birdie
Just give us affordable true 10bit matrices. HDR is more about capturing the image rather than displaying it.
Posted on Reply
#2
DeathtoGnomes
No Surprise here. Green team is always doing something underhanded. They got caught, again.

In the famous words of Led Zeppelin, " Lying.., Cheatin'..., Hurtin'..., that all you seem to do"
Posted on Reply
#3
Vayra86
HDR - the next standard to not find a single match for your entire hardware pipeline within every product

Already we have several HDR versions, already its being marketed on panels that have no capability to do anything useful with HDR, and already its a huge inflated marketing mess - and it has even barely landed yet.

Display tech > any bullshit that comes after that.

And the standard display tech is still inferior or way too costly (TN/IPS versus OLED).

Ill come back to HDR when OLED is mainstream. Until then, unless you have a full array local dimming panel, this is a total waste of time.

Well played Nvidia, well played - you really are rats, thanks for confirming
Posted on Reply
#4
bug
DeathtoGnomes said:
No Surprise here. Green team is always doing something underhanded. They got caught, again.

In the famous words of Led Zeppelin, " Lying.., Cheatin'..., Hurtin'..., that all you seem to do"
Yeah, well, if you need Nvidia to tell you what HDR is about, you're kind of asking for it.

And before you judge, think about this: every single commercial about HDR or wide gamut does the exact same thing, because they can't actually show HDR or wide gamut on your standard TV.
Posted on Reply
#5
Vayra86
bug said:

every single commercial about HDR or wide gamut does the exact same thing, because they can't actually show HDR or wide gamut on your standard TV.
^ this, 100%
Posted on Reply
#6
NdMk2o1o
bug said:
Yeah, well, if you need Nvidia to tell you what HDR is about, you're kind of asking for it.

And before you judge, think about this: every single commercial about HDR or wide gamut does the exact same thing, because they can't actually show HDR or wide gamut on your standard TV.
Vayra86 said:
^ this, 100%
This was at a convention so it has nothing whatsoever to do with not being able to display HDR on a non HDR TV....
Posted on Reply
#7
Totally
NdMk2o1o said:
This was at a convention so it has nothing whatsoever to do with not being able to display HDR on a non HDR TV....
^ this, 100%
Posted on Reply
#8
Steevo
Nvidia being called out for shit, bring on the fanbois with shit comments proving they didn't read.
Posted on Reply
#9
NdMk2o1o
Steevo said:
bring on the fanbois
People in glass houses shouldn't throw stones, or pot, kettle, black.... here's another, if the shoe fits wear it :p
Posted on Reply
#10
newtekie1
Semi-Retired Folder
Pretty lame, but at the same time, expected when the tech being showed off makes so little actual difference.

birdie said:
Just give us affordable true 10bit matrices. HDR is more about capturing the image rather than displaying it.
Ironically, most of the 8-bit vs 10-bit comparisons are much worse doctored bullshat than just changing a few monitor settings.
Posted on Reply
#11
Duality92
So NVidia is saying we can all get HDR with a bit of monitor fine tuning? Great! :D
Posted on Reply
#13
phanbuey
image doesnt look any better... just looks like it has a purple tint with slightly warmer color palette
Posted on Reply
#14
Gasaraki
No shit. Every single vendor that demo some better visual demo compared to the old visual try to get the old visuals as bad as possible to showcase the differences.
Posted on Reply
#15
semantics
NdMk2o1o said:
This was at a convention so it has nothing whatsoever to do with not being able to display HDR on a non HDR TV....
There is we have pictures of it do we not? Coverage happens on the web not in person. It is a dumb move by monitor makers in general but hell they've probably learned from it being hard enough to get people to buy variable refresh rate monitors another product hard to show except in person.
Posted on Reply
#16
Sihastru
Or maybe the monitors weren't properly reset after the guys that came before Dmitry and Eber had their way with them.

The only way to show the capabilities of HDR is to use the same monitor and content, preserving all settings, except the one being tested, enabled and disabled between runs. Record the same sequence with the same camera settings, in the same ambient (lighting) conditions, then show a split screen using post-processing magic.

The test was doomed from the start.
Posted on Reply
#17
NdMk2o1o
semantics said:
There is we have pictures of it do we not? Coverage happens on the web not in person. It is a dumb move by monitor makers in general but hell they've probably learned from it being hard enough to get people to buy variable refresh rate monitors another product hard to show except in person.
but they weren't altered because this was a TV ad, it was a live event that people attended and they were obviously changed so those people would see a bigger difference and not for us lowly peasants who only have pictures to look at.
Posted on Reply
#18
bug
NdMk2o1o said:
This was at a convention so it has nothing whatsoever to do with not being able to display HDR on a non HDR TV....
Of course, I was just saying it's the same trick. Nobody seems to fret about it, unless Nvidia is doing it.
Haters will quickly jump to the conclusion they did it because they probably suck at HDR*. While everybody with an interest knows HDR content is subtle, it's not it's not as easy as blowing out all color channels to make it look like your colors are more vivid. For all we know, the content displayed might have looked pretty similar under floor lighting conditions. I've actually read about someone who was used to oversaturating color channels. When they tried to play HDR content using the same settings, they came to the conclusion HDR is the same as SDR.

*Nvidia actually introduced HDR (albeit in a different form) in their 7000 series (iirc) over a decade ago.

Edit: Just watched a South Park episode. Something about first world problems.
Posted on Reply
#19
ZoneDymo
bug said:
Yeah, well, if you need Nvidia to tell you what HDR is about, you're kind of asking for it.

And before you judge, think about this: every single commercial about HDR or wide gamut does the exact same thing, because they can't actually show HDR or wide gamut on your standard TV.
yeah so that makes sense...these are monitors shown IN THE FLESH on a show, thats a very very different story.
To deliberately make the old look worse to sell the new...is just a con.

Kinda like making older cards worse through drivers to promote newer cards.
Posted on Reply
#20
NdMk2o1o
bug said:
Of course, I was just saying it's the same trick. Nobody seems to fret about it, unless Nvidia is doing it.
Oh come on man, you're really not going with that are you? AMD get bashed just as much and sometimes with justification, this isn't a case of someone hating on NVIDIA for the sake of it.

bug said:
*Nvidia actually introduced HDR (albeit in a different form) in their 7000 series (iirc) over a decade ago.
Yea software HDR iirc, I remember one of the Tom Clancy games (and quite a few others at the time which don't spring to mind) having the option in the settings at the time

ZoneDymo said:
Kinda like making older cards worse through drivers to promote newer cards.
Surely such practices have never been applied? :p
Posted on Reply
#21
Halo3Addict
Almost every vendor does this.
You think that expensive TV at BestBuy in the middle of the showroom floor is running on the same exact settings as all the other TVs? No, they tweak the settings and create custom images to showcase their flagship product for more revenue. Unfortunately, it's up to the consumer to research and be informed.

Besides, I remember reading an article (arsTechnica?) that true HDR isn't possible on these displays. We'll have to wait for OLED to become mainstream. I'll come back if I find the link.
Posted on Reply
#22
Hood
nVidia sucks, AMD sucks, Intel sucks - a visitor to this site might get the idea that this place is full of geniuses, since everyone here obviously knows much more than the 3 biggest PC chip makers in the world. Never mind that all 3 corporations have accomplished things that none of us could do if we had a trillion dollars and 20 years. Being critical is normal, but it's turned into a shouting match and degenerated into personal attacks, hatred, and bigotry, ultimately against everything and everyone, at one time or another. Does anyone here actually like any of their PC hardware brands, without needing to talk crap about other brands, and the people who buy them? These 3 companies have risen to the top against many other makers of CPUs and GPUs, shouldn't we give them their due respect for the great things they've done? We all have bad days, and feel negativity, myself included, but does it really make us feel better when we put others down? I'm going to try to post only positive things from now on, and leave the keyboard alone when I'm feeling scrappy. In the 5 years I've been a member, the quality and tone of this site has gone downhill rapidly, just as the political and social climate has.become toxic in most of the world. I hope we can be better than that, and focus on the real mission of this site (well, besides making money) - informing the uninformed about how to build and repair the best PC they can afford. Hate will never accomplish that - patience, understanding, and tolerance are what's needed.
Posted on Reply
#23
ironwolf
I do seriously hope that no one was actually shocked over this. If this was [insert any company name] they likely would have pulled some shenanigans. Nature of the beast sadly. Their problem was they got caught and their nuts busted.
Posted on Reply
#24
Solidstate89
Vayra86 said:
HDR - the next standard to not find a single match for your entire hardware pipeline within every product

Already we have several HDR versions, already its being marketed on panels that have no capability to do anything useful with HDR, and already its a huge inflated marketing mess - and it has even barely landed yet.

Display tech > any bullshit that comes after that.

And the standard display tech is still inferior or way too costly (TN/IPS versus OLED).

Ill come back to HDR when OLED is mainstream. Until then, unless you have a full array local dimming panel, this is a total waste of time.

Well played Nvidia, well played - you really are rats, thanks for confirming
There are a few HDR standards, but everything on the PC from the GPUs to the games/software and the OSes (namely Windows) are standardizing around HDR-10, so you can quiet that rant.

phanbuey said:
image doesnt look any better... just looks like it has a purple tint with slightly warmer color palette
Even if it did look better, you could never tell by watching the video because you aren't watching with an HDR monitor, and I don't know if YT even supports an HDR standard yet. Even if it did, the camera recording the monitors would have to record with the wide-gamut settings as well, and I don't know about that either.
Posted on Reply
Add your own comment