• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Deliberately Worsens SDR Monitor Image Settings to Showcase HDR

Raevenlord

News Editor
Joined
Aug 12, 2016
Messages
3,755 (1.18/day)
Location
Portugal
System Name The Ryzening
Processor AMD Ryzen 9 5900X
Motherboard MSI X570 MAG TOMAHAWK
Cooling Lian Li Galahad 360mm AIO
Memory 32 GB G.Skill Trident Z F4-3733 (4x 8 GB)
Video Card(s) Gigabyte RTX 3070 Ti
Storage Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB
Display(s) Acer Nitro VG270UP (1440p 144 Hz IPS)
Case Lian Li O11DX Dynamic White
Audio Device(s) iFi Audio Zen DAC
Power Supply Seasonic Focus+ 750 W
Mouse Cooler Master Masterkeys Lite L
Keyboard Cooler Master Masterkeys Lite L
Software Windows 10 x64
In its eagerness to showcase just how important HDR (High Dynamic Range) support is for the image quality of the future, NVIDIA set up a display booth on Computex, where it showcased the difference between SDR (Standard Dynamic Range) and HDR images. However, it looks as if the green company was a smite too eager to demonstrate just how incredible HDR image quality is, considering they needed to fiddle with the SDR screen's settings to increase the divide.

The revelation comes courtesy of Hardware Canucks, who say were granted access to the monitor settings NVIDIA used on their displays while running the demo. And as it turns out, NVIDIA had changed default factory values for brightness, contrast, and even gamma in the SDR monitor, which compromised the image quality it was actually able to convey. Resetting the monitor settings to their factory values resulted in a severely less muted image on the SDR monitor than before, which plays out on the deliberate attempt to reduce image quality on the SDR presentation. Now granted, image quality perceptions comparing SDR to HDR may fall on the personal, subjective spectrum of each viewer; however, actual brightness, contrast and gamma settings being set outside even their set factory levels (which can usually be improved upon with calibration) does make it look like someone was trying too hard to showcase HDR's prowess.



View at TechPowerUp Main Site
 
Last edited by a moderator:
Just give us affordable true 10bit matrices. HDR is more about capturing the image rather than displaying it.
 
No Surprise here. Green team is always doing something underhanded. They got caught, again.

In the famous words of Led Zeppelin, " Lying.., Cheatin'..., Hurtin'..., that all you seem to do"
 
HDR - the next standard to not find a single match for your entire hardware pipeline within every product

Already we have several HDR versions, already its being marketed on panels that have no capability to do anything useful with HDR, and already its a huge inflated marketing mess - and it has even barely landed yet.

Display tech > any bullshit that comes after that.

And the standard display tech is still inferior or way too costly (TN/IPS versus OLED).

Ill come back to HDR when OLED is mainstream. Until then, unless you have a full array local dimming panel, this is a total waste of time.

Well played Nvidia, well played - you really are rats, thanks for confirming
 
No Surprise here. Green team is always doing something underhanded. They got caught, again.

In the famous words of Led Zeppelin, " Lying.., Cheatin'..., Hurtin'..., that all you seem to do"

Yeah, well, if you need Nvidia to tell you what HDR is about, you're kind of asking for it.

And before you judge, think about this: every single commercial about HDR or wide gamut does the exact same thing, because they can't actually show HDR or wide gamut on your standard TV.
 
every single commercial about HDR or wide gamut does the exact same thing, because they can't actually show HDR or wide gamut on your standard TV.

^ this, 100%
 
Yeah, well, if you need Nvidia to tell you what HDR is about, you're kind of asking for it.

And before you judge, think about this: every single commercial about HDR or wide gamut does the exact same thing, because they can't actually show HDR or wide gamut on your standard TV.

^ this, 100%

This was at a convention so it has nothing whatsoever to do with not being able to display HDR on a non HDR TV....
 
This was at a convention so it has nothing whatsoever to do with not being able to display HDR on a non HDR TV....

^ this, 100%
 
Nvidia being called out for shit, bring on the fanbois with shit comments proving they didn't read.
 
bring on the fanbois
People in glass houses shouldn't throw stones, or pot, kettle, black.... here's another, if the shoe fits wear it :p
 
Pretty lame, but at the same time, expected when the tech being showed off makes so little actual difference.

Just give us affordable true 10bit matrices. HDR is more about capturing the image rather than displaying it.

Ironically, most of the 8-bit vs 10-bit comparisons are much worse doctored bullshat than just changing a few monitor settings.
 
  • Like
Reactions: bug
image doesnt look any better... just looks like it has a purple tint with slightly warmer color palette
 
No shit. Every single vendor that demo some better visual demo compared to the old visual try to get the old visuals as bad as possible to showcase the differences.
 
This was at a convention so it has nothing whatsoever to do with not being able to display HDR on a non HDR TV....
There is we have pictures of it do we not? Coverage happens on the web not in person. It is a dumb move by monitor makers in general but hell they've probably learned from it being hard enough to get people to buy variable refresh rate monitors another product hard to show except in person.
 
Or maybe the monitors weren't properly reset after the guys that came before Dmitry and Eber had their way with them.

The only way to show the capabilities of HDR is to use the same monitor and content, preserving all settings, except the one being tested, enabled and disabled between runs. Record the same sequence with the same camera settings, in the same ambient (lighting) conditions, then show a split screen using post-processing magic.

The test was doomed from the start.
 
There is we have pictures of it do we not? Coverage happens on the web not in person. It is a dumb move by monitor makers in general but hell they've probably learned from it being hard enough to get people to buy variable refresh rate monitors another product hard to show except in person.
but they weren't altered because this was a TV ad, it was a live event that people attended and they were obviously changed so those people would see a bigger difference and not for us lowly peasants who only have pictures to look at.
 
This was at a convention so it has nothing whatsoever to do with not being able to display HDR on a non HDR TV....
Of course, I was just saying it's the same trick. Nobody seems to fret about it, unless Nvidia is doing it.
Haters will quickly jump to the conclusion they did it because they probably suck at HDR*. While everybody with an interest knows HDR content is subtle, it's not it's not as easy as blowing out all color channels to make it look like your colors are more vivid. For all we know, the content displayed might have looked pretty similar under floor lighting conditions. I've actually read about someone who was used to oversaturating color channels. When they tried to play HDR content using the same settings, they came to the conclusion HDR is the same as SDR.

*Nvidia actually introduced HDR (albeit in a different form) in their 7000 series (iirc) over a decade ago.

Edit: Just watched a South Park episode. Something about first world problems.
 
Last edited:
Yeah, well, if you need Nvidia to tell you what HDR is about, you're kind of asking for it.

And before you judge, think about this: every single commercial about HDR or wide gamut does the exact same thing, because they can't actually show HDR or wide gamut on your standard TV.

yeah so that makes sense...these are monitors shown IN THE FLESH on a show, thats a very very different story.
To deliberately make the old look worse to sell the new...is just a con.

Kinda like making older cards worse through drivers to promote newer cards.
 
Of course, I was just saying it's the same trick. Nobody seems to fret about it, unless Nvidia is doing it.

Oh come on man, you're really not going with that are you? AMD get bashed just as much and sometimes with justification, this isn't a case of someone hating on NVIDIA for the sake of it.

*Nvidia actually introduced HDR (albeit in a different form) in their 7000 series (iirc) over a decade ago.

Yea software HDR iirc, I remember one of the Tom Clancy games (and quite a few others at the time which don't spring to mind) having the option in the settings at the time

Kinda like making older cards worse through drivers to promote newer cards.

Surely such practices have never been applied? :p
 
Almost every vendor does this.
You think that expensive TV at BestBuy in the middle of the showroom floor is running on the same exact settings as all the other TVs? No, they tweak the settings and create custom images to showcase their flagship product for more revenue. Unfortunately, it's up to the consumer to research and be informed.

Besides, I remember reading an article (arsTechnica?) that true HDR isn't possible on these displays. We'll have to wait for OLED to become mainstream. I'll come back if I find the link.
 
nVidia sucks, AMD sucks, Intel sucks - a visitor to this site might get the idea that this place is full of geniuses, since everyone here obviously knows much more than the 3 biggest PC chip makers in the world. Never mind that all 3 corporations have accomplished things that none of us could do if we had a trillion dollars and 20 years. Being critical is normal, but it's turned into a shouting match and degenerated into personal attacks, hatred, and bigotry, ultimately against everything and everyone, at one time or another. Does anyone here actually like any of their PC hardware brands, without needing to talk crap about other brands, and the people who buy them? These 3 companies have risen to the top against many other makers of CPUs and GPUs, shouldn't we give them their due respect for the great things they've done? We all have bad days, and feel negativity, myself included, but does it really make us feel better when we put others down? I'm going to try to post only positive things from now on, and leave the keyboard alone when I'm feeling scrappy. In the 5 years I've been a member, the quality and tone of this site has gone downhill rapidly, just as the political and social climate has.become toxic in most of the world. I hope we can be better than that, and focus on the real mission of this site (well, besides making money) - informing the uninformed about how to build and repair the best PC they can afford. Hate will never accomplish that - patience, understanding, and tolerance are what's needed.
 
I do seriously hope that no one was actually shocked over this. If this was [insert any company name] they likely would have pulled some shenanigans. Nature of the beast sadly. Their problem was they got caught and their nuts busted.
 
HDR - the next standard to not find a single match for your entire hardware pipeline within every product

Already we have several HDR versions, already its being marketed on panels that have no capability to do anything useful with HDR, and already its a huge inflated marketing mess - and it has even barely landed yet.

Display tech > any bullshit that comes after that.

And the standard display tech is still inferior or way too costly (TN/IPS versus OLED).

Ill come back to HDR when OLED is mainstream. Until then, unless you have a full array local dimming panel, this is a total waste of time.

Well played Nvidia, well played - you really are rats, thanks for confirming
There are a few HDR standards, but everything on the PC from the GPUs to the games/software and the OSes (namely Windows) are standardizing around HDR-10, so you can quiet that rant.

image doesnt look any better... just looks like it has a purple tint with slightly warmer color palette
Even if it did look better, you could never tell by watching the video because you aren't watching with an HDR monitor, and I don't know if YT even supports an HDR standard yet. Even if it did, the camera recording the monitors would have to record with the wide-gamut settings as well, and I don't know about that either.
 
Back
Top