Wednesday, June 7th 2017

NVIDIA Deliberately Worsens SDR Monitor Image Settings to Showcase HDR

In its eagerness to showcase just how important HDR (High Dynamic Range) support is for the image quality of the future, NVIDIA set up a display booth on Computex, where it showcased the difference between SDR (Standard Dynamic Range) and HDR images. However, it looks as if the green company was a smite too eager to demonstrate just how incredible HDR image quality is, considering they needed to fiddle with the SDR screen's settings to increase the divide.

The revelation comes courtesy of Hardware Canucks, who say were granted access to the monitor settings NVIDIA used on their displays while running the demo. And as it turns out, NVIDIA had changed default factory values for brightness, contrast, and even gamma in the SDR monitor, which compromised the image quality it was actually able to convey. Resetting the monitor settings to their factory values resulted in a severely less muted image on the SDR monitor than before, which plays out on the deliberate attempt to reduce image quality on the SDR presentation. Now granted, image quality perceptions comparing SDR to HDR may fall on the personal, subjective spectrum of each viewer; however, actual brightness, contrast and gamma settings being set outside even their set factory levels (which can usually be improved upon with calibration) does make it look like someone was trying too hard to showcase HDR's prowess.
Source: Hardware Canucks' YouTube Channel
Add your own comment

78 Comments on NVIDIA Deliberately Worsens SDR Monitor Image Settings to Showcase HDR

#51
HTC
RelayerI am far far far from an nVidia apologist. I wouldn't walk across the street to piss on Jen Hsun if he was on fire, nor spend a cent on an nVidia product. With that said, I've been in sales my whole life. I'd be willing to bet it was someone at the show who on their own adjusted the settings to make the difference more dramatic. Salespeople are known for often stretching the truth and outright lying to make a sale. A lot more likely than some upper end marketing decision.
Quite a while back, i got a visit from a dude that worked for an IPS company and was trying to get me as a client. I bothered to listen and asked several questions. One of them was if there was fidelization (in Portuguese, it's the time you're required to stay with an IPS company after signing on, to avoid paying a penalty for switching / quitting: dunno if there's an equivalent term in English).

The dude said "1 month" fidelization, claiming the law had changed recently.

A few days later, i went to that company's store on an unrelated matter and, while i was there, i asked about the fidelization time to which they said "it's 2 years" ...

I asked: hasn't the law change recently? One of your sellers came to my house and said it had changed recently and now the fidelization period was 1 month, to which they promptly said "the law on fidelization period hasn't changed.

As for the topic @ hand, i don't care who it is: if they lie to sell products and get caught, they should be exposed on the lie(s) in question, in the most public fashion possible so that they (and others) thing twice (or more times) before considering doing this sort of crap again.
Posted on Reply
#53
erixx
Do we have the name of the guy who set up the monitors? Maybe he followed (or not) a superior's instructions (to fine-tune both or only one monitor)? Or maybe a pesty visitor boy that walked along changed them? No, in the name of god (amd) it was Nvidia CEO who ordered all!

And of course most monitor, tv, phone...car commercials do not represent the product in a realistic fashion.

:eek:Free the Beasts :D
Posted on Reply
#54
renz496
lol people are quick to bash nvidia. but as Dmitri mentioned in his respond to the people discussing about the video in you tube only nvidia actually allow tinkering to be done on initial setups. he specifically mention Asus, Acer and AMD did not allow such thing. nvidia might have their own setup but they are more transparent than others about it.
Posted on Reply
#55
bug
theoneandonlymrkLook it up I've used it not imagined it my monitor is a pro model.
Which one? Your TN-Film monitor or your FHD 40"?
Posted on Reply
#56
Relayer
Th3pwn3rLol, I'm willing to bet the people here knocking HDR don't even have a monitor or TV that can display it. I can see a clear difference on my Samsung 55'' KS8000 when HDR is off. Not only does it look far better with HDR enabled but I also like it(for the person saying it's uncomfortable to the eye(wtf??))...
I've seen HDR with proper source material. It's a lot better. Problem is people don't just want higher fidelity. If it's not an overblown special effect then they think, "Meh, is that it?"
Posted on Reply
#57
Relayer
HTCQuite a while back, i got a visit from a dude that worked for an IPS company and was trying to get me as a client. I bothered to listen and asked several questions. One of them was if there was fidelization (in Portuguese, it's the time you're required to stay with an IPS company after signing on, to avoid paying a penalty for switching / quitting: dunno if there's an equivalent term in English).

The dude said "1 month" fidelization, claiming the law had changed recently.

A few days later, i went to that company's store on an unrelated matter and, while i was there, i asked about the fidelization time to which they said "it's 2 years" ...

I asked: hasn't the law change recently? One of your sellers came to my house and said it had changed recently and now the fidelization period was 1 month, to which they promptly said "the law on fidelization period hasn't changed.

As for the topic @ hand, i don't care who it is: if they lie to sell products and get caught, they should be exposed on the lie(s) in question, in the most public fashion possible so that they (and others) thing twice (or more times) before considering doing this sort of crap again.
True. Lies should be exposed. But we need to point the finger at the right party. Not jump to conclusions.

And you wanted to sign up, get the free wireless router, or whatever the IPS company was offering, and then bail on your contract without penalty? And you believed the guy when he said you could? Really?
renz496lol people are quick to bash nvidia. but as Dmitri mentioned in his respond to the people discussing about the video in you tube only nvidia actually allow tinkering to be done on initial setups. he specifically mention Asus, Acer and AMD did not allow such thing. nvidia might have their own setup but they are more transparent than others about it.
Ever look at the photos of the menu at McDonald's and then look at your meal? This is why we have reviewers in the 1st place. You can't trust the vendor to tell you straight.
Posted on Reply
#58
londiste
Th3pwn3rLol, I'm willing to bet the people here knocking HDR don't even have a monitor or TV that can display it. I can see a clear difference on my Samsung 55'' KS8000 when HDR is off. Not only does it look far better with HDR enabled but I also like it(for the person saying it's uncomfortable to the eye(wtf??))...
i would love a good way to have a real fair comparison of sdr and hdr. unfortunately, that is easier said than done as that would basically require content being mastered for each (if not for specific displays).

test tool in nvidia's hdr sdk was as close as i could find. and the differences are rather minimal. bright lights are somewhat brighter and there is definitely more detail in both bright and dark areas thanks to 10-bit colors. but that's it. it really is not that noticeable unless you know what to look for. or if the content is not fudged.

practically all of hdr marketing is pure bullshit.
Posted on Reply
#59
Th3pwn3r
londistei would love a good way to have a real fair comparison of sdr and hdr. unfortunately, that is easier said than done as that would basically require content being mastered for each (if not for specific displays).

test tool in nvidia's hdr sdk was as close as i could find. and the differences are rather minimal. bright lights are somewhat brighter and there is definitely more detail in both bright and dark areas thanks to 10-bit colors. but that's it. it really is not that noticeable unless you know what to look for. or if the content is not fudged.

practically all of hdr marketing is pure bullshit.
What your opinion is isn't fact. It's subjective. Similar to audio and taste, it differs.
Posted on Reply
#60
Vayra86
Th3pwn3rWhat your opinion is isn't fact. It's subjective. Similar to audio and taste, it differs.
Of course it does, but the marketplace eventually shows us all what technology is a success and what is not. And as long as something is not offered as a mainstream option, it generally becomes a niche market, which is currently, and very likely also the future status of HDR.

It all starts and ends with content, just like 4K, and whether or not regular Joe can see an advantage of buying into it. So in a way, his opinion is fact.

The current way HDR is marketed for one, does not help the adoption of HDR at all. The way Nvidia did it right here, is in fact counterproductive. Putting HDR capability on TN and IPS panels, or even VA panels, that do not have full array local dimming, is also counterproductive. It will make people think HDR is pointless, or produces unnatural images, because the panel tech simply can't display the gamut.

The whole situation is rather similar to the way High Definition was being done: everything was HD... but we had HD Ready too, and Full HD. People were being misled, and bought 720p screens, end result being that we now today still don't have 1080p broadcasts (ofc this is just part of the reason, but it definitely helped) - a large majority of people got stuck on 720p so why scale up content further?
Posted on Reply
#61
Prince Valiant
HoodnVidia sucks, AMD sucks, Intel sucks - a visitor to this site might get the idea that this place is full of geniuses, since everyone here obviously knows much more than the 3 biggest PC chip makers in the world. Never mind that all 3 corporations have accomplished things that none of us could do if we had a trillion dollars and 20 years. Being critical is normal, but it's turned into a shouting match and degenerated into personal attacks, hatred, and bigotry, ultimately against everything and everyone, at one time or another. Does anyone here actually like any of their PC hardware brands, without needing to talk crap about other brands, and the people who buy them? These 3 companies have risen to the top against many other makers of CPUs and GPUs, shouldn't we give them their due respect for the great things they've done? We all have bad days, and feel negativity, myself included, but does it really make us feel better when we put others down? I'm going to try to post only positive things from now on, and leave the keyboard alone when I'm feeling scrappy. In the 5 years I've been a member, the quality and tone of this site has gone downhill rapidly, just as the political and social climate has.become toxic in most of the world. I hope we can be better than that, and focus on the real mission of this site (well, besides making money) - informing the uninformed about how to build and repair the best PC they can afford. Hate will never accomplish that - patience, understanding, and tolerance are what's needed.
None of these companies are white as snow. Being useful doesn't make questionable actions right.

On HDR: I'd rather see new panels with a wider gamut, better blacks, response times, etc, and a good, calibrated, sRGB mode but, here we are.
Posted on Reply
#62
Hood
Prince ValiantNone of these companies are white as snow. Being useful doesn't make questionable actions right.
My post was not about the evil deeds perpetrated by companies, or the normal sort of criticism they deserve, but about the toxic reactions of forum members, those folks who take each minor revelation and blow it up into a huge biased rant which degenerates into personal attacks against any who disagree.
Posted on Reply
#63
Frick
Fishfaced Nincompoop
My Galaxy S7 has a HDR camera, but my 50 instagram followers don't like my pictures more now than they did before. HDR is clearly a hoax.
Posted on Reply
#64
medi01
"Much better than 480" (tm)
HoodnVidia sucks, AMD sucks, Intel sucks
nVidia just lied.
Let's go full throttle defense mode.
Posted on Reply
#66
medi01
HDR is amazing.
The issue is, "HDR support" means jack shit. It merely means device supports it as input, not that it is actually capable of displaying it.
So instead you actually need to look after "Ultra HD Premium" certification to get proper HDR screen.
Posted on Reply
#67
newtekie1
Semi-Retired Folder
Prima.VeraHDR can be easily emulated via software. 10 or 12 bit however cannot. Even if some of the panels are doing fake 10/12 bit by using dithering. Still good, since it completely removes banding.
Well, we're told it completely removes banding. But since the difference can't actually be seen, we are just left assuming it is true. And we'll assume that up until 16-bit panels come out and they tell us those now completely eliminate banding. And show the same BS 256 color vs 8-bit picture to prove how the banding goes away...

And here is the thing with every single example of 10/12-bit vs 8-bit I've ever seen, they are all 8-bit pictures because they are designed to be viewed on 8-bit panels. Obviously, this is to try to sell people currently with 8-bit panels on the miracles of 10/12-bit.

I'll use BenQ's product page as a perfect example: www.benq.us/product/monitor/PG2401PT/features/

On that page they have this picture to show the difference between 8-bit and 10-bit:


Look how smooth the 10-bit side looks! Obviously 10-bit is so much better, how did we ever live with 8-bit? But wait, people will most likely be viewing that on a 8-bit monitor. So the "10-bit" image on the left will be viewed as 8-bit. Dig a little deaper, and download the picture and open it in an image editor. OMG, it's only an 8-bit image! So what you are seeing on the left is really 8-bit, so what the hell is the right?

You mean the monitor manufacturers, and pretty much everyone trying to say 10/12-bit is so much better, are really just purposely lowering the quality of images to make 8-bit look worse than it really is? OMG, get the pitchforks!!!
Posted on Reply
#68
BorgOvermind
Someone said HDR can be easily emulated via software.
The same, most FX today can be done in DX 8.1 to a very large extent.
That does not mean it happens.

And what's all the new fuss about the HDR anyway ?
Do you not remember the good old HL2 demo on that ?
Posted on Reply
#69
bug
BorgOvermindSomeone said HDR can be easily emulated via software.
The same, most FX today can be done in DX 8.1 to a very large extent.
That does not mean it happens.

And what's all the new fuss about the HDR anyway ?
Do you not remember the good old HL2 demo on that ?
HDR is great for photos and other professional usage. But as it usually happens, it's adoption will be spearheaded/funded by clueless gamers first. Games can make use of HDR, but at the same time they are pretty well served by the software emulated version as well (you're not going to notice HDR during fast paced action the same way you can't tell 4xAA from 8xAA of 16xAF from 8xAF during the same scenes).

In the grand scheme of things, HDR (more specifically HDR10) is about being able to cover more colour space then traditional displays. It's using a wider colour space (which we could already do), but at the same time being able to use more colours from that colour space. Up until now, even if you used Rec2020, you were still limited to 256 shades of R/G/B, thus limiting the maximum luminance delta. Of course, much like other things before it, having HDR support in place is one thing, wielding HDR properly is another and it will come a little later.
Posted on Reply
#70
Prima.Vera
newtekie1Well, we're told it completely removes banding. But since the difference can't actually be seen, we are just left assuming it is true. And we'll assume that up until 16-bit panels come out and they tell us those now completely eliminate banding. And show the same BS 256 color vs 8-bit picture to prove how the banding goes away...

And here is the thing with every single example of 10/12-bit vs 8-bit I've ever seen, they are all 8-bit pictures because they are designed to be viewed on 8-bit panels. Obviously, this is to try to sell people currently with 8-bit panels on the miracles of 10/12-bit.

I'll use BenQ's product page as a perfect example: www.benq.us/product/monitor/PG2401PT/features/

On that page they have this picture to show the difference between 8-bit and 10-bit:


Look how smooth the 10-bit side looks! Obviously 10-bit is so much better, how did we ever live with 8-bit? But wait, people will most likely be viewing that on a 8-bit monitor. So the "10-bit" image on the left will be viewed as 8-bit. Dig a little deaper, and download the picture and open it in an image editor. OMG, it's only an 8-bit image! So what you are seeing on the left is really 8-bit, so what the hell is the right?

You mean the monitor manufacturers, and pretty much everyone trying to say 10/12-bit is so much better, are really just purposely lowering the quality of images to make 8-bit look worse than it really is? OMG, get the pitchforks!!!
Is true, exaggerations are often use to sharpen the differences. Back to the panels that do fake 10-bit by dithering the image, the best example I can give is the first game Starcraft. Even if is on 256 colors, due to software dithering it still looks unbelievable good.
Posted on Reply
#71
Prince Valiant
HoodMy post was not about the evil deeds perpetrated by companies, or the normal sort of criticism they deserve, but about the toxic reactions of forum members, those folks who take each minor revelation and blow it up into a huge biased rant which degenerates into personal attacks against any who disagree.
I don't disagree on over the top reactions and that people should control themselves.
bugHDR is great for photos and other professional usage. But as it usually happens, it's adoption will be spearheaded/funded by clueless gamers first. Games can make use of HDR, but at the same time they are pretty well served by the software emulated version as well (you're not going to notice HDR during fast paced action the same way you can't tell 4xAA from 8xAA of 16xAF from 8xAF during the same scenes).

In the grand scheme of things, HDR (more specifically HDR10) is about being able to cover more colour space then traditional displays. It's using a wider colour space (which we could already do), but at the same time being able to use more colours from that colour space. Up until now, even if you used Rec2020, you were still limited to 256 shades of R/G/B, thus limiting the maximum luminance delta. Of course, much like other things before it, having HDR support in place is one thing, wielding HDR properly is another and it will come a little later.
If only they'd spearhead profile loading in games and/or LUT access on non-pro monitors while they're at it :p.

My worry is that it's going to be dragged out for years, much like OLED, before we start seeing it in the mainstream and without gouging prices.
Posted on Reply
#72
Dave65
nvidia and intel cut from same cloth..
Posted on Reply
#73
brutlern
Or you can just install a reshade mod, get a similar effect, and save a few hundred bucks.

P.S: I have not seen an HDR monitor in person, but based on the way the picture looks, you can easily achieve a similar look.
Posted on Reply
#74
medi01
brutlernOr you can just install a reshade mod, get a similar effect, and save a few hundred bucks.

P.S: I have not seen an HDR monitor in person, but based on the way the picture looks, you can easily achieve a similar look.
What!?!?!
Hell no!!!
HDR covers wider range of color gamut. You are able to display colors that normal monitor isn't capable of.

Think of it like TN vs IPS screeen

Posted on Reply
#75
londiste
Th3pwn3rWhat your opinion is isn't fact. It's subjective. Similar to audio and taste, it differs.
care to elaborate?
comparison with different content is apples to oranges.
with similar/same content, the difference is quite difficult to discern.
Posted on Reply
Add your own comment
Apr 26th, 2024 12:55 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts