• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

VESA Introduces DisplayHDR True Black High Dynamic Range Standard for OLED Displays

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,356 (7.68/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
The Video Electronics Standards Association (VESA ) today introduced its new DisplayHDR True Black high dynamic range (HDR) standard, a variant on VESA's widely adopted High-Performance Monitor and Display Compliance Test Specification (DisplayHDR). The new standard has been optimized for emissive display technologies, including organic light emitting diode (OLED) and future microLED displays.

DisplayHDR True Black allows for up to 100X deeper black levels in addition to a greater dynamic range and a 4X improvement in rise time compared to VESA's DisplayHDR 1000 performance tier. This enables a visually stunning experience for home theater and gaming enthusiasts in subdued lighting environments. DisplayHDR and DisplayHDR True Black are the display industry's first fully open standards specifying HDR quality for LCD and emissive displays, respectively.



VESA also announced today that it has added a new 500 performance level to both the DisplayHDR and the DisplayHDR True Black standards to address the need for thin, ultra-lightweight HDR laptops. The new 500 level includes local dimming as well as the same color gamut, black level and bit-depth requirements associated with the 600 and 1000 levels with a small decrease in luminance compared to the 600 level, to bring about better thermal control in displays for super-thin notebooks. While the new 500 level is optimized for very small, ultra-slim displays, it actually applies to all resolutions and screen sizes, including those used in monitors.

Accelerated DisplayHDR Adoption Sets Stage for True Black
Since its introduction a year ago, VESA's DisplayHDR standard has seen widespread and growing adoption among LCD display OEMs. To date, nearly three dozen displays across nine display OEMs have been released to market with DisplayHDR certification. Many more are expected to be introduced in the coming months. With the introduction of the new DisplayHDR True Black standard, VESA anticipates a similarly strong adoption curve among OLED display OEMs as has occurred with the DisplayHDR standard.

"Embracing the new DisplayHDR True Black standard, OLED is the ideal display technology for mixing bright highlights with deep, true blacks, so consumers can create extraordinary content or simply appreciate incredibly breathtaking imagery on their PCs," said Jeremy Yun, vice president, OLED Marketing Team, Samsung Display Company. "The new standard, when coupled with VESA's DisplayHDR logo program, will show consumers that True Black represents a highly important step in enhancing gaming, TV or movie watching, as well as viewing and editing of photos and videos. Users can see and feel a dynamic range that yields a superior high-end HDR experience."

Deeper Black Levels with DisplayHDR True Black Standard
On LCD displays, what is considered "black" is actually a dark grey tone, which is a result of minor light leakage common with these displays. VESA defined the new DisplayHDR True Black specification with emissive displays in mind to bring the permissible black level down to 0.0005 cd/m2 - the lowest level that can be effectively measured with industry-standard colorimeters. For gamers and movie watchers in subdued lighting environments, displays adhering to the DisplayHDR True Black specification can provide incredibly accurate shadow detail and dramatic increases in dynamic range (up to 50X depending on lighting condition) for a truly remarkable visual experience.

"When VESA unveiled the original DisplayHDR standard, we recognized that display technologies were quickly evolving, and we immediately set to work on developing a new open HDR standard for OLED and other emissive display technologies," stated Roland Wooster, chairman of the VESA task group responsible for DisplayHDR, and the association's representative from Intel Corporation for HDR display technology. "On behalf of all of the VESA member companies that contributed to the DisplayHDR True Black specification, I'm pleased to say that we are fulfilling our promise with today's announcement. We're extremely proud of this incredible, high contrast and high dynamic range standard. Consumers benefit from the transparency of the DisplayHDR True Black specification and logo, which makes it clear that they're getting a display that yields huge performance improvements in subdued lighting environments."

Added Wooster, "We're also very excited to include the new 500 performance tier for the DisplayHDR and DisplayHDR True Black standards, which provides true local dimming, high color quality and a high contrast ratio at the lowest price point and thermal impact for display OEMs. This combination makes the 500 level ideal for ultra-thin notebook designs, but it's equally applicable to monitors as well."

View at TechPowerUp Main Site
 
Joined
Aug 3, 2011
Messages
110 (0.02/day)
DisplayHDR is such a scam compared to HDR in TVs. 600 and up REQUIRE local dimming which is useless for anything but getting the specification since <10 is generally used due to cost, and this is generally useless for any sort of content viewing The only other option is getting a 4000:1 contrast ratio (check the tunnel test from the DisplayHDR spec) which only the best of VA panels can achieve. The only thing good that's come out of it is that you can guarantee the monitor is at least 8-bit which has been long overdue for computer monitors.

This True Black standard is even more pointless since it's just another label + price hike to slap onto OLED monitors.
 
Joined
Sep 27, 2014
Messages
550 (0.16/day)
Third party standards are needed, otherwise we are left at manufacturer whims.
My "10 bit" monitor is actually 8 bit FRC that can't be used for HDR. But it has a big 10 bit slapped on.
Price hike? Sure, if the quality is there, why not? You pay extra for that panel anyway, at least now we have a common yard stick.
 
Joined
Aug 3, 2011
Messages
110 (0.02/day)
Because the quality isn't there. There's a DisplayHDR1000 monitor out now (the Philips Momentum) that only has 32 EDGE LIT dimming zones, which ends up degrading your image like crud if actually enabled. Even 32 FALD zones will look like crud in most cases. Forcing OEMs to use edge lit dimming just to achieve your shitty standard is a plain scam since nobody gets any benefit when using edge lit dimming.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
It's obvious that the Philips Momentum is a TV with DisplayPorts and a FreeSync range:
https://www.techspot.com/review/1738-philips-momentum-43/
It's not really suitable to be used as a monitor at all because "it uses a non-standard subpixel structure, which causes a bit of text blurriness in Windows."

As for lacking FALD:
While I could slam Philips more for using edge lit local dimming, the reality is this monitor’s HDR mode does provide an improvement over SDR because it comfortably provides two of the three key HDR pillars in brightness and color space. And in a lot of situations, the edge lit backlight does help improve the dynamic range and contrast of scenes. So it’s not like the HDR experience is awful because it lacks an FALD backlight, it’s definitely better than SDR. But the experience isn’t as good as with a proper HDR display that ticks every box, so I’d class the Momentum 43 as an HDR-lite display or something along those lines.

The good news is that the areas of HDR that the Momentum 43 does support, like brightness and color space, it supports really well. The panel can comfortably sustain over 900 nits of brightness regardless of the window size, and while peak brightness doesn’t quite hit 1000 nits, the 935 nits my unit can produce is absolutely blinding at a desk viewing distance. When you’re not experiencing the glow issues from the edge lit backlight, we’re also getting a contrast ratio over 40,000:1 in a best case scenario, which is great. And as for color space, 97% DCI-P3 coverage means the display can show a significantly higher number of colors than basic sRGB, which leads to more vibrant imagery in the HDR mode.
 
Joined
Aug 3, 2011
Messages
110 (0.02/day)
Yeah a DisplayHDR1000 monitor that only classes as "an HDR-lite display or something along those lines " is a great way to show how much of a scam DisplayHDR specs are.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Much better than SDR and 900+ nit average, I'd say that's worth certifying.
 
Joined
Jan 3, 2019
Messages
51 (0.03/day)
Wtf this is for OLED monitor so, it should have absolute black level.

And btw why my account techpowerup account missing ? I have registered in here since 2009
 
Joined
Aug 3, 2011
Messages
110 (0.02/day)
900 nits? It's only usable around 500. Edge lit is as much of a scam as DisplayHDR.

https://pcmonitors.info/reviews/philips-436m6vbpab/

: said:
This was primarily due to the brightness being boosted to an impressive 1126 cd/m² (‘VESA HDR 1000’) or 1065 cd/m² (‘UHDA’). The black point was less impressive at 0.15 cd/m² – that’s similar to when the monitor runs at >80% brightness under SDR. The white square is so bright that it creates a significant halo of brightness around it, flooding some of the surrounding darkness. Even though the black point reading was taken around 18 – 19cm (7 – 8 inches) from the closest edge of the white square, the glow from the white square penetrated through in that region. Although not documented in the table, a similar thing happened with the smallest test patch size (‘1% of all pixels’). The ‘Normal’ HDR setting was more impressive in many respects. The brightness was more constrained but still very bright at 533 cd/m². This created a much more limited halo effect which didn’t affect the black point reading anywhere near as much, something that was also clear to the eye. We measured a much more pleasing black point of 0.03 cd/m² (similar to the monitor running SDR at ~10% brightness), yielding an impressive contrast ratio of 17767:1.
 
Joined
Nov 29, 2016
Messages
667 (0.25/day)
System Name Unimatrix
Processor Intel i9-9900K @ 5.0GHz
Motherboard ASRock x390 Taichi Ultimate
Cooling Custom Loop
Memory 32GB GSkill TridentZ RGB DDR4 @ 3400MHz 14-14-14-32
Video Card(s) EVGA 2080 with Heatkiller Water Block
Storage 2x Samsung 960 Pro 512GB M.2 SSD in RAID 0, 1x WD Blue 1TB M.2 SSD
Display(s) Alienware 34" Ultrawide 3440x1440
Case CoolerMaster P500M Mesh
Power Supply Seasonic Prime Titanium 850W
Keyboard Corsair K75
Benchmark Scores Really Really High
This is so stupid. Why would you need to certify OLED and Micro-LED? How can black be blacker than off. EVERY OLED and Micro-LED should be automatically "certified" without paying some stupid fee to get it "certified".
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
I suspect that VESA DisplayHDR certification is the same cost as FreeSync certification: the cost of shipping the display (has to get to VESA some how) and the value of the display shipped (implied that it will not be returned). VESA member dues cover the labor of certification where AMD eats the FreeSync certification cost as part of their quality assurance program.
 
Joined
Jun 16, 2016
Messages
409 (0.14/day)
System Name Baxter
Processor Intel i7-5775C @ 4.2 GHz 1.35 V
Motherboard ASRock Z97-E ITX/AC
Cooling Scythe Big Shuriken 3 with Noctua NF-A12 fan
Memory 16 GB 2400 MHz CL11 HyperX Savage DDR3
Video Card(s) EVGA RTX 2070 Super Black @ 1950 MHz
Storage 1 TB Sabrent Rocket 2242 NVMe SSD (boot), 500 GB Samsung 850 EVO, and 4TB Toshiba X300 7200 RPM HDD
Display(s) Vizio P65-F1 4KTV (4k60 with HDR or 1080p120)
Case Raijintek Ophion
Audio Device(s) HDMI PCM 5.1, Vizio 5.1 surround sound
Power Supply Corsair SF600 Platinum 600 W SFX PSU
Mouse Logitech MX Master 2S
Keyboard Logitech G613 and Microsoft Media Keyboard
Third party standards are needed, otherwise we are left at manufacturer whims.
My "10 bit" monitor is actually 8 bit FRC that can't be used for HDR. But it has a big 10 bit slapped on.
Price hike? Sure, if the quality is there, why not? You pay extra for that panel anyway, at least now we have a common yard stick.

Your "10 bit" monitor probably can display 10 bit color with FRC, at least displaying more colors than just 8 bit, but that doesn't necessarily mean it can accept HDR data. HDR contains 10-bit, but it is not 10-bit. So I think that's more on you for buying a monitor for HDR without checking that it actually supports HDR 10 or better. You can still do 10 bit but you're not going to find much content that is 10 bit and not HDR. Probably just photo editing.
 
Joined
Sep 27, 2014
Messages
550 (0.16/day)
HDR contains 10-bit, but it is not 10-bit.
My statement above was just an example of words that mean nothing without standards.
But hey, I'll play... Can you explain what exactly is that difference that HDR is "10 bit but not 10 bit"?
Because all those guys in the industry don't seem to agree yet to what HDR really means.
 
Joined
Apr 19, 2018
Messages
978 (0.45/day)
Processor AMD Ryzen 9 5950X
Motherboard Asus ROG Crosshair VIII Hero WiFi
Cooling Arctic Liquid Freezer II 420
Memory 32Gb G-Skill Trident Z Neo @3806MHz C14
Video Card(s) MSI GeForce RTX2070
Storage Seagate FireCuda 530 1TB
Display(s) Samsung G9 49" Curved Ultrawide
Case Cooler Master Cosmos
Audio Device(s) O2 USB Headphone AMP
Power Supply Corsair HX850i
Mouse Logitech G502
Keyboard Cherry MX
Software Windows 11
Well their previous standards have been a complete joke, enabling completely junk monitors that outright lie about HDR support, for all to "enjoy", for a premium, of course. At least there's no standard here, as OLED does what OLED does, and VESA can't really fuck that up too much, unlike the shitshow that their LCD based "standards" have done to the market.

Mind you, it's kind of the same as THX, and what a joke that turned into when they lowered the specs so that Chinese junk could be included... $79 desktop speakers with 3@ drivers, THX certified! yeah baby, I'll take 3!
 
Joined
Sep 17, 2014
Messages
20,906 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
DisplayHDR is such a scam compared to HDR in TVs. 600 and up REQUIRE local dimming which is useless for anything but getting the specification since <10 is generally used due to cost, and this is generally useless for any sort of content viewing The only other option is getting a 4000:1 contrast ratio (check the tunnel test from the DisplayHDR spec) which only the best of VA panels can achieve. The only thing good that's come out of it is that you can guarantee the monitor is at least 8-bit which has been long overdue for computer monitors.

This True Black standard is even more pointless since it's just another label + price hike to slap onto OLED monitors.
900 nits? It's only usable around 500. Edge lit is as much of a scam as DisplayHDR.

https://pcmonitors.info/reviews/philips-436m6vbpab/

Thank you, THANK YOU.

At least some people get it. The fact they need all those standards is a bad attempt at 'HD ready' scam practices once again.
 
Joined
Nov 19, 2012
Messages
753 (0.18/day)
System Name Chaos
Processor Intel Core i5 4590K @ 4.0 GHz
Motherboard MSI Z97 MPower MAX AC
Cooling Arctic Cooling Freezer i30 + MX4
Memory 4x4 GB Kingston HyperX Beast 2400 GT/s CL11
Video Card(s) Palit GTX 1070 Dual @ stock
Storage 256GB Samsung 840 Pro SSD + 1 TB WD Green (Idle timer off) + 320 GB WD Blue
Display(s) Dell U2515H
Case Fractal Design Define R3
Audio Device(s) Onboard
Power Supply Corsair HX750 Platinum
Mouse CM Storm Recon
Keyboard CM Storm Quickfire Pro (MX Red)
This is so stupid. Why would you need to certify OLED and Micro-LED? How can black be blacker than off. EVERY OLED and Micro-LED should be automatically "certified" without paying some stupid fee to get it "certified".

Actually the trouble is in the transition from full off (let's call this [0,0,0] in terms of RGB) to any other level is orders of magnitude slower than going from, say, [1,1,1] to any other illumination level. Do note that I am still very much against this type of certification, which is only loosely based on agreed-upon industry standards.

So far, DisplayHDR seems even less useful and qualitative of a measure than 80Plus is for PSUs. Now if it were formalized similar to Cybenetics' certification (created out of necessity due to 80Plus' lack of details and moral integrity), which has sound scientific-like in-depth test methodology with all of the testing data and details freely available, explained and easily readable; then it would make sense. I would absolutely love for someone to actually test rise/fall times, backlight flickering and RTC overshoot, using an oscilloscope and proper (constant and repeatable) lab conditions, and then compile that along with color gamut, color accuracy, contrast stability, power consumption, etc. into one or more performance category certifications. There are already websites that host reviewers/testers who do many or all of these things, but they are relatively few and there are not enough commonalities between them to be directly comparable in terms of end results.

Should we as (primarily) consumers rally together, propose a unification of the best approaches from existing test platforms and have reviewers sit down together and propose a standardized battery of tests? Because as I see it, the industry as a whole - meaning developers, manufacturers, distributors and resellers - shows no intent nor interest in doing anything of the sort.
 
Joined
Nov 3, 2013
Messages
2,141 (0.56/day)
Location
Serbia
Processor Ryzen 3600
Motherboard X570 I Aorus Pro
Cooling Deepcool AG400
Memory HyperX Fury 2 x 8GB 3200 CL16
Video Card(s) RX 470 Nitro+ 4GB
Storage SX8200 Pro 512 / NV2 512
Display(s) 24G2U
Case NR200P
Power Supply Ion SFX 650
Mouse G703
Keyboard Keychron V1 (Akko Matcha Green) / Apex m500 (gateron milky yellow)
Software W10
Every time someone unironically mentions "HDR" and "Standard" in the same sentence, God kills 20 kittens. At least.
This thing passed the level of a bad joke a while ago... calling it a dumpster fire would be an understatement.
 

bug

Joined
May 22, 2015
Messages
13,213 (4.06/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
DisplayHDR is such a scam compared to HDR in TVs. 600 and up REQUIRE local dimming which is useless for anything but getting the specification since <10 is generally used due to cost, and this is generally useless for any sort of content viewing The only other option is getting a 4000:1 contrast ratio (check the tunnel test from the DisplayHDR spec) which only the best of VA panels can achieve. The only thing good that's come out of it is that you can guarantee the monitor is at least 8-bit which has been long overdue for computer monitors.

This True Black standard is even more pointless since it's just another label + price hike to slap onto OLED monitors.
You've posted this on Anandtech, too and it doesn't make more sense here.

There's nothing stopping VESA from requiring 10,000:1 static contrast and true 10 bit panels. But then nobody would be able to built a certified device. Because, you know, having a standard does not improve the parts quality.
LCD today can't do much better that 3,000:1 and that's not enough for HDR. How would you have gone about this?
Also, you can't have a 10bit panel for less than $2,000 or $3,000. They are simply a no go in the consumer space.

Now what I'd like to point out is that the OLED specs are geared towards OLED's deeper blacks. So you'd have to be watching in a pretty dark room, otherwise ambient light will kill dark details.
 
Joined
Aug 20, 2007
Messages
20,763 (3.41/day)
System Name Pioneer
Processor Ryzen R9 7950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage 2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64
Can you explain what exactly is that difference that HDR is "10 bit but not 10 bit"?

Basically, not all 10 bits are used for color information, even if present. Some are used for brightness encoding or some shenanigans I assume.
 
Joined
Sep 27, 2014
Messages
550 (0.16/day)
Basically, not all 10 bits are used for color information, even if present. Some are used for brightness encoding or some shenanigans I assume.
Nope. They are all there to be used.

From what I see the only difference is in the maximum luminosity of the panel (nits). My 10 bit monitor is only 300 nit, below the required 400 (or 600) for HDR.
A big difference is in who is charging the "HDR" certification money, and that's what nobody seem to agree on.

Every time someone unironically mentions "HDR" and "Standard" in the same sentence, God kills 20 kittens. At least.
This thing passed the level of a bad joke a while ago... calling it a dumpster fire would be an understatement.
That's more poetical, but is what I was also trying to convey.

AMD got into this HDR game too:
https://www.pcgamesn.com/amd-freesync-2-hdr-standards
https://www.techpowerup.com/245533/amd-comments-on-freesync-2-hdr-controversy
 
Last edited:

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
FreeSync 2 HDR is free except the cost of shipping a monitor and the value of the monitor to AMD for testing.

HDR is mostly about obscene contrast ratios and you get those obscene contrast ratios by having bright backlights and LCDs that can block most of it from passing through.

HDR certification verifies claims manufacturers make in regards to the monitor's capabilities: no more, no less.
 

bug

Joined
May 22, 2015
Messages
13,213 (4.06/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
FreeSync 2 HDR is free except the cost of shipping a monitor and the value of the monitor to AMD for testing.

HDR is mostly about obscene contrast ratios and you get those obscene contrast ratios by having bright backlights and LCDs that can block most of it from passing through.

HDR certification verifies claims manufacturers make in regards to the monitor's capabilities: no more, no less.
Actually, HDR is not about obscene contrast. It's about contrast closer to what the human eye can see. Display devices have been restricted in this department since they were born. HDR is a step towards closing that gap.

Similar to how ray tracing can make rendering a scene more straightforward and yield more realistic results, HDR can work towards eliminating all sorts of perceptual approximations and make it easier to determine what info to retain in a scene. Sure, the approximations we use (in both cases) are quite good and getting the hang of the new workflow will not be instantaneous, but generally, the fewer approxiamtions/reductions employed, the better the result.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
True enough. Nothing is going to beat the blackness of a new moon night nor the brightness of a midday sun. DisplayHDR is trying to push the boundaries what LCDs can do for the everyman and clarify options by categorizing them.
 

bug

Joined
May 22, 2015
Messages
13,213 (4.06/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
True enough. Nothing is going to beat the blackness of a new moon night nor the brightness of a midday sun. DisplayHDR is trying to push the boundaries what LCDs can do for the everyman and clarify options by categorizing them.
And HDR10 does the same for mastering and conveying the source material.
 
Joined
Aug 3, 2011
Messages
110 (0.02/day)
You've posted this on Anandtech, too and it doesn't make more sense here.
What doesn't make sense? I'm not going to explain how HDR works but you can at least read the official DisplayHDR specs from VESA's site. I'll even attach it for you to make things easy.

There's nothing stopping VESA from requiring 10,000:1 static contrast and true 10 bit panels. But then nobody would be able to built a certified device. Because, you know, having a standard does not improve the parts quality.
Nobody mentioned 10,000:1. True 10 bit, however should be required for the highest level of HDR, just as how most high end TVs use true 10-bit panels.

LCD today can't do much better that 3,000:1 and that's not enough for HDR. How would you have gone about this?
Correction: IPS panels can't do much better that 2,000:1, so they should not be certified for HDR. However, this isn't even the case for the DisplayHDR spec. Go check section 5.2 on the PDF. The luminance levels from the corner box test states that HDR400 would need a 1000:1 contrast ratio, HDR600 needs 6000:1, and HDR1000 needs 12000:1. These contrast ratios are only achievable with some sort of local dimming. Guess what's an easy way to cheat this? Edge-lit local dimming. They even flat out say the Tunnel test "Ensures that either the native panel contrast ratio is at least 4000:1, –or that local or global dimming has been implemented." [pg 22]

TVs get between 5000:1 and 6000:1 static contrast easy from VA. Why do most mid to high end TVs (LG excluded since they only have OLED or IPS) use VA panels? Because that's the only way you get proper HDR outside of OLED, which only LG makes. Why is this different for monitors? Because IPS is predominant here with OEMs, especially ones that put in a ton of effort into IPS development (AU Optronics, Sharp, and LG most notably) and ignored everything else. Samsung is probably the only major player in monitor panels that's still improving VA tech.

Also, you can't have a 10bit panel for less than $2,000 or $3,000. They are simply a no go in the consumer space.
But why is it possible for TVs? Why are you able to buy a true 10-bit TV many times the size of computer monitor for under $2K? OEMs are neglecting the computer monitor space so the tech is years behind as well as super expensive. That's why. VESA is only aiding the OEMs with their rushed and crud DisplayHDR specs as a me too move.

There are plenty of more issues with the whole HDR thing most specifically with content, but that's another discussion. VESA's DisplayHDR spec for OLED is pointless due to the nature of OLED, and the LCD one is just garbage since it requires local dimming for 600 and up.
 

Attachments

  • DisplayHDR_CTS_v1.0.pdf
    564.6 KB · Views: 360
Top