1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Samsung, Amazon Unveil Yet Another HDR Standard

Discussion in 'News' started by Raevenlord, Apr 20, 2017.

  1. Raevenlord

    Raevenlord News Editor Staff Member

    Joined:
    Aug 12, 2016
    Messages:
    864 (2.49/day)
    Thanks Received:
    847
    And here I was thinking the whole point of having standards was to homogenize offerings for a given feature, ensuring the same minimal requirements were met by anyone (or any product) looking to carry a sticker emblazoning its capabilities. Yet here it is, another HDR standard, which Samsung and Amazon are calling HDR10+.

    The HDR10+ standard looks to slightly bridge the gap between the HDR10 standard as certified by the UHD Alliance, and the Dolby Vision one, which boasted better HDR reproduction whilst carrying higher specifications to be adhered to. The greatest change in HDR10+: the adoption of Dynamic Tone Mapping, which stand upon variable dynamic metadata to help adjust brightness and contrast in real time, optimized on a frame-by-frame basis, a feature present in Dolby Vision but lacking on the UHD Alliance's HDR10, which resulted in some overly darkened bright scenes.

    [​IMG] [​IMG]

    The presence of both Samsung and Amazon in this announcement isn't fortuitous: a hardware manufacturer and a content provider are, in this case, holding hands in delivering an improved experience and an ecosystem adapted to this new standard. For one, Samsung's 2016 UHD TVs will get HDR10+ support through a firmware update during the second half of 2017 (all of Samsung's 2017 UHD TVs, including its premium QLED TV lineup, support HDR10+.) This ensures hardware support for new HDR10+ content, where dynamic metadata needs to be included within a video file before it can be decoded, hence relying upon content creators adopting it - and Amazon is one of the prime content providers for the latest technologies. What a happy coincidence.

    Source: News @ Samsung.com
     
  2. deemon

    Joined:
    Apr 16, 2015
    Messages:
    280 (0.34/day)
    Thanks Received:
    65
    Dolby Vision isn't standard... it's proprietary shit like G-Sync and the like, that needs to die!
     
  3. R-T-B

    R-T-B

    Joined:
    Aug 20, 2007
    Messages:
    6,594 (1.82/day)
    Thanks Received:
    5,896
    It is a standard, just a proprietary standard. Standards don't have to be open. Heck, betamax was a standard.
     
  4. Solidstate89

    Solidstate89

    Joined:
    May 29, 2012
    Messages:
    400 (0.21/day)
    Thanks Received:
    135
    Being proprietary has no effect on whether it is still a standard or not. Especially when it's as influential as Dolby.

    I don't know why you would even think otherwise.
     
  5. Xzibit

    Xzibit

    Joined:
    Apr 30, 2012
    Messages:
    2,230 (1.17/day)
    Thanks Received:
    1,209
    The UHD Alliance Premium Cert is an Entry level sticker. It only requires you to meet 90% of color gamut @ 1k nits / 540 on OLEDs. Good enough stamp of approval. They still have their issues with certifying Local Dimming tech and of course it doesn't include the Audio side of things.
     
  6. ZoneDymo

    ZoneDymo

    Joined:
    Feb 11, 2009
    Messages:
    1,574 (0.51/day)
    Thanks Received:
    554
    ermm which standard is not proprietary? pretty sure HDMI etc is also owned.
     
  7. notb

    Joined:
    Jun 28, 2016
    Messages:
    667 (1.70/day)
    Thanks Received:
    174
    Location:
    Warsaw, Poland
    Of course. And it's not free, obviously:
    "HDMI manufacturers pay an annual fee of US$10,000 plus a royalty rate of $0.15 per unit, reduced to $0.05 if the HDMI logo is used, and further reduced to $0.04 if HDCP is also implemented."
    en.wikipedia.org/wiki/HDMI

    It's very unlikely for community-originated solution to become a large industry standard in electronics. Almost each of the "standards" were started by large manufacturers (often as joint venture).

    Even the ubiquitous USB is protected by a (non-free) license.

    And here's a interesting difference: if you're making a HDMI cable, you're getting a discount for adding HDMI logo. It's exactly the opposite with USB, where you have to pay extra to put a USB logo on your product. :D
    This is a beautiful example of a difference between a dominating and well-known product VS one that still fights for market share and recognizability. :D
     
    Aenra says thanks.
  8. RejZoR

    RejZoR

    Joined:
    Oct 2, 2004
    Messages:
    11,369 (2.43/day)
    Thanks Received:
    5,185
    Location:
    Europe/Slovenia
    So, buying HDR capable devices is a pointless thing at this point with so many standards and no one knowing which one will be used...
     
    10 Year Member at TPU
  9. FordGT90Concept

    FordGT90Concept "I go fast!1!11!1!"

    Joined:
    Oct 13, 2008
    Messages:
    20,256 (6.32/day)
    Thanks Received:
    9,420
    Location:
    IA, USA
    Pretty sure MPEG-H Part 14 (HDR10) is the defacto standard for HDR. It is what ATSC 3.0 will use and no doubt PAL as well.
     
    Last edited: Apr 21, 2017
    Crunching for Team TPU
  10. medi01

    Joined:
    Jul 9, 2015
    Messages:
    1,086 (1.45/day)
    Thanks Received:
    321
    I don't see any gap between (free) HDR10+ and Dolby.
    The main difference was that metadata was static in HDR, now it isn't anymore.

    Oh, and did I mention you don't need to pay Dolby (or anyone) for every device/software that is using HDR10+ just for supporting it?

    To my knowledge LG is the only TV manufacturer that bothered supporting Dolby (on top of HDR10 of course). But given the price of its OLED tvs, it's peanuts for them anyhow.


    They don't have to be open, but given that we have HDR10 that is free, no thanks (and f*ck you, Dolby)

    The "we invest a couple of millions into standard to shave off billions for nothing later on" crap needs to stop. (and I'm glad Samsung and Sony bothered)

    No, that's a different thing (encoding based).
    The HDR10 vs Dolby is how you communicate that to a device.

    Since expectations are that device max brightness/whatever will vary a lot, "additional work" is needed to map from source to device in an optimal way.


    There are only 2 standards, OK, this one is 3rd, next to nobody supports Dolby and good luck taking on standard established by actual TV manufacturers (and available loyalty free)

    You should care more about actual capabilities of the thing with "HDR" label, as it can very wildly.
     
    Last edited: Apr 21, 2017
  11. Brusfantomet

    Joined:
    Mar 23, 2012
    Messages:
    654 (0.34/day)
    Thanks Received:
    220
    Location:
    Norway
    not a big fan of these dynamic back lighting solutions, and all the TVs I have seen with local dimming looks bad.

    as for the thing with standards:
    [​IMG]
    (from https://xkcd.com/927/)
     
    FordGT90Concept, Xzibit and Aquinus say thanks.
  12. medi01

    Joined:
    Jul 9, 2015
    Messages:
    1,086 (1.45/day)
    Thanks Received:
    321
  13. Xzibit

    Xzibit

    Joined:
    Apr 30, 2012
    Messages:
    2,230 (1.17/day)
    Thanks Received:
    1,209
    That strip is kind of right

    Dolby Vision already had these standards but HDR10 didn't cover them. UHD Alliance seams to not take them into consideration until the majority of their body agrees on them. There is a gap between innovating new way to meet higher set standard and them being accepted.

    If you recall Samsung had been pushing HDR 1000 for nits capability and with the QLEDs they are pushing HDR 1500 & HDR 2000 which puts them closer to DV territory.

    [​IMG]

    LEDs were already capable of +1000 but they looked horrid without local dimming grids. OLED just cant get bright enough to reach 1k let alone DV at 4k and who knows what comes next HDR12 ?nits & DV 10k nits and thats just one aspect of the standards.
     
    Last edited: Apr 21, 2017
  14. IvanP91v

    Joined:
    Oct 2, 2011
    Messages:
    31 (0.01/day)
    Thanks Received:
    7
    Do we really need more than one standard to display a 1.07 billion colors?
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)