Friday, November 18th 2016

AMD Radeon GPUs Limit HDR Color Depth to 8bpc Over HDMI 2.0

High-dynamic range or HDR is all the rage these days as the next big thing in display output, now that hardware has time to catch up with ever-increasing display resolutions such as 4K ultra HD, 5K, and the various ultra-wide formats. Hardware-accelerated HDR is getting a push from both AMD and NVIDIA in this round of GPUs. While games with HDR date back to Half Life 2, hardware-accelerated formats that minimize work for game developers, in which the hardware makes sense of an image and adjusts its output range, is new and requires substantial compute power. It also requires additional interface bandwidth between the GPU and the display, since GPUs sometimes rely on wider color palettes such as 10 bpc (1.07 billion colors) to generate HDR images. AMD Radeon GPUs are facing difficulties in this area.

German tech publication Heise.de discovered that AMD Radeon GPUs render HDR games (games that take advantage of new-generation hardware HDR, such as "Shadow Warrior 2") at a reduced color depth of 8 bits per cell (16.7 million colors), or 32-bit; if your display (eg: 4K HDR-ready TV) is connected over HDMI 2.0 and not DisplayPort 1.2 (and above). The desired 10 bits per cell (1.07 billion colors) palette is available only when your HDR display runs over DisplayPort. This could be a problem, since most HDR-ready displays these days are TVs. Heise.de observes that AMD GPUs reduce output sampling from the desired Full YCrBr 4: 4: 4 color scanning to 4: 2: 2 or 4: 2: 0 (color-sub-sampling / chroma sub-sampling), when the display is connected over HDMI 2.0. The publication also suspects that the limitation is prevalent on all AMD "Polaris" GPUs, including the ones that drive game consoles such as the PS4 Pro.
Source: Heise.de
Add your own comment

126 Comments on AMD Radeon GPUs Limit HDR Color Depth to 8bpc Over HDMI 2.0

#1
RejZoR
There must be technical reason behind this, considering Polaris is new GPU that now has HDMI 2.0 support (unlike R9 Fury which was still HDMI 1.4). Question is, why. It can't be cheating or GPU processing saving since DisplayPort does work with HDR in full fat mode. So, what is it? Unless HDMI 2.0 support isn't actually HDMI 2.0. Somehow. That would kinda suck.
Posted on Reply
#2
bug
This is weird, because HDMI 2.0 offers the same bandwidth as DP 1.2. Probably a bug in the drivers?
Posted on Reply
#3
MyTechAddiction
I remember that the human eye can`t distinguish more that 16 milion colors anyway.Whats the point for more except for marketing purposes?
Posted on Reply
#4
TheLostSwede
News Editor
MyTechAddictionI remember that the human eye can`t distinguish more that 16 milion colors anyway.Whats the point for more except for marketing purposes?
The point is HDR which the human eye can see.
Posted on Reply
#5
zlatan
Jesus TPU, this is an HDMI limitation, not an AMD one! The only question is what is better ... doing 8 bit with 4:4:4 chroma sub-sampling, or doing 10 bit with 4:2:2. In theory the latter is better, but this is hugely deppends on the game’s tonemapping pipeline. Today it is hard to design a pipeline to both traditional display signals and new HDR output signals. The industry don't have enough experience to do this. So the tonemapping pipeline in a game is designed for 8 bits per cell. So even if a hardware can do 10 bit with 4:2:2 in 4K@60 fps, it is not useful to support this option until the game’s tonemapping pipeline mature enough.
Posted on Reply
#6
ShurikN
German tech publication Heise.de..........
.........The publication also suspects that the limitation is prevalent on all AMD "Polaris" GPUs, including the ones that drive game consoles such as the PS4 Pro.
Something you forgot to mention
In a test at heise they checked out Shadow Warrior 2 in HDR a Radeon RX 480 which showed similar visual results towards a GeForce GTX 1080. So it seems this is the case for Nvidia as well and likely Nvidia is using a similar trick at 8-bit also. Nvidia has not yet shared info on this though. According to heise, they did see a decrease in performance with Nvidia whereas the RX 480 performance remained the same.
Posted on Reply
#7
RejZoR
MyTechAddictionI remember that the human eye can`t distinguish more that 16 milion colors anyway.Whats the point for more except for marketing purposes?
Just like human eye can't see more than 24fps, right? Things are not so simple for computers, mostly before image actually reaches the output display device. Those 24bit means just number of solid colors. You need additional 8bits to represent transparent colors, that's why we address it as 32bit while color space is actually 24bit (+8bit transparency).

I've been working with 3D raytracing and the tool had option for 40bit image renering which significantly increased rendering time, but decreased or even eliminated color banding on the output image (especially when that got scaled down later to be stored as image file since mainstream formats like JPG, PNG or BMP don't support 40bits).
Posted on Reply
#8
snakefist
[/irony on]

Of course human eye can see 10-bit palette accurately - just as human ear is capable to hear cats and dogs frequencies and distinguish clearly the debilitating difference between 22.1KHz and 24KHz (let alone 48KHz!). Also, all displays used accurately show all 100% of visible spectrum, having no technological limitations whatsoever, so this is of utmost importance! All humanity vision and hearing are flawless, too... And we ALL need 40MP cameras for pictures later reproduced pictures on FHD or, occasionally, 4k displays... especially on mobile phones displays...

[irony off]

I try to fight this kind of blind belief in ridiculous claims for many years, but basically it's not worth it...
Posted on Reply
#9
bug
MyTechAddictionI remember that the human eye can`t distinguish more that 16 milion colors anyway.Whats the point for more except for marketing purposes?
The human eye can actually distinguish less than a million colours, but the point is, which colours?
On one side we have the dynamic range, which is the difference between the darkest and brightest colour. Th human eye has a certain dynamic range, but by adjusting the pupil, it can shift this range up and down.
On the other hand, we have the computer which handles colour in a discrete world. Discrete computing inevitably alters the data, thus we need to have the computer work at a level of detail the human doesn't see, otherwise we end up with wrong colours and/or banding.

In short, for various reasons, computers need more info to work with than the naked eye can see.
Posted on Reply
#10
heky
Like zlatan has mentioned already, this is a HDMI 2.0 limiation, not a AMD/Nvidia one.
Posted on Reply
#11
Prima.Vera
MyTechAddictionI remember that the human eye can`t distinguish more that 16 milion colors anyway.Whats the point for more except for marketing purposes?
Actually the human eye can distinguish between 2 and 10 million, depending on age, eye viewside quality, etc... However HDR supposedly was good on removing color banding in games/movies...
Posted on Reply
#12
natr0n
HDR reminds me of nfs:mw 2005 it had an hdr setting way back when.
Posted on Reply
#13
nemesis.ie
The key here is that we are now getting displays that can output HDR images which means better contrast, more visible detail in dark areas. small very bright points at the same time as dark areas etc. i.e. the entire pipeline needs to support HDR to get the "full effect".
Posted on Reply
#14
bug
natr0nHDR reminds me of nfs:mw 2005 it had an hdr setting way back when.
Eh, it's not the same thing. That was tone mapping (see: en.wikipedia.org/wiki/Tone_mapping ). This time we're talking real HDR. Which is nice and has been around since forever, but it needs to shift huge amount of data which video cards are only now starting to be able to handle. Even so, be prepared for the same content availability issues that have plagued 3D and 4k.
Posted on Reply
#15
FordGT90Concept
"I go fast!1!11!1!"
3840 x 2160 x 60 Hz x 30 bit = 14,929,920,000 or 14.9 Gb/s
HDMI 2.0 maximum bandwidth: 14.4 Gb/s

I said it before and I'll say it again, HDMI sucks. They're working on HDMI 2.1 spec likely to increase the bandwidth. Since HDMI is running off the DVI backbone that was created over a decade ago, the only way to achieve more bandwidth is shorter cables. HDMI is digging its own grave and has been for a long time.
Posted on Reply
#16
Xuper
bugThis is weird, because HDMI 2.0 offers the same bandwidth as DP 1.2. Probably a bug in the drivers?
Nope, For 10bit ,4K@60Hz and 4:4:4 , You need at least DP 1.3 far above HDMI 2.0 , This supports 30/36/48 RGB Colour or 10/12/16 bit Colour at 4K

Edit : Oh i found this :

www.hdmi.org/manufacturer/hdmi_2_0/hdmi_2_0_faq.aspx

Look at a Chart :
4K@60 , 10 bit , 4.2.2 : Pass
4K@60 , 10 bit , 4:4:4 : Fail
Posted on Reply
#17
Solidstate89
MyTechAddictionI remember that the human eye can`t distinguish more that 16 milion colors anyway.Whats the point for more except for marketing purposes?
You don't "remember" anything, because that is bullshit what you just said. It's like saying the human eye can't see more than 30FPS.

Also, this is a limitation of the HDMI spec. That's the entire purpose for HDMI 2.0a's existence is to add the necessary metadata stream for UHD-HDR to work.
Posted on Reply
#18
bug
behrouzNope, For 10bit ,4K@60Hz and 4:4:4 , You need at least DP 1.3 far above HDMI 2.0 , This supports 30/36/48 RGB Colour or 10/12/16 bit Colour at 4K

Edit : Oh i found this :

www.hdmi.org/manufacturer/hdmi_2_0/hdmi_2_0_faq.aspx

Look at a Chart :
4K@60 , 10 bit , 4.2.2 : Pass
4K@60 , 10 bit , 4:4:4 : Fail
I was commenting on this:
German tech publication Heise.de discovered that AMD Radeon GPUs render HDR games ... at a reduced color depth of 8 bits per cell (16.7 million colors), or 32-bit; if your display (eg: 4K HDR-ready TV) is connected over HDMI 2.0 and not DisplayPort 1.2 (and above).
The assertions is it only reduces the colour depth on HDMI 2.0 while all is peachy on DP 1.2. Which, as I have said, it's weird, because both HDMI 2.0 and DP 1.2 have the same bandwidth. Still, it could be HDMI's overhead that makes the difference once again.
Posted on Reply
#19
eidairaman1
The Exiled Airman
Not a fan of hdmi anyway, such a weaksauce connector standrd.
Posted on Reply
#20
prtskg
bugThis is weird, because HDMI 2.0 offers the same bandwidth as DP 1.2. Probably a bug in the drivers?
So you are in the driver. Sorry couldn't resist it.:p
ShurikNSomething you forgot to mention
In a test at heise they checked out Shadow Warrior 2 in HDR a Radeon RX 480 which showed similar visual results towards a GeForce GTX 1080. So it seems this is the case for Nvidia as well and likely Nvidia is using a similar trick at 8-bit also. Nvidia has not yet shared info on this though. According to heise, they did see a decrease in performance with Nvidia whereas the RX 480 performance remained the same.
That's quite the thing he missed.
Posted on Reply
#21
bug
prtskgSo you are in the driver. Sorry couldn't resist it.:p
I meant another bug :blushing:
Posted on Reply
#22
qubit
Overclocked quantum bit
zlatanJesus TPU, this is an HDMI limitation, not an AMD one! The only question is what is better ... doing 8 bit with 4:4:4 chroma sub-sampling, or doing 10 bit with 4:2:2. In theory the latter is better, but this is hugely deppends on the game’s tonemapping pipeline. Today it is hard to design a pipeline to both traditional display signals and new HDR output signals. The industry don't have enough experience to do this. So the tonemapping pipeline in a game is designed for 8 bits per cell. So even if a hardware can do 10 bit with 4:2:2 in 4K@60 fps, it is not useful to support this option until the game’s tonemapping pipeline mature enough.
This would mean that NVIDIA will have the same limitation then. I personally have no idea as I'm not familiar with the intimate details of the HDMI spec and am taking your word for it.

@btarunr Do you want to check zlatan's point and update the article if he's right?
Posted on Reply
#23
Prima.Vera
This time is not AMD fault, but HDMI standard one. nVidia has THE SAME issue with HDMI 2.0 btw.
Also not sure why those 4K HDR TVs are not provided with at least DP1.2 interfaces also...
Posted on Reply
#25
ShurikN
Prima.VeraAlso not sure why those 4K HDR TVs are not provided with at least DP1.2 interfaces also...
I believe there are some, but are not that great in number. Now i understand a 1080p screen not having DP, but for a 4k one, its ridiculous.
Posted on Reply
Add your own comment
Apr 27th, 2024 17:39 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts