• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon GPUs Limit HDR Color Depth to 8bpc Over HDMI 2.0

Using 4:2:2 in 4K@60 doesn't mean it's 8-bit only, it can be 12-bit, not sure why no 10-bit:

edkqcg0.png

http://www.hdmi.org/manufacturer/hdmi_2_0/hdmi_2_0_faq.aspx

That Zlatan Answered!

Jesus TPU, this is an HDMI limitation, not an AMD one! The only question is what is better ... doing 8 bit with 4:4:4 chroma sub-sampling, or doing 10 bit with 4:2:2. In theory the latter is better, but this is hugely deppends on the game’s tonemapping pipeline. Today it is hard to design a pipeline to both traditional display signals and new HDR output signals. The industry don't have enough experience to do this. So the tonemapping pipeline in a game is designed for 8 bits per cell. So even if a hardware can do 10 bit with 4:2:2 in 4K@60 fps, it is not useful to support this option until the game’s tonemapping pipeline mature enough.
 
what kind of noobs do write articles here in tpu, what epic failure article do write btarunr.. just make the ridicolous.. ROFLMAO ¬¬

AMD reaffirms HDR abilities, HDMI 2.0 is the limitation

Read more: http://www.tweaktown.com/news/55012/amd-reaffirms-hdr-abilities-hdmi-2-limitation/index.html

Just as I was laying down to hopefully fall asleep after a massive 18-hour work day, I read a story over at TechPowerUp sourced from German tech site Heise.de, that AMD Radeon graphics cards were limited in their HDR abilities... well, click bait can be bad sometimes, and we now know the truth.

amd-reaffirms-hdr-abilities-hdmi-limitation_09 The original story can be read here, which claimed that Radeon graphics cards were reducing the color depth to 8 bits per cell (16.7 million colors) or 32-bit, if the display was connected to HDMI 2.0, and not DisplayPort 1.2 - something that spiked my interest.

10 bits per cell (1.07 billion colors) is a much more desired height to reach for HDR TVs, but the original article made it out to seem like this was a limitation of AMD, and not that of HDMI 2.0 and its inherent limitations. Heise.de said that AMD GPUs reduce output sampling from the "desired Full YCrBr 4: 4: 4 color scanning to 4: 2: 2 or 4: 2: 0 (color-sub-sampling / chroma sub-sampling), when the display is connected over HDMI 2.0. The publication also suspects that the limitation is prevalent on all AMD 'Polaris' GPUs, including the ones that drive game consoles such as the PS4 Pro," reports TPU.

I reached out to AMD for clarification, with Antal Tungler - the Senior Manager of Global Technology Marketing, who said t hat this was a limitation of HDMI bandwidth. Tungler said "we have no issues whatsoever doing 4:4:4 10b/c HDR 4K60Hz over Displayport, as it's offering more bandwidth".

Tungler added: "4:4:4 10b/c @4K60Hz is not possible through HDMI, but possible and we do support over Displayport. Now, over HDMI we do 4:2:2 12b 4K60Hz which Dolby Vision TVs accept, and we do 4:4:4 8b 4K60Hz, which TVs also can accept as an input. So we support all modes TVs accept. In fact you can switch between these modes in our Settings".

amd-reaffirms-hdr-abilities-hdmi-limitation_10 Now, that's settled. This isn't a limitation of AMD Radeon graphics cards or the APU inside of the PS4 Pro, but instead its a limitation of bandwidth from the HDMI 2.0 standard. DP 1.2 has no issues throwing up HDR at the right 4:4:4 10b/c at 4K60, with AMD supporting it all - as long as you are careful when buying your TV or display, and want the best experience from it - HDMI 2.0 is a limitation right now.

Click bait articles aren't good, and it tarnishes the reputation of people in its path. TPU ran the story without fact checking it either, and while I'm not personally calling Heise.de or TPU out personally, it would be nice to not have sensationalist headlines for something that has been explained in detail (the limitations of HDMI 2.0 and the superiority of DisplayPort).

DisplayPort offers more bandwidth, and will be driving 4K120 in 2017, as well as 1080p and 1440p at 240Hz. AMD is on the bleeding edge of that, but don't fall for the non-hype of this story. Our original story on Radeon Technologies Group event in Sonoma, CA last year. The same article, discussing DP1.3 supporting 5K60, 4K120, 1080p/1440 at 240Hz.

Read more: http://www.tweaktown.com/news/55012/amd-reaffirms-hdr-abilities-hdmi-2-limitation/index.html
 
Last edited:
What kind of noob post links which have been repeatedly posted throughout this thread?
 
What if you connect your GPU to the 4K HDR TV using dual HDMI cables?? Wouldn't this double the bandwidth in theory making possible ...everything ?
 
What if you connect your GPU to the 4K HDR TV using dual HDMI cables?? Wouldn't this double the bandwidth in theory making possible ...everything ?
Yes it would, it's like dual channel DVI. Of course the standard would have to be ratified and implemented for it to work and I can't see that happening since a single plug is cheaper and more marketable. Instead, HDMI Licensing will just up the HDMI standard itself instead like they've been doing since it came out in 2002.
 
What if you connect your GPU to the 4K HDR TV using dual HDMI cables?? Wouldn't this double the bandwidth in theory making possible ...everything ?

Do we have TV with dual HDMI cables? I was thinking. I think with SFR (Why SFR? Read this link), It's possible to carry 50% of 10-bit colour with Full RGB 4:4:4 through HDMI 2.0.Other 50% of data can go either DP 1.2 or other HDMI 2.0 port.But that's problem :

1) Game engine should support SFR.a SFR implementation requires a considerable amount of skill and experience, not every developer can do
2) TV with android should support Dual screen( I don't know what they call ).for instance : android should be able to play 50% of data from HDMI + other 50%'s Data from DP

any else?
 
I mean PS4K has HDR and people seem to think it looks good. Furthermore they have it on PS3 as well which has HDMI 1.3 I think.


So either this is easily fixable with a driver update, or 8-bit HDR is decent.
 
Same with nvidia... I can't run above 8bpc with my 1080.. It's bullshit if you ask me.. I bought this thing for top end.. Not to find out it can't compete with DP..
I'm so sick of false and misleading advertising by nvidia and AMD..
 
Same with nvidia... I can't run above 8bpc with my 1080.. It's bullshit if you ask me.. I bought this thing for top end.. Not to find out it can't compete with DP..
I'm so sick of false and misleading advertising by nvidia and AMD..
Behrouz's post shows that AMD gave data of hdmi's limitation. I'm hopeful Nvidia would have similar slides too.
 
Gaming on a Tv screen is awefull, input lag is rarely under 30ms for 4kHDR TV's.

The surent state of Computer monitors is pretty bad, the market is littered with awefull 6bit pannels.
 
Same with nvidia... I can't run above 8bpc with my 1080.. It's bullshit if you ask me.. I bought this thing for top end.. Not to find out it can't compete with DP..
I'm so sick of false and misleading advertising by nvidia and AMD..

The fact is HDR is f***ing pathetic on PC. We still don't have monitors that support it, apparently only DP works with it, and few games actually support it.


Meanwhile a simple update adds it to most new TV's for consoles. WTF is going on?!?!
 
The fact is HDR is f***ing pathetic on PC. We still don't have monitors that support it, apparently only DP works with it, and few games actually support it.


Meanwhile a simple update adds it to most new TV's for consoles. WTF is going on?!?!
Idk... more money in consoles/tv market, I guess.
 
Idk... more money in consoles/tv market, I guess.

Well yes and no. Rich old men are often the ones to buy new display tech first - so I would say the TV thing is mostly true.

However the PC gaming market is pretty comparable to the console market in many respects at this point, and usually companies use PC footage as their flagship advertisement - so having HDR for the "Sizzle Reals" would be a good idea.


Though the dumbest aspect imo is how easy it would be to add HDR to monitors. it would take so little effort to make 4K monitors HDR capable, and god knows PC Elite gamers are happy to pay $500+ for a display.
 
Next monitor I get will be DP, same with TV, hdmi is a poor connection in my book
 
Next monitor I get will be DP, same with TV, hdmi is a poor connection in my book

Good luck finding TVs with DP...i actually wanted to change my TV this holiday season, but changed my mind when i couldnt find any with DP, except for some really expensive Panasonic models.
 
Good luck finding TVs with DP...i actually wanted to change my TV this holiday season, but changed my mind when i couldnt find any with DP, except for some really expensive Panasonic models.

I dont use a tv on pc
 
Next monitor I get will be DP, same with TV, hdmi is a poor connection in my book

Yeah I'm pretty done with HDMI. I wish everything would use Displayport.

In fact I wish the next gen consoles would just come with DP 1.4. Though I know that's a longshot, I could actually see it happen due to 8K support.
 
yay trolls that join just to start crap. you will be gone soon enough :roll:

To be fair they are correct. This thread shouldn't exist because it makes TPU look either nvidia biased or just flat out wrong. Neither of those options is good in my book.
 
Also Tarun looks bad, it also shows that time after time Tarun has pulled the trigger sooner than he should have.
 
This thread shouldn't exist because it makes TPU look either nvidia biased or just flat out wrong.
and yet there's no correction/retraction on the front page :(
 
Back
Top