• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon GPUs Limit HDR Color Depth to 8bpc Over HDMI 2.0

Pretty sure they're talking about Microsoft's generic drivers (GPU and display), not manufacturer drivers.

From Nvidia:

"If your application does not run in full-screen exclusive mode, the desktop compositor will strip the extra range and precision necessary for HDR. It is important to understand that this is a temporary restriction as Microsoft announced plans for OS support for HDR."

https://developer.nvidia.com/displaying-hdr-nuts-and-bolts
 
To Steevo, thanks for confirming you have no experience with HDR 10 or any 4K HDR format. Not that your shitbox could even push 4K anything. Asshole.


Thanks sweetie, I love you too.
 
Ah, yeah, I read that Shadow Warrior 2 had to run full screen to work with HDR. That makes sense that, running windowed, it has to be the same as the desktop which, right now, is 8-bpc. Microsoft aims to make the whole explorer HDR compatible so that's no longer a limitation (improves the look of the OS too).
 
I don't know what's taking Microsoft so long. This is holding up all sorts of goodies for HTPC users. It's... awkward for me to have to switch to my Xbox One S instead of my PC for Forza to enjoy the HDR goodness.
 
HDR is a superset of 10 bit support. Shared graphics mode on Windows 10 does not support HDR 10 yet, nor do any desktop apps. HDR is 10 bit, 10 bit is not HDR.
https://mspoweruser.com/microsoft-bringing-native-hdr-display-support-windows-year/

Shadow Warrior will work with HDR on PC, but only in exclusive mode. Again, there's no proof from the original article that AMD is not pushing 10 bits in HDR mode. I think that's what is so frustrating is that the headline makes it sound like a fact but there's no actual proof that it's a deficiency on AMD's part or if the game is "faking" HDR in some way (not using 10 bit pixels in the entire rendering pipeline).

To Steevo, thanks for confirming you have no experience with HDR 10 or any 4K HDR format. Not that your shitbox could even push 4K anything. Asshole.

To Xzibit, I completely missed the tiny source link, whoops. It looks like the source article is quoting yet another article, which makes this third hand news. Still a whole bunch of non-news. As for my TV, I have a P series now, and though the 1000 nit mastering is a big part of HDR, it's only part. The big deal is 10 bit color for better reds and greens and little to no dithering. I play UHD blurays and Forza Horizon 3 on my TV and the difference is shocking. I never realized how much tone mapping they did in 8 bit content until I saw dark shadow detail and the sun in the same shot.

What the hell ? Is it allowed in Forum? No need to be angry.Also reported.

I'll concede that HDMI does not have 18 Gbps bandwidth for just video and audio, but at no point does this prevent HDR from going over HDMI 2.0, like he was suggesting. It is AMD's fault if this isn't working, not HDMI 2.0. I have sent HDR over HDMI 2.0. There is no bandwidth issue, no matter what the number is.

EDIT:

This is what Vizio says their HDMI 2.0 ports support. Parens are my analysis of the article.
600MHz pixel clock rate:
2160p@60fps, 4:4:4, 8-bit (AMD doesn't support)
2160p@60fps, 4:2:2, 12-bit (PS4 Pro w/ AMD GPU supports, Polaris may not support)
2160p@60fps, 4:2:0, 12-bit (PS4 Pro w/ AMD GPU supports, Polaris may not support)

When HDMI 2.0 FAQ says It's not possible to carry 4K@60 , 10 bit , 4:4:4 over HDMI 2.0 then You can't say it's AMD's fault! also , Where did you get that AMD can't do that ? proof?
 
What the hell ? Is it allowed in Forum? No need to be angry.Also reported.



When HDMI 2.0 FAQ says It's not possible to carry 4K@60 , 10 bit , 4:4:4 over HDMI 2.0 then You can't say it's AMD's fault! also , Where did you get that AMD can't do that ? proof?
Whoa whoa whoa there, we were merely having a spirited discussion.
 
Nvidia cuts corners in quality for performance, they always have.

Looking at TSAA vs other AA effects, TSAA looks like it removes some lighting passes, and some detail while performing AA.
I never understood why the game developers are still pushing for those shitty and crappy technologies like FXAA or TSAA, when there are way better solution out there. The COD:IW and DOOM all have beautiful AA settings with very low performance impact.
 
I guess the "nVidia GF GPUs Limit HDR Color Depth to 8bpc Over HDMI 2.0" article will be saved for another slow news day, eh?..
 
I think it would be good if Bta strikes through the article and issues an apology at this point.

Otherwise it will be hard to take any future articles seriously. (IMO, of course)
 
I remember that the human eye can`t distinguish more that 16 milion colors anyway.Whats the point for more except for marketing purposes?
16.8 million colors gives a maximum of 256 gradations between any two colors, which is clearly not enough. Human vision can distinguish more than 16.8 million colors across the field of vision.

HDR reminds me of nfs:mw 2005 it had an hdr setting way back when.
Games have supported HDR internally for more than a decade, what's been missing is a way to display it. For this reason games have applied a tone mapping technique called bloom, which looks awful.

Looking at TSAA vs other AA effects, TSAA looks like it removes some lighting passes, and some detail while performing AA.
Any expert in visualization knows that AA-techniques utilizing post processing effects will degrade the overall visual quality, that includes versions of temporal antialiasing. Such techniques will effectively blur the picture, which defeats the purpose of higher resolutions in the first place. Stick with proper AA techniques, like MSAA, CSAA or the best: SSAA.
 
Any expert in visualization knows that AA-techniques utilizing post processing effects will degrade the overall visual quality, that includes versions of temporal antialiasing. Such techniques will effectively blur the picture, which defeats the purpose of higher resolutions in the first place. Stick with proper AA techniques, like MSAA, CSAA or the best: SSAA.

And the explanation for that is really simple: in post processing, the processor doesn't actually know what constitutes an edge to be anti-aliased, it guesses.
 
do you even bother to read? i posted that already 5 posts ago. how about you do some research lol
i happen to correct my information today so i post it for it maybe a good for someone else and it way yesterday that i read it from techpowerup main page
 
And the explanation for that is really simple: in post processing, the processor doesn't actually know what constitutes an edge to be anti-aliased, it guesses.
Yes, all the geometry information is lost during rasterization; which is the process of transferring the scene from a 3D world into a sampled 2D picture. After the fragment/pixel shaders you have even less information, basically only the "finished" picture. The realm of possibilities of what you can do then is similar to what you can do in a photo editor, so not much really. You know you can never regenerate lost information.

MSAA, CSAA and SSAA all work by increasing the sampling during rasterization, which increases the data available for each pixel. That is simply the only way to create a quality rendering.
 
This article became a joke, and others are laughing about it for making misleading article based on a click bait article which TPU didn't bother confirming before posting.
Does TPU still holding a grudge over the "reviews need to be fair." thing & all the other AMD related whatever ?
 
This article became a joke, and others are laughing about it for making misleading article based on a click bait article which TPU didn't bother confirming before posting.
Does TPU still holding a grudge over the "reviews need to be fair." thing & all the other AMD related whatever ?

Ever since nVidia invited w1zard to that 'CONference' back at the gtx680 release there has been a large shift in TPUs 'attitude'.

Gotta keep those favours coming in I guess? ;)

There is clear bias, IMHO. w1z prolly gonna ip ban me now hahaha
 
i happen to correct my information today so i post it for it maybe a good for someone else and it way yesterday that i read it from techpowerup main page
im sorry if english isnt your 1st language, but this sentence makes no sense lol
 
another reason to prefer modern DisplayPort above all else, want the best quality pixels?, use DisplayPort
also there was some issues with nvida doing wierd hdmi in the early days of hdmi 2.0, when they transmitted 4:2:0 instead of 4:4:4 to get 4k60
 
im sorry if english isnt your 1st language, but this sentence makes no sense lol
haha english is not my first language
i just basically said that: i did read tweakdown post today and it did basically correct the misleading information about this post that i did ready yesterday. (i basically write it just like how i would spell it in my main language and forget that in my main language we use a lot of consciences connected in words to shrink the size of phrases because i was kinda rush xD )
and sorry if i did repeat the link of tweakdown since after that i did read it today i did post over here and i didn't read the older post
 
haha english is not my first language
i just basically said that: i did read tweakdown post today and it did basically correct the misleading information about this post that i did ready yesterday. (i basically write it just like how i would spell it in my main language and forget that in my main language we use a lot of consciences connected in words to shrink the size of phrases because i was kinda rush xD )
and sorry if i did repeat the link of tweakdown since after that i did read it today i did post over here and i didn't read the older post
all good my friend, was having an annoying morning was all. thanks for clearing up :D
 
German tech publication Heise.de discovered that AMD Radeon GPUs render HDR games (games that take advantage of new-generation hardware HDR, such as "Shadow Warrior 2") at a reduced color depth of 8 bits per cell (16.7 million colors), or 32-bit; if your display (eg: 4K HDR-ready TV) is connected over HDMI 2.0 and not DisplayPort 1.2 (and above). The desired 10 bits per cell (1.07 billion colors) palette is available only when your HDR display runs over DisplayPort. This could be a problem, since most HDR-ready displays these days are TVs. Heise.de observes that AMD GPUs reduce output sampling from the desired Full YCrBr 4: 4: 4 color scanning to 4: 2: 2 or 4: 2: 0 (color-sub-sampling / chroma sub-sampling), when the display is connected over HDMI 2.0. The publication also suspects that the limitation is prevalent on all AMD "Polaris" GPUs, including the ones that drive game consoles such as the PS4 Pro.

I really don't know Why it's News! ( when AMD told them before Launching RX480) , Here direct Link from this TPU's Article

b745733b3c78.jpg


Look at 3840x2160 @ 60Hz ( 4:2:2 )

You Discovered?! Here footnote :
Under embargo until June 29.2016at 9 am.EST.
Holy Mother of GOD! They found out on Nov 17.2016at 11:07pm.:eek:
 
Back
Top