• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

12 Bit HDMI color output in Cat 14.6?

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,412 (7.77/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
I remember reading someone on the forums asking if 10 bit HDMI was even possible on modern graphics cards, and forgot about it until i saw this option in the new beta drivers.

I get an 8, 10 and 12 bit color bit depth option, and the HDTV i use as a screen confirms its working.


Anyone seen this before/knows more about it?

Capture006.jpg
 
Last edited:
thats on my 46" sony
 
I'll have to try it on my brand new Panasonic 39" LED/LCD HDTV and see what happens.

Do you think there'd be a performance penalty to using this, similar to how 16bpp and 32bpp were back in the day?

Also, some quick reading brings up some concerns on HDMI cable bandwidth actually CARRYING those colors.
 
I'll have to try it on my brand new Panasonic 39" LED/LCD HDTV and see what happens.

Do you think there'd be a performance penalty to using this, similar to how 16bpp and 32bpp were back in the day?

Also, some quick reading brings up some concerns on HDMI cable bandwidth actually CARRYING those colors.


i think its part of HDMI 1.4 specs, for greater color depth. may not make a difference if the panel cant support it, but i guess in theory better colors/gradients are possible?
 
One advantage of larger color space is the ability to natively view videos encoded in the High-10 H.264 profile, which is gaining traction due to its ability to produce better quality images at the same bit rate as 8-bit H.264 video.

Counter-intuitively, for a given video bitrate, a higher bit depth and lower resolution is perceptibly better than a lower bit depth and higher resolution. There's a few good example frames of 10-bit vs 8-bit compressed video here, and a quick internet search will find many more such as this good example. Note the reduced macroblocks and banding in the 10-bit frames even when downsampled on an 8-bit display.

There's a lot of anime being released in Hi10p because of this compression advantage.
 
Last edited:
Regardless of the ability of the cable to carry the colors (the HDMI 1.4 cert process from my reading is dubious at best, with many certified cables failing independent testing for even 1.3 standards), one thing is for certain:

If your monitor is an LCD, you'll be lucky if you can get 8bpc (32bpp) out of it, let alone anything higher.

Many LCDs still process colors above 18bpp using dithering internally, unfortunately. At least last time I checked.

You can try a test if you want to feel bad about your monitor, mine flat our fails this one:

http://battletech.hopto.org/html_tutorials/colourtest/

Answer? Get a CRT, or realize that ignorance is bliss, and forget what I just told you.
 
My iphone actually did well on that test. O.o
I do have the screen adjusted for red though. It shows a darker, warmer color.
 
my phone and TV show the same on that test (they're both IPS, makes sense) - the grey is solid, but the color gradients are smooth.


edit: with brightness up i can see the three sections to the grey, actually. it could be just that i like my screens dim.
 
It's even worse with mine it defaults to 8bit although maybe that's the AV or the AMD drivers not reading it right.

My TV is 10bit.

I rolled back to 14.2 a little while ago as 14.6 sucks for Arma 3.
 
It's even worse with mine it defaults to 8bit although maybe that's the AV or the AMD drivers not reading it right.

My TV is 10bit.

I rolled back to 14.2 a little while ago as 14.6 sucks for Arma 3.

Its new to the 14.6 beta

  • New user controls for video color and display settings
  • Greater control over Video Color Management:
    • Controls have been expanded from a single slider for controlling Boost and Hue to per color axis
  • Color depth control for Digital Flat Panels (available on supported HDMI and DP displays)
    • Allows users to select different color depths per resolution and display
 
Yeah i know some pretty much pointed that out already.. Maybe it work better when out of beta HAHA.
 
My TV is 10bit.

I wasn't aware of any panels that were actually 10bit and not just dithering... granted, I am an audo/video nut who has been out of touch for 3+ years, so I could be really really off on this, but just to satisfy my own curiousity, I'm going to be googling this tonight.
 
They are probably unlocking the 8bit+ pipeline in GCN cards now that more monitors and affordable ones are using 10bit panels and higher then the norm sRGB is a selling point.

You can now save some money and not have to buy a FirePro or Quadro to get 10bit.
 
I wasn't aware of any panels that were actually 10bit and not just dithering... granted, I am an audo/video nut who has been out of touch for 3+ years, so I could be really really off on this, but just to satisfy my own curiousity, I'm going to be googling this tonight.

my TV actually shows 12 bit (as shown in the photo) so clearly 12 bit inputs are a thing, even if the panels arent.


online specs for my TV dont mention it whatsoever, and google is finding very little on the subject except a few conversations that seem to be "someone who knows a little bit talking to people who dont have a clue"
 
Pretty much what I got when I googled it as well... Leaves me thinking though that it probably isn't available at the panel yet, as the manufacturers tend to advertise the shit out of questionable features at any given chance (yellow color channel added to Sharp Aquos for example, nevermind conventional RGB not supporting it), how would they not advertise 10+bit color?
 
since mine defaulted to 10 bit, i'd have to assume 12 bit input w/ 10 bit IPS panel.

apparently the HDMI 2.0 specs do 8, 12 or 16 bit, and thats about all the info i can find.
 
That or it's where the dithering shows the least banding. But anything is possible I suppose, there isn't much info either way.
 
Interesting. At some point I'll have connect my tower to my TV to see if deep color works. Theoretically my TV supports it as well but unlike yours, it doesn't tell you what the current color depth is and until now there wasn't really any way of choosing it.
 
Interesting. At some point I'll have connect my tower to my TV to see if deep color works. Theoretically my TV supports it as well but unlike yours, it doesn't tell you what the current color depth is and until now there wasn't really any way of choosing it.

mine tells me when i press the info button, its always showed 10 bit from my 7970, until now with these drivers (the first image in the thread is a PHOTO, not a screenshot)
 
mine tells me when i press the info button, its always showed 10 bit from my 7970, until now with these drivers (the first image in the thread is a PHOTO, not a screenshot)

What does your TV say when you un-check the Enable ITC processing ?
 
They are probably unlocking the 8bit+ pipeline in GCN cards now that more monitors and affordable ones are using 10bit panels and higher then the norm sRGB is a selling point.

You can now save some money and not have to buy a FirePro or Quadro to get 10bit.

I am just going by the manual but this is the only place it's referenced lol.
tvman.gif
 
I get an 8, 10 and 12 bit color bit depth option, and the HDTV i use as a screen confirms its working.

You sure about that Mussels? I have a mere $500 32" Panasonic I bought a few years ago for only $330 and I'm pretty sure it's 8 bit, but I get 8, 10, and 12 bit options showing in CCC too. I think what may be happening is the TV is showing the inputted signal depth, but might be down converting it's output to 8 or 10 bit.

I'm trying to avoid unnecessary processing from my TV, as well as needless wear and tear. It would be nice if the color depth of the panels were part of the TV specs. I may call Panasonic about this, but I'm doubting any of their phone or email staff will know, and it's unlikely I'll get an email response from the engineers, whom are likely in Japan.

But, if anyone happens to find better sources than I did in trying to determine the bit rating of my TV panel's color depth, it's a TC-L32U3 model made in 2011. BTW Mussels, I made a thread about my dilemma and was hoping you'd chime in. For now I've got it set to 8 bit.
http://www.techpowerup.com/forums/threads/ccc-display-color-depth-setting-bpc.208920/
 
Too bad I don't have my necro card with me.
 
Back
Top