• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

EDIT

Joined
Aug 16, 2007
Messages
7,180 (1.11/day)
I use my TV with my pc as a second display for movies, some gaming although mostly i use my monitor.
I've been looking into calibration and i figured that i would set the Pixel Format to YCbCr 4:4:4 because movies and tv are encoded in YCbCr. However i am assuming games are RGB Full 444 (0-255).

I've calibrated the brightness on my tv for 16:235 for blacks.

There is also the Dynamic Range setting in catalyst control centre, 0-255 or 16-235. Not sure what that does.

My tv is a panasonic ST50 plasma and doesn't support FULL RGB.

What exactly is happening, i when i set it to YCbCr 4:4:4 as far as YCbCr and RGB content go?

Theres also an RGB limited option.
 
I can set the gpu to FULL RGB even though the tv doesnt support it, so i assume somewhere it converts that to YCbCr? Would you still calibrate the tv for 16-235 even though you set the gpu too FULL RGB?

If i enable ITC Processing it lets the TV do the conversion rather than the GPU which i've heard is prefered?
 
I can set the gpu to FULL RGB even though the tv doesnt support it...

Me too, and it's the one that says "PC Standard".

Even though my set is not 4:4:4 capable, this setting seems to make brightness, contrast, and sharpness better. I went through and re-tuned the Cleartype settings and was able to drop the Sharpness setting on my TV from 40 to 30, and it looks better overall now.

The only bummer is, you have to do this every time you install a new AMD driver, because it reverts the setting to YCbCr.
 
Last edited by a moderator:
I just had to get a cheapo TV to use as a pc screen because my monitor packed in and i have found using a DVI to HDMI converter to be the best option.

Just using a HDMI cable i found the picture to be sub par, text looked bad and brightness/colour was poor.

After changing to a DVI / HDMI converter everything seemed to be how it would on a normal monitor. Text is now perfect and clear and the issues with brightness and colour have gone away.

I also noticed Scaling issues at first using HDMI only but now i am using the converter don't have to mess with the setting as CCC and the TV thinks im using DVI.

Some other things i have noticed to.

When using HDMI only, CCC would let me choose 8/10/12 bit colour and using either option didn't make much difference to the picture quality.

Using the DVI to HDMI converter, CCC auto set to 8bit and would not let me change the option but now the picture is perfect.

I found the info here very useful https://pcmonitors.info/articles/correcting-hdmi-colour-on-nvidia-and-amd-gpus/

Im using a samsung ue32h5000 TV
 
Last edited:
Well, turns out I spoke too soon on the Pixel Format setting. I had to put it back on YCbCr because in games it was crushing blacks extremely bad and way too color saturated and dark. The inside of liberated trading posts in Far Cry 4 suddenly became so dark I couldn't even see any details on the walls.
 
use YCbCr and set the black range from 0-255 for the best image.
HDMI with the scaling setting adjusted to 0 and 12 bit color will give the best image quality for sure.

Some TV sets may require other settings, but those are generally the really old ones.
 
use YCbCr and set the black range from 0-255 for the best image.
HDMI with the scaling setting adjusted to 0 and 12 bit color will give the best image quality for sure.

Some TV sets may require other settings, but those are generally the really old ones.


On YCbCr I don't have to do that, there's no excessive darkening like with RGB on HDMI.

animal007uk sent me a convo tip to try a DVI to HDMI adapter though, which I happen to have and did. Now I can run full RGB without the darkening, and I now get enough color saturation in games to where I don't have to bump up the Color setting on my TV.

The only slight problem I had at first is CCC was not getting the overscan scaling to fit the screen when I switched res at first, but now it seems to be OK. Haven't tested it enough to be sure, but anything that helps avoid changing TV settings constantly might help it last longer.
 
On YCbCr I don't have to do that, there's no excessive darkening like with RGB on HDMI.

animal007uk sent me a convo tip to try a DVI to HDMI adapter though, which I happen to have and did. Now I can run full RGB without the darkening, and I now get enough color saturation in games to where I don't have to bump up the Color setting on my TV.

The only slight problem I had at first is CCC was not getting the overscan scaling to fit the screen when I switched res at first, but now it seems to be OK. Haven't tested it enough to be sure, but anything that helps avoid changing TV settings constantly might help it last longer.

the HDMI-DVI adaptors disable half the features of HDMI, the only reason its helping is because your TV is auto detecting it as a media connection and turning features on - you're a lot better off going with HDMI and disabling those features in the TV.
 
the HDMI-DVI adaptors disable half the features of HDMI, the only reason its helping is because your TV is auto detecting it as a media connection and turning features on - you're a lot better off going with HDMI and disabling those features in the TV.
Then again, you pretty much insisted your TV was 12 bit, and I showed you otherwise.

Not trying to be a smart ass, I just think you assume I'm a bit naive where I'm really not. Animal and I had already discussed the disabling of HDMI features before I even tried it, which I expected, and I responded telling him I have all those extra filters turned off anyway to avoid processing lag.
 
Then again, you pretty much insisted your TV was 12 bit, and I showed you otherwise.

Not trying to be a smart ass, I just think you assume I'm a bit naive where I'm really not. Animal and I had already discussed the disabling of HDMI features before I even tried it, which I expected, and I responded telling him I have all those extra filters turned off anyway to avoid processing lag.


the entire point of another thread i made (and has a very similar discussion) was finding out what the hell that setting did, if i set it to 12 bit and the TV accepts it - what else am i meant to assume? forcing that setting on my other TV's results in no signal, so it clearly does accept it as an input.
 
the entire point of another thread i made (and has a very similar discussion) was finding out what the hell that setting did, if i set it to 12 bit and the TV accepts it - what else am i meant to assume? forcing that setting on my other TV's results in no signal, so it clearly does accept it as an input.

I had assumed you were of the tech level to have known many TVs can accept a signal they cannot actually produce, such as those which are advertised to be 1080p ready (some even very misleadingly 1080p period), but actually only down convert it to 720p.

This and the fact that all Pixel Format options are usable in CCC for pretty much any set indicates many TVs are going to be down converting it, rather than using and displaying the signal as sent.
 
Back
Top