- Joined
- Oct 6, 2004
- Messages
- 58,413 (8.18/day)
- Location
- Oystralia
System Name | Rainbow Sparkles (Power efficient, <350W gaming load) |
---|---|
Processor | Ryzen R7 5800x3D (Undervolted, 4.45GHz all core) |
Motherboard | Asus x570-F (BIOS Modded) |
Cooling | Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate |
Memory | 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V) |
Video Card(s) | Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W)) |
Storage | 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2 |
Display(s) | Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144) |
Case | Fractal Design R6 |
Audio Device(s) | Logitech G560 | Corsair Void pro RGB |Blue Yeti mic |
Power Supply | Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY) |
Mouse | Logitech G Pro wireless + Steelseries Prisma XL |
Keyboard | Razer Huntsman TE ( Sexy white keycaps) |
VR HMD | Oculus Rift S + Quest 2 |
Software | Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware! |
Benchmark Scores | Nyooom. |
Plasma Vs LCD
I've never owned a plasma, so my information here is second hand. The short version is that while plasmas can look better (they dont have to be at a native resolution to look their best) they still often fudge what resolution they run at, and could even be non widescreen (i've personally seen dozens of 1024x768 plasmas advertised as '1080p capable')
most of this article is about LCD HDTV's (the most common), so if you have a plasma, it may not be relevant.
Size of the screen:
Size means nothing. just because a screen is bigger, doesn't mean it takes more CPU or GPU power to run.
Even intels integrated GPU's can manage a 56" plasma TV without breaking a sweat these days, yes even for 1080p movie playback. (this belongs in a separate thread, but even my intel atom 1.6Ghz netbook can play 720p HD content fine, its really not very demanding at all and uses CPU almost exclusively)
HDTV's work fine for PC gaming, but you need to be aware of motion sickness, which is why console games often have a different FoV (field of view) to PC games.
Resolution (720p. 768p, 1080p etc):
2. Resolution does matter, but TV manufacturers lie. just because it has a 1080p input doesn't mean it actually is a 1080p panel - it could be a 720p or 768p panel and they just take the 1080p signal and discard/shrink some of the image to make it fit. this is quite common on those 'too good to be true' bargain priced 1080p HDTV's you see heavily discounted, and pretty obvious when you actually try a PC on the TV and find the image is blurry and grainy (especially on text) no matter what you do.
If no matter what you do your TV just looks blurry, this is probably why. you'll never be able to find the native resolution, so it will always look bad. sometimes using a different connection (VGA instead of HDMI) can yield better results.
Input (HDMI, DVI, VGA etc)
Use HDMI when possible, and only try other inputs for troubleshooting. HDMI is the only one that does audio and video in the one cable, so its the technically superior (and most common)
Its worth noting that some TV's have differences in the HDMI ports - my Samsung has three, yet only one (HDMI 2) supports its native 1366x768 resolution - the rest do 720p and 1080p, and even forcing 1366x760 looks blurry due to it scaling down to 720 automatically.
because of these oddities, i've actually found my 40" 1366x768 Samsung actually looks better than most modern cheap 1080p TV's, despite their pixel density advantage they still look blurrier and very 'soft' for text in windows.
As mentioned earlier, trying other connections to troubleshoot can be worth it. they may well stick to the HDMI specs with their non spec panel, and VGA might be the only way to get a decent, unmodified image at native panel resolution. Make sure to test all resolutions, and not just 720p/1080p
Overscan.
TV's, monitors, and video cards just *love* enabling overscan by default. this crops off the edges of your image, and is a bad idea for a PC user. disable it first of all, before messing around with other settings. this can also be a cause of a blurry image.
On my Samsung 23.6" monitor, this was called "AV mode" and it caused me all sorts of headaches while setting up the monitor on HDMI.
Image "enhancements"
Manufacturers (I'm glaring at you, Sony) just love adding in things to make TV and movies look better (arguably), but for an uncompressed digital signal such as PC or game console, they often make things look much, much worse.
So look for any settings like denoise, mosquito noise settings, deblocking or digital noise reduction (DNR) and turn them the hell off. A good test i found is to look at a still image of a solid colour (i like black) and look for artifacting - DNR on a sony 42" i tested made the black have a repeating pattern that looked like a tetris game was going on, and it was clearly visible in dark scenes in TV shows and movies as well.
These enhancements can also increase response times, causing delay and lag for gaming. Some HDTV's (like my samsung) have a "Game Mode" which forces all enhancements off to minimise lag and make it more suited for gaming on PC/console.
Refresh rates - 50/60/120/200/400hz etc
Another misleading term that sony love to advertise - they repeat frames to make some images look smoother, most noticeable on low refresh sources like interlaced TV or movies (DVD in other words), but when using HDMI or VGA connections, you're stuck at 60Hz, no matter what. So dont get tricked into thinking '400Hz' means 400Hz will show up as an available resolution.
LED HDTV's
Just to clarify a common misconception, LED is the lighting behind the panel, replacing the old CCFL method. it can be brighter, and it definitely saves power and runs cooler - but it means nothing to image quality.
A good 5 year old HDTV can look better than a cheap modern LED HDTV. Keep that in mind when purchasing, just because its LED 1080P on the box doesnt mean its a quality screen.
Here is an example photo of a HDTV that would be utterly horrible for HTPC use: see what you can find wrong with it
I've never owned a plasma, so my information here is second hand. The short version is that while plasmas can look better (they dont have to be at a native resolution to look their best) they still often fudge what resolution they run at, and could even be non widescreen (i've personally seen dozens of 1024x768 plasmas advertised as '1080p capable')
most of this article is about LCD HDTV's (the most common), so if you have a plasma, it may not be relevant.
Size of the screen:
Size means nothing. just because a screen is bigger, doesn't mean it takes more CPU or GPU power to run.
Even intels integrated GPU's can manage a 56" plasma TV without breaking a sweat these days, yes even for 1080p movie playback. (this belongs in a separate thread, but even my intel atom 1.6Ghz netbook can play 720p HD content fine, its really not very demanding at all and uses CPU almost exclusively)
HDTV's work fine for PC gaming, but you need to be aware of motion sickness, which is why console games often have a different FoV (field of view) to PC games.
Resolution (720p. 768p, 1080p etc):
2. Resolution does matter, but TV manufacturers lie. just because it has a 1080p input doesn't mean it actually is a 1080p panel - it could be a 720p or 768p panel and they just take the 1080p signal and discard/shrink some of the image to make it fit. this is quite common on those 'too good to be true' bargain priced 1080p HDTV's you see heavily discounted, and pretty obvious when you actually try a PC on the TV and find the image is blurry and grainy (especially on text) no matter what you do.
If no matter what you do your TV just looks blurry, this is probably why. you'll never be able to find the native resolution, so it will always look bad. sometimes using a different connection (VGA instead of HDMI) can yield better results.
Input (HDMI, DVI, VGA etc)
Use HDMI when possible, and only try other inputs for troubleshooting. HDMI is the only one that does audio and video in the one cable, so its the technically superior (and most common)
Its worth noting that some TV's have differences in the HDMI ports - my Samsung has three, yet only one (HDMI 2) supports its native 1366x768 resolution - the rest do 720p and 1080p, and even forcing 1366x760 looks blurry due to it scaling down to 720 automatically.
because of these oddities, i've actually found my 40" 1366x768 Samsung actually looks better than most modern cheap 1080p TV's, despite their pixel density advantage they still look blurrier and very 'soft' for text in windows.
As mentioned earlier, trying other connections to troubleshoot can be worth it. they may well stick to the HDMI specs with their non spec panel, and VGA might be the only way to get a decent, unmodified image at native panel resolution. Make sure to test all resolutions, and not just 720p/1080p
Overscan.
TV's, monitors, and video cards just *love* enabling overscan by default. this crops off the edges of your image, and is a bad idea for a PC user. disable it first of all, before messing around with other settings. this can also be a cause of a blurry image.
On my Samsung 23.6" monitor, this was called "AV mode" and it caused me all sorts of headaches while setting up the monitor on HDMI.
Image "enhancements"
Manufacturers (I'm glaring at you, Sony) just love adding in things to make TV and movies look better (arguably), but for an uncompressed digital signal such as PC or game console, they often make things look much, much worse.
So look for any settings like denoise, mosquito noise settings, deblocking or digital noise reduction (DNR) and turn them the hell off. A good test i found is to look at a still image of a solid colour (i like black) and look for artifacting - DNR on a sony 42" i tested made the black have a repeating pattern that looked like a tetris game was going on, and it was clearly visible in dark scenes in TV shows and movies as well.
These enhancements can also increase response times, causing delay and lag for gaming. Some HDTV's (like my samsung) have a "Game Mode" which forces all enhancements off to minimise lag and make it more suited for gaming on PC/console.
Refresh rates - 50/60/120/200/400hz etc
Another misleading term that sony love to advertise - they repeat frames to make some images look smoother, most noticeable on low refresh sources like interlaced TV or movies (DVD in other words), but when using HDMI or VGA connections, you're stuck at 60Hz, no matter what. So dont get tricked into thinking '400Hz' means 400Hz will show up as an available resolution.
LED HDTV's
Just to clarify a common misconception, LED is the lighting behind the panel, replacing the old CCFL method. it can be brighter, and it definitely saves power and runs cooler - but it means nothing to image quality.
A good 5 year old HDTV can look better than a cheap modern LED HDTV. Keep that in mind when purchasing, just because its LED 1080P on the box doesnt mean its a quality screen.
Here is an example photo of a HDTV that would be utterly horrible for HTPC use: see what you can find wrong with it
Last edited: