• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

LG Display Unveils Next-Generation OLED TV Display, the OLED EX

Joined
Jun 10, 2014
Messages
2,889 (0.81/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
Technically yes, it's burnout not burn-in but burn-in is what it's been called for almost a century now, deal with it.
No, you don't understand the technical details. Only CRTs have burn-in. Burn-in means the pixel is physically burned and is destroyed.
The uneven wear for OLEDs on plasmas can be "reversed" by applying wear to the rest of the screen to even out the problem, or even just changing your usage of the screen. Uneven wear is much less of a problem for OLED than plasmas, where in OLED it's mostly tied to very bright areas.

And FYI, LCDs also have wear on pixels, typically causes by sharp lines of contrast over time, causing the TFT panel to wear out in those spots, creating lines or shadows. Ironically, this is closer to a "burn-in" than anything that can happen on an OLED display. :p
 
Joined
Apr 19, 2018
Messages
957 (0.44/day)
Processor AMD Ryzen 9 5950X
Motherboard Asus ROG Crosshair VIII Hero WiFi
Cooling Arctic Liquid Freezer II 420
Memory 32Gb G-Skill Trident Z Neo @3806MHz C14
Video Card(s) MSI GeForce RTX2070
Storage Seagate FireCuda 530 1TB
Display(s) Samsung G9 49" Curved Ultrawide
Case Cooler Master Cosmos
Audio Device(s) O2 USB Headphone AMP
Power Supply Corsair HX850i
Mouse Logitech G502
Keyboard Cherry MX
Software Windows 11
"algorithm-based EX Technology, which helps boost the innovative display's overall picture quality by enhancing brightness up to 30 percent"

So a software tweak, which could easily be implemented on other LG OLED panels then?

And more seriously, it's really not a good sign that LG are not talking about colour volume or wider colour gamut in this awful PR marketing drivel. If they have not improved this, then Samsung may well win the next gen OLED race, with it's QD technology. Not good LG, not good - You'll be bright, but nowhere near as vibrant as Samsung.
 
Last edited:
Joined
Dec 22, 2011
Messages
3,890 (0.87/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
I was tempted to go OLED, but still didn't want to have to worry about burn in (wear or whatever people want to call it) as I always intended to use the display as a monitor.

So I have zero regrets grabbing a QN90A instead, deep blacks and it can get incredibly bright, HDR content looks stunning. A great all rounder.
 

ARF

Joined
Jan 28, 2020
Messages
3,892 (2.56/day)
Location
Ex-usa
Every TV I ever buy is obsolete very quickly. The trick is to ignore the new shiny stuff and just be happy that what you now have is nicer than your old TV :)

My 2015 model Panasonic TX-50CX670E with VA is perfect for what it is worth.
I like it to this day and do not plan a new replacement.
49.5" Panasonic TX-50CX670 - Specifications (displayspecifications.com)


What is disturbing is this market situation that almost everybody has a 4K 3840x2160 TV-set while so few users have a 4K 3840x2160 PC monitor that is actually more important because the user sits closer to the panel and sees its gigantic individual pixels.

What is even more disturbing is that LG cancelled its mobile phones business, so I don't know what I will buy next time, after 2025 when my new LG OLED LG G8s ThinQ phone guarantee ends... :(
 
Joined
Apr 19, 2018
Messages
957 (0.44/day)
Processor AMD Ryzen 9 5950X
Motherboard Asus ROG Crosshair VIII Hero WiFi
Cooling Arctic Liquid Freezer II 420
Memory 32Gb G-Skill Trident Z Neo @3806MHz C14
Video Card(s) MSI GeForce RTX2070
Storage Seagate FireCuda 530 1TB
Display(s) Samsung G9 49" Curved Ultrawide
Case Cooler Master Cosmos
Audio Device(s) O2 USB Headphone AMP
Power Supply Corsair HX850i
Mouse Logitech G502
Keyboard Cherry MX
Software Windows 11
I was tempted to go OLED, but still didn't want to have to worry about burn in (wear or whatever people want to call it) as I always intended to use the display as a monitor.

So I have zero regrets grabbing a QN90A instead, deep blacks and it can get incredibly bright, HDR content looks stunning. A great all rounder.
I'd never buy one of these over a current gen OLED. The price for this dinosaur LCD tech is absolutely insane, not to mention the downsides to LCD's, like DSE, uneven backlighting and severe clouding and haloing. And not to mention it doesn't even do Dolby Vision. Total deal breaker.

I've had my LG C9 for over two years, and it's got thousands of hours on it, including gaming on it. No burn in, and a beautiful, bright picture. Watching DV content on it is a real experience. The latest models are even better!

Where people go wrong, is that they leave their Windows desktop on it all day, or for hours at a time unattended. Set the screen to go black after 5-10 mins of inactivity, and you'll be golden.
 
Last edited:

bug

Joined
May 22, 2015
Messages
13,161 (4.07/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Every time there's a brightness and/or power-efficiency boost (they usually go hand-in-hand with OLED), that also means there's a burn-in reduction boost, though LG never markets this since they like to pretend that burn-in isn't an issue at all. The less power you have to send to a pixel to reach a specific desired brightness level, the less it will deteriorate over time. So if LG's super special deuterium tech allows for 30% more brightness at the same amount of power input, and if you have a specific brightness target in mind, you may need 23% less power to reach it. And the organic pixels will deteriorate less as a result. How much less is something I can't confidently say because this may not be a linear relationship, and other factors such as heat levels affect the stability of the organic compounds as well. Anyway, I would expect some small amount of extra burn-in resistance from this advancement, which compounds with the many other small advancements made over the years. In 2022, OLED panels will likely be considerably more burn-in-resistant than they were 7 years ago.

I do think it's worth being cautious about burn-in with OLED panels, even if there are many people who happily use OLED TVs as monitors without any noticeable burn-in. The posters in this thread are right—taking extra steps such as lowering the brightness to 120 or 100 nits, setting a dark background and using dark mode for your apps, and setting your task bar to auto-hide, will all help your display last longer. These aren't options for everyone, though, and OLED displays aren't appropriate for every use case. I work at home, and my work PC is also my leisure PC. It's also in a well-lit room. I'm used to 200 - 250 nits, not 100 - 120. I also have static interface elements on screen for 8 - 12 hours a day, every single day. There's no getting rid of them. And going full dark theme for everything is just not how I want to live my life, lol. I'll choose to hold out on OLED as my main PC display until there's a panel that can last 5+ years for less while being on with static interface elements for 3500+ hours each year. It's a pretty demanding requirement, and I'm guessing we're still quite a few years away from that being possible. In the meantime, I'll happily buy an OLED TV for the living room. :)
That's bad for your eyes. 120 is always the target brightness as it's the same luminance level as a white sheet of paper in daylight. That's what our eyes are naturally used to. It's also why you see professional monitors with hoods: no matter what the ambient luminance, people still work at 120 nits.
 
Joined
Dec 22, 2011
Messages
3,890 (0.87/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
I'd never buy one of these over a current gen OLED. The price for this dinosaur LCD tech is absolutely insane. And not to mention it doesn't even do Dolby Vision. Total deal breaker.

I've had my LG C9 for over two years, and it's got thousands of hours on it, no burn in, and a beautiful, bright picture. Watching DV content on it is a real experience. The latest models are even better!

Like I said, I have zero regrets. If I only intended to use it as a TV I would have bought an OLED, but this dinosaur tech with boosted with Mini LED backlighting gives OLED a real run for its money.

Beside got it for a steal too.
 

bug

Joined
May 22, 2015
Messages
13,161 (4.07/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Like I said, I have zero regrets. If I only intended to use it as a TV I would have bought an OLED, but this dinosaur tech with boosted with Mini LED backlighting gives OLED a real run for its money.

Beside got it for a steal too.
That's the problem with miniLED: it costs as much as OLED, of not more :(

Great that you got a good deal on it.
 
Joined
Feb 20, 2019
Messages
7,190 (3.86/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
No, you don't understand the technical details. Only CRTs have burn-in. Burn-in means the pixel is physically burned and is destroyed.
The uneven wear for OLEDs on plasmas can be "reversed" by applying wear to the rest of the screen to even out the problem, or even just changing your usage of the screen. Uneven wear is much less of a problem for OLED than plasmas, where in OLED it's mostly tied to very bright areas.

And FYI, LCDs also have wear on pixels, typically causes by sharp lines of contrast over time, causing the TFT panel to wear out in those spots, creating lines or shadows. Ironically, this is closer to a "burn-in" than anything that can happen on an OLED display. :p
CRT's don't have pixels, they have triads of R, G, and B phosphor dots that convert non-visible EM from the electron beam into visible light. The phosphors do eventually lose their luminance WRT the electron beam strength resulting in uneven wear, but the ion burn of the phosphor actually blasted deposits onto the glass of the CRT creating a dark burn mark visible even when the CRT was switched off. I really do fully understand the technical details.

Plasma are also phosphor based, so suffer the exact same mechanic that just manifests a little differently but effectively you can still get burn-in that is visible when the plasma display isn't even powered.

QLED is a little different. They're not technically phosphors and not much is documented about their wear but they effectively do with cadmium of indium and blue light what phoshpor did in CRTs and plasmas with non-visible parts of the EM spectrum. They are likely to suffer wear in exactly the same way but potentially the lower-energy levels of narrow-band blue light compared to higher EM energy levels in CRT and plasma displays will make the burn-in negligible or even undetectable over the useful life of the QLED display.

OLEDs are the only ones that burn out rather than burn in. No material is blasted (technical term is ion burn) off the phosphor or cadmium/indium of quantum dots and deposited on the next layer so it's literally just the LED diodes themselves wearing out, Technically they're not burning but the decline of LED brightness (including OLEDs) is caused by the heat of operating thermally expanding and contracting to invoke further dislocations in the crystalline structure of the semiconductor. Over time more and more dislocations mean that fewer of the electrons jumping between the p-n junction of the semiconductors are jumping the right amount which changes their energy levels and thus wavelength of the emitted photon to a non-visible wavelength.

So, it's not tecnically "burn" but it is wear caused by heat and the hotter they get the faster they burn - you can probably understand why so many people call it burnout as the LEDs wear out even though it's thermally-triggered crystal grain breakup.

What is disturbing is this market situation that almost everybody has a 4K 3840x2160 TV-set while so few users have a 4K 3840x2160 PC monitor that is actually more important because the user sits closer to the panel and sees its gigantic individual pixels.
Desktop 4K display adoption is not hampered by technology but by inadequacies of OS non-integer scaling.

Unfortunately, even today, so much content is raster-based and targets 96dpi. No matter how you scale it you will always run into downsides if you don't integer-scale.

Even with integer scaling at 200% you gain double the horizonal and vertical pixels but then lose 3x the horizontal resolution because you can't use subpixel antialiasing like ClearType. For text it's actually a wash as 1080p with Cleartype has 50% greater horizontal resolution and text has far more vertical lines in it than horizontal lines making the horizontal resolution far more valuable than the additional vertical resolution that 4K provides. I know it's only text, but text legibility is the key thing people focus on for picture clarity on desktop monitors at those closer distances.

I'm not advocating that 1080p displays are better than 4K displays, I'm just saying that it's very difficult to get a good text quality if you move away from 100% scaling at native resolution, and have to give up subpixel-AA, 200% scaling isn't categorically better than a lower-dpi display at 100%!
 
Last edited:
Joined
Dec 22, 2011
Messages
3,890 (0.87/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
That's the problem with miniLED: it costs as much as OLED, of not more :(

Great that you got a good deal on it.

Well, it can push over 1800 nits, not much comes close.
 
Joined
Aug 20, 2007
Messages
20,709 (3.41/day)
System Name Pioneer
Processor Ryzen R9 7950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage 2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches
Software Windows 11 Enterprise (legit), Gentoo Linux x64
OLED can suffer from uneven wear, but it has nothing to do with pictures being static. It's caused by areas being significantly brighter over time wearing those pixels more than the others. This will happen regardless if the picture is static or changing.
That's generally referred to as burn i (even if it technically isn't). If the image is static patterns will indeed become more noticable.

EDIT: I see others beat me to this point, ignore.

Well, it can push over 1800 nits, not much comes close.
Coming from a 600 nit peak, I'm guessing that much brightness hurts?

Not that it's used for more than like realistic momemtary explosions, but just curious. No brand war here, happy for you.
 
Joined
Jun 10, 2014
Messages
2,889 (0.81/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
That's generally referred to as burn i (even if it technically isn't).
Just because a misconception is widespread enough, doesn't make it true.
A burned-in pixel no longer works, when that happens on a CRT the phosphorus in that spot is destroyed.
Uneven wear just means some pixels have spent more of their lifetime, slightly altering the pixel's color response, resulting in visible patterns when various bright content is displayed.
If the usage pattern which produced this uneven wear continues, then the unevenness will get worse.
If the usage pattern is changed, the panel will become more even again, over thousands of hours of use.

If the image is static patterns will indeed become more noticable.
Listen closely; whether the picture is static or not is irrelevant. That myth needs to end now.
What matters is some region is brighter than others over time, that's it.
Most people still seem to think that leaving a static picture on for hours will destroy a OLED panel, but that's completely untrue. You will not "misuse" your OLED TV a few times and risk ruining it, like pausing a movie and forgetting it, or be worried that a guest or your children uses the TV when you're not wathcing. The only worry should be whether a portion of the screen will be significantly brighter than the rest over time (like certain TV-channels, viewing the same web page or streaming layout all day etc.), not whether the content in that region is static or moving.

Please try to grasp this very important difference, because understanding it is necessary to determine if uneven wear is a problem or not for your use case.
 
Joined
Dec 30, 2021
Messages
354 (0.43/day)
Unfortunately, even today, so much content is raster-based and targets 96dpi. No matter how you scale it you will always run into downsides if you don't integer-scale.

Even with integer scaling at 200% you gain double the horizonal and vertical pixels but then lose 3x the horizontal resolution because you can't use subpixel antialiasing like ClearType. For text it's actually a wash as 1080p with Cleartype has 50% greater horizontal resolution and text has far more vertical lines in it than horizontal lines making the horizontal resolution far more valuable than the additional vertical resolution that 4K provides. I know it's only text, but text legibility is the key thing people focus on for picture clarity on desktop monitors at those closer distances.

I'm not advocating that 1080p displays are better than 4K displays, I'm just saying that it's very difficult to get a good text quality if you move away from 100% scaling at native resolution, and have to give up subpixel-AA, 200% scaling isn't categorically better than a lower-dpi display at 100%!
When was the last time you've tried windows' scaling system? I ask because this has not mirrored my experience. Text doesn't get upscaled with windows scaling, it gets rendered with more pixels. And Cleartype is compatible with it. Even at just 125%, text looks better to my eyes due to the increased text rendering resolution. Make sure you run the cleartype tool after changing the scaling factor.
 

ARF

Joined
Jan 28, 2020
Messages
3,892 (2.56/day)
Location
Ex-usa
Desktop 4K display adoption is not hampered by technology but by inadequacies of OS non-integer scaling.

Unfortunately, even today, so much content is raster-based and targets 96dpi. No matter how you scale it you will always run into downsides if you don't integer-scale.

Even with integer scaling at 200% you gain double the horizonal and vertical pixels but then lose 3x the horizontal resolution because you can't use subpixel antialiasing like ClearType. For text it's actually a wash as 1080p with Cleartype has 50% greater horizontal resolution and text has far more vertical lines in it than horizontal lines making the horizontal resolution far more valuable than the additional vertical resolution that 4K provides. I know it's only text, but text legibility is the key thing people focus on for picture clarity on desktop monitors at those closer distances.

I'm not advocating that 1080p displays are better than 4K displays, I'm just saying that it's very difficult to get a good text quality if you move away from 100% scaling at native resolution, and have to give up subpixel-AA, 200% scaling isn't categorically better than a lower-dpi display at 100%!

I find 1080p with 96 dpi more than terrible. It is criminal and should be banned from any office use.


1640979423111.png

EIZO 4K Monitors - High definition and large screen sizes | EIZO (eizoglobal.com)

1640979649251.png

Confused about HiDPI and Retina display? ― Understanding pixel density in the age of 4K | EIZO (eizoglobal.com)
 

Attachments

  • 1640979417239.png
    1640979417239.png
    241.3 KB · Views: 37
Joined
Apr 18, 2019
Messages
1,863 (1.03/day)
Location
Olympia, WA
System Name Sleepy Painter
Processor AMD Ryzen 5 3600
Motherboard Asus TuF Gaming X570-PLUS/WIFI
Cooling FSP Windale 6 - Passive
Memory 2x16GB F4-3600C16-16GVKC @ 16-19-21-36-58-1T
Video Card(s) MSI RX580 8GB
Storage 2x Samsung PM963 960GB nVME RAID0, Crucial BX500 1TB SATA, WD Blue 3D 2TB SATA
Display(s) Microboard 32" Curved 1080P 144hz VA w/ Freesync
Case NZXT Gamma Classic Black
Audio Device(s) Asus Xonar D1
Power Supply Rosewill 1KW on 240V@60hz
Mouse Logitech MX518 Legend
Keyboard Red Dragon K552
Software Windows 10 Enterprise 2019 LTSC 1809 17763.1757
I wasn't aware you could pack that many buzzwords in such a short span.
 
Joined
Dec 22, 2011
Messages
3,890 (0.87/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
Coming from a 600 nit peak, I'm guessing that much brightness hurts?

Not that it's used for more than like realistic momemtary explosions, but just curious. No brand war here, happy for you.

Well it's certainly impressive, I'm not saying you need sun glasses or anything I then can't really say I've cranked it up in a pitch black room either, I've dabbled a bit with certain HDR content and games like Shadow of the Tomb Raider and the results are impressive, of course where it shines is in rooms which are already really bright and have loads of natural sunlight, it's reflection handling is great too.

What sold it on me was the price, typical MSRP for the 50" I got here in the UK is £1199, it's currently on deal for £1099, I was able to nab one back in September for £890 delivered, so i just pulled the trigger right away. Plenty big enough considering it's on my desk, Mini LED, 120Hz 4K HDMI 2.1, VRR, Game Mode giving really low latency etc etc chuffed to bits.

So yeah, definitely not interested in a brand war, OLED is stunning no question, and they obvously don't have to get as bright as they have superior blacks in the first place, different tech, different solutions and all that.
 
Joined
Aug 20, 2007
Messages
20,709 (3.41/day)
System Name Pioneer
Processor Ryzen R9 7950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage 2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches
Software Windows 11 Enterprise (legit), Gentoo Linux x64
Listen closely; whether the picture is static or not is irrelevant. That myth needs to end now.
What matters is some region is brighter than others over time, that's it.
It does matter though, as varied content will excercise the pixels more evenly and the wear will be less noticable.
 

ARF

Joined
Jan 28, 2020
Messages
3,892 (2.56/day)
Location
Ex-usa
The screensavers are invented specifically for CRT, plasma and OLED screens! Always use a non-static picture with as much movement of the internal images as possible.
 
Joined
Feb 20, 2019
Messages
7,190 (3.86/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
The screensavers are invented specifically for CRT, plasma and OLED screens! Always use a non-static picture with as much movement of the internal images as possible.
Auto screen-off is a better screensaver, and that's been the default Since Windows 8.
 
Joined
Dec 30, 2021
Messages
354 (0.43/day)
Stick to auto sleep, and if you ever want to intentionally "wear out" your pixels more evenly, that's what LG's pixel refresher effectively does, but in a more measured manner.
 
Joined
Aug 20, 2007
Messages
20,709 (3.41/day)
System Name Pioneer
Processor Ryzen R9 7950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage 2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches
Software Windows 11 Enterprise (legit), Gentoo Linux x64
Stick to auto sleep, and if you ever want to intentionally "wear out" your pixels more evenly, that's what LG's pixel refresher effectively does
No, LG's pixel refresher does a lot more than that. It's more akin to programing each pixels wear level into memory and having the panel attempt to hide weak spots.
 

ARF

Joined
Jan 28, 2020
Messages
3,892 (2.56/day)
Location
Ex-usa
Auto screen-off is a better screensaver, and that's been the default Since Windows 8.

In some cases, auto screen-off isn't an option. For example, when a given TV programme with its logo is being watched, and the logo prints a permanent burn on the panel.
 
Joined
Feb 20, 2019
Messages
7,190 (3.86/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
In some cases, auto screen-off isn't an option. For example, when a given TV programme with its logo is being watched, and the logo prints a permanent burn on the panel.
We're talking about screensavers vs screen-off though, whether burn-in happens or not due to uneven wear has no impact whatsoever on whether you should use a screensaver or allow the OS to turn off the display instead.

Running a screensaver when the TV is not in use is pointless because it's consuming power, generating heat, increasing wear, and preventing the pixel-refresh cycle from starting. The manufacturer's own pixel conditioning cycle can only occur when the display is inactive, and it is vastly more effective than a random screensaver that has no ability to interact with the per-subpixel brightness calibration of a display's internal lookup tables and no way to read the resistance across each diode to feed that data back into the calibration system. A screensaver is just more use, in other words, it continues to wear out the OLEDs.

What benefit does a screensaver provide that auto-poweroff of the display doesn't? None;
There are no advantages to a screensaver unless you like watching pretty screensavers, in which case fine - but don't pretend it's 'saving' your screen.
 
Last edited:

ARF

Joined
Jan 28, 2020
Messages
3,892 (2.56/day)
Location
Ex-usa
Who decides whether the TV is or isn't in use? It is quite annoying when someone is actually watching, and the TV somehow decides that it isn't in use lol and turns off.

There should be another solution that is not a complete turn off :D
 
Joined
Feb 20, 2019
Messages
7,190 (3.86/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Who decides whether the TV is or isn't in use? It is quite annoying when someone is actually watching, and the TV somehow decides that it isn't in use lol and turns off.

There should be another solution that is not a complete turn off :D
What are you talking about? That's completely irrelevant; both screensaver and auto-off occur on the exact same trigger (user inactivity).

If you're now saying that screensavers and auto-off are annoying, then yes, they are - but less annoying than burn-in on an expensive OLED. I'm actually hoping that more TVs come with sensors like laptops and phones that can tell if anyone is actually watching them or not, and shut off after, say, 5 minutes where nobody is sat in front of them. Displays typically wake up within a second or two so it's not exactly a hardship for them to turn off as soon as they're not being used, especially if these sensors have worked out that you're back to watch the screen before you've even finished sitting down.
 
Last edited:
Top