• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

LG Display Unveils Next-Generation OLED TV Display, the OLED EX

Technically yes, it's burnout not burn-in but burn-in is what it's been called for almost a century now, deal with it.
No, you don't understand the technical details. Only CRTs have burn-in. Burn-in means the pixel is physically burned and is destroyed.
The uneven wear for OLEDs on plasmas can be "reversed" by applying wear to the rest of the screen to even out the problem, or even just changing your usage of the screen. Uneven wear is much less of a problem for OLED than plasmas, where in OLED it's mostly tied to very bright areas.

And FYI, LCDs also have wear on pixels, typically causes by sharp lines of contrast over time, causing the TFT panel to wear out in those spots, creating lines or shadows. Ironically, this is closer to a "burn-in" than anything that can happen on an OLED display. :P
 
"algorithm-based EX Technology, which helps boost the innovative display's overall picture quality by enhancing brightness up to 30 percent"

So a software tweak, which could easily be implemented on other LG OLED panels then?

And more seriously, it's really not a good sign that LG are not talking about colour volume or wider colour gamut in this awful PR marketing drivel. If they have not improved this, then Samsung may well win the next gen OLED race, with it's QD technology. Not good LG, not good - You'll be bright, but nowhere near as vibrant as Samsung.
 
Last edited:
I was tempted to go OLED, but still didn't want to have to worry about burn in (wear or whatever people want to call it) as I always intended to use the display as a monitor.

So I have zero regrets grabbing a QN90A instead, deep blacks and it can get incredibly bright, HDR content looks stunning. A great all rounder.
 
Every TV I ever buy is obsolete very quickly. The trick is to ignore the new shiny stuff and just be happy that what you now have is nicer than your old TV :)

My 2015 model Panasonic TX-50CX670E with VA is perfect for what it is worth.
I like it to this day and do not plan a new replacement.
49.5" Panasonic TX-50CX670 - Specifications (displayspecifications.com)


What is disturbing is this market situation that almost everybody has a 4K 3840x2160 TV-set while so few users have a 4K 3840x2160 PC monitor that is actually more important because the user sits closer to the panel and sees its gigantic individual pixels.

What is even more disturbing is that LG cancelled its mobile phones business, so I don't know what I will buy next time, after 2025 when my new LG OLED LG G8s ThinQ phone guarantee ends... :(
 
I was tempted to go OLED, but still didn't want to have to worry about burn in (wear or whatever people want to call it) as I always intended to use the display as a monitor.

So I have zero regrets grabbing a QN90A instead, deep blacks and it can get incredibly bright, HDR content looks stunning. A great all rounder.
I'd never buy one of these over a current gen OLED. The price for this dinosaur LCD tech is absolutely insane, not to mention the downsides to LCD's, like DSE, uneven backlighting and severe clouding and haloing. And not to mention it doesn't even do Dolby Vision. Total deal breaker.

I've had my LG C9 for over two years, and it's got thousands of hours on it, including gaming on it. No burn in, and a beautiful, bright picture. Watching DV content on it is a real experience. The latest models are even better!

Where people go wrong, is that they leave their Windows desktop on it all day, or for hours at a time unattended. Set the screen to go black after 5-10 mins of inactivity, and you'll be golden.
 
Last edited:
Every time there's a brightness and/or power-efficiency boost (they usually go hand-in-hand with OLED), that also means there's a burn-in reduction boost, though LG never markets this since they like to pretend that burn-in isn't an issue at all. The less power you have to send to a pixel to reach a specific desired brightness level, the less it will deteriorate over time. So if LG's super special deuterium tech allows for 30% more brightness at the same amount of power input, and if you have a specific brightness target in mind, you may need 23% less power to reach it. And the organic pixels will deteriorate less as a result. How much less is something I can't confidently say because this may not be a linear relationship, and other factors such as heat levels affect the stability of the organic compounds as well. Anyway, I would expect some small amount of extra burn-in resistance from this advancement, which compounds with the many other small advancements made over the years. In 2022, OLED panels will likely be considerably more burn-in-resistant than they were 7 years ago.

I do think it's worth being cautious about burn-in with OLED panels, even if there are many people who happily use OLED TVs as monitors without any noticeable burn-in. The posters in this thread are right—taking extra steps such as lowering the brightness to 120 or 100 nits, setting a dark background and using dark mode for your apps, and setting your task bar to auto-hide, will all help your display last longer. These aren't options for everyone, though, and OLED displays aren't appropriate for every use case. I work at home, and my work PC is also my leisure PC. It's also in a well-lit room. I'm used to 200 - 250 nits, not 100 - 120. I also have static interface elements on screen for 8 - 12 hours a day, every single day. There's no getting rid of them. And going full dark theme for everything is just not how I want to live my life, lol. I'll choose to hold out on OLED as my main PC display until there's a panel that can last 5+ years for less while being on with static interface elements for 3500+ hours each year. It's a pretty demanding requirement, and I'm guessing we're still quite a few years away from that being possible. In the meantime, I'll happily buy an OLED TV for the living room. :)
That's bad for your eyes. 120 is always the target brightness as it's the same luminance level as a white sheet of paper in daylight. That's what our eyes are naturally used to. It's also why you see professional monitors with hoods: no matter what the ambient luminance, people still work at 120 nits.
 
I'd never buy one of these over a current gen OLED. The price for this dinosaur LCD tech is absolutely insane. And not to mention it doesn't even do Dolby Vision. Total deal breaker.

I've had my LG C9 for over two years, and it's got thousands of hours on it, no burn in, and a beautiful, bright picture. Watching DV content on it is a real experience. The latest models are even better!

Like I said, I have zero regrets. If I only intended to use it as a TV I would have bought an OLED, but this dinosaur tech with boosted with Mini LED backlighting gives OLED a real run for its money.

Beside got it for a steal too.
 
Like I said, I have zero regrets. If I only intended to use it as a TV I would have bought an OLED, but this dinosaur tech with boosted with Mini LED backlighting gives OLED a real run for its money.

Beside got it for a steal too.
That's the problem with miniLED: it costs as much as OLED, of not more :(

Great that you got a good deal on it.
 
No, you don't understand the technical details. Only CRTs have burn-in. Burn-in means the pixel is physically burned and is destroyed.
The uneven wear for OLEDs on plasmas can be "reversed" by applying wear to the rest of the screen to even out the problem, or even just changing your usage of the screen. Uneven wear is much less of a problem for OLED than plasmas, where in OLED it's mostly tied to very bright areas.

And FYI, LCDs also have wear on pixels, typically causes by sharp lines of contrast over time, causing the TFT panel to wear out in those spots, creating lines or shadows. Ironically, this is closer to a "burn-in" than anything that can happen on an OLED display. :p
CRT's don't have pixels, they have triads of R, G, and B phosphor dots that convert non-visible EM from the electron beam into visible light. The phosphors do eventually lose their luminance WRT the electron beam strength resulting in uneven wear, but the ion burn of the phosphor actually blasted deposits onto the glass of the CRT creating a dark burn mark visible even when the CRT was switched off. I really do fully understand the technical details.

Plasma are also phosphor based, so suffer the exact same mechanic that just manifests a little differently but effectively you can still get burn-in that is visible when the plasma display isn't even powered.

QLED is a little different. They're not technically phosphors and not much is documented about their wear but they effectively do with cadmium of indium and blue light what phoshpor did in CRTs and plasmas with non-visible parts of the EM spectrum. They are likely to suffer wear in exactly the same way but potentially the lower-energy levels of narrow-band blue light compared to higher EM energy levels in CRT and plasma displays will make the burn-in negligible or even undetectable over the useful life of the QLED display.

OLEDs are the only ones that burn out rather than burn in. No material is blasted (technical term is ion burn) off the phosphor or cadmium/indium of quantum dots and deposited on the next layer so it's literally just the LED diodes themselves wearing out, Technically they're not burning but the decline of LED brightness (including OLEDs) is caused by the heat of operating thermally expanding and contracting to invoke further dislocations in the crystalline structure of the semiconductor. Over time more and more dislocations mean that fewer of the electrons jumping between the p-n junction of the semiconductors are jumping the right amount which changes their energy levels and thus wavelength of the emitted photon to a non-visible wavelength.

So, it's not tecnically "burn" but it is wear caused by heat and the hotter they get the faster they burn - you can probably understand why so many people call it burnout as the LEDs wear out even though it's thermally-triggered crystal grain breakup.

What is disturbing is this market situation that almost everybody has a 4K 3840x2160 TV-set while so few users have a 4K 3840x2160 PC monitor that is actually more important because the user sits closer to the panel and sees its gigantic individual pixels.
Desktop 4K display adoption is not hampered by technology but by inadequacies of OS non-integer scaling.

Unfortunately, even today, so much content is raster-based and targets 96dpi. No matter how you scale it you will always run into downsides if you don't integer-scale.

Even with integer scaling at 200% you gain double the horizonal and vertical pixels but then lose 3x the horizontal resolution because you can't use subpixel antialiasing like ClearType. For text it's actually a wash as 1080p with Cleartype has 50% greater horizontal resolution and text has far more vertical lines in it than horizontal lines making the horizontal resolution far more valuable than the additional vertical resolution that 4K provides. I know it's only text, but text legibility is the key thing people focus on for picture clarity on desktop monitors at those closer distances.

I'm not advocating that 1080p displays are better than 4K displays, I'm just saying that it's very difficult to get a good text quality if you move away from 100% scaling at native resolution, and have to give up subpixel-AA, 200% scaling isn't categorically better than a lower-dpi display at 100%!
 
Last edited:
That's the problem with miniLED: it costs as much as OLED, of not more :(

Great that you got a good deal on it.

Well, it can push over 1800 nits, not much comes close.
 
OLED can suffer from uneven wear, but it has nothing to do with pictures being static. It's caused by areas being significantly brighter over time wearing those pixels more than the others. This will happen regardless if the picture is static or changing.
That's generally referred to as burn i (even if it technically isn't). If the image is static patterns will indeed become more noticable.

EDIT: I see others beat me to this point, ignore.

Well, it can push over 1800 nits, not much comes close.
Coming from a 600 nit peak, I'm guessing that much brightness hurts?

Not that it's used for more than like realistic momemtary explosions, but just curious. No brand war here, happy for you.
 
That's generally referred to as burn i (even if it technically isn't).
Just because a misconception is widespread enough, doesn't make it true.
A burned-in pixel no longer works, when that happens on a CRT the phosphorus in that spot is destroyed.
Uneven wear just means some pixels have spent more of their lifetime, slightly altering the pixel's color response, resulting in visible patterns when various bright content is displayed.
If the usage pattern which produced this uneven wear continues, then the unevenness will get worse.
If the usage pattern is changed, the panel will become more even again, over thousands of hours of use.

If the image is static patterns will indeed become more noticable.
Listen closely; whether the picture is static or not is irrelevant. That myth needs to end now.
What matters is some region is brighter than others over time, that's it.
Most people still seem to think that leaving a static picture on for hours will destroy a OLED panel, but that's completely untrue. You will not "misuse" your OLED TV a few times and risk ruining it, like pausing a movie and forgetting it, or be worried that a guest or your children uses the TV when you're not wathcing. The only worry should be whether a portion of the screen will be significantly brighter than the rest over time (like certain TV-channels, viewing the same web page or streaming layout all day etc.), not whether the content in that region is static or moving.

Please try to grasp this very important difference, because understanding it is necessary to determine if uneven wear is a problem or not for your use case.
 
Unfortunately, even today, so much content is raster-based and targets 96dpi. No matter how you scale it you will always run into downsides if you don't integer-scale.

Even with integer scaling at 200% you gain double the horizonal and vertical pixels but then lose 3x the horizontal resolution because you can't use subpixel antialiasing like ClearType. For text it's actually a wash as 1080p with Cleartype has 50% greater horizontal resolution and text has far more vertical lines in it than horizontal lines making the horizontal resolution far more valuable than the additional vertical resolution that 4K provides. I know it's only text, but text legibility is the key thing people focus on for picture clarity on desktop monitors at those closer distances.

I'm not advocating that 1080p displays are better than 4K displays, I'm just saying that it's very difficult to get a good text quality if you move away from 100% scaling at native resolution, and have to give up subpixel-AA, 200% scaling isn't categorically better than a lower-dpi display at 100%!
When was the last time you've tried windows' scaling system? I ask because this has not mirrored my experience. Text doesn't get upscaled with windows scaling, it gets rendered with more pixels. And Cleartype is compatible with it. Even at just 125%, text looks better to my eyes due to the increased text rendering resolution. Make sure you run the cleartype tool after changing the scaling factor.
 
Desktop 4K display adoption is not hampered by technology but by inadequacies of OS non-integer scaling.

Unfortunately, even today, so much content is raster-based and targets 96dpi. No matter how you scale it you will always run into downsides if you don't integer-scale.

Even with integer scaling at 200% you gain double the horizonal and vertical pixels but then lose 3x the horizontal resolution because you can't use subpixel antialiasing like ClearType. For text it's actually a wash as 1080p with Cleartype has 50% greater horizontal resolution and text has far more vertical lines in it than horizontal lines making the horizontal resolution far more valuable than the additional vertical resolution that 4K provides. I know it's only text, but text legibility is the key thing people focus on for picture clarity on desktop monitors at those closer distances.

I'm not advocating that 1080p displays are better than 4K displays, I'm just saying that it's very difficult to get a good text quality if you move away from 100% scaling at native resolution, and have to give up subpixel-AA, 200% scaling isn't categorically better than a lower-dpi display at 100%!

I find 1080p with 96 dpi more than terrible. It is criminal and should be banned from any office use.


1640979423111.png

EIZO 4K Monitors - High definition and large screen sizes | EIZO (eizoglobal.com)

1640979649251.png

Confused about HiDPI and Retina display? ― Understanding pixel density in the age of 4K | EIZO (eizoglobal.com)
 

Attachments

  • 1640979417239.png
    1640979417239.png
    241.3 KB · Views: 86
I wasn't aware you could pack that many buzzwords in such a short span.
 
Coming from a 600 nit peak, I'm guessing that much brightness hurts?

Not that it's used for more than like realistic momemtary explosions, but just curious. No brand war here, happy for you.

Well it's certainly impressive, I'm not saying you need sun glasses or anything I then can't really say I've cranked it up in a pitch black room either, I've dabbled a bit with certain HDR content and games like Shadow of the Tomb Raider and the results are impressive, of course where it shines is in rooms which are already really bright and have loads of natural sunlight, it's reflection handling is great too.

What sold it on me was the price, typical MSRP for the 50" I got here in the UK is ÂŁ1199, it's currently on deal for ÂŁ1099, I was able to nab one back in September for ÂŁ890 delivered, so i just pulled the trigger right away. Plenty big enough considering it's on my desk, Mini LED, 120Hz 4K HDMI 2.1, VRR, Game Mode giving really low latency etc etc chuffed to bits.

So yeah, definitely not interested in a brand war, OLED is stunning no question, and they obvously don't have to get as bright as they have superior blacks in the first place, different tech, different solutions and all that.
 
Listen closely; whether the picture is static or not is irrelevant. That myth needs to end now.
What matters is some region is brighter than others over time, that's it.
It does matter though, as varied content will excercise the pixels more evenly and the wear will be less noticable.
 
The screensavers are invented specifically for CRT, plasma and OLED screens! Always use a non-static picture with as much movement of the internal images as possible.
 
The screensavers are invented specifically for CRT, plasma and OLED screens! Always use a non-static picture with as much movement of the internal images as possible.
Auto screen-off is a better screensaver, and that's been the default Since Windows 8.
 
Stick to auto sleep, and if you ever want to intentionally "wear out" your pixels more evenly, that's what LG's pixel refresher effectively does, but in a more measured manner.
 
Stick to auto sleep, and if you ever want to intentionally "wear out" your pixels more evenly, that's what LG's pixel refresher effectively does
No, LG's pixel refresher does a lot more than that. It's more akin to programing each pixels wear level into memory and having the panel attempt to hide weak spots.
 
Auto screen-off is a better screensaver, and that's been the default Since Windows 8.

In some cases, auto screen-off isn't an option. For example, when a given TV programme with its logo is being watched, and the logo prints a permanent burn on the panel.
 
In some cases, auto screen-off isn't an option. For example, when a given TV programme with its logo is being watched, and the logo prints a permanent burn on the panel.
We're talking about screensavers vs screen-off though, whether burn-in happens or not due to uneven wear has no impact whatsoever on whether you should use a screensaver or allow the OS to turn off the display instead.

Running a screensaver when the TV is not in use is pointless because it's consuming power, generating heat, increasing wear, and preventing the pixel-refresh cycle from starting. The manufacturer's own pixel conditioning cycle can only occur when the display is inactive, and it is vastly more effective than a random screensaver that has no ability to interact with the per-subpixel brightness calibration of a display's internal lookup tables and no way to read the resistance across each diode to feed that data back into the calibration system. A screensaver is just more use, in other words, it continues to wear out the OLEDs.

What benefit does a screensaver provide that auto-poweroff of the display doesn't? None;
There are no advantages to a screensaver unless you like watching pretty screensavers, in which case fine - but don't pretend it's 'saving' your screen.
 
Last edited:
Who decides whether the TV is or isn't in use? It is quite annoying when someone is actually watching, and the TV somehow decides that it isn't in use lol and turns off.

There should be another solution that is not a complete turn off :D
 
Who decides whether the TV is or isn't in use? It is quite annoying when someone is actually watching, and the TV somehow decides that it isn't in use lol and turns off.

There should be another solution that is not a complete turn off :D
What are you talking about? That's completely irrelevant; both screensaver and auto-off occur on the exact same trigger (user inactivity).

If you're now saying that screensavers and auto-off are annoying, then yes, they are - but less annoying than burn-in on an expensive OLED. I'm actually hoping that more TVs come with sensors like laptops and phones that can tell if anyone is actually watching them or not, and shut off after, say, 5 minutes where nobody is sat in front of them. Displays typically wake up within a second or two so it's not exactly a hardship for them to turn off as soon as they're not being used, especially if these sensors have worked out that you're back to watch the screen before you've even finished sitting down.
 
Last edited:
Back
Top