Thursday, December 30th 2021

LG Display Unveils Next-Generation OLED TV Display, the OLED EX

LG Display, the world's leading innovator of display technologies, unveiled today its newest OLED TV technology OLED EX. The next-generation OLED EX display implements LG Display's deuterium and personalized algorithm-based EX Technology, which helps boost the innovative display's overall picture quality by enhancing brightness up to 30 percent compared to conventional OLED displays.

The OLED EX name is an acronym of "Evolution" and "eXperience", which represents the company's goal of providing customers with new experiences through its ever-evolving OLED technology. OLED displays are self-emissive by nature with their multiple millions of pixels emitting light independently without a separate backlight source. This distinctive characteristic lets OLED EX achieve the perfect black, rich and accurate color expression as well as an extremely fast response time.
Since 2013, the year it became the first to mass produce OLED TV displays, LG Display has been consistently improving its leading OLED technology. OLED EX is the result of the unparalleled knowledge and know-how the company has gained over nearly ten years of developing OLED displays, created to deliver the most lifelike images that transcend the limitations of a conventional display.

The EX Technology applied to the OLED EX displays combines deuterium compounds and personalized algorithms to enhance the stability and efficiency of the organic light emitting diode, thereby improving the overall display performance.

Thanks to EX Technology, OLED EX displays unlock new levels of picture accuracy and brightness to accurately deliver exquisite, realistic details and colors without any distortion - such as the reflection of sunlight on a river or each individual vein of a tree leaf.

Deuterium compounds are used to make highly efficient organic light-emitting diodes that emit stronger light. LG Display has successfully converted the hydrogen elements present in organic light emitting elements into stable deuterium and managed to apply the compounds to OLED EX for the first time.

Deuterium is twice as heavy as normal Hydrogen, and only a small amount exists in the natural world - as only one atom of Deuterium is found in about 6,000 ordinary Hydrogen atoms. LG Display has worked out how to extract deuterium from water and apply it to organic light-emitting devices. When stabilized, the Deuterium compounds allow the display to emit brighter light while maintaining high efficiency for a long time.

In addition, thanks to LG Display's very own 'personalized algorithm' based on machine learning technology, OLED EX is more in control of its own device. The algorithm predicts the usage amount of up to 33 million organic light-emitting diodes based on 8K OLED displays after learning individual viewing patterns, and precisely controls the display's energy input to more accurately express the details and colors of the video content being played.

LG Display has also upgraded its designs through the new OLED EX technology. By utilizing its innovative EX Technology, the company reduced bezel thickness from the original 6 mm to 4 mm based on 65-inch OLED displays. By reducing the bezel thickness by 30 percent compared to existing OLED displays, the OLED EX display creates an even more immersive viewing experience all the while delivering a sleeker and premium design.

LG Display plans to strengthen its leadership and product competitiveness in the large-sized OLED business by integrating OLED EX technology into all OLED TV displays manufactured at its OLED production plants in Paju, South Korea, and in Guangzhou, China, starting from the second quarter of 2022.

In 2013, LG Display's first year of starting mass-production of OLED TV displays, the company sold 200,000 units and by early last year recorded accumulated sales of 10 million units. In the two years since then, the company's accumulated sales have doubled to surpass 20 million units globally.

"Despite the global TV market experiencing a 12 percent decline this year, we still observed a 70 percent growth in OLED sales," said Dr. Oh Chang-ho, Executive Vice President & Head of the TV Business Unit at LG Display. "With our new OLED EX technology, we aim to provide even more innovative, high-end customer experiences through the evolution of our OLED technology, algorithms and designs."
Add your own comment

80 Comments on LG Display Unveils Next-Generation OLED TV Display, the OLED EX

#51
stimpy88
"algorithm-based EX Technology, which helps boost the innovative display's overall picture quality by enhancing brightness up to 30 percent"

So a software tweak, which could easily be implemented on other LG OLED panels then?

And more seriously, it's really not a good sign that LG are not talking about colour volume or wider colour gamut in this awful PR marketing drivel. If they have not improved this, then Samsung may well win the next gen OLED race, with it's QD technology. Not good LG, not good - You'll be bright, but nowhere near as vibrant as Samsung.
Posted on Reply
#52
Fluffmeister
I was tempted to go OLED, but still didn't want to have to worry about burn in (wear or whatever people want to call it) as I always intended to use the display as a monitor.

So I have zero regrets grabbing a QN90A instead, deep blacks and it can get incredibly bright, HDR content looks stunning. A great all rounder.
Posted on Reply
#53
ARF
Chrispy_Every TV I ever buy is obsolete very quickly. The trick is to ignore the new shiny stuff and just be happy that what you now have is nicer than your old TV :)
My 2015 model Panasonic TX-50CX670E with VA is perfect for what it is worth.
I like it to this day and do not plan a new replacement.
49.5" Panasonic TX-50CX670 - Specifications (displayspecifications.com)


What is disturbing is this market situation that almost everybody has a 4K 3840x2160 TV-set while so few users have a 4K 3840x2160 PC monitor that is actually more important because the user sits closer to the panel and sees its gigantic individual pixels.

What is even more disturbing is that LG cancelled its mobile phones business, so I don't know what I will buy next time, after 2025 when my new LG OLED LG G8s ThinQ phone guarantee ends... :(
Posted on Reply
#54
stimpy88
FluffmeisterI was tempted to go OLED, but still didn't want to have to worry about burn in (wear or whatever people want to call it) as I always intended to use the display as a monitor.

So I have zero regrets grabbing a QN90A instead, deep blacks and it can get incredibly bright, HDR content looks stunning. A great all rounder.
I'd never buy one of these over a current gen OLED. The price for this dinosaur LCD tech is absolutely insane, not to mention the downsides to LCD's, like DSE, uneven backlighting and severe clouding and haloing. And not to mention it doesn't even do Dolby Vision. Total deal breaker.

I've had my LG C9 for over two years, and it's got thousands of hours on it, including gaming on it. No burn in, and a beautiful, bright picture. Watching DV content on it is a real experience. The latest models are even better!

Where people go wrong, is that they leave their Windows desktop on it all day, or for hours at a time unattended. Set the screen to go black after 5-10 mins of inactivity, and you'll be golden.
Posted on Reply
#55
bug
kongaEvery time there's a brightness and/or power-efficiency boost (they usually go hand-in-hand with OLED), that also means there's a burn-in reduction boost, though LG never markets this since they like to pretend that burn-in isn't an issue at all. The less power you have to send to a pixel to reach a specific desired brightness level, the less it will deteriorate over time. So if LG's super special deuterium tech allows for 30% more brightness at the same amount of power input, and if you have a specific brightness target in mind, you may need 23% less power to reach it. And the organic pixels will deteriorate less as a result. How much less is something I can't confidently say because this may not be a linear relationship, and other factors such as heat levels affect the stability of the organic compounds as well. Anyway, I would expect some small amount of extra burn-in resistance from this advancement, which compounds with the many other small advancements made over the years. In 2022, OLED panels will likely be considerably more burn-in-resistant than they were 7 years ago.

I do think it's worth being cautious about burn-in with OLED panels, even if there are many people who happily use OLED TVs as monitors without any noticeable burn-in. The posters in this thread are right—taking extra steps such as lowering the brightness to 120 or 100 nits, setting a dark background and using dark mode for your apps, and setting your task bar to auto-hide, will all help your display last longer. These aren't options for everyone, though, and OLED displays aren't appropriate for every use case. I work at home, and my work PC is also my leisure PC. It's also in a well-lit room. I'm used to 200 - 250 nits, not 100 - 120. I also have static interface elements on screen for 8 - 12 hours a day, every single day. There's no getting rid of them. And going full dark theme for everything is just not how I want to live my life, lol. I'll choose to hold out on OLED as my main PC display until there's a panel that can last 5+ years for less while being on with static interface elements for 3500+ hours each year. It's a pretty demanding requirement, and I'm guessing we're still quite a few years away from that being possible. In the meantime, I'll happily buy an OLED TV for the living room. :)
That's bad for your eyes. 120 is always the target brightness as it's the same luminance level as a white sheet of paper in daylight. That's what our eyes are naturally used to. It's also why you see professional monitors with hoods: no matter what the ambient luminance, people still work at 120 nits.
Posted on Reply
#56
Fluffmeister
stimpy88I'd never buy one of these over a current gen OLED. The price for this dinosaur LCD tech is absolutely insane. And not to mention it doesn't even do Dolby Vision. Total deal breaker.

I've had my LG C9 for over two years, and it's got thousands of hours on it, no burn in, and a beautiful, bright picture. Watching DV content on it is a real experience. The latest models are even better!
Like I said, I have zero regrets. If I only intended to use it as a TV I would have bought an OLED, but this dinosaur tech with boosted with Mini LED backlighting gives OLED a real run for its money.

Beside got it for a steal too.
Posted on Reply
#57
bug
FluffmeisterLike I said, I have zero regrets. If I only intended to use it as a TV I would have bought an OLED, but this dinosaur tech with boosted with Mini LED backlighting gives OLED a real run for its money.

Beside got it for a steal too.
That's the problem with miniLED: it costs as much as OLED, of not more :(

Great that you got a good deal on it.
Posted on Reply
#58
Chrispy_
efikkanNo, you don't understand the technical details. Only CRTs have burn-in. Burn-in means the pixel is physically burned and is destroyed.
The uneven wear for OLEDs on plasmas can be "reversed" by applying wear to the rest of the screen to even out the problem, or even just changing your usage of the screen. Uneven wear is much less of a problem for OLED than plasmas, where in OLED it's mostly tied to very bright areas.

And FYI, LCDs also have wear on pixels, typically causes by sharp lines of contrast over time, causing the TFT panel to wear out in those spots, creating lines or shadows. Ironically, this is closer to a "burn-in" than anything that can happen on an OLED display. :p
CRT's don't have pixels, they have triads of R, G, and B phosphor dots that convert non-visible EM from the electron beam into visible light. The phosphors do eventually lose their luminance WRT the electron beam strength resulting in uneven wear, but the ion burn of the phosphor actually blasted deposits onto the glass of the CRT creating a dark burn mark visible even when the CRT was switched off. I really do fully understand the technical details.

Plasma are also phosphor based, so suffer the exact same mechanic that just manifests a little differently but effectively you can still get burn-in that is visible when the plasma display isn't even powered.

QLED is a little different. They're not technically phosphors and not much is documented about their wear but they effectively do with cadmium of indium and blue light what phoshpor did in CRTs and plasmas with non-visible parts of the EM spectrum. They are likely to suffer wear in exactly the same way but potentially the lower-energy levels of narrow-band blue light compared to higher EM energy levels in CRT and plasma displays will make the burn-in negligible or even undetectable over the useful life of the QLED display.

OLEDs are the only ones that burn out rather than burn in. No material is blasted (technical term is ion burn) off the phosphor or cadmium/indium of quantum dots and deposited on the next layer so it's literally just the LED diodes themselves wearing out, Technically they're not burning but the decline of LED brightness (including OLEDs) is caused by the heat of operating thermally expanding and contracting to invoke further dislocations in the crystalline structure of the semiconductor. Over time more and more dislocations mean that fewer of the electrons jumping between the p-n junction of the semiconductors are jumping the right amount which changes their energy levels and thus wavelength of the emitted photon to a non-visible wavelength.

So, it's not tecnically "burn" but it is wear caused by heat and the hotter they get the faster they burn - you can probably understand why so many people call it burnout as the LEDs wear out even though it's thermally-triggered crystal grain breakup.
ARFWhat is disturbing is this market situation that almost everybody has a 4K 3840x2160 TV-set while so few users have a 4K 3840x2160 PC monitor that is actually more important because the user sits closer to the panel and sees its gigantic individual pixels.
Desktop 4K display adoption is not hampered by technology but by inadequacies of OS non-integer scaling.

Unfortunately, even today, so much content is raster-based and targets 96dpi. No matter how you scale it you will always run into downsides if you don't integer-scale.

Even with integer scaling at 200% you gain double the horizonal and vertical pixels but then lose 3x the horizontal resolution because you can't use subpixel antialiasing like ClearType. For text it's actually a wash as 1080p with Cleartype has 50% greater horizontal resolution and text has far more vertical lines in it than horizontal lines making the horizontal resolution far more valuable than the additional vertical resolution that 4K provides. I know it's only text, but text legibility is the key thing people focus on for picture clarity on desktop monitors at those closer distances.

I'm not advocating that 1080p displays are better than 4K displays, I'm just saying that it's very difficult to get a good text quality if you move away from 100% scaling at native resolution, and have to give up subpixel-AA, 200% scaling isn't categorically better than a lower-dpi display at 100%!
Posted on Reply
#59
Fluffmeister
bugThat's the problem with miniLED: it costs as much as OLED, of not more :(

Great that you got a good deal on it.
Well, it can push over 1800 nits, not much comes close.
Posted on Reply
#60
R-T-B
efikkanOLED can suffer from uneven wear, but it has nothing to do with pictures being static. It's caused by areas being significantly brighter over time wearing those pixels more than the others. This will happen regardless if the picture is static or changing.
That's generally referred to as burn i (even if it technically isn't). If the image is static patterns will indeed become more noticable.

EDIT: I see others beat me to this point, ignore.
FluffmeisterWell, it can push over 1800 nits, not much comes close.
Coming from a 600 nit peak, I'm guessing that much brightness hurts?

Not that it's used for more than like realistic momemtary explosions, but just curious. No brand war here, happy for you.
Posted on Reply
#61
efikkan
R-T-BThat's generally referred to as burn i (even if it technically isn't).
Just because a misconception is widespread enough, doesn't make it true.
A burned-in pixel no longer works, when that happens on a CRT the phosphorus in that spot is destroyed.
Uneven wear just means some pixels have spent more of their lifetime, slightly altering the pixel's color response, resulting in visible patterns when various bright content is displayed.
If the usage pattern which produced this uneven wear continues, then the unevenness will get worse.
If the usage pattern is changed, the panel will become more even again, over thousands of hours of use.
R-T-BIf the image is static patterns will indeed become more noticable.
Listen closely; whether the picture is static or not is irrelevant. That myth needs to end now.
What matters is some region is brighter than others over time, that's it.
Most people still seem to think that leaving a static picture on for hours will destroy a OLED panel, but that's completely untrue. You will not "misuse" your OLED TV a few times and risk ruining it, like pausing a movie and forgetting it, or be worried that a guest or your children uses the TV when you're not wathcing. The only worry should be whether a portion of the screen will be significantly brighter than the rest over time (like certain TV-channels, viewing the same web page or streaming layout all day etc.), not whether the content in that region is static or moving.

Please try to grasp this very important difference, because understanding it is necessary to determine if uneven wear is a problem or not for your use case.
Posted on Reply
#62
konga
Chrispy_Unfortunately, even today, so much content is raster-based and targets 96dpi. No matter how you scale it you will always run into downsides if you don't integer-scale.

Even with integer scaling at 200% you gain double the horizonal and vertical pixels but then lose 3x the horizontal resolution because you can't use subpixel antialiasing like ClearType. For text it's actually a wash as 1080p with Cleartype has 50% greater horizontal resolution and text has far more vertical lines in it than horizontal lines making the horizontal resolution far more valuable than the additional vertical resolution that 4K provides. I know it's only text, but text legibility is the key thing people focus on for picture clarity on desktop monitors at those closer distances.

I'm not advocating that 1080p displays are better than 4K displays, I'm just saying that it's very difficult to get a good text quality if you move away from 100% scaling at native resolution, and have to give up subpixel-AA, 200% scaling isn't categorically better than a lower-dpi display at 100%!
When was the last time you've tried windows' scaling system? I ask because this has not mirrored my experience. Text doesn't get upscaled with windows scaling, it gets rendered with more pixels. And Cleartype is compatible with it. Even at just 125%, text looks better to my eyes due to the increased text rendering resolution. Make sure you run the cleartype tool after changing the scaling factor.
Posted on Reply
#63
ARF
Chrispy_Desktop 4K display adoption is not hampered by technology but by inadequacies of OS non-integer scaling.

Unfortunately, even today, so much content is raster-based and targets 96dpi. No matter how you scale it you will always run into downsides if you don't integer-scale.

Even with integer scaling at 200% you gain double the horizonal and vertical pixels but then lose 3x the horizontal resolution because you can't use subpixel antialiasing like ClearType. For text it's actually a wash as 1080p with Cleartype has 50% greater horizontal resolution and text has far more vertical lines in it than horizontal lines making the horizontal resolution far more valuable than the additional vertical resolution that 4K provides. I know it's only text, but text legibility is the key thing people focus on for picture clarity on desktop monitors at those closer distances.

I'm not advocating that 1080p displays are better than 4K displays, I'm just saying that it's very difficult to get a good text quality if you move away from 100% scaling at native resolution, and have to give up subpixel-AA, 200% scaling isn't categorically better than a lower-dpi display at 100%!
I find 1080p with 96 dpi more than terrible. It is criminal and should be banned from any office use.



EIZO 4K Monitors - High definition and large screen sizes | EIZO (eizoglobal.com)


Confused about HiDPI and Retina display? ― Understanding pixel density in the age of 4K | EIZO (eizoglobal.com)
Posted on Reply
#64
LabRat 891
I wasn't aware you could pack that many buzzwords in such a short span.
Posted on Reply
#65
Fluffmeister
R-T-BComing from a 600 nit peak, I'm guessing that much brightness hurts?

Not that it's used for more than like realistic momemtary explosions, but just curious. No brand war here, happy for you.
Well it's certainly impressive, I'm not saying you need sun glasses or anything I then can't really say I've cranked it up in a pitch black room either, I've dabbled a bit with certain HDR content and games like Shadow of the Tomb Raider and the results are impressive, of course where it shines is in rooms which are already really bright and have loads of natural sunlight, it's reflection handling is great too.

What sold it on me was the price, typical MSRP for the 50" I got here in the UK is £1199, it's currently on deal for £1099, I was able to nab one back in September for £890 delivered, so i just pulled the trigger right away. Plenty big enough considering it's on my desk, Mini LED, 120Hz 4K HDMI 2.1, VRR, Game Mode giving really low latency etc etc chuffed to bits.

So yeah, definitely not interested in a brand war, OLED is stunning no question, and they obvously don't have to get as bright as they have superior blacks in the first place, different tech, different solutions and all that.
Posted on Reply
#66
R-T-B
efikkanListen closely; whether the picture is static or not is irrelevant. That myth needs to end now.
What matters is some region is brighter than others over time, that's it.
It does matter though, as varied content will excercise the pixels more evenly and the wear will be less noticable.
Posted on Reply
#67
ARF
The screensavers are invented specifically for CRT, plasma and OLED screens! Always use a non-static picture with as much movement of the internal images as possible.
Posted on Reply
#68
Chrispy_
ARFThe screensavers are invented specifically for CRT, plasma and OLED screens! Always use a non-static picture with as much movement of the internal images as possible.
Auto screen-off is a better screensaver, and that's been the default Since Windows 8.
Posted on Reply
#69
konga
Stick to auto sleep, and if you ever want to intentionally "wear out" your pixels more evenly, that's what LG's pixel refresher effectively does, but in a more measured manner.
Posted on Reply
#70
R-T-B
kongaStick to auto sleep, and if you ever want to intentionally "wear out" your pixels more evenly, that's what LG's pixel refresher effectively does
No, LG's pixel refresher does a lot more than that. It's more akin to programing each pixels wear level into memory and having the panel attempt to hide weak spots.
Posted on Reply
#71
ARF
Chrispy_Auto screen-off is a better screensaver, and that's been the default Since Windows 8.
In some cases, auto screen-off isn't an option. For example, when a given TV programme with its logo is being watched, and the logo prints a permanent burn on the panel.
Posted on Reply
#72
Chrispy_
ARFIn some cases, auto screen-off isn't an option. For example, when a given TV programme with its logo is being watched, and the logo prints a permanent burn on the panel.
We're talking about screensavers vs screen-off though, whether burn-in happens or not due to uneven wear has no impact whatsoever on whether you should use a screensaver or allow the OS to turn off the display instead.

Running a screensaver when the TV is not in use is pointless because it's consuming power, generating heat, increasing wear, and preventing the pixel-refresh cycle from starting. The manufacturer's own pixel conditioning cycle can only occur when the display is inactive, and it is vastly more effective than a random screensaver that has no ability to interact with the per-subpixel brightness calibration of a display's internal lookup tables and no way to read the resistance across each diode to feed that data back into the calibration system. A screensaver is just more use, in other words, it continues to wear out the OLEDs.

What benefit does a screensaver provide that auto-poweroff of the display doesn't? None;
There are no advantages to a screensaver unless you like watching pretty screensavers, in which case fine - but don't pretend it's 'saving' your screen.
Posted on Reply
#73
ARF
Who decides whether the TV is or isn't in use? It is quite annoying when someone is actually watching, and the TV somehow decides that it isn't in use lol and turns off.

There should be another solution that is not a complete turn off :D
Posted on Reply
#74
Chrispy_
ARFWho decides whether the TV is or isn't in use? It is quite annoying when someone is actually watching, and the TV somehow decides that it isn't in use lol and turns off.

There should be another solution that is not a complete turn off :D
What are you talking about? That's completely irrelevant; both screensaver and auto-off occur on the exact same trigger (user inactivity).

If you're now saying that screensavers and auto-off are annoying, then yes, they are - but less annoying than burn-in on an expensive OLED. I'm actually hoping that more TVs come with sensors like laptops and phones that can tell if anyone is actually watching them or not, and shut off after, say, 5 minutes where nobody is sat in front of them. Displays typically wake up within a second or two so it's not exactly a hardship for them to turn off as soon as they're not being used, especially if these sensors have worked out that you're back to watch the screen before you've even finished sitting down.
Posted on Reply
#75
efikkan
R-T-BIt does matter though, as varied content will excercise the pixels more evenly and the wear will be less noticable.
Varied content is key, in fact, if you watch a variety of normal TV, movies, fullscreen games etc., you have no reason to worry. Even pausing a movie or game, and even forgetting it and falling asleep is still not the problem for OLED.
But the key aspect of the varied content is that the overall brightness evens out, not how long each picture is shown on the screen.
If you watch a variety of content where one section of the screen is significantly brighter, you will get some extra unevenness over time. Whether the user watches this content in long sessions or in 5 second sessions is irrelevant.
But if you choose to show a slideshow of static pictures which overall balances out, even if you show each one for hours at the time, it will be no problem.
So care about the overall brightness of the content to be viewed, not whether pictures are static or not, get it?

For most buyers, this will not be a concern at all. OLED is much more tolerant to uneven wear than plasma ever was (even the last gen Panasonics), to the point that it's really just the extreme edge cases where OLED is a poor choice.
Examples includes;
- TVs showing mostly a single news/sports channel with a "fixed" layout.
- PC users with the majority of screen time with a bright web browser covering only parts of the screen.
Posted on Reply
Add your own comment