• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

LG Display Unveils Next-Generation OLED TV Display, the OLED EX

Joined
Sep 1, 2009
Messages
1,182 (0.22/day)
Location
CO
System Name 4k
Processor AMD 5800x3D
Motherboard MSI MAG b550m Mortar Wifi
Cooling Corsair H100i
Memory 4x8Gb Crucial Ballistix 3600 CL16 bl8g36c16u4b.m8fe1
Video Card(s) Nvidia Reference 3080Ti
Storage ADATA XPG SX8200 Pro 1TB
Display(s) LG 48" C1
Case CORSAIR Carbide AIR 240 Micro-ATX
Audio Device(s) Asus Xonar STX
Power Supply EVGA SuperNOVA 650W
Software Microsoft Windows10 Pro x64
Ive had mine since a month after the c1 came out, Just have a black background or an active background, hide the taskbar and keep SDR to around 100nits and you wont have any problems.
 
Joined
Nov 13, 2007
Messages
10,209 (1.71/day)
Location
Austin Texas
Processor 13700KF Undervolted @ 5.6/ 5.5, 4.8Ghz Ring 200W PL1
Motherboard MSI 690-I PRO
Cooling Thermalright Peerless Assassin 120 w/ Arctic P12 Fans
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2x 2TB WDC SN850, 1TB Samsung 960 prr
Display(s) Alienware 32" 4k 240hz OLED
Case SLIGER S620
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard RoyalAxe
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
Ive had mine since a month after the c1 came out, Just have a black background or an active background, hide the taskbar and keep SDR to around 100nits and you wont have any problems.

I've had mine close to 1.5 years now, and have done none of those things, turned off the static brightness reduction using the factory remote -- use it for 12-16 hours a day desktop usage.

No burn in yet... Unfortunately since i've got about 6 months left on that best buy burn in warranty :/
 
Joined
Jun 6, 2021
Messages
602 (0.59/day)
System Name Red Devil
Processor AMD 5950x - Vermeer - B0
Motherboard Gigabyte X570 AORUS MASTER
Cooling NZXT Kraken Z73 360mm; 14 x Corsair QL 120mm RGB Case Fans
Memory G.SKill Trident Z Neo 32GB Kit DDR4-3600 CL14 (F4-3600C14Q-32GTZNB)
Video Card(s) PowerColor's Red Devil Radeon RX 6900 XT (Navi 21 XTX)
Storage 2 x Western Digital SN850 1GB; 1 x Samsung SSD 870EVO 2TB
Display(s) 3 x Asus VG27AQL1A; 1 x Sony A1E OLED 4K
Case Corsair Obsidian 1000D
Audio Device(s) Corsair SP2500; Steel Series Arctis Nova Pro Wireless (XBox Version)
Power Supply AX1500i Digital ATX - 1500w - 80 Plus Titanium
Mouse Razer Basilisk V3
Keyboard Razer Huntsman V2 - Optical Gaming Keyboard
Software Windows 11
Why? When the current ones are blindingly bright. I don't think ppl realize how bright current gen oleds are.
Because these crazy mofos want HDR9000 and to be blind by 30 years old.
 
Joined
Jan 29, 2012
Messages
6,402 (1.44/day)
Location
Florida
System Name natr0n-PC
Processor Ryzen 5950x/5600x
Motherboard B450 AORUS M
Cooling EK AIO - 6 fan action
Memory Patriot - Viper Steel DDR4 (B-Die)(4x8GB)
Video Card(s) EVGA 3070ti FTW
Storage Various
Display(s) PIXIO IPS 240Hz 1080P
Case Thermaltake Level 20 VT
Audio Device(s) LOXJIE D10 + Kinter Amp + 6 Bookshelf Speakers Sony+JVC+Sony
Power Supply Super Flower Leadex III ARGB 80+ Gold 650W
Software XP/7/8.1/10
Benchmark Scores http://valid.x86.fr/79kuh6
I miss the windows eXperience.

I always turn down the brightness a bit so idk how hdr is gonna work well if at all.
 
Joined
Feb 23, 2008
Messages
1,064 (0.18/day)
Location
Montreal
System Name Aryzen / Sairikiki / Tesseract
Processor 5800x / i7 920@3.73 / 5800x
Motherboard Steel Legend B450M / GB EX58-UDP4 / Steel Legend B550M
Cooling Mugen 5 / Pure Rock / Glacier One 240
Memory Corsair Something 16 / Corsair Something 12 / G.Skill 32
Video Card(s) AMD 6800XT / AMD 6750XT / Sapphire 7800XT
Storage Way too many drives...
Display(s) LG 332GP850-B / Sony w800b / Sony X90J
Case EVOLV X / Carbide 540 / Carbide 280x
Audio Device(s) SB ZxR + GSP 500 / board / Denon X1700h + ELAC Uni-Fi 2 + Senn 6XX
Power Supply Seasonic PRIME GX-750 / Corsair HX750 / Seasonic Focus PX-650
Mouse G700 / none / G602
Keyboard G910
Software w11 64
Benchmark Scores I don't play benchmarks...
We were supposed to have starships and flying cars, instead you are using Deuterium in TVs...
 
Joined
Sep 15, 2011
Messages
6,457 (1.41/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
I want an 88" for 1000$ please.
 
Joined
Aug 20, 2007
Messages
20,709 (3.41/day)
System Name Pioneer
Processor Ryzen R9 7950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage 2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches
Software Windows 11 Enterprise (legit), Gentoo Linux x64
Interesting. How long have you had yours and what changes to your UI have you made to prevent static images?
I'm bumping up against the limit of what VA high-refresh + strobing backlight can achieve and I'm dissatisfied but have accepted OLED isn't ready for proper 8+ hours a day desktop use yet.
I've had my B9 since the very earliest days of the pandemic. No habit changes other than dark mode and a screensaver. No burn in.
 
Joined
Dec 30, 2021
Messages
354 (0.43/day)
People saying OLED is bright enough are missing the point. When LG's W-OLED panels get bright, they lean on their special white subpixels to do so. This results in very bright colors appearing washed out instead of properly saturated. Yes, it's still bright, but it's a desaturated brightness. If this enhances the brightness of all of their subpixels, then they have to rely on the white subpixel less, and colors can pop more and appear more saturated at high brightness levels. That's a good thing even if you aren't going to turn up the brightness level to a retina-searing level. This is the area that Samsung's QD-Display OLED technology will have a strong advantage in, so LG is trying to narrow that advantage with this tech.
 

bug

Joined
May 22, 2015
Messages
13,163 (4.07/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Oled tvs are good for at most 5 years usage, with this new thing, it might expand to 7 years. I'm still waiting for that microled monitor just because burn-in on oleds are bad for static things. For tv, oled is all right as content is always changing, for a monitor oled is bad, desktop is most of the time static. Well, you can always change background picture every minute, maybe 30 seconds would be best, no icons on the desktop, and taskbar only when you move the mouse cursor to it as known as "automatically hides taskbar". That will help a lot to contain burn-in from happening.
If you frequently use one app or another (web browser or something work related), it's gonna have a static interface. But, and this is a big but, burn in happens more at high brightness. The higher the brightness the panel will sustain, the harder it will be burn it at the normal 120nits.
TVs also have this thing that detects static stuff (logos) and dims them. I hope monitors will do the same, but this will interfere with color critical work - you'll have to turn it off then.

People saying OLED is bright enough are missing the point. When LG's W-OLED panels get bright, they lean on their special white subpixels to do so. This results in very bright colors appearing washed out instead of properly saturated. Yes, it's still bright, but it's a desaturated brightness. If this enhances the brightness of all of their subpixels, then they have to rely on the white subpixel less, and colors can pop more and appear more saturated at high brightness levels. That's a good thing even if you aren't going to turn up the brightness level to a retina-searing level. This is the area that Samsung's QD-Display OLED technology will have a strong advantage in, so LG is trying to narrow that advantage with this tech.
Sounds like you're assuming "bright" is always about whites...
 
Joined
Dec 30, 2021
Messages
354 (0.43/day)
Sounds like you're assuming "bright" is always about whites...
No? On a good LCD, bright colors can be fully saturated and pop well with no white creeping in at all. W-OLED works differently though. They have a fourth white subpixel, and LG uses this to enhance brightness. It's a neat trick, but it does lead to desaturation in bright colors. This is why many reviews point out that bright colors are one of the drawbacks to LG's displays, and why HDR color volume measurements are weaker for LG's displays than competing high-end LCD/QLED displays. The more they have to use their white subpixel, the more desaturated colors will appear when bright. That's just how a W-OLED panel works. By enhancing the panel's overall brightness, they can alleviate that issue. It's a good thing for LG.
 

bug

Joined
May 22, 2015
Messages
13,163 (4.07/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
No? On a good LCD, bright colors can be fully saturated and pop well with no white creeping in at all. W-OLED works differently though. They have a fourth white subpixel, and LG uses this to enhance brightness. It's a neat trick, but it does lead to desaturation in bright colors. This is why many reviews point out that bright colors are one of the drawbacks to LG's displays, and why HDR color volume measurements are weaker for LG's displays than competing high-end LCD/QLED displays. The more they have to use their white subpixel, the more desaturated colors will appear when bright. That's just how a W-OLED panel works. By enhancing the panel's overall brightness, they can alleviate that issue. It's a good thing for LG.
I'm not exactly sure how would you use a white led to produce bright reds, for example, I'll have to read more into W-OLED. I know about it, but not much.
In the meantime, my CX looks anything but desaturated.
 
Joined
Dec 30, 2021
Messages
354 (0.43/day)
I'm not exactly sure how would you use a white led to produce bright reds, for example, I'll have to read more into W-OLED. I know about it, but not much.
In the meantime, my CX looks anything but desaturated.
Well, the issue is that a white LED doesn't produce bright reds. So when a highlight gets really bright on a W-OLED display, it loses some color definition because of that.

The way it works is that most ordinary displays have RGB (or BGR) subpixels. W-OLED is unique in that it has a fourth subpixel for white, making it WBGR. The fourth white subpixel serves no purpose in adding to color definition, since it's white. Instead, its sole purpose is to enhance brightness. The colored subpixels are actually heavily filtered white LEDs. The color filtering process is lossy and much brightness is lost to it. The white subpixel is unfiltered white, and it shines more brightly than the others because of this. So as a trick to the human eye, when they want to enhance brightness they make the white subpixels bright, and this makes the scene appear more bright overall. But since it's just the white LED getting brighter, it's not providing a lot of color definition. In most normal scenes of regular brightness, W-OLED displays are perfectly saturated. It's the bright highlights where color definition is lost due to this effect. This is also why color volume graphs, such as in this review, fail to show top-tier results for what is otherwise a great display.
 
Joined
Nov 15, 2020
Messages
861 (0.70/day)
System Name 1. Glasshouse 2. Odin OneEye
Processor 1. Ryzen 9 5900X (manual PBO) 2. Ryzen 9 7900X
Motherboard 1. MSI x570 Tomahawk wifi 2. Gigabyte Aorus Extreme 670E
Cooling 1. Noctua NH D15 Chromax Black 2. Custom Loop 3x360mm (60mm) rads & T30 fans/Aquacomputer NEXT w/b
Memory 1. G Skill Neo 16GBx4 (3600MHz 16/16/16/36) 2. Kingston Fury 16GBx2 DDR5 CL36
Video Card(s) 1. Asus Strix Vega 64 2. Powercolor Liquid Devil 7900XTX
Storage 1. Corsair Force MP600 (1TB) & Sabrent Rocket 4 (2TB) 2. Kingston 3000 (1TB) and Hynix p41 (2TB)
Display(s) 1. Samsung U28E590 10bit 4K@60Hz 2. LG C2 42 inch 10bit 4K@120Hz
Case 1. Corsair Crystal 570X White 2. Cooler Master HAF 700 EVO
Audio Device(s) 1. Creative Speakers 2. Built in LG monitor speakers
Power Supply 1. Corsair RM850x 2. Superflower Titanium 1600W
Mouse 1. Microsoft IntelliMouse Pro (grey) 2. Microsoft IntelliMouse Pro (black)
Keyboard Leopold High End Mechanical
Software Windows 11
Seems to me the only people who have a problem with OLEDs are those who don't own one.
 
Joined
Apr 15, 2021
Messages
844 (0.78/day)
heh, and now deuterated TVs... I wonder how this is going to work out. More brightness is fine if in an environment with a lot of light, but without even more deeper blacks to go with it, this could be a problem when it comes to level of detail. Well, at least if this product flops, we'll already have a name for it: "doodoo-rated" instead of deuterated.
 
Joined
Nov 22, 2020
Messages
185 (0.15/day)
System Name heat
Processor 2990wx
Motherboard MSI X399 SLI PLUS ATX
Cooling Thermaltake Floe Riing RGB 360 TT Premium Edition 42.34 CFM Liquid CPU Cooler
Memory 128gb
Video Card(s) 2 2080s
Storage 100 tb
Case Asus TUF Gaming GT501 ATX Mid Tower Case
Power Supply 1200 cosair plat
Yeah... too bad LotR wasn't shot in HDR :(

Edit: A good showcase for HDR needs contrast, not just light. Perhaps a better scene would be some spell casting (fire) in The Witcher?
end oh the day. proper hdr is very costly from start to finish.
from software to hard ware side. also proper calibrating to.

games... ahahaah
dev really dont like standards to follow.
hdr is one of those that super ridge . for a reason.
so instead they will bs it and use auto hdr(which is fake hdr)
 
Joined
Feb 20, 2019
Messages
7,194 (3.86/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Seems to me the only people who have a problem with OLEDs are those who don't own one.
I've installed four in conference rooms and despite my protests they were destroyed by burn-in. We now use QLED TVs and they're obviously inferior but still going strong.

In defence of OLED, this was almost 7 years ago now - OLED may have improved dramatically WRT burn-in and that's why I'm interested in feedback on current gen.

I'm not exactly sure how would you use a white led to produce bright reds, for example, I'll have to read more into W-OLED. I know about it, but not much.
In the meantime, my CX looks anything but desaturated.
9 out of 10 times you want peak brightness, it's white or almost white so W-OLED is fine.

Occasionally you want OMFG colour pop and that's where white OLED falls short since it can display a 700nit white but only a 350nit colour image. If you have the brightness cranked in a bright room or direct sunlight W-OLED is going to look wrong.

On the other hand, if you bought an OLED TV for a bright room then you're doing it wrong; OLED's biggest strength by far is black levels and contrast, both of which are ruined in a bright room. Response time is amazing too but with the primary content for TVs being 60fps or slower, the response time really isn't that much of a game changer.
 

bug

Joined
May 22, 2015
Messages
13,163 (4.07/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
end oh the day. proper hdr is very costly from start to finish.
from software to hard ware side. also proper calibrating to.
It's actually not costly at all. It requires 10bits per channel, which is a 25% increase in bandwidth, but that's about it.

The hard part about HDR is the actual dynamic range. LCD can't meet that without resorting to local dimming - and nobody has figured out how to do that without breaking the bank. OLED is the only one suitable for HDR (ok, it won't work if you want a TV in the bar on the beach), but OLED isn't cheap either.
 
Joined
Dec 30, 2021
Messages
354 (0.43/day)
I've installed four in conference rooms and despite my protests they were destroyed by burn-in. We now use QLED TVs and they're obviously inferior but still going strong.

In defence of OLED, this was almost 7 years ago now - OLED may have improved dramatically WRT burn-in and that's why I'm interested in feedback on current gen.
Every time there's a brightness and/or power-efficiency boost (they usually go hand-in-hand with OLED), that also means there's a burn-in reduction boost, though LG never markets this since they like to pretend that burn-in isn't an issue at all. The less power you have to send to a pixel to reach a specific desired brightness level, the less it will deteriorate over time. So if LG's super special deuterium tech allows for 30% more brightness at the same amount of power input, and if you have a specific brightness target in mind, you may need 23% less power to reach it. And the organic pixels will deteriorate less as a result. How much less is something I can't confidently say because this may not be a linear relationship, and other factors such as heat levels affect the stability of the organic compounds as well. Anyway, I would expect some small amount of extra burn-in resistance from this advancement, which compounds with the many other small advancements made over the years. In 2022, OLED panels will likely be considerably more burn-in-resistant than they were 7 years ago.

I do think it's worth being cautious about burn-in with OLED panels, even if there are many people who happily use OLED TVs as monitors without any noticeable burn-in. The posters in this thread are right—taking extra steps such as lowering the brightness to 120 or 100 nits, setting a dark background and using dark mode for your apps, and setting your task bar to auto-hide, will all help your display last longer. These aren't options for everyone, though, and OLED displays aren't appropriate for every use case. I work at home, and my work PC is also my leisure PC. It's also in a well-lit room. I'm used to 200 - 250 nits, not 100 - 120. I also have static interface elements on screen for 8 - 12 hours a day, every single day. There's no getting rid of them. And going full dark theme for everything is just not how I want to live my life, lol. I'll choose to hold out on OLED as my main PC display until there's a panel that can last 5+ years for less while being on with static interface elements for 3500+ hours each year. It's a pretty demanding requirement, and I'm guessing we're still quite a few years away from that being possible. In the meantime, I'll happily buy an OLED TV for the living room. :)
 
Joined
Aug 24, 2004
Messages
217 (0.03/day)
Took the plunge and bought the LG CX 55 a few months ago and love this thing! Don't even mind if it's already obsolete. Cause if my new tv is awesome, this new tech has got to be incredible!
 
Joined
Feb 3, 2017
Messages
3,475 (1.33/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
People saying OLED is bright enough are missing the point. When LG's W-OLED panels get bright, they lean on their special white subpixels to do so. This results in very bright colors appearing washed out instead of properly saturated. Yes, it's still bright, but it's a desaturated brightness. If this enhances the brightness of all of their subpixels, then they have to rely on the white subpixel less, and colors can pop more and appear more saturated at high brightness levels. That's a good thing even if you aren't going to turn up the brightness level to a retina-searing level. This is the area that Samsung's QD-Display OLED technology will have a strong advantage in, so LG is trying to narrow that advantage with this tech.
LG has been hush-hush about what W-OLED actually technically is. There have been educated guesses - that I subscribe to - that W-OLED actually does not have subpixels in the traditional RGB-OLED meaning. It has white OLED - patent bought from Kodak - functioning as backlight and color filters on top of it - or no color filters in case of white subpixel. A number of things hint at this. The ~30% brightness hit compared to RGB-OLED for one, differential OLED subpixel aging problem somehow getting resolved in W-OLED and some others.

Similarly, Samsung is quite hush-hush about what exactly QD-OLED is. They have clearly tried a number of different approaches with varying levels of success. What we know is that QD-OLED uses blue OLED functioning as backlight. What is going on on top of it is where things get muddy.
- The most official-ish description for now seems to be that there is a QD layer with QDs to form red and green subpixels and backlight is directly used for blue. This does sound exciting and viable enough in theory but there are a couple nagging problems that I do not really see them having resolved just yet.
- A slightly different method was described a little while ago that I would suspect is what Samsung actually does - take the light from blue OLED and run it through QD filter to get a light with spectrum that has nice clean RGB peaks and run that through color filters. Essentially the same idea of W-OLED but white OLED being replaced with blue OLED plus QD filter.
 
Joined
Dec 30, 2021
Messages
354 (0.43/day)
LG has been hush-hush about what W-OLED actually technically is. There have been educated guesses - that I subscribe to - that W-OLED actually does not have subpixels in the traditional RGB-OLED meaning. It has white OLED - patent bought from Kodak - functioning as backlight and color filters on top of it - or no color filters in case of white subpixel. A number of things hint at this. The ~30% brightness hit compared to RGB-OLED for one, differential OLED subpixel aging problem somehow getting resolved in W-OLED and some others.

Similarly, Samsung is quite hush-hush about what exactly QD-OLED is. They have clearly tried a number of different approaches with varying levels of success. What we know is that QD-OLED uses blue OLED functioning as backlight. What is going on on top of it is where things get muddy.
- The most official-ish description for now seems to be that there is a QD layer with QDs to form red and green subpixels and backlight is directly used for blue. This does sound exciting and viable enough in theory but there are a couple nagging problems that I do not really see them having resolved just yet.
- A slightly different method was described a little while ago that I would suspect is what Samsung actually does - take the light from blue OLED and run it through QD filter to get a light with spectrum that has nice clean RGB peaks and run that through color filters. Essentially the same idea of W-OLED but white OLED being replaced with blue OLED plus QD filter.
Yeah, I mentioned the white LED + color filters approach in another post in this thread. With QD-OLED(/QD-Display), I was under the impression that the quantum dots will perform a high-efficiency color conversion of blue light into red or green light for the RG subpixels and pure blue for B. The idea is that the quantum dots can take frequencies of blue light and convert them to frequencies of other colors. Much less light is lost in this process while potentially allowing for wider color gamuts to be expressed. (light filtering is a subtractive process, while quantum dot color conversion is theoretically not)

It's an exciting tech for sure. It'll just be very expensive to start with. There's also talk of displays in the future that will be fully quantum dot driven in some manner. I'm not sure what the theory there is (it was brought up in the Linus Tech Tips video where they toured a quantum dot production facility)
 
Last edited:
Joined
Feb 3, 2017
Messages
3,475 (1.33/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
There's also talk of displays in the future that will be fully quantum dot driven in some manner. I'm not sure what the theory there is
Quantum dots can be excited by light - turning blue or UV light into pure green or pure red light. This is what quantum dot filters and QD-OLED are based on.

Alternatively, types of quantum dots can be excited by electricity, emitting light. These are used and driven similarly to OLED with the exception that light emitting part of the LED are quantum dots instead of organic stuff. Only experimental panels for this have been created and there seems to be a whole host of problems with getting that technology ready for mass production. Exciting tech but seems to be further away than microLED or by some accounts too difficult to make viable at all.
 
Joined
Feb 20, 2019
Messages
7,194 (3.86/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Took the plunge and bought the LG CX 55 a few months ago and love this thing! Don't even mind if it's already obsolete. Cause if my new tv is awesome, this new tech has got to be incredible!
Every TV I ever buy is obsolete very quickly. The trick is to ignore the new shiny stuff and just be happy that what you now have is nicer than your old TV :)

Sat here on my 4K 120Hz 65" Samsung that has really flaky first-gen VRR support, I want a better TV but it's genuinely great and would do everything I needed to even if it was only 60Hz 1080p, all I really care about is black levels in a dark room and VA's good black levels and high contrast will suffice to quell my "urge to splurge" on OLED for a couple more years I hope.
 
Joined
Jun 10, 2014
Messages
2,889 (0.81/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
Why? When the current ones are blindingly bright. I don't think ppl realize how bright current gen oleds are.
Perhaps some are viewing their TVs in bright daylight.
But for those using blinds or watching in a darker setting, OLED is plenty bright.

Used as a monitor rather than as an occasional TV, OLEDs are still a disaster, right? Every test I've encountered about current-gen OLEDs is that they still burn in very quickly and you have to make some serious compromises if used for any kind of static content like OSD, HUD, OS constants.
OLED doesn't have burn-in.
OLED can suffer from uneven wear, but it has nothing to do with pictures being static. It's caused by areas being significantly brighter over time wearing those pixels more than the others. This will happen regardless if the picture is static or changing.
E.g. if you watch a news channel all day, you will probably see uniformity issues where the news anchors and news tickers are positioned on the screen, even though they are moving.

Still, unless you are taking it to extremes, panel uniformity after several years of use will still beat any LCD.

Oled tvs are good for at most 5 years usage, with this new thing, it might expand to 7 years.
Why? Have you experienced panels wearing out with normal usage?

The good old CRTs typically lasted 30-40 years of daily use (let's say 3-4 hours/day). Plasma panels generally last >15 years, LCD probably 10-15 years (depending on quality), but with OLED I don't know yet.

My primary concern with TVs today is the software breaking, an intentional hardware failure(usually bad caps or solder), or they somehow become obsolete before they break down. This is one of the reasons why I wouldn't pay a lot for a TV, because I expect it to fail after ~5 years.

-----

A pro-tip for TV buyers;
Generally speaking, there are at this point few improvements year-to-year for OLED TVs, and all OLED TVs have great panels (the same panels). So unless you strictly need a new feature, grab the previous generation on discount. Nearly every year I've seen them 40-60% off in Q1 the following year, that's the time to buy!
 
Joined
Feb 20, 2019
Messages
7,194 (3.86/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
OLED doesn't have burn-in.
And now we have a near-perfect description of burn-in that also applied to plasma TVs and CRTs. It may even apply to quantum-dot screens in time, they're too new and wear too slowly to know for sure at this point.
OLED can suffer from uneven wear, but it has nothing to do with pictures being static. It's caused by areas being significantly brighter over time wearing those pixels more than the others. This will happen regardless if the picture is static or changing.
E.g. if you watch a news channel all day, you will probably see uniformity issues where the news anchors and news tickers are positioned on the screen, even though they are moving.
Technically yes, it's burnout not burn-in but burn-in is what it's been called for almost a century now, deal with it.
 
Top