• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

LG Display Claims Samsung's QD OLED More Susceptible to Screen Burn Than LG's WOLED

It's really mostly arithmetic.

See the responses from Chrispy_ he seems to know a lot more about this, in simplied terms that's what happens, roughly the same energy coming in goes out with a different wavelengh
 
See the responses from Chrispy_ he seems to know a lot more about this, in simplied terms that's what happens, roughly the same energy coming in goes out with a different wavelengh
All you need to know is the size of the quantum determines the acheiveable output wavelength. The band gap gets smaller the larger the dot. The smaller the bandgap the lower the energy photon released when it fluoresces. The energy levels depend on the size of the object, for a cube say of side L say, the energy levels are proportional to 1/L, the larger L thus the smaller the spacing between energy levels. Quantum dots are so small (2-10nm typically) they only have a few energy levels rather than a continuum like you would have in dots that were even a micron in size. You need this small number of levels for tunability.
 
dual layer LCD

Great.... Now I'm pining after the Hisense 75U9DG @ Best Buy. Seems the tech (as deployed) is better for un-processed 'gaming/monitor' use rather than processed Multimedia like TV and Cinema.
Maybe BB willn't have liquidated them by Holiday Season 2023, and I can get a 4K 75" 120hz FreeSync display for well less than $1k. :p :kookoo:
 
This is a law that should be adopted in other countries. Unfortunately, here(Brazil) the law obliges companies to offer the minimum of Just 1 year warranty. I think it's too little for some expensive products like big TVs.
One year is practically nothing, since leaving the EU, the UK has dropped down to one year as well. :(

Whenever I see a product that only offers the legal minimum warranty it suggests to me the manufacturer has no faith in their own product.
 
One year is practically nothing, since leaving the EU, the UK has dropped down to one year as well. :(

Whenever I see a product that only offers the legal minimum warranty it suggests to me the manufacturer has no faith in their own product.

Agreed. That is one reason the Dell 34" OLED is so appealing with its 3 year burn in warranty. If it ever drops down to $699 or $799, I will probably get one. I am in no rush though, as I currently love the screen I have. OLED would be better though.
 
So far there is not a single property of QD Oled in a better place than WOLED except maybe peak brightness - which also goes at the expense of pure blacks and therefore static contrast. On top of that: bad subpixel arrangement,
I haven't tried text/cleartype on WOELD (RGBW subpixels). I probably could and should because we have a couple in the office lobby, but they just loop video presentations.

Does white text on WOLED use the white subpixels, or are they only called into action when requiring the higher brightness for HDR, leaving the usual RGB trio to handle sRGB up to their maximum 700nits or similar.
 

You're right, UE mandatory it's two years. In Spain and Portugal, we have three years, so I thought that was a UE law. Also, we have a obligation of 10 years of replacements and spare parts.

However, the point it's the same, no one-year and crack TV hehe.

what?! warranties are two year in Portugal, not three
 
See the responses from Chrispy_ he seems to know a lot more about this, in simplied terms that's what happens, roughly the same energy coming in goes out with a different wavelengh
Aye, the quantum dots are passive and simply split photons by absorbing and then emitting some of the original photon's energy to create a new photon.

While is is technically possible with some materials in some circumstances to absorb multiple lower-energy photons and emit fewer higher energy photons, I don't think that's possible with the type of semiconductor QDs used in OLED displays. QD's are photon splitters to the best of my understanding, and photon combiners are more theoretical than practical outside of scientific research labs. I'm not an expert in this field and the exact chemistry and type of QDs used are guarded trade secrets that only the manufacturer can disclose, but https://www.nature.com/articles/ncomms9210 looks like a half-decent primer on the various types of QD (I only skim-read it, but that looks like roughly the right application for what we're talking about with TVs)

So yeah, it is arithmetic - I just suck at explaining it, clearly.

Blue photons have ~3.0eV of energy, and red photons have ~1.9eV of energy. Effectively a blue photon hits a red quantum dot and the dot absorbs ~1.9eV of energy from the photon. That photon has ~1.1eV of energy remaining, which isn't enough to interact with any more quantum dots, so it passes through unchanged. The quantum dot then releases the ~1.9eV of energy as a second photon. So you get your one-photon-in, two-photons-out mechanic; 3.0eV is blue, 1.9eV is red, and 1.1eV is infra-red. What about the red photon at ~1.9eV, surely it will hit more red quantum dots as it passes through, right? Well yes, but in that case you get exactly 1.9eV in and 1.9ev out, which is effectively no change (other than the direction of the photon, which is random)

The beauty/magic of quantum dots is that they absorb photons rather than electrons, are are thus light-powered semiconductors, rather than electricity-powered semiconductors. For a photon to interact with the quantum-dot's molecule, it has have enough energy (eV) to move an electron "up" to a higher energy state (orbital), so if a red 1.9eV photon arrives at a blue quantum dot whose molecules require 3.0eV of energy to raise an electron to a higher energy state, the photon cannot interact with the electrons in the quantum dot and they pass through unchanged. Yes, they can collide with atomic nucleus occasionally, but this is almost infinitely rare; The models we make of atoms show a big blob as the nucleus but while electrons are best described as "charge clouds" that definte how big an atom is, the actual nucleus is tiny, about 1/100,000th the diameter of even the smallest possible electron cloud, so there's 1-in-10,000,000,000 chance of that actually happening.
 
I can't image using an OLED for gaming or youtubes, screen burn is a big issue, that's why i use QLED and LED.
 
Nonsense. You simply get used to anything you use frequently. That also applies to higher resolutions and refresh rate.


Pixel dot pitch is a thing though. Smaller diagonal at the same resolution means you need smaller pixels = OLEDs. Thats exactly why LG is so slow on smaller diagonals.
Yeah, you can get used to playing games at 320p, that doesn't mean it would be the best experience
 
@D007
the stores i worked in the past 15y had at least 4-5 LG oleds on display, not single one xperienced burn in,
while running the same video content (with fixed tex/logos).
unless you mean image retention, but that's not the same thing.
i just cant give those claims any value, when it looks like ppl dont even know the difference (between BI and IR).

@chrcoluk
lol, what utter crap.
you know how many things are sold (mainly on tv) that offer "lifetime warranty"? you think they actually last a lifetime? nope.

maybe start looking into the cost side of it.
any business making something will incur (higher) costs, even if they have a low failure rate.
lets say you make just 1M tvs, and you have to have EVERY single part for EVERY model that you make,
and of course its not enough to have 1 piece each.
ignoring parts cost for a moment (you dont get them for free, yet arent making money off of it),
you still need a warehouse, a crew to run it, and pay for power/water/insurance etc.

@SkySong
oled dont have burn in, its not a plasma tv.
not really an issue. unless you run max brightness/contrast.
 
Last edited:
Yeah, you can get used to playing games at 320p, that doesn't mean it would be the best experience
Ofc, but as with all developments, diminishing returns happen, and this applies especially to FPS and resolution. 'Need' is a strong term above 60 FPS, I would say. And if you consider movies, there are strong arguments for a framerate even lower: 50, 24... That's also part of the point @Fry178 was making: some stuff just doesn't really gain anything from a high framerate.
 
No wonder LG is going after Samsung, LG is trying to deflect the problems they have with lousy OLED TV's with burn in issues
 
Ofc, but as with all developments, diminishing returns happen, and this applies especially to FPS and resolution. 'Need' is a strong term above 60 FPS, I would say. And if you consider movies, there are strong arguments for a framerate even lower: 50, 24... That's also part of the point @Fry178 was making: some stuff just doesn't really gain anything from a high framerate.
1080p60 is still plenty good enough to showcase everything the developer put into the game.

Will it look a bit nicer at 4K 120Hz with HDR? Sure - but it won't really make the gameplay any better, fix any bugs, or reveal detail that wasn't at least mostly there at 1080p60.
 
1080p60 is still plenty good enough to showcase everything the developer put into the game.

Will it look a bit nicer at 4K 120Hz with HDR? Sure - but it won't really make the gameplay any better, fix any bugs, or reveal detail that wasn't at least mostly there at 1080p60.
Heh I would even defend the argument that 100 or 120 FPS are in fact never developer intent, but only serve a competitive environment - or games with a high skillcap - and even thén... there are many games also historically that don't thrive on anything but their own fixed framerate, 50 or 60 FPS. Like... beat-em-ups; Fallout games; arcade shooters; platformers; Diablo 2; etc etc.

Somewhere along the way of developments in the PC gamur space, we went from 'high refresh is awesome if you can actually run it' to 'Must have >100 FPS or my world falls apart'. Strange how that works, but always easily recognized as #firstworldproblems. In my view it says more about the person saying it than about it being a thing.
 
Heh I would even defend the argument that 100 or 120 FPS are in fact never developer intent, but only serve a competitive environment - or games with a high skillcap - and even thén... there are many games also historically that don't thrive on anything but their own fixed framerate, 50 or 60 FPS. Like... beat-em-ups; Fallout games; arcade shooters; platformers; Diablo 2; etc etc.

Somewhere along the way of developments in the PC gamur space, we went from 'high refresh is awesome if you can actually run it' to 'Must have >100 FPS or my world falls apart'. Strange how that works, but always easily recognized as #firstworldproblems. In my view it says more about the person saying it than about it being a thing.
Yeah. same argument for resolution as framerate too.

I game at 4K60 mostly and while the extra resolution gains you sharpness for distant detail, it also exposes textures that aren't high-enough resolution and you can see the polygon angles of a curve. Sometimes, higher resolution actually hinders immersion and realism rather than helping it.
 
1080p60 is still plenty good enough to showcase everything the developer put into the game.

Will it look a bit nicer at 4K 120Hz with HDR? Sure - but it won't really make the gameplay any better, fix any bugs, or reveal detail that wasn't at least mostly there at 1080p60.
Yeah, ok, maybe I'm more sensitive to these things but once I saw what 1440p 144hz looked like I couldn't go back to my 1080p 60hz monitor and made that switch a long time ago. Now that I moved to qd-oled with HDR, when I use my laptop it feels more bland. Of course gameplay doesn't get any better(except for competitive gaming), it's about enjoying the extra details. Like listening to stereo versus spatial audio, more details.
 
Apple will use microled, not oled and...
Apple use OLED already in their iphone and the OLED ipad are heavily rumored to come next year, like you said, micro-led will take a lot of time, in the meantime they need a temporary compromise and it's OLED.
 
No wonder LG is going after Samsung, LG is trying to deflect the problems they have with lousy OLED TV's with burn in issues
I must be missing out on all this burn in fun.

It's way overblown. Funny how I notice the complaints about the tech are all almost exclusively by people who have never touched an LG OLED TV of recent (last 4 years) vintage.
 
@rtb
even more, when I see folks thinking IR = BI
 
I must be missing out on all this burn in fun.

It's way overblown. Funny how I notice the complaints about the tech are all almost exclusively by people who have never touched an LG OLED TV of recent (last 4 years) vintage.
As an owner of an excellent LG CX, I couldn't agree more. Despite what tech papers may say, it's plenty bright, with no burn in in sight.
 
Yeah, about that. This
 
Update from Rtings yt channel FYI

 
Back
Top