• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

LG Display Begins Mass Production of Ultimate Gaming OLED Panel with 4th-Generation OLED Technology

When will they begin mass production of 4K monitors with more sane diagonals of 21, 22 and 23.8 | 24.0-inch ? ? :confused:
 
Utter FUD.


Yep, gotta love ARM for bringing the slowest CPU's and GPU's to some of the most expensive items in the household!
It's not ARM lmao.

Same reason you don't criticize NVIDIA for Nintendo choosing to use a downclocked five year old chip on a six year old process with the Switch 2.

ARM has plenty of high performance options available for the segment, the vast majority of TV manufacturers however are plenty happy to increase their profit margin by another 10% or so and install a piece of crap SoC, with 5-10 year old core architecture, connected to the slowest possible memory and eMMC storage they can find.
 
I have my eye on these, apparently, a 38/39" Ultrawide with 360Hz will come out, if all goes well, this is the panel I will get.
 
The overwhelming majority of smart TVs are simply equipped with a slow ass chip, they feel and act like massively underpowered PC or phones really. It reminds me of the Android phone world. Or the average car infotainment from Volkswagen lol.
That was true for the Samsung smart TV we had, but I can't say that this LG TV feels slow. I only have a Fire TV Stick 4K Max (1st gen) to compare with though and that's hardly powerful, but way better than the non Max version that I also had. It was a midrange version when release in 2021, but I was a bit shocked that it only had 2 GB of free storage for apps. However, webOS doesn't seem to have nearly as big apps as Android, so it hasn't proven to be an issue as yet.

Yep, gotta love ARM for bringing the slowest CPU's and GPU's to some of the most expensive items in the household!
Sorry, but how is that Arms fault? Arm license IP, they don't make the chips.
If you want to blame someone, blame the TV makers, as they're the ones deciding what hardware goes into their TVs.
Also, one of the biggest chip makers for TVs is Mediatek, who bough MStar which was their major competitor.
Yes, if you compare with smartphone SoCs, TV SoCs are way behind, as MediaTek's highest-end TV chip are based on the Cortex-A76 cores and the Mali G57 MC3 GPU.
That said, those parts are only for the application processing and not the rest of the stuff that goes on in a TV.
It seems like all of their slower Pentonic parts are Cortex-A73 at least, not A53 or A55.

LG and Samsung make their own chips from what I know.
 
Last edited:
@TheDeeGee
because they dont experience burn in, as they arent plasmas?
image retention you mean?
unfortunately LG isnt the best when it comes to "preventing" it.
at least when they were providing sony with panels (left retail 4 years ago), their (tvs tho) experienced IR within 8-10 month, while the equivalent sony model did not.
 
That was true for the Samsung smart TV we had, but I can't say that this LG TV feels slow. I only have a Fire TV Stick 4K Max (1st gen) to compare with though and that's hardly powerful, but way better than the non Max version that I also had. It was a midrange version when release in 2021, but I was a bit shocked that it only had 2 GB of free storage for apps. However, webOS doesn't seem to have nearly as big apps as Android, so it hasn't proven to be an issue as yet.


Sorry, but how is that Arms fault? Arm license IP, they don't make the chips.
If you want to blame someone, blame the TV makers, as they're the ones deciding what hardware goes into their TVs.
Also, one of the biggest chip makers for TVs is Mediatek, who bough MStar which was their major competitor.
Yes, if you compare with smartphone SoCs, TV SoCs are way behind, as MediaTek's highest-end TV chip are based on the Cortex-A76 cores and the Mali G57 MC3 GPU.
That said, those parts are only for the application processing and not the rest of the stuff that goes on in a TV.
It seems like all of their slower Pentonic parts are Cortex-A73 at least, not A53 or A55.

LG and Samsung make their own chips from what I know.
Connect an Nvidia Shield to your Smart TV and feel the difference between actual responsiveness and your Smart TV. Or just a PC with a low end desktop chip... I mean.. come on. Its like night and day. Its entirely comparable to me opening the door of my VW ID3 and wanting to fire off onto the road within 5 seconds after sitting down, and the whole infotainment still getting to grips with the whole event. Don't even consider trying to connect your phone over bluetooth at that point, no no. And DEFINITELY don't be entering a destination on your navigation because the keyboard will just simply pop up a good 20 seconds after you ask for it ;) Oh yeah, and every letter of your destination will also be having a good 500ms delay before appearing on screen. And yes, that means you will be having wrong inputs all the time. And yes, its absolutely ridiculous considering its 39k EUR car.

I have a brand new LG Evo C4 right now and it still has the odd slowdown. And that's not, far from even, the bottom of Smart TV chip land at all.
Nah... really... we should be expecting much more of this, especially as tech nerds. I mean, Apple figured out how to present its consumers with bottom end chips and still have a good UX (or at least, the chip performance is sufficient to have a consistent experience). Why haven't TVs figured this out? And don't get me wrong - I'm not entirely complaining about the LG TV's performance, it does what it needs to do. But its absolutely just that, and not a smidge more - and occasionally its a tad too little in terms of performance/responsiveness. And that's with a TV not filled to the brim with historical data, yet. It works just okay out of the box. But in due time?
 
Last edited:
Connect an Nvidia Shield to your Smart TV and feel the difference between actual responsiveness and your Smart TV.
Bad comparison IMO. I have a Shield TV and "responsive" is not a term I would use to describe it. The device is using a 10-year old chip and it definitely feels like it. NVidia desperately needs to refresh it with the chip from the Nintendo Switch 2 (considering the 2015 Shield uses the chip from the Switch 1).
 
Bad comparison IMO. I have a Shield TV and "responsive" is not a term I would use to describe it. The device is using a 10-year old chip and it definitely feels like it. NVidia desperately needs to refresh it with the chip from the Nintendo Switch 2 (considering the 2015 Shield uses the chip from the Switch 1).
Yeah Im old but the shield is too at this point, probably true :)
 
Jesus!! I Just wanted an good contrast ratio whit low bright, 1500nits for me is too much!! Normally I use 600nits whit HDR (the bare minimum),and 300 while browsing , 1500 nits is just unbearable to my Eyes!!
 
Jesus!! I Just wanted an good contrast ratio whit low bright, 1500nits for me is too much!! Normally I use 600nits whit HDR (the bare minimum),and 300 while browsing , 1500 nits is just unbearable to my Eyes!!
it's 1500 nits on a 1.5% window. calm down. The average max brightness will be more like 600-900, so it will resemble an OLED HDTV in your living room from 3 years ago. Plus I hear they have this thing called a brightness and contrast control...
 
This please. And text rendering quality that I can use all day, without VRR flickering.


2010 called to get its burn-in meme back.
It's not a meme, it's still a problem.
 
Back
Top