Monday, February 14th 2022

Alienware's 34-inch QD-OLED Monitor Gets a Price

Remember that 34-inch QD-OLED monitor that Alienware announced at CES earlier this year? The company has finally worked out how much it's going to charge for it, although there is still no fixed availability date. At US$1,299 the AW3423DW is going to be a $100 pricier than the AW3821DW, which sports a 38-inch Nano IPS panel with a resolution of 3840x1600, rather than the 34-inch QD-OLED panel with a resolution of 3440x1440 of the AW3423DW.

Obviously the two display technologies aren't comparable, but it's at least an indication of how pricy QD-OLED will be initially, compared to more traditional display technologies. Both displays feature G-Sync Ultimate, so it's not as if Dell has tried to cut any corners here. The AW3423DW does offer a higher refresh rate of 175 Hz vs 144 Hz for the AW3821DW, which may be an advantage to some, but the official HDR certification is oddly enough only HDR 400 vs HDR 600, despite the fact that Dell claims it can deliver up to 1000 cd/m². That said, the black levels of the AW3423DW should be vastly superior, as should the colour gamut. The display is said to be available sometime early this spring, presumably in the US market first.
Sources: @Alienware, via TFT Central
Add your own comment

135 Comments on Alienware's 34-inch QD-OLED Monitor Gets a Price

#101
goodeedidid
ChomiqSays who? Plenty of people use C series primarily for media consumption. Gaming use isn't also the main marketing aspect on LG's page dedicated to C1. Or do you mean to say that 65", 77" and 83" C1's are meant to be used for gaming only?

No, filmmaker mode does no such thing. What it does it turns off majority of image processing and brings the white point to the correct D65 value:

Also, filmmaker mode is not exclusive to OLED TV's. It also needs (when operating in auto-detection mode) to have proper metadata (that tells the TV it's dealing with CINEMA content) on the media that you're playing. That's fine for streaming media like Netflix or Disney+ but results in problems in case of physical blu-ray disc (older disc won't have this metadata).

Out of the box LG OLEDs have one of the most inaccurate picture settings because the white balance is set to 0, which is too blue (that's why on youtube you'll see plenty of people gaming on their OLEDs and picture on the TV is blue AF). You can correct it by setting it to Warm 50.
ValantarNope, this is literally not possible. TVs do not understand the signalling protocols PCs use for auto sleep/wake of monitors, and do not have this functionality built in. PCs do not have HDMI-CEC functionality (though it might be possible to add through third party software - I've never checked). But this is not a way in which PCs and TVs communicate.

Uh ... digital signals are also signals. Changing data is signal processing. Over-processing means processing the image data to make it look different. Whether it's analogue or digital is irrelevant. For TVs, typically this means motion smoothing/interpolation, drastic color adjustments, contrast adjustments, algorithmic/AI upscaling or "image enhancement" processes, black level adjustments, denoising, and a whole host of other "features" (I'd rather call them bugs, but that's just me). Why do you think TVs have a "game mode"? Because this mode cuts out all this processing to cut down on input lag, as this processing causes significant lag. Monitors have essentially none of these features, in part because they're fundamentally unsuitable for crucial PC use cases such as text rendering.

And I never said "all monitors have good calibration". I specifically said that they don't:

So: decent monitors have useful color profiles; the quality of calibration (for all monitors) varies a lot - even the decent ones, but particularly the bad ones. And is often crap, just as it typically is with most modes on most TVs - just in different ways (TVs tend towards overprocessing, oversaturation and excessive contrast; monitors tend to be muted, dull and just plain inaccurate).

"They have a gaming mode and gaming UI and are marketed for gaming" is in no way proof of burn-in being impossible through gaming. You understand that, right? That marketing can ... mislead? Lie? Just hide inconvenient facts, or gamble on that they don't apply to enough people to be a real problem? Like, there is zero causal relationship between the things you are presenting as an argument here and the issue at hand. The existence of gaming modes is not proof of the absence of burn-in - it literally can't be proof of that, as there is no causal relation between the two.

Now, it takes a lot of time to cause burn-in on a modern OLED if it is used for many different things, but if run at high brightness with static imagery (including portions of an image being static, such as a game UI), this will still burn in - and it can even be quite quick. How long, and how severely, is dependent on the frequency of use, the brightness, the specific design of the static elements, whether these UI elements are covered by anti-burn in features (such as auto logo dimming), and more. None of this means it won't happen, it just means that it's less likely to happen than it was a few years ago, and that it won't happen to everyone. If you play lots of different games and don't use it for web browsing or work, you might never see meaningful retention. But if you play 4+ hours of a single game every day, and don't use the TV for much else? It'll likely happen sooner rather than later.


As for the processing thing and the lack of DP support, thankfully that's no longer an issue:
www.notebookcheck.net/LG-UltraGear-48GQ900-48-inch-gaming-monitor-announced-with-a-WOLED-panel-HDMI-2-1-120-Hz-support-and-a-matte-finish.605806.0.html#8362734


Nope. PC GPUs generally do not support HDMI CEC (how TVs and STBs, BD players and the like signal to each other to turn on/off, adjust volume, etc.), and while TVs might understand PC display sleep/wake signals, they do not respond to these signals nor have any way of enabling this.
All in all, the LG C1 and C2 are the perfect gaming TVs and like I said they are gaming dedicated even for PCs. Why would they feature Dolby Vision at 4K 120Hz (4 HDMI 2.1 ports), variable refresh rates with Nvidia G Sync and AMD Free-sync Premium. which are all PC features, 1ms response times? This is definitely gaming dedicated even for PCs and not to mention that the new C2 is going to be even brighter and it will have 42-inch model that is just going to the perfect gaming TV and better option than gaming monitors that charge you super premium prices and that have less features. Hell not that it matters that much for gaming but if you really really want to have your colors super accurate you can get a color calibrator.
Posted on Reply
#102
Valantar
goodeedididAll in all, the LG C1 and C2 are the perfect gaming TVs and like I said they are gaming dedicated even for PCs. Why would they feature Dolby Vision at 4K 120Hz (4 HDMI 2.1 ports), variable refresh rates with Nvidia G Sync and AMD Free-sync Premium. which are all PC features, 1ms response times? This is definitely gaming dedicated even for PCs and not to mention that the new C2 is going to be even brighter and it will have 42-inch model that is just going to the perfect gaming TV and better option than gaming monitors that charge you super premium prices and that have less features. Hell not that it matters that much for gaming but if you really really want to have your colors super accurate you can get a color calibrator.
Sorry, but no. If they were "dedicated for PCs", why did LG literally just announce an UltraGear OLED gaming monitor using the same panels? LG's TVs are TVs first. They are 120Hz because motion smoothing on TVs is a wildly popular feature (despite it looking like garbage), as well as this being a better match for 24fps content than 60Hz, reducing the potential for judder in poorly configured settings and generally being more flexible (as well as potentially overcoming the perceived judder of low refresh rate OLEDs which is caused by their extremely fast pixel response times). 1MS response times are due to these being OLED - they are just that fast. (Also, Xbox consoles have had VRR support for quite a few years, even if poorly optimized and underutilized.) Yes, these TVs are very popular for gaming, both console and PC gaming, but they're TVs first, and the vast majority of people buying them use them as TVs first and foremost. LG has adapted and added features appealing to a broader audience - VRR and other gaming centric features - simply because this is an easy way for them to sell more TVs, expanding the possible market for their products. If you can spend 5% more on software development and expand your market from "premium TV buyers" to "premium TV buyers and gamers", that's an easy win. Remember, the TV market is vastly larger than any gaming equipment market, but as the saying goes, ¿Porque no los dos?

If these were dedicated to PC use, they would have
- DisplayPort
- PC compatible sleep/wake
- An option to bypass the "smarts" entirely, as it would have no use
- other PC-focused hardware features, like USB hubs (which all of LG's monitors except for the very cheapest have)
All of which their newly announced 48" Ultragear monitor have. That is dedicated to PCs (though they also advertise it for consoles).
lexluthermiesterOne is an Insignia(BestBuy brand) and the 4k is Samsung. Been doing some reading, you might be right and I might be really lucky. Though I suspect that it isn't a case of one of us being dead wrong and the other right. I think that reality is somewhere inbetween and TV makers implement certain features or not and that it varies from maker to maker and model to model.

So I apologize for being a bit aggressive and blunt.
No problem :) I'm still under the impression that this is very rare as a TV feature, though maybe it's more common in US models and firmwares? It would definitely be interesting to find out the logic (if there is any!) behind this.
Posted on Reply
#103
trsttte
goodeedididNvidia G Sync and AMD Free-sync Premium
G-Sync compatible which is the same as Free-sync which is now part of the HDMI 2.1 spec.
ValantarIt would definitely be interesting to find out the logic (if there is any!) behind this.
Just the good old competing standards ¯\_(ツ)_/¯
Posted on Reply
#104
Valantar
trsttteJust the good old competing standards ¯\_(ツ)_/¯
Probably. Though I doubt there are any license fees or other hurdles towards implementing PC monitor wake/sleep functionality on TVs - outside of the (extremely small) development cost and possible customer confusion. Should be pretty easy to add as a toggleable option like Lex mentions above. So, competing standards and corporate laziness?
Posted on Reply
#105
goodeedidid
ValantarSorry, but no. If they were "dedicated for PCs", why did LG literally just announce an UltraGear OLED gaming monitor using the same panels? LG's TVs are TVs first. They are 120Hz because motion smoothing on TVs is a wildly popular feature (despite it looking like garbage), as well as this being a better match for 24fps content than 60Hz, reducing the potential for judder in poorly configured settings and generally being more flexible (as well as potentially overcoming the perceived judder of low refresh rate OLEDs which is caused by their extremely fast pixel response times). 1MS response times are due to these being OLED - they are just that fast. (Also, Xbox consoles have had VRR support for quite a few years, even if poorly optimized and underutilized.) Yes, these TVs are very popular for gaming, both console and PC gaming, but they're TVs first, and the vast majority of people buying them use them as TVs first and foremost. LG has adapted and added features appealing to a broader audience - VRR and other gaming centric features - simply because this is an easy way for them to sell more TVs, expanding the possible market for their products. If you can spend 5% more on software development and expand your market from "premium TV buyers" to "premium TV buyers and gamers", that's an easy win. Remember, the TV market is vastly larger than any gaming equipment market, but as the saying goes, ¿Porque no los dos?

If these were dedicated to PC use, they would have
- DisplayPort
- PC compatible sleep/wake
- An option to bypass the "smarts" entirely, as it would have no use
- other PC-focused hardware features, like USB hubs (which all of LG's monitors except for the very cheapest have)
All of which their newly announced 48" Ultragear monitor have. That is dedicated to PCs (though they also advertise it for consoles).


No problem :) I'm still under the impression that this is very rare as a TV feature, though maybe it's more common in US models and firmwares? It would definitely be interesting to find out the logic (if there is any!) behind this.
What I mean is dedicated for gaming with PCs, not just dedicated for PCs lol it's a TV. LG announcing new gaming monitor that is OLED doesn't change the fact that the C1 and C2 are not gaming TVs that support PC features such as Gsync and Freesync.. they announced that monitor because that's what they do lol it's their business to make new things.. Also the 120Hz panel is either 120Hz or not.. there is no such thing as bad 120Hz or good 120Hz lol as I told you remember, it's all digital and it is either completely on or off, no in between lol. You speak of things as if we're still living in the analogue era. What in the world means that 120Hz is garbage?!? Explain that in comparison to other monitors or TVs that have "good" and not "garbage" 120Hz according to you. Even before the C1 came out long before that Nvidia started marketing campaign, probably TV manufacturers, that dedicated TVs are coming with the new HDMI 2.1 spec. You keep downplaying the fact those TVs are not gaming dedicated but they are.
Posted on Reply
#106
Valantar
goodeedididWhat I mean is dedicated for gaming with PCs, not just dedicated for PCs lol it's a TV. LG announcing new gaming monitor that is OLED doesn't change the fact that the C1 and C2 are not gaming TVs that support PC features such as Gsync and Freesync..
They are TVs. TVs with various featuresets suitable for various applications, but TVs first and foremost.
goodeedididthey announced that monitor because that's what they do lol it's their business to make new things..
But that would only make sense if there was meaningful differentiation in order to drive sales. Which the features I mentioned add. Sure, some companies just make new SKUs for no good reason, but most don't operate that way. Creating a new product has a non-trivial cost, especially in a product segment like this.
goodeedididAlso the 120Hz panel is either 120Hz or not.. there is no such thing as bad 120Hz or good 120Hz lol as I told you remember, it's all digital and it is either completely on or off, no in between lol. You speak of things as if we're still living in the analogue era. What in the world means that 120Hz is garbage?!? Explain that in comparison to other monitors or TVs that have "good" and not "garbage" 120Hz according to you.
What on earth are you talking about here? I have never said any such thing. Please re-read. Seriously, what you're saying bears no relation to what I wrote whatsoever. You really need to work on your reading comprehension.

I said motion smoothing looks like garbage. Motion smoothing or interpolation is an image processing feature of TVs, to smooth out motion by adding interpolated "fake" frames in between the actual image data of whatever is being displayed. Remember those "480Hz" and "600Hz" TVs in the mid 2010s? That's motion interpolation. (And those AFAIK all had 60Hz panels.) Motion interpolation relates to the image signal, but not directly to the display panel.

To expand on that point: the OLEDs are 120Hz first and foremost because motion smoothing on a 60Hz OLED doesn't really work - the lighting fast pixel response times of OLED undermine the effect of any interpolation added to the signal. Doubling the number of frames allows for smoother transitions and thus smoother motion, which aligns better with conventions and expectations for TVs.

I have at no point differentiated between "bad" and "good" 120Hz, or any categorizations of refresh rates whatsoever.
goodeedididEven before the C1 came out long before that Nvidia started marketing campaign, probably TV manufacturers, that dedicated TVs are coming with the new HDMI 2.1 spec. You keep downplaying the fact those TVs are not gaming dedicated but they are.
Okay, we need a dictionary here: Dedicated means "Wholly committed to a particular course of thought or action; devoted." It can also be formulated as "designed for a particular use or function", but in that formulation carries a strong implication of it being designed only or mainly for that use or funciton (and not simply as one more feature on top of a pile of others). If a TV is dedicated to gaming, then that is its main if not only use. This is not the case for any current LG OLED, despite them being good at gaming and having features well further improving this. Your TV might of course be dedicated to gaming in your usage, if that's your only or main use for it, but that doesn't mean that that type of TV is dedicated to gaming in general.

I'm not downplaying anything, I'm trying to adjust your perspective to match with the reality of these devices being general-purpose TVs with good gaming features that a relatively small subset of users buy them for, but which is not the main purpose of the product in general.
Posted on Reply
#107
goodeedidid
ValantarThey are TVs. TVs with various featuresets suitable for various applications, but TVs first and foremost.

But that would only make sense if there was meaningful differentiation in order to drive sales. Which the features I mentioned add. Sure, some companies just make new SKUs for no good reason, but most don't operate that way. Creating a new product has a non-trivial cost, especially in a product segment like this.

What on earth are you talking about here? I have never said any such thing. Please re-read. Seriously, what you're saying bears no relation to what I wrote whatsoever. You really need to work on your reading comprehension.

I said motion smoothing looks like garbage. Motion smoothing or interpolation is an image processing feature of TVs, to smooth out motion by adding interpolated "fake" frames in between the actual image data of whatever is being displayed. Remember those "480Hz" and "600Hz" TVs in the mid 2010s? That's motion interpolation. (And those AFAIK all had 60Hz panels.) Motion interpolation relates to the image signal, but not directly to the display panel.

To expand on that point: the OLEDs are 120Hz first and foremost because motion smoothing on a 60Hz OLED doesn't really work - the lighting fast pixel response times of OLED undermine the effect of any interpolation added to the signal. Doubling the number of frames allows for smoother transitions and thus smoother motion, which aligns better with conventions and expectations for TVs.

I have at no point differentiated between "bad" and "good" 120Hz, or any categorizations of refresh rates whatsoever.

Okay, we need a dictionary here: Dedicated means "Wholly committed to a particular course of thought or action; devoted." It can also be formulated as "designed for a particular use or function", but in that formulation carries a strong implication of it being designed only or mainly for that use or funciton (and not simply as one more feature on top of a pile of others). If a TV is dedicated to gaming, then that is its main if not only use. This is not the case for any current LG OLED, despite them being good at gaming and having features well further improving this. Your TV might of course be dedicated to gaming in your usage, if that's your only or main use for it, but that doesn't mean that that type of TV is dedicated to gaming in general.

I'm not downplaying anything, I'm trying to adjust your perspective to match with the reality of these devices being general-purpose TVs with good gaming features that a relatively small subset of users buy them for, but which is not the main purpose of the product in general.
But of course a TV is a TV firstly. Then you could say that it has dedicated gaming featureset that does make it a gaming TV. Just like gaming monitors are considered gaming because they have dedicated gaming features... but still like I said before the C1 and especially the C2 (42-inch) will offer great value for an OLED panel with great gaming features. For the same features for an OLED monitor probably one would pay twice as much until prices go down when they make them on a larger scale.
Posted on Reply
#108
lexluthermiester
ValantarNo problem :) I'm still under the impression that this is very rare as a TV feature, though maybe it's more common in US models and firmwares? It would definitely be interesting to find out the logic (if there is any!) behind this.
What's interesting to me is that I had a Hannspree 28" TV BITD that had the same features. It would come on when I booted my PC and it would shut off when shutting down. Granted, back then, I was connecting it through the VGA input, so the specs for HDMI/DP didn't apply. The point being that TV makers might be integrating features that they think might appeal to their customer base.
Posted on Reply
#109
Valantar
lexluthermiesterWhat's interesting to me is that I had a Hannspree 28" TV BITD that had the same features. It would come on when I booted my PC and it would shut off when shutting down. Granted, back then, I was connecting it through the VGA input, so the specs for HDMI/DP didn't apply. The point being that TV makers might be integrating features that they thing might appeal to their customer base.
Yeah, it might be gaining ground as TVs are becoming more popular as PC gaming screens. I would definitely love to see this happen, and ideally as a firmware update, as our TV is used >95% of the time with our HTPC. Having it act more like a monitor would as such be really nice.
goodeedididBut of course a TV is a TV firstly.
That, as I showed in the post you just quoted, is a direct contradiction of you saying they are "dedicated for gaming". Make up your mind, please.
goodeedididThen you could say that it has dedicated gaming featureset that does make it a gaming TV.
Again, no. It makes it a TV that has a good gaming featureset. You can absolutely use it as a gaming TV, and it might in part be marketed as a gaming TV (in addition to other advertising that focuses on other features, such as image quality or smart OS features), but it will never be a dedicated gaming TV. You might use it as a dedicated gaming TV, but again, that doesn't make that description generally applicable to the product as a whole.
goodeedididJust like gaming monitors are considered gaming because they have dedicated gaming features...
Again, no. Those are products expressly designed and marketed towards a single use case. There is literally not a single TV in the world to which the same applies - TVs are marketed broadly, to as many markets as possible.
This image sums this up pretty succinctly, taken from LG's C1 web site:
www.lg.com/us/images/TV-OLED-C1-10-OLED-Value-4S-Desktop.jpg
Gaming is highlighted as one of four use cases showcasing one of four qualities they want to highlight. Beyond that, you have to scroll more than halfway down the page for it to get into gaming. That is not how a dedicated gaming device is marketed.

For contrast: On LG's Ultragear gaming monitors, "gaming" is in the title for every single product, and the first (and often near-exclusive) focus on their product pages is gaming.
goodeedididbut still like I said before the C1 and especially the C2 (42-inch) will offer great value for an OLED panel with great gaming features. For the same features for an OLED monitor probably one would pay twice as much until prices go down when they make them on a larger scale.
I have never contradicted that. I have simply tried to add some nuance to a simplistic perspective, highlighting that while TVs are always cheaper than monitors and often deliver superior image quality, monitors have other features and other qualities that TVs (typically/often) lack, there are UX challenges with using a TV as a monitor, and that not all TV features (image processing especially, hence the need for "game mode") are suited for PC usage, whether that is gaming or web browsing.
Posted on Reply
#110
goodeedidid
ValantarThat, as I showed in the post you just quoted, is a direct contradiction of you saying they are "dedicated for gaming". Make up your mind, please.
You keep on trying to twist my argument and come on top with little points that don't really serve your argument. I did say that the C1 is gaming dedicated, but that of course doesn't mean it's only for gaming duh of course it is a freaking TV. It has dedicated gaming feature set and that is not because the "signal" is processed and that is not why it needs such a mode lol.. it has specific gaming features like Gsync which means it is also meant to be used on PCs... it's simple. PC monitors also have gaming modes, photo modes, and media modes, is their "signals" also "processed"?!? Can you actually explain what you mean by "processed signal"? You could say that there is image post processing, AA post-processing, or something else but I have never ever heard about "signal processing". Those are all setting you can calibrate and change just as you can do on a monitor. There is literally no difference, except that the TV has a smart box in it and a monitor is just a dumb panel box with LED light for which you pay a premium.
ValantarI have never contradicted that. I have simply tried to add some nuance to a simplistic perspective, highlighting that while TVs are always cheaper than monitors and often deliver superior image quality, monitors have other features and other qualities that TVs (typically/often) lack, there are UX challenges with using a TV as a monitor, and that not all TV features (image processing especially, hence the need for "game mode") are suited for PC usage, whether that is gaming or web browsing.
And again, the C1 and most certainly the C2 aren't going to get any burn-in issues even if used extensively for gaming. Even TV channels have logos that temporarily stay on the screen. There aren't any "UX" challenges at all. If you claim something like this you might as well try to give some proof. Your whole real point, the only point that you are right about that I can agree with is that the TV doesn't go into a standby mode when you turn off the TV. But I will give it a test as my C1 does have a PC mode or something like that.
Posted on Reply
#111
Valantar
goodeedididYou keep on trying to twist my argument and come on top with little points that don't really serve your argument. I did say that the C1 is gaming dedicated, but that of course doesn't mean it's only for gaming duh of course it is a freaking TV.
Sorry, but I'm not twisting anything. I'm literally showing you the dictionary definition of what "dedicated" means, because what you're saying here is in direct contradiction of itself. If a TV is "dedicated to gaming", then it is primarily for gaming to the exclusion of all else - it is its main and primary designed purpose. If not, then it is not gaming dedicated. Period. That is what "dedicated" means.
goodeedididIt has dedicated gaming feature set and that is not because the "signal" is processed and that is not why it needs such a mode lol..
So ... I know this is going to sound like I'm talking to a toddler, but apparently it's necessary: you know that digital data can be manipulated, right? You keep saying that signal processing is somehow an analogue-only thing, which ... just no. You understand that digital data can be altered, I assume? That 0s can be changed to 1s, etc? That you can put an image signal - a digital one too - through an algorithm to change how the image looks? That is signal processing. When you have a digital image signal, and you run it through various algorithms to change it, that is signal processing.

The prevalence of heavy-handed signal processing and how it both degrades image quality on static PC content and causes massive input lag (to the tune of 100+ ms often) is precisely why TVs today have game modes - because they're borderline unusable for anything requiring reaction speed without such a mode, and desktop use would be quite terrible due to text dithering issues, oversharpening, etc.

I mean, you might as well say it has a "movie dedicated" feature set. Which would be equally nonsensical. It is a multi-purpose product with many features that fit many use cases, and obviously some features align better with some use cases than others - some even only make sense for some uses, like adaptive sync. That does not amount to the TV being dedicated to that one use case.
goodeedididit has specific gaming features like Gsync which means it is also meant to be used on PCs...
Worth noting: it doesn't. It supports VESA adaptive sync over HDMI, an extension made by AMD which Nvidia then came to support as "G-sync compatible". This is a feature also supported by Xbox consoles for quite a few years now.
goodeedididit's simple. PC monitors also have gaming modes, photo modes, and media modes, is their "signals" also "processed"?!?
Yes, but to a much, much, much lesser degree, typically only amounting to minor adjustments of color balance and contrast. They don't add sharpening, motion interpolation, AI upscaling/image enhancements, or any of the other signal processing techniques applied by TVs.
goodeedididCan you actually explain what you mean by "processed signal"?
See above. I have also said this in several previous posts.
goodeedididYou could say that there is image post processing, AA post-processing, or something else but I have never ever heard about "signal processing".
That is literally the same thing, though in slightly different contexts. Post-processing refers to processing an image after it is "done" (a digital image is rendered, a film image is developed and scanned, etc.). As a "finished" image is a requirement for there being an image signal to transfer to a TV or monitor (it's "done" by default as it's being transmitted to the display device), the "post" is omitted as it's just meaningless at that point. All signal processing is post processing.
goodeedididThose are all setting you can calibrate and change just as you can do on a monitor. There is literally no difference, except that the TV has a smart box in it and a monitor is just a dumb panel box with LED light for which you pay a premium.
This just shows that you have no idea what you're talking about, sorry. Monitors offer color balance and contrast adjustments on a basic level. Some offer black level adjustments, and (ironically) some gaming monitors are increasing the amount of processing available for various "gaming" features (though most of this is just marketing fluff). Some gaming monitors also add in fps counters and crosshairs, though those are simple graphics overlaid on the image - still, that's also a form of processing. But all of this is lightweight. It still has the potential to drastically change how the image looks, but it requires little processing power and adds very little input lag.

TVs, on the other hand, have long since started offering a host of features to "improve" image quality: motion smoothing/interpolation, algorithmic/AI upscaling, denoising, sharpening, all kinds of nebulously named "image enhancement" settings. This is not even close to an exhaustive list, and TVs also have all of the features above (typically except for the gaming-centric ones, though some do). What makes this distinct from what monitors do is the complexity of the processing and the processing power involved. TVs, even "dumb" ones, have relatively powerful display driver chipsets in order to perform this processing. Monitors mostly have very basic display controller chipsets, fundamentally incapable of performing image processing of this kind.
goodeedididAnd again, the C1 and most certainly the C2 aren't going to get any burn-in issues even if used extensively for gaming. Even TV channels have logos that temporarily stay on the screen.
I have given you concrete, real-world examples demonstrating otherwise. If you have proof that these examples are invalid, please provide it.
goodeedididThere aren't any "UX" challenges at all.
Again: I have given you plentiful examples of the user experience niggles and annoyances of using a TV as a monitor. You might not care about them, but they exist.
goodeedididIf you claim something like this you might as well try to give some proof.
Done. Seriously, if you missed this, what have you been reading?
Posted on Reply
#112
goodeedidid
ValantarSorry, but I'm not twisting anything. I'm literally showing you the dictionary definition of what "dedicated" means, because what you're saying here is in direct contradiction of itself. If a TV is "dedicated to gaming", then it is primarily for gaming to the exclusion of all else - it is its main and primary designed purpose. If not, then it is not gaming dedicated. Period. That is what "dedicated" means.

So ... I know this is going to sound like I'm talking to a toddler, but apparently it's necessary: you know that digital data can be manipulated, right? You keep saying that signal processing is somehow an analogue-only thing, which ... just no. You understand that digital data can be altered, I assume? That 0s can be changed to 1s, etc? That you can put an image signal - a digital one too - through an algorithm to change how the image looks? That is signal processing. When you have a digital image signal, and you run it through various algorithms to change it, that is signal processing.

The prevalence of heavy-handed signal processing and how it both degrades image quality on static PC content and causes massive input lag (to the tune of 100+ ms often) is precisely why TVs today have game modes - because they're borderline unusable for anything requiring reaction speed without such a mode, and desktop use would be quite terrible due to text dithering issues, oversharpening, etc.

I mean, you might as well say it has a "movie dedicated" feature set. Which would be equally nonsensical. It is a multi-purpose product with many features that fit many use cases, and obviously some features align better with some use cases than others - some even only make sense for some uses, like adaptive sync. That does not amount to the TV being dedicated to that one use case.

Worth noting: it doesn't. It supports VESA adaptive sync over HDMI, an extension made by AMD which Nvidia then came to support as "G-sync compatible". This is a feature also supported by Xbox consoles for quite a few years now.

Yes, but to a much, much, much lesser degree, typically only amounting to minor adjustments of color balance and contrast. They don't add sharpening, motion interpolation, AI upscaling/image enhancements, or any of the other signal processing techniques applied by TVs.

See above. I have also said this in several previous posts.

That is literally the same thing, though in slightly different contexts. Post-processing refers to processing an image after it is "done" (a digital image is rendered, a film image is developed and scanned, etc.). As a "finished" image is a requirement for there being an image signal to transfer to a TV or monitor (it's "done" by default as it's being transmitted to the display device), the "post" is omitted as it's just meaningless at that point. All signal processing is post processing.

This just shows that you have no idea what you're talking about, sorry. Monitors offer color balance and contrast adjustments on a basic level. Some offer black level adjustments, and (ironically) some gaming monitors are increasing the amount of processing available for various "gaming" features (though most of this is just marketing fluff). Some gaming monitors also add in fps counters and crosshairs, though those are simple graphics overlaid on the image - still, that's also a form of processing. But all of this is lightweight. It still has the potential to drastically change how the image looks, but it requires little processing power and adds very little input lag.

TVs, on the other hand, have long since started offering a host of features to "improve" image quality: motion smoothing/interpolation, algorithmic/AI upscaling, denoising, sharpening, all kinds of nebulously named "image enhancement" settings. This is not even close to an exhaustive list, and TVs also have all of the features above (typically except for the gaming-centric ones, though some do). What makes this distinct from what monitors do is the complexity of the processing and the processing power involved. TVs, even "dumb" ones, have relatively powerful display driver chipsets in order to perform this processing. Monitors mostly have very basic display controller chipsets, fundamentally incapable of performing image processing of this kind.

I have given you concrete, real-world examples demonstrating otherwise. If you have proof that these examples are invalid, please provide it.

Again: I have given you plentiful examples of the user experience niggles and annoyances of using a TV as a monitor. You might not care about them, but they exist.

Done. Seriously, if you missed this, what have you been reading?
THERE IS NO SUCH THING AS SIGNAL PROCESSING.. lol drop it. You are plainly WRONG. There is image post processing but there is no "signal processing". You're starting to sound foolish. Only analogue devices deal with signals, digital devices deal with data. Any post processing settings can be changed by the user, if your imaginary "signal processing" that you're talking about was real and was applied in real-time then you wouldn't have had the option to change it because it would have been already manipulated it but that is not the case and you are WRONG buddy.
Posted on Reply
#113
Valantar
goodeedididTHERE IS NO SUCH THING AS SIGNAL PROCESSING.. lol drop it. You are plainly WRONG. There is image post processing but there is no "signal processing". You're starting to sound foolish. Only analogue devices deal with signals, digital devices deal with data.
Any data that is transferred constitutes a signal, no matter the means of transportation. You might want to look up what "signal" means too - it can be pretty much anything carrying any form of information, analog or digital. What you are saying here is pure nonsense. I mean, look at the categories in Wikipedia's article on signal processing...

I also think the entire field of electrical engineering might want to have a word with you.

I mean, what do you imagine a stream of data between a source and receiver constitutes, if not a signal?
goodeedididAny post processing settings can be changed by the user,
Not necessarily. Before the advent of game mode on TVs, many TVs had no way of disabling all processing.
goodeedididif your imaginary "signal processing" that you're talking about was real and was applied in real-time then you wouldn't have had the option to change it because it would have been already manipulated it but that is not the case and you are WRONG buddy.
It's fascinating to see someone who has absolutely zero idea what they are talking about constantly double down on their fundamental misconceptions. Like, do you believe that most consumer electronics grant you access to every possible setting? News flash: they don't. Most lock them down tight, exposing just a few. TV processing is much like this - there is a lot that isn't adjustable. Also, a lot of settings are adjustable or can't be disabled - but this of course depends on the specific TV and firmware. But thankfully with PC and game modes, many tvs now allow for them to be disabled entirely.

Also: you asked me for "proof" above, and examples. Not that I hadn't given them before, but they were repeated. Care to respond to that? 'Cause right now it just looks like you're working quite hard to dodge things that contradict your "arguments".
Posted on Reply
#114
medi01
CrackongI just need Flat OLED 32 inch and 4k and 120Hz (or above)

When?
When OLED TVs will size shrink further.

48" OLEDs sell for about 700 Euro atm.

LG promised 42" C2 to come in March.
n-sterThat's damn cheap for QD-OLED if it lives up to the hype!
Curious sentence, especially given that we are already in "hurts my eyes" territory even with OLED TVs (which are watched from quite a distance) which gets only worse when you sit next to the screen.

So, dropping the promised higher brightness, what is that new stuff that is worth more money than normal OLEDs?
Valantar- DisplayPort
- PC compatible sleep/wake
- An option to bypass the "smarts" entirely, as it would have no use
- other PC-focused hardware features, like USB hubs (which all of LG's monitors except for the very cheapest have)
- Not sure why one needs a display port. To NOT be able to pass over sound?
- CEC wakes up over HDMI just fine, can't modern GPUs do that?
- "Gaming mode" bypassing all the smarts is a given these days
Posted on Reply
#115
Valantar
medi01When OLED TVs will size shrink further.

48" OLEDs sell for about 700 Euro atm.

LG promised 42" C2 to come in March.


Curious sentence, especially given that we are already in "hurts my eyes" territory even with OLED TVs (which are watched from quite a distance) which gets only worse when you sit next to the screen.

So, dropping the promised higher brightness, what is that new stuff that is worth more money than normal OLEDs?


- Not sure why one needs a display port. To NOT be able to pass over sound?
- CEC wakes up over HDMI just fine, can't modern GPUs do that?
- "Gaming mode" bypassing all the smarts is a given these days
-DP supports sound output just fine. It's also licence free and tends to be slightly more stable (less prone to bugs) in my experience than HDMI on PCs, though the difference is really minuscule. Still, it's nice to have options.
- As had been discussed at length above, PCs don't generally support CEC.
- Pretty much, yes. I don't know how many times I've brought up Game Mode in the posts above, but it must be dozens. Your point?
Posted on Reply
#116
n-ster
medi01Curious sentence, especially given that we are already in "hurts my eyes" territory even with OLED TVs (which are watched from quite a distance) which gets only worse when you sit next to the screen.

So, dropping the promised higher brightness, what is that new stuff that is worth more money than normal OLEDs?
Consider that this is a new technology and you usually have to pay through the nose at first, if it's even available on a monitor. My understanding is QD-OLED's claim to fame is the purity of the colors (QD vs filters) and perceived brightness of the colors (no white subpixel washing out the color).
Posted on Reply
#117
Valantar
medi01Curious sentence, especially given that we are already in "hurts my eyes" territory even with OLED TVs (which are watched from quite a distance) which gets only worse when you sit next to the screen.

So, dropping the promised higher brightness, what is that new stuff that is worth more money than normal OLEDs?
Having all emitters be the same color should reduce image retention further, as some emitters are simply more efficient and longer-lasting than others in producing light, and QD lets you transform blue light into red and green with minimal loss (~10%) compared to other technologies (phosphors + color filters, for example . This makes it both brighter and more efficient. It also allows for the creation of wider-gamut panels more easily, which should bring down costs in the future (or at least provide wider gamuts and more accurate colors at the same price points). As for the brightness itself, you're right that 1000+ nits isn't necessarily something you want in all contexts - but that obviously depends. If you're in a bright room, this will make for usable, good quality OLEDs even for that, and if you're not, then you will get a longer lasting panel as you don't have to push the emitters as hard to achieve whatever level of brightness is desirable. The main driver of OLED emitter degradation and thus image retention is running them at high outputs, which is always proportional to peak output. The further your desired brightness is from the peak output of each pixel, the longer the panel will last. There's a reason why something like an LG C1 can only sustain ~130 nits full screen white but can do 767 nits on a 2% window or 628 in a real scene highlight - LG are clamping down full screen brightness to avoid the panel degrading rapidly (and, of course, most media do not contain significant sections of the full screen being very bright, unlike PC usage). QD-OLED will likely noticeably increase the level at which these limits are set. As mentioned, this will also make QD-OLED much better suited to regular PC usage as displaying a near-100% white screen is overall rather common on PCs (like browsing these forums or working on text documents). Of course this matters less for gaming, but the wider gamut and higher brightness still makes for a better and more versatile panel overall. Plus you avoid subpixel trickery like WOLED, which messes with color balance at higher brightnesses and can mess with text rendering (plus, of course, it being a band-aid fix to the fundamental issue of those OLED panels otherwise not getting particularly bright).

Tl;dr: it's less about the peak brightness and more about how this extends the range of safe long-term usage of the panel, as well as increased versatility.

Also worth noting: Samsung's layer diagrams for their QD-OLEDs have very few layers and even omit a polarizing layer, which further increases light transmission efficiency. And of course (in time) the move to a single color of subpixel over AMOLED (WOLED already has this) has the potential to drastically simplify production - printing/growing a pattern of identical dots is dramatically easier than doing the same three times over with highly accurate spacing. WOLED has this, but needs a color filter layer, which is much less efficient than quantum dots.
Posted on Reply
#118
MarsM4N
goodeedididThat is not really true the C1 and C2 are exclusively with gaming in mind and burn-ins aren't really an issue anymore. You can hardly very very hardly cause burn-ins for example with the C1 even if you tried to. So yes the upcoming C2 will be a wicked gaming TV with so much more value than a dumb overpriced monitor.

"not designed for static content": Gaming is not static content.. duhhh lol


Not true, auto-AI brightness can be turned off.
The LG C1 & C2 Series are TV's designed for television purposes & changing content. If it's display would be suitable for PC use, they would have released a monitor long time ago.

And you do have static content in games (bright HUD). Not to mention the static content you have while not gaming (desktop & windowed apps). ;) Linus got burn-in.
Posted on Reply
#119
goodeedidid
ValantarAny data that is transferred constitutes a signal, no matter the means of transportation. You might want to look up what "signal" means too - it can be pretty much anything carrying any form of information, analog or digital. What you are saying here is pure nonsense. I mean, look at the categories in Wikipedia's article on signal processing...

I also think the entire field of electrical engineering might want to have a word with you.

I mean, what do you imagine a stream of data between a source and receiver constitutes, if not a signal?
Generally speaking everything can be considered a signal, but that is completely besides the point. The point is that you said that the "signal" is "processed" and you can't do anything about it, but that is completely untrue! What you mean to say is that the data or the image has been processed after the TV has received the digital data,, which is called post processing, which is a completely different thing from what you wanted to say by "signal processing", which is not a term used in any digital gadget such as mobile phones and TVs. Digital TVs receive packets of data-information and after that data has been received then the TV applies post processing and this is an important point because you as an user you can choose what processing to apply or not to apply. In the case of the C1 you can turn off probably all post processing
ValantarNot necessarily. Before the advent of game mode on TVs, many TVs had no way of disabling all processing.
That's besides the point, I'm just talking about the most modern TVs like the C1.
Valantart's fascinating to see someone who has absolutely zero idea what they are talking about constantly double down on their fundamental misconceptions. Like, do you believe that most consumer electronics grant you access to every possible setting? News flash: they don't. Most lock them down tight, exposing just a few. TV processing is much like this - there is a lot that isn't adjustable. Also, a lot of settings are adjustable or can't be disabled - but this of course depends on the specific TV and firmware. But thankfully with PC and game modes, many tvs now allow for them to be disabled entirely.

Also: you asked me for "proof" above, and examples. Not that I hadn't given them before, but they were repeated. Care to respond to that? 'Cause right now it just looks like you're working quite hard to dodge things that contradict your "arguments".
What is my fundamental misconception? That I keep on telling you the TVs all they do is apply post processing and that the "signal" they receive isn't "processed" as you claim it to be?
MarsM4Ntelevision purposes & changing content
Is this a quotation or your own words?
MarsM4NIf it's display would be suitable for PC use, they would have released a monitor long time ago.
I stated that the C1 is suitable for gaming on PC, not for other uses such as browsing the internet or office work. For that you don't need a big TV.
MarsM4NNot to mention the static content you have while not gaming (desktop & windowed apps). ;) Linus got burn-in.
I watched it and Linus clearly said "It shows no signs of burn-in whatsoever. And actually you would find plenty of testimonials from users online"

Also they apparently talk about a model called CX, which I think is not the high-end TV as the C1 that I'm using. The C1 even has less chance of burn-in at incredibly prolonged usage periods. Also the C1 can be configured to not have auto-dimming and anything auto. On the C1 the auto-dimming and auto-brightness are controlled by the "AI" but you can turn that "AI" off and have almost complete control of the whole image, something that Valantar does not agree with.
Posted on Reply
#120
Valantar
goodeedididGenerally speaking everything can be considered a signal, but that is completely besides the point. The point is that you said that the "signal" is "processed" and you can't do anything about it, but that is completely untrue! What you mean to say is that the data or the image has been processed after the TV has received the digital data,, which is called post processing, which is a completely different thing from what you wanted to say by "signal processing", which is not a term used in any digital gadget such as mobile phones and TVs. Digital TVs receive packets of data-information and after that data has been received then the TV applies post processing and this is an important point because you as an user you can choose what processing to apply or not to apply. In the case of the C1 you can turn off probably all post processing
Okay, so you admit that data that is transferred (anywhere) constitutes a signal, but it ... ceases to be a signal when it is being processed? Yeah, sorry, your logic is wildly incoherent. And "signal processing" is a very broadly applicable term across nearly all fields where it might be relevant, including digital AV equipment and computers. It literally does not matter whatsoever whether the processing is done algorithmically by an IC or through some analog device - both are equally well described by "signal processing". I mean, what do you imagine happens after the TV's SoC is done processing it? It's trasmitted to the display panel. As a signal. So, you have data, that is transferred to a TV as some sort of signal, but it conveniently ceases to be a signal just for the fraction of a second it takes to process it, before once again becomes a signal when transferred from the display controller SoC to the panel? Yeah, sorry, that logic is selective, incoherent, and ultimately arbitrary.

I've also never said that "you can't do anything about it", I've said that TVs up until the advent of game modes have often had no option to entirely turn off their (often quite aggressive) processing, which has resulted in high input lag and usability problems with PCs (some TVs have had "PC modes" but even those often didn't disable all processing, leaving in place things like sharpening and denoising).

Also, your application of the term "post processing" here is ... strange. Post (as in: "after") as to what, precisely? After the image is "done", i.e. finished, encoded, and transmitted? If that is the criterion, then by that definition all post processing is signal processing. However, the conventional use of "post processing" is not in relation to signals, but rather in relation to stored data - photoshopping an image, adding a filter to a video, adding a sharpening filter or AA to a game before that image data is transmitted; the post then refers to after the creation of the data (often an image), but crucially happens before this is considered "finished" and is transmitted anywhere. You could of course insist that any processing done by the TV is "post processing" because it happens after the signal source has sent the signal, but then we're talking about a distinction without a difference, and your argument collapses.
goodeedididThat's besides the point, I'm just talking about the most modern TVs like the C1.
I'm sorry, but you've got this twisted around. You're not arguing a point here. You're arguing against my points. Whatever delineation of your arguments you are claiming is meaningless, as you are arguing against my claims. Thus, the basis of my claims must be the basis for any further claims made - otherwise we're just not talking about the same thing, making the discussion meaningless.
goodeedididWhat is my fundamental misconception?
Let's see:
- The absurd idea that signal processing is a term only applicable to analog signals (which you seem to have abandoned for now, which I guess is progress?)
- The misconseption that there is a meaningful difference between your unusual application of "post processing" and my use of signal processing
- The idea that users have total control over their consumer electronics. Heck, even LG's OLED have widely reported user-inaccessible settings (that some users are gaining access to through a service menu)
goodeedididThat I keep on telling you the TVs all they do is apply post processing and that the "signal" they receive isn't "processed" as you claim it to be?
Once again: that sentence explicitly contradicts itself. If a TV receives a signal, and then processes the data contained in that signal, before displaying it? It is then processing the signal, which we might perhaps call signal processing? I really can't understand why you're so gung-ho on this immaterial use of broadly applicable terminology. It ultimately doesn't make any kind of difference to my argument.
goodeedididI watched it and Linus clearly said "It shows no signs of burn-in whatsoever. And actually you would find plenty of testimonials from users online"
Like ... what kind of reality distortion bubble do you live in? 1:39and forward: "This burn-in is a result of snapping my windows into the four corners of my display." While he is saying that, we can clearly see the image retention on-screen. 2:00 and forward, Wendell: "It is amazing, but what's not amazing is that I'm starting to see signs of burn-in." Do you need more?

Yes, the pixel refresher program alleviates the problem - but it's a band-aid, that "fixes" it by literally wearing down every pixel so that the localized wear is less apparent. It is not a long-term solution, as it will literally wear out the display. The video makes this abundantly clear. That is in fact the entire message of the video: beware that depending on your use case, even brand-new OLEDs can burn in quite quickly. That doesn't mean it will happen - agin, it depends on your use case - but they are by no means image retention-proof.
Posted on Reply
#121
goodeedidid
ValantarOkay, so you admit that data that is transferred (anywhere) constitutes a signal, but it ... ceases to be a signal when it is being processed?
The "signal" carries data, the "signal itself isn't important". Data arrives in packets right? After you receive those packets you can "process" them and that is called post processing, and you have the option to choose what to process on the LG C1 to such a degree that you can turn off auto brightness and dimming for example, or things like clarity and color tints. I'm specifically talking about the C1, which is still my point that you can't accept. So yes, after the TV receives the data then it is not a "signal" anymore, it is packets of data that are used to be translated into an image for you to see with your eyes. What is it so hard for you to understand??
ValantarI've also never said that "you can't do anything about it"
You suggested that the "processing" has been already baked in and you can't change it, which is untrue cause again we're talking POST PROCESSING.
ValantarAlso, your application of the term "post processing" here is ... strange.
en.wikipedia.org/wiki/Video_post-processing (I can't see what's so strange about that.)
ValantarLike ... what kind of reality distortion bubble do you live in?
Posted on Reply
#122
Valantar
goodeedididThe "signal" carries data, the "signal itself isn't important".
No. The signal consists of the data. If there is no data, there is no signal, and if there is no signal, there is no data. The two are identical in this context.
goodeedididData arrives in packets right? After you receive those packets you can "process" them and that is called post processing
As I said, at that point we're discussing a distinction without a difference. Whether you call it signal processing (which frames it through the transmission process from source to display panel, including intermediate decoding and processing steps) or call it post processing (which frames it specifically as happening after the decoding of the signal, but before it is then transferred (as another signal) to the display panel). I don't see how your objection to my use of "signal processing" in any way affects the validity of my arguments. (And, again, what happened to your "signal processing is just for analog!" line?)
goodeedididand you have the option to choose what to process on the LG C1 to such a degree that you can turn off auto brightness and dimming for example, or things like clarity and color tints. I'm specifically talking about the C1, which is still my point that you can't accept.
What? The inability to disable auto dimming (ABL) on LG OLEDs is widely reported. Some parts of the dimming behaviour are user adjustable, but not all of it - unless you gain access to the service menu. This is not intended usage and risks voiding your warranty, so ... yeah. Sorry, but you literally cannot disable that unless you gain access to parts of the configuration menu that LG is explicitly blocking user access to. If your argument for "you can do this" is actually "you can do this if you buy not-for-consumers hardware from gray market sources, potentially void your warranty, and exceed the manufacturer specified usage parameters of the product", then ... well, no, you can't do that, unless you're really dead-set on doing so. This is not a user-accessible feature.

I mean, this discussion only started going off the rails after I pointed out that LG's filmmaker and game modes aren't an advantage but rather a feature meant to overcome previous disadvantages of TVs compared to monitors. Whether you are only talking about the C1 is irrelevant in regards to that. I've never argued that the C series forces overprocessing on users - that's what filmmaker mode and game mode exists to remove! - I've said that these features exist specifically to alleviate what was previously inherent disadvantages for TVs compared to monitors (and which is still a disadvantage many TVs have).
goodeedididSo yes, after the TV receives the data then it is not a "signal" anymore, it is packets of data that are used to be translated into an image for you to see with your eyes.
Okay. And this happens in the TV's SoC, right? How, then, does this data reach your eyes? The SoC isn't a display, after all. It has decoded the image signal, processed it, and encoded it into a new format. Which is then transmitted, over wires, either through a display controller, or if this is integrated into the SoC, directly to the display panel. Which, again, makes it a signal. Heck, the photons carrying information to your eyes are a signal. This is precisely what I pointed out in a previous post: you're arbitrarily delineating a fraction of a second where the data is decoded and changed as it "not being a signal" despite this being an inherent part of a signal transmission and display pipeline. IMO, it would only meaningfully cease to be a signal if the data was at rest for any significant period of time - i.e. some form of storage (not caching). This never happens. Thus, what the SoC performs, is signal processing, as the purpose of what it does is to process the image signal before it reaches the display. You can call that post-processing, though the shortness of that Wiki article you shared does go some way towards indicating that this is not the most common usage of that term (it's more common in real-time 3D graphics, for example). But back to what I've been saying quite a few times now: this is a distinction without a difference. Whether we call this signal or post processing changes nothing.
goodeedididYou suggested that the "processing" has been already baked in and you can't change it, which is untrue cause again we're talking POST PROCESSING.
What? Where have I suggested that? I've said that many/most TVs don't allow you to disable it, as the settings aren't available to do so. I've also said that the most common usage of post processing, such as in games, is in relation to changes baked into a signal before it is sent (you can't make a TV or monitor remove an AA pass from a game, for example, as the required data to do so doesn't exist at that point in the pipeline). But I've never, ever, said that the processing done by TVs is baked into anything apart from the signal path from TV SoC to display panel.
goodeedidid
Did you seriously miss the part where I gave you specific time codes from that video where both Linus and Wendell confirm that they are seeing burn-in? Again: Linus at 1:39, Wendell at ~2:00. (In case you can't see it, those time codes are both links to that point in the video. Try clicking them, please.) I mean, the entire video is about them both getting burn-in on their new OLEDs after relatively short spans of time (and whether or not this is fixable). You're working really hard to pull out misleadingly selective quotes to support your argument here.
Posted on Reply
#123
Chomiq
First impressions from Reddit user:
Monitors/comments/t75xwf
They're already available for order if you're signed up for Dell Premiere (or whatever it's called), official launch is March 9th in US.

Subpixel layout:


Compared to JOLED:
Posted on Reply
#124
dir_d
Well here's the review, looks like Vincent confirmed that this monitor is the real deal. Look's like the only drawback will be text which can be helped by clear type. I just hope Samsung Display makes more sizes because there really isn't a reason to buy any other panel besides this one.

Posted on Reply
#125
lexluthermiester
dir_dI just hope Samsung Display makes more sizes because there really isn't a reason to buy any other panel besides this one.
Sure there is. Not everyone wants a curved panel, nor does everyone want an ultra-wide-aspect screen. So, for those two reasons this display is a no go.

Now flat-screen 16:9? Heck yeah! I'd take two..
Posted on Reply
Add your own comment
Apr 19th, 2024 17:55 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts