Yes, there is a lot of speculation happening in your post.
There is also a lot of overcooked language to go with it, such as:
"very real" and "very near" in the same sentence
"silly"
"just absurd"
Who? I thought this was just speculation?
they stopped because the cost of making OLED monitors still prices them out of range for most consumers. Dell stopped making the $5000
monitor because it was not selling in big enough numbers to make it worthwhile.
tftcentral said "quickly pulled from the market" which implies there was a technical problem.
suggests Dell was unhappy with the color drift problem when viewed from the sides. But I note most computer users sit directly in front of their monitors.
Dell never looked at the monitor before it put it into the market? That site wouldn't happen to be partially an evangelism outfit for OLED, right? (Since we're speculating.)
https://www.oled-info.com/dell-brings-back-its-up3017q-30-4k-oled-monitor-and-slashes-price-1500
So IMO, with poor sales, there was no incentive (read: profit $$$) to entice Dell to invest in further R&D.
Yes, this makes great sense. Dell was the first to market with this fabulous new OLED tech and, after having spent all the money to bring a real product to market, one that went on sale (rather than being prototype/demo vaporware), it pulled it from the market instead of doing the horrible thing of building upon its investment.
https://www.oled-info.com/dell-brings-back-its-up3017q-30-4k-oled-monitor-and-slashes-price-1500
There is the very real potential for manufacturing costs in the very near future for larger OLED displays to be less than current LCD technologies.
Somehow, large format OLED is being sold in consumer-grade products but it's just too costly to sell it in prosumer and pro monitor size packages? There is at least one flaw in the technology that prevents it from entering those spaces, possibly several. The one lone monitor (has it shipped yet?) following the panel Dell removed from the market has a diagonal of 21.6" or something, and a low brightness level.
The laws of physics are a problem for some products, regardless of how much effort engineers put into working around them. It may be that the blue pixels of OLED, for instance, can't be fixed adequately for the purpose of introducing a prosumer or above monitor with HDR-grade brightness, or even less.
The recently announced small monitor makes a point of saying that it is not using the white subpixel work-around, as I recall. The wording was unclear so it may refer to something else. Interpreting it as referring to the white subpixel implementation (a workaround to help to alleviate the inherent lifespan problem of blue OLED) helps to explain the panel's low brightness. It also suggests that the white subpixel workaround is not a great solution for the prosumer/pro monitor space.
https://www.oled-info.com/dell-brings-back-its-up3017q-30-4k-oled-monitor-and-slashes-price-1500
So when those manufacturing techniques improve enough so makers can produce bigger displays more efficiently, OLED monitors will return in bigger numbers - unless some yet to be discovered better technology comes along first.
In other words, when the tech has improved to be good enough, it will be good enough. Otherwise, something else will sell. This is news?
https://www.oled-info.com/dell-brings-back-its-up3017q-30-4k-oled-monitor-and-slashes-price-1500
The idea that just because burn in "can" affect OLED displays is preventing sales is just silly. "Can" does NOT mean "will".
What's silly is the dodging you continue to do around this issue. Two professional sources have been posted that rebut your posts and yet you continue.
https://www.oled-info.com/dell-brings-back-its-up3017q-30-4k-oled-monitor-and-slashes-price-1500
And the suggestion there has been no progress in marginalizing the disadvantages identified years ago with early generations OLEDs (like burn in and blue luminance issues) is just absurd.
Overwrought dismissive language doesn't create substance.
https://www.oled-info.com/dell-brings-back-its-up3017q-30-4k-oled-monitor-and-slashes-price-1500
And it is important to remember the significant advantages that OLED already has. It supports significant faster response times - into the µseconds. And unless displaying pure white, OLEDs enjoy significantly lower energy consumption. OLEDs do not use a backlight so they have thinner panels than LCDs. And because of no backlight and the fact the individual pixels can be turned off, OLED panel can produce pure absoute black for better contrast ratios. And they have better viewing angles.
Plasma had important advantages over LCD and yet where are the plasma desktop monitors? Where are the plasma televisions? Why did plasma get killed off well before OLED became a dominant standard?
No matter how many positives a product has, its negatives also matter. Plasma wasn't very compatible with the fad of selling people increasingly tiny pixel pitches, pixels smaller than they can possibly see from normal TV/film viewing distances. It used more power than the efficiency marketing (which LCD makers used successfully, despite the big picture lack of relevance) made seem reasonable. It had retention and burn-in problems, like OLED. It was heavy. OLED may face a significant problem from the pixel shrinkage obsession that manufacturers have used to convince people to replace their televisions. As the pixels get smaller and the brightness increases (i.e. HDR), those blue pixels face an increasing individual lifespan strain. It may be that retention and burn-in become bigger problems as the pixels shrink. There is also the problem of color drift. If, for instance, the blue pixels dim from use faster than the other colors then the color gamut could shrink over time. Not only would the monitor need a lot of calibration (which is common with pros already) but the shrunken gamut could make the panel unusable by graphics pros.
https://www.oled-info.com/dell-brings-back-its-up3017q-30-4k-oled-monitor-and-slashes-price-1500
In the beginning, many felt the significant disadvantages of the LCD monitor would mean the CRT monitor would never go away. It is common for early generation technologies to have substantial disadvantage at first.
It's also hardly unheard of for technologies to fail, or to become so niche that mass-market corporations don't invest in improving them. The main evangelist of plasma said plasma could be improved to use a lot less power and to work with 4K. Corporations, though, made the decision not to pursue plasma further.
Plasma screens were once used as computer monitors, too. Maybe we'll see OLED displays become commonplace as computer monitors. Perhaps we won't. It all depends on the severity of the drawbacks when compared with the competing technologies. There was a big demand for a better-performing PowerPC than the G4 in Mac laptops and Apple/IBM couldn't get the G5/Power architecture to work in the power envelope. So, it just didn't happen. Apple had to abandon Power in favor of Intel. The Power architecture had plenty of virtues and could have been improved. However, a mobile Power CPU, particularly one for Apple, was never developed.
The things I said in my post have been upheld. I said OLED has not displaced LCD for computer monitors. I said OLED is vulnerable to retention and burn-in, something two independent sources confirmed. If OLED were truly so great, in terms of it not having serious issues relating to those I discussed, it wouldn't have been pulled from the market. We wouldn't see giant OLED televisions at reachable consumer prices — consumer-grade products, only. We would see pro-level products, halo products — particularly given than laundry list of advantages you posted. Of course people love the contrast ratio of OLED. Pro photo/graphics people certainly aren't happy with the horrible contrast ratios of IPS, or the black crush/angle problems of VA. It's obvious that people want OLED. The question is... is it good enough? Technology typically trickles down. Reference panels, pro-grade monitors for photography... those kinds of things are above, not below, $3000 jumbo consumer TVs, readily-available for years now.
Dell had no problem selling IPS professional screens with low contrast ratios (one popular one had a ratio below 800:1 when calibrated). If OLED were truly ready I seriously doubt something as minor as off-angle viewing angle performance would have caused the company to pull their product. It's clear that just the contrast ratio advantage by itself is more important than off-angle consistency, provided that ratio doesn't come with face-on black crush, as VA does. So, it's clear that other problems, more critical flaws, are in play.
About the image retention of OLED, that is not really an issue. The degradation of pixels (because thát is the core issue) happens with EVERY type of panel, in different degrees and for OLED its just that blues get weaker, faster than the others.
OLED is a lot different than other panel types. We can't just say "Well, they all have problems" and hand-wave the blue subpixel longevity issue. That is, unless it truly has been fixed, like with a new material. The white subpixel workaround may be enough for televisions at 4K, but what about the 8K propaganda that has already been around for two years or more? It's only going to speed up. People will be convinced that they really can see the difference at normal TV viewing distances, even though they can't (even though 1440 would have been good enough for HDTV/film at normal, not desktop, distances). HDR is being heavily pushed, putting intense pressure on the panel to produce a lot of brightness. There is no conventional backlight to produce that brightness in OLED.
Retention and burn-in are far more of a problem for some panel tech types than others. CRT is more prone to burn-in than TN and VA LCD, for instance. Plasma is really prone to retention and is vulnerable to burn-in. IPS is possibly more vulnerable to burn-in than TN and VA, although I wonder if that testing that suggested this was using a constant-control backlight in the IPS panel, versus PWM. It seems reasonable to assume that a particularly low duty cycle PWM would make a panel much less prone to retention and burn-in than a constant control backlight.
Plasma and CRT are dead now. They're dead because they had drawbacks that outweighed their strengths in the marketplace. This is despite the fact that both have particular strengths. I'll bet, for instance, that plasma is less prone to fading for blue and violet display than OLED is. I've never seen anyone have to work around a problem with blue fading in plasma by adding white to the mix. CRT is the best tech, as far as I know, for input lag. Plasma has better uniformity than LCD. Plasma has better contrast than the vast majority of LCD panels. Plasma has better off-angle viewing gamma/color consistency than LCD.
Over time, OLED panels will gradually lose peak brightness, which isn't stellar to begin with. But yes, 10-15 years is what they are rated for and I have no worries they won't make that.
Which ones? The panels in the Dell monitor that Dell took off the market? Or, are we talking about the lone replacement which has a low max brightness, fresh out of the box — just as HDR is becoming big in the marketplace.