• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

What is the PC monitors future trend in terms of resolution and aspect ratio?

It is my personal opinion that monitors, much like audio gear, have reached a reasonable maximum in recent years. Both are beyond physical abilities of even above average human, and now it's just marketing gibberish, pointless gimmicks and usual fight for having a larger number on a box, posing as progress. And, of course, planned obsolescence to force people to buy "new" things, which is why manufacturers push the self-destroying OLED so much.
 
I don't agree with,
which is why manufacturers push the self-destroying OLED so much.
That part is nonsense.

See this thread then read my post #5 where I pointed out,
And while some misinformed worry about burn in, they needn't. As noted here,

burn-in shouldn't be a problem for most people. Burn-in is typically caused by leaving a single, static image element, like a channel logo, on the screen for a long time, repeatedly. That's an issue if you keep Fox News, ESPN or MSNBC on for multiple hours every day and don't watch enough other programming, for example. But as long as you vary what's displayed, chances are you'll never experience burn-in.

As I also noted there,
Plus, even my 7 year old OLED TV (which has no visible burn-in issues) has built-in features to mitigate burn-in issues. For example, it uses pixel shift technology for when it senses a static logo is being displayed in the same spot. It will quickly move the logo image a 2-3 pixels in different directions to keep the same pixel from being lit with the same color for too long a time. FTR, I have never noticed this shift in action. That makes sense since there are millions of pixels on this screen.

So yes, burn-in technically is possible with OLED technologies, but TV and monitor makers learned long ago how to prevent it from occurring - at least for most users.

I also don't agree with your comment about planned obsolescence. I think products in general are not made "like they used to be" and so many don't have the longevity as those made yesteryear. But that is a consequence of consumers demanding cheaper prices and company shareholders demanding more "immediate" profits. That is, they are not planning on these products failing after a specified "predetermined known period of time (the definition of planned obsolescence). It is just happening as a result of cheaper components going inside.

That said, that 7 year old OLED of mine is still going strong, as is the 16 year old Samsung LCD TV (now in the basement den) the OLED replaced.
 
I don't consider curved displays to be displays, sorry. Rather brain damage catalysts.

Slap me when 5120x2160 at 100 Hz with no curvature whatsoever will fit into ~35" and be delivered with a decent IPS panel and a price tag of no more than 5 Franklins. I'll be stocking on that no doubt.
Ahhh.... what now?
I'm using a curved 34" OLED 3440x1440 for the last 16 months, I enjoying it and my brain is better than ever.
Curved panels are just for personal views. Not suitable for multi person view.
 
Curved panels are just for personal views. Not suitable for multi person view.
^^^This^^^. I agree with Zach.

I personally don't care for them but I sure see the appeal for some computer users who tend to sit, more or less, stationary, front and center of the screen. You can achieve a wrap-around effect without having to block out the bezels encountered with multiple flat screens.

I don't understand them for TV viewing with multiple viewers as anyone sitting off to the side will see viewing angle anomalies. Perhaps that is why curved, big screen TV sales are not doing well.
 
A long, long time ago, we had CRT panels of 4 to 3 aspect ratio that averaged at 1024x768 (640x480 to 1600x1200, possibly with a couple models beyond these boundaries).
Then, 4 to 3 (1024x768) and 5 to 4 (1280x1024) first gen LCD panels.
After that, 16 to 10 (1280x800; 1440x900; 1680x1050; 1920x1200; 2560x1600) displays.
And then, 16:9 (1280x720; 1366x768; 1600x900; 1920x1080; 2560x1440; 3840x2160 etc).
For about a decade, we also have side options such as 21:9 (2560x1080 and more common these days, 3440x1440) and wider panels but these are fairly niche and expensive options.

Is it safe to assume we'll have a shift to 21:9 as the standard? Will 2160p displays of such an aspect ratio emerge?
Nah I don't think 21:9 is going to be very common outside the monitor world, and perhaps it will expand to gaming beyond the PC, so consoles, and we might see some tiny segment of TV's at 21:9, just like how TV's have bled into the monitor space.

Will we get bigger 21:9? Sure. But I think overall the display market is already adjusting to producing a wild range of aspect ratios and diagonals by now. Things will be more diverse.
 
It is my personal opinion that monitors, much like audio gear, have reached a reasonable maximum in recent years. Both are beyond physical abilities of even above average human, and now it's just marketing gibberish, pointless gimmicks and usual fight for having a larger number on a box, posing as progress. And, of course, planned obsolescence to force people to buy "new" things, which is why manufacturers push the self-destroying OLED so much.

That's good point if we talk resolution. Going higher res was always mostly about mitigating aliasing and to me coming from ~22" 1920x1080 to 35" 3440x1440 was like finally saying goodbye to those damn jaggies - I see them that much seldom to the point of being surprised then, so it's game graphics problem, not res itself. Pixel density wise it's not that much of an upgrade, but it doesn't include that I needed to increase the distance, double that, and this is what does the job here. Now I feel like nothing pushes me into higher res. Like it would be obviously nicer, but price of performance doesn't seem convincing even with my 4090. I wonder more how are bigger 3440x1440 screens like popular whopping ~45" ones. - pixelated mess or once again needing to sit even further would do the job.
 
Not suitable for multi person view.
That's why I never consider curved displays. Hotseat Worms parties, football simulators, movie watching, you name it. Happens weekly.
Curved displays also ruin your flat display experience by faking a bulge in the middle of the screen.

And please stop discussing OLED and other stuff, you literally got another thread for panel techologies. Resolution debates only.
 
Call me old school but I prefer 4:3 or 5:4 display. Maybe because of lack of desk space, or playing older games, or monitor is close proximity to my eyes. With wider aspect ratio of course you can see more, but your eyes would fix onto the middle of the screen anyway.

I still think 1080p (or should I say 16:9) gonna stay as mainstream resolution and aspect ratio, since most will target the common resolution, I also still stick to 1080p because I don't wanna spend on latest and greatest graphics card to render higher resolution monitor when new games comes around. Most midrange GPU like RX 6600 targets that resolutions. With prices of GPU getting higher and higher I found little reason to spend more on higher resolution monitor
 
With wider aspect ratio of course you can see more, but your eyes would fix onto the middle of the screen anyway.
'haps it's just me, or it's my head traumas doing head trauma things (I was diagnosed with a concussion 9 times in my life, once after being hit by a truck), but my dead zones are on up and down so I'm cruising with 21:9 and have unused pixels when it's 16:9 or "taller." I'm currently a 16:9 display owner and I literally can't see the upper ~300 (out of 2160) pixels because of that; or lower 300 pixels, or 150+150, or whatnot. Still great experience, I really do like my monitor.
I also still stick to 1080p because I don't wanna spend on latest and greatest graphics card to render higher resolution monitor when new games comes around.
4K display + DLSS/FSR/XeSS Performance = excellent UI (especially outside gaming) + acceptable framerates (of course plain 1080p is significantly faster) + better details compared to 1080p. I got a 6700 XT, almost everything is enjoyable at reasonable settings at 60 FPS, with only the most ridiculous titles such as Avatar and Lords of the Fallen put into "maybe later" list.
Of course if you don't see how anything below 120 FPS can be fine you should go for 1440p or 1080p, avoiding 4K and beyond altogether. But for me, 60 FPS is okay. Wouldn't mind having more but my wallet is the opposite of thick at the moment.
 
I still think 1080p (or should I say 16:9) gonna stay as mainstream resolution and aspect ratio, since most will target the common resolution, I also still stick to 1080p because I don't wanna spend on latest and greatest graphics card to render higher resolution monitor when new games comes around.
Nah, the primary reason we have HiDPI panels is text legibility. This all started with the Retina Display on iPhone.

The benefit of HiDPI panels is blatantly obvious when looking at logographic character systems (Chinese, Japanese, some other Asian alphabets, maybe Arabic as well). It is not so important for Western alphabets like the one used here at TPU.

Apple understood the Retina Display would benefit the large part of their user base that doesn't happen to use Roman characters. For many of this planet's inhabitants, HiDPI improves character legibility immensely. You don't even need to be able to read these languages to recognize the obvious improvement.

And modern operating systems have adjusted antialiasing techniques to take advantage of these higher resolution displays.

Yes, this does put a strain on GPUs for gaming but something like 3840x2160p at 200% (1920x1080p effectively) for text is superb on a 27" monitor.

But even if you read/write Western languages, if you're staring at Excel spreadsheets filled with Arabic numerals all day, you'll see how HiDPI displays can make reading easier, especially once you pass 40 years of age and your eyes are old and tired. Let's remember that most monitors and computer displays spend a far larger total amount of time doing productivity related things, not gaming or YouTube video playback.
 
Last edited:
3840x2160p at 200% (1920x1080p effectively) for text is superb on a 27" monitor.
This is precisely how I roll. Magnificent. Arabic is legible at 720p though, can't complain about that. Not that I can read it but my dad (native Arabic speaker) doesn't struggle at all.
 
This is precisely how I roll. Magnificent. Arabic is legible at 720p though, can't complain about that. Not that I can read it but my dad (native Arabic speaker) doesn't struggle at all.
HiDPI text legibility superiority is one of the main reasons why my notebook computer (Acer with 14" FHD display, 1920x1080p) spends days -- sometimes weeks -- without being cracked open.

It's simply far gentler on my tired old eyes to read on my 27" 4K monitor (200% UI scaling).

Hell, if I want to read something online on the couch, at the pool, in bed, I'll grab my iPad before I grab the notebook computer. Again, the Retina Display blows doors on the regular non-HiDPI 100% display on the notebook.
 
HiDPI text legibility superiority is one of the main reasons why my notebook computer (Acer with 14" FHD display, 1920x1080p) spends days -- sometimes weeks -- without being cracked open.

It's simply far gentler on my tired old eyes to read on my 27" 4K monitor (200% UI scaling).
Yeah, and I got some personal issues with that, too. Vision is perfectly fine, however what most people deem okay is ridiculously small text for me. That's why I use a lot of scaling sometimes so my eyes don't "wear out."

TPU is used at a 110% scale (27", 4K, 200% global scaling, about 40"/102 cm far from the display), for example. Lower is not comfortable.
 
I think 16:9 and 16:10 will continue to be standard for laptop screens, at least. Ultra-wides are a niche that fits a few people's use cases - nothing against them - but it doesn't work for most of us. Your average person has a single window maximized most of the time, rather than two side by side. Until the average movie and YT video are filmed at 21:9 or higher, very few screens will be sold in those dimensions.

Having no personal experience with higher resolutions than 1080p, I can't speak for myself, but I have read a couple studies about resolution. The tl;dr is: 4k is a very worthwhile upgrade, 8k is not. The average person can tell the difference between 2k and 4k at normal viewing distance, even sometimes for phone-size screens - only people with unusually good vision (20/15 or better), sitting closer than normal to the screen, can sometimes tell 4k from 8k, and even then, only sometimes. Expect to see a huge push for 8k from companies like Samsung and Nvidia who have an interest in it though.

I'm not a fan of OLEDs, being too bothered by the nannying you have to do for them to appreciate the admittedly much darker blacks. (I don't think other colors look that much better on OLED vs. a good IPS.) I'm sure we'll get a better technology soon, whether that's improved microLED or miniLED. Though as long as we're limited to RGB color space, display colors can never look truly lifelike.
 
21:9 is more natural to human vision, 16:9 was thrust upon us by those a step up from a boxed view of the world. :laugh:
 
When MICRO LED becomes feasible, it will probably be the defacto standard. It's too good to ignore. The thing is we don't need to keep upgrading resolution. But in the pc/computer world, who knows. Broadcast TV, in my opinion will take a long time to upgrade. Not so for the pc display industry. 8k is pointless to upgrade to. Why not wait for 16k or 32k or 64k? I guess what I am saying is, if you are going to make another standard like 1080p or 4k.....why do it when it's double? Just wait a decade and let something else settle to be the standard for the next 20 years. My opinion.
 
Almost any first person shooter/action game is tenfold better with the extended POV. Also really nice in racing games. It's really far from niche. A lot of people don't even know they want a 21:9 display. It's also great for movies, alas there's still no consensus to be seen. Every director plays their own aspect ratio.
No, it really depends on the game. Open world fps yes, corridor shooters (valorant, cs 2 etc.) no. I have tried everything possible, 21:9, 16:9, 32:9. For competitive shooters a 27" 16:9 is the go to. For single player games a 42" 4k oled is the goat.
 
Always said I would be happy when we have 4k120 (a decade before it became a reality). I am happy with 4k120 and probably always will be...especially since we have VRR/ever-improving up-scaling these days.

That said, I will probably eventually buy something that is 4k240 because that is without-a-doubt the direction we are headed in the next couple (or so) years.

I'm not saying there isn't a place for other aspect ratios and those that need/enjoy them (I always kinda liked 2:1...), only wrt my personal requirements they have been met, and I think many normies are similar.

Now I look forward to things like 4000 nit HDR/1000 nit full-screen on huge/relatively cheap LCDs...with 4k240hz capability (or hell, even bring back 3D in some newfangled way if somebody wants to do that).

That, I feel, will be the next worth-while upgrade wrt displays. Mostly for the size/price than anything else.

At least in my house. I know others have different priorities. I'm just here representing liking to be on the cutting-edge of what is eventually mainstream.
 
I don't think 21:9 will be the standard of all entertainment considering the push for 4K. It's more like a niche that a few of us, gamers enjoy.
 
I don't get the HDR concept anyway. If you want HDR you need to increase the brightness but on an LCD monitor that means that you'll lose the blacks. So what's the point? If you ask me no LCD panel should had HDR.
Why? If it's there and you don't like it you can turn it off, but if it isn't there than anyone who likes it can't turn it on because it isn't there.
We have already got 8K TVs with a trend of downward spiraling prices.
Other than demo videos is there any 8k content available for them (aka movies and tv shows... even 4k content, especially tv shows was pretty rare not so long ago)? There's even very few content that's shot above 24 fps which makes anything with fast movement either blurry or stuttery (don't really get it, it's easier to lower the frame rate of a high frame rate movie/tv show than it is to increase it since the first requires no extra data while the other requires extra data, so the displays could have filters that would lower the frame rate for people who prefer the low fps look).




In theory I'd like an affordable (under 500 eur) 16k 32-36" display with 1000Hz+ refresh rate, slightly wider than human vision color gamut, more than just rgb subpixels (I'm pretty sure you'd need that if you wanted a color gamut slightly wider than human vision) 32 bits per color channel, capable of showing picture with a completely black and extremely bright pixels next to each other, capable of getting bright enough to be usable in direct sunlight (as in can get bright enough to overpower direct sunlight), no burn in issues, no loss of brightness over time, not glossy (sorry, if I wanted a mirror, I'd buy a mirror, glossy is just ugly under normal conditions). And a display connector that actually has the bandwidth to run it without resorting to lossy compression (even if it's supposedly not noticeable).

In practice that isn't happening any time soon. And when it does happen it won't be affordable.
 
Other than demo videos is there any 8k content available for them (aka movies and tv shows... even 4k content, especially tv shows was pretty rare not so long ago)? There's even very few content that's shot above 24 fps which makes anything with fast movement either blurry or stuttery (don't really get it, it's easier to lower the frame rate of a high frame rate movie/tv show than it is to increase it since the first requires no extra data while the other requires extra data, so the displays could have filters that would lower the frame rate for people who prefer the low fps look).

Publicly the cinema industry doesn't release that many 8K movies, but at least there is a list. https://www.imdb.com/list/ls098792662/
There is no reason for them not to shoot everything at 8K.
Look, 20 years ago when we were still using 4:3 CRT TVs, all movies, music videos, etc. were shot at 16:9. Why?
Same here and now. Shot at 8K, 16K or whatever special resolution they like, but publicly lying and releasing almost everything at 1080p or some at 4K...
 
Publicly the cinema industry doesn't release that many 8K movies, but at least there is a list. https://www.imdb.com/list/ls098792662/
There is no reason for them not to shoot everything at 8K.
Look, 20 years ago when we were still using 4:3 CRT TVs, all movies, music videos, etc. were shot at 16:9. Why?
Theatrical presentation. It's not a surprise that 16:9 was selected as the compromise aspect ratio for HD. At 1.798:1 it's very close to the standard widescreen film aspect ratio of 1.85:1. On a 1920x1080p FHD display, 1.85:1 content needs fairly minimal letterboxing, about 21 pixels top and bottom.

And a lot of the production choices for content like music videos were forward thinking anyhow. 16:9 was part of the ATSC standard set by the FCC (USA) in 1996 so everyone knew that those TV sets were coming, even if most people didn't have them yet.

Perhaps more importantly, capturing at higher resolutions gives the content creator more freedom during the editing process. When you shoot at 8K with intent to release at 4K, you can choose which pixels to include. Filmmakers have been doing this for decades. For sure, Jim Cameron isn't shooting at 8K, selecting "Scale 50%" and pressing the Export button.

And it's not just resolution. Bits per pixel, colorspace, frames per second, etc.

The music industry has been doing the same thing, capturing at much higher resolutions than the planned output. So 192kHz, 24-bit full digital recordings have been happening for decades even if Joe Consumer is now listening over lossy Bluetooth with a pair of no-name wireless earbuds. And today there are even high resolution recordings (384kHz, 32-bit) using all the latest microphones, etc.

Same here and now. Shot at 8K, 16K or whatever special resolution they like, but publicly lying and releasing almost everything at 1080p or some at 4K...

Remember that 4:3 is a holdover from the old Academy aperture standard set by AMPAS in the early 1930s. CRTs weren't commercially feasible as widescreen displays (image accuracy, display weight, manufacturing difficulty, calibration, etc.).

A lot of content is being shot at much higher resolutions than currently available consumer sets can support. Again, this is forward thinking. It's just released as 1080p or 2160p content because that's what Joe Consumer's display can support and the bandwidth requirements for 8K or higher resolution digital distribution don't make it worthwhile.

But for sure, 8K and higher resolutions will hit movie theater screens before Joe Consumer's living room.

Let's not forget that 8K displays aren't just consumer devices. They are used a lot already in enterprise/commercial/military/educational sectors. There are art museums doing high-resolution scans of masterpieces (like Rembrandt's The Night Watch) to be studied by academics on high resolution displays.

It's not just about a FIFA match, an Excel spreadsheet, or the next GTA release. There are plenty of non-consumer 8K usage cases already around.

Sony was ready five years ago to impress people with 8K video displays but COVID-19 scuttled that by delaying the 2020 Tokyo Olympics by a year (and no outside visitors were allowed in 2021).
 
Last edited:
Back
Top