• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

When will 1440p be mainstream and commonly used?

When will 1440p be mainstream and commonly used?


  • Total voters
    84
8K isn't commonplace like 2K, 4K, 5K.
Of course not. It's newer technology that has a very high bar for entry.

Neither 2K nor 5K are television resolutions, they will never have widespread market adoption even if 1440p is twice the linear resolution of 720p. No TV manufacturer actively markets 1440p sets to a global audience.

8K resolution is sweet -- I've seen it in person -- but it really benefits from a theatre-like setting.
 
"4k" and with shitty frame rates
Modern consoles are like a Zen2 gaming laptop.

The fact they can do 4k60 in Esports titles says a lot about what modern hardware can achieve, since we can turn settings down and they cant.


And seriously, DLSS has come miles since it's first introduction. 4K w/ DLSS looks insanely good vs native 1440p, i'll need to try and take side by side photos in DRG to show what i mean
(AMD's FSR/FSR2 are blurrier in this title, at least)


Sigh now i wanna go do a comparison video.... I'll start with screenshots


To avoid forum compression i'm focusing on text clarity at a distance, which is something that's a big issue with these techs

I'm changing modes, taking a steam screenshot then cropping this far away small text


4K native
1669709359126.png

4K w/ DLSS
(The lines are an animation on-screen, not an error)
1669709285588.png


FSR 2.0
1669709450632.png


The only problem is FSR screws up where certain effects like smoke or steam pass heavily aliased things like a wire mesh fence
This flickers horribly, although it's not something seeen in-game, just the pre-game lobby
1669709507002.png


Sorry, ended up gaming as friends joined the lobby i was screenshotting from:


1440p native (it's so much smaller)
1669712886173.png


and zoomed in equal to the 4K w/ DLSS
1669712898723.png



4K with DLSS looks better than 1440p native, FSR can as well when it doesnt have motion artifacting
 
Last edited:
Modern consoles are like a Zen2 gaming laptop.

The fact they can do 4k60 in Esports titles says a lot about what modern hardware can achieve, since we can turn settings down and they cant.


And seriously, DLSS has come miles since it's first introduction. 4K w/ DLSS looks insanely good vs native 1440p, i'll need to try and take side by side photos in DRG to show what i mean
(AMD's FSR/FSR2 are blurrier in this title, at least)


Sigh now i wanna go do a comparison video.... I'll start with screenshots


To avoid forum compression i'm focusing on text clarity at a distance, which is something that's a big issue with these techs

I'm changing modes, taking a steam screenshot then cropping this far away small text


4K native
View attachment 272154
4K w/ DLSS
(The lines are an animation on-screen, not an error)
View attachment 272153

FSR 2.0
View attachment 272156

The only problem is FSR screws up where certain effects like smoke or steam pass heavily aliased things like a wire mesh fence
This flickers horribly, although it's not something seeen in-game, just the pre-game lobby
View attachment 272157

Sorry, ended up gaming as friends joined the lobby i was screenshotting from:


1440p native (it's so much smaller)
View attachment 272162

and zoomed in equal to the 4K w/ DLSS
View attachment 272163


4K with DLSS looks better than 1440p native, FSR can as well when it doesnt have motion artifacting

i also always play my games in portrait mode of 1fps, it's so much better then 60fps, 120fps and that crazyness. The eye can't see more then 1fp, can't even deal with motion
 
4K TV is indeed great. 4K for gaming......not so much. You would be stuck in a never ending expensive rapid upgrade cycle to keep up with games getting constantly more demanding. If you are willing to constantly throw money at 4K gaming GPUs then it's great but the vast majority won't do it and they probably never will.
I think you're right if talking about PC enthusiasts that game. however, I think the many PC gamers will buy 4k and then either run lower resolution full screen or FSR/DLSS performance mode when they realize their GPU isn't fast enough. I think there's a very good possibility 1440p gets overlooked.
 
i also always play my games in portrait mode of 1fps, it's so much better then 60fps, 120fps and that crazyness. The eye can't see more then 1fp, can't even deal with motion
You could try actually saying something
 
i also always play my games in portrait mode of 1fps, it's so much better then 60fps, 120fps and that crazyness. The eye can't see more then 1fp, can't even deal with motion

Movies run at 25fps; one would certainly notice 1fps
 
You could try actually saying something
He could, but you'd just plug your ears and go "nu uh, DLSS is perfect and consoles do 4k60 guyz". If you enjoy rubbing vaseline on your screen then go right on ahead, enjoy it. Not everyone is going to enjoy that though, many are going to go for the higher rez screens, and the growth of 1440p screens on steam is indicative of that, as is the growing number of laptops using 1440p or the far superior 1600p screens.

Oh, and for the record, consoles do NOT do "4k60". They are up-sampling 1080p or, in some cases, 1440p video up to 4k, and are using checker-boarding on top of that to achieve 4k. And only some e sports games manage 60 FPS, most titles still run at 30 FPS or, for games like JC3, about 15 FPS.
 
growing number of laptops using 1440p or the far superior 1600p screens.

Doesn't matter really, in a laptop you want a 1080p screen but a really good one. For now, the perfect laptop display should be OLED, and at least 90 Hz. Cheaper Asus laptops come with the OLED, but you have to shell out a bit for their 90 Hz versions.

From what I read, OLED displays do cycling to prevent burn-in, but the heat and some other organic factors still cause it (as is on the AW3423).

Micro LED some time in future maybe.
 
Modern consoles are like a Zen2 gaming laptop.

The fact they can do 4k60 in Esports titles says a lot about what modern hardware can achieve, since we can turn settings down and they cant.


And seriously, DLSS has come miles since it's first introduction. 4K w/ DLSS looks insanely good vs native 1440p, i'll need to try and take side by side photos in DRG to show what i mean
(AMD's FSR/FSR2 are blurrier in this title, at least)


Sigh now i wanna go do a comparison video.... I'll start with screenshots


To avoid forum compression i'm focusing on text clarity at a distance, which is something that's a big issue with these techs

I'm changing modes, taking a steam screenshot then cropping this far away small text


4K native
View attachment 272154
4K w/ DLSS
(The lines are an animation on-screen, not an error)
View attachment 272153

FSR 2.0
View attachment 272156

The only problem is FSR screws up where certain effects like smoke or steam pass heavily aliased things like a wire mesh fence
This flickers horribly, although it's not something seeen in-game, just the pre-game lobby
View attachment 272157

Sorry, ended up gaming as friends joined the lobby i was screenshotting from:


1440p native (it's so much smaller)
View attachment 272162

and zoomed in equal to the 4K w/ DLSS
View attachment 272163


4K with DLSS looks better than 1440p native, FSR can as well when it doesnt have motion artifacting
I have to say, I considered DLSS superior to FSR until now. Your 4K native screenshot looks great, closely followed by FSR 2.0. DLSS looks blurry as heck to me, almost like the small cutout picture you took of 1440p. I can hardly read the text there.
 
I have to say, I considered DLSS superior to FSR until now. Your 4K native screenshot looks great, closely followed by FSR 2.0. DLSS looks blurry as heck to me, almost like the small cutout picture you took of 1440p. I can hardly read the text there.

It uses Tensor cores and that's it. People over Reddit were altering a few FSR game files on the old Warzone, getting far better results than DLSS with FSR. Potential depends on the implementation entirely.
 
Console Gamer: I'm playing 4K games on my $500 console. Woot! Woot!

PC Gamer: It's not native 4K. It's upscaled from 1080p and some quality settings are low and you're playing shooters at 30 FPS

Console Gamer: 30 FPS is fine. The human eye can't see more than 24 FPS anyway. You PC gamers are full of poop with your 144 Hz monitors.

PC Gamer: <sigh>
lmao
 
I have to say, I considered DLSS superior to FSR until now. Your 4K native screenshot looks great, closely followed by FSR 2.0. DLSS looks blurry as heck to me, almost like the small cutout picture you took of 1440p. I can hardly read the text there.
the 1440p was native, not DLSS
FSR had some motion artifacting DLSS didnt


I uh, hacked DLSS into the game for my 1070Ti since then - the quality difference on that GPU is massive, 25% more FPS and that fuzzy text is readable further back than native res

The key point i was making is that 4K with upscaling tech is visually superior to plain 1440p these days - bad implementations exist, but overall they're better. That combined with consoles supporting 4K, but only just (and barely) supporting 1440p has sealed the deal, with 4K 120Hz oled coming to marker as well as all the high refresh 4K displays samsung are throwing out

I do have the full high res screenshots from the testing, but TPU cant accept images that large (hence the 100% image size snipping tool shots)
 
Of course not. It's newer technology that has a very high bar for entry.

Neither 2K nor 5K are television resolutions, they will never have widespread market adoption even if 1440p is twice the linear resolution of 720p. No TV manufacturer actively markets 1440p sets to a global audience.

8K resolution is sweet -- I've seen it in person -- but it really benefits from a theatre-like setting.
oh? 1920x1080p isn't TV resolution? (1080p is 2k 1440p is 2.5k 1620p is 3k 2160p is 4k and so on ) drat, my Toshiba 32" FHD (which is kinda horrible) doesn't use a "TV resolution" :roll:

"who can do more can do less", i saw benefits, testing with a friend, for 2.5K over 4k after all "the Xbox Series X, will automatically detect a 1440p resolution and give you the option to support it." (1620p is also easily achievable) with better performances without the upscalling
also
"The maximum native resolution available on Xbox Series X is 3840Ă—2160, while the lowest resolution on console is 2432Ă—1368."
"the PS5 runs at dynamic resolution, with the highest native resolution being 2880Ă—1620 and the lowest being around 1824Ă—1026" (well PS5 is a 3K console :D )

indeed, the issue is not the console, but rather the TV that the console will be hooked on ... and 2k-4k are the most common resolutions

tho for me 1440p "IS" mainstream :laugh: and is supported by consoles ;)
 
As much as I hate to say it, probably never. 1440p is a fantastic sweet spot for fidelity vs. performance without upscaling or interpolation techniques, but what's technically "superior" doesn't always win out. Mainstream, as has been mentioned, is driven by the TV market, which went straight to 2160p (4K) from 1080p (FHD) because it's easy to essentially slap four 1080p panels together to make one big display. 1440p is nice for PC monitors, because you can make something larger than a 1080p display that still has a smaller pixel pitch. But there's no mainstream application for 1440p, at least not to any significant degree. There's HD, FHD and 4K in that space. 2/2.5K may as well not even exist as far as the vast majority of consumers are concerned.
 
There's HD, FHD and 4K in that space. 2/2.5K may as well not even exist as far as the vast majority of consumers are concerned.
again, FHD=2K=1920x1080p (1920x1080p is, indeed, half 3840x2160p and not quarter)

2K is the most common consumer TV resolution (well there is still a lot of HD/720p TV )


for the rest, you are, sadly, right.

edit: actually 720p/HD/1.3K (1.2K depending if rounding up or down) is the odd one out ... i guess 540p never was a thing :laugh: (the true 1k techncally or 0.96K if nitpicking on rounding :laugh: )
 
Last edited:
the 1440p was native, not DLSS
FSR had some motion artifacting DLSS didnt


I uh, hacked DLSS into the game for my 1070Ti since then - the quality difference on that GPU is massive, 25% more FPS and that fuzzy text is readable further back than native res

The key point i was making is that 4K with upscaling tech is visually superior to plain 1440p these days - bad implementations exist, but overall they're better. That combined with consoles supporting 4K, but only just (and barely) supporting 1440p has sealed the deal, with 4K 120Hz oled coming to marker as well as all the high refresh 4K displays samsung are throwing out

I do have the full high res screenshots from the testing, but TPU cant accept images that large (hence the 100% image size snipping tool shots)
I know it was native. :)

That's why I think...
4K FSR = a little bit worse than 4K native,
4K DLSS = a little bit better than 1440p native.

The crux in it is that ideally, you don't run a native 1440p image on a 4K screen, and you don't run a native 1080p image on a 1440p screen, so one should be comparing how much worse upscaling looks compared to native, and not how much better it looks compared to a lower resolution.

With my 1080p screen, my question is "how much do I lose compared to 1080p native" and not "how much do I gain compared to 720p or 900p".
 
again, FHD=2K=1920x1080p (1920x1080p is, indeed, half 3840x2160p and not quarter)

2K is the most common consumer TV resolution (well there is still a lot of HD/720p TV )


for the rest, you are, sadly, right.

edit: actually 720p/HD/1.3K (1.2K depending if rounding up or down) is the odd one out ... i guess 540p never was a thing :laugh: (the true 1k techncally or 0.96K if nitpicking on rounding :laugh: )

I shouldn't have said 2K, that actually means something else (2048 x 1080 specifically) and isn't really relevant here. But FHD is definitely one fourth of 4K. 2X in two directions (twice width, twice height) is 4X.

1920 * 1080 = 2,073,600
2,073,600 * 4 = 8,294,400
3840 * 2160 = 8,294,400
 
I shouldn't have said 2K, that actually means something else (2048 x 1080 specifically) and isn't really relevant here. But FHD is definitely one fourth of 4K. 2X in two directions (twice width, twice height) is 4X.

1920 * 1080 = 2,073,600
2,073,600 * 4 = 8,294,400
3840 * 2160 = 8,294,400
okay, but that's not common sense as 1920x2=3840 and 1080x2=2160 (edit: well, for me at least)

but in term of pixel count, alright i agree (tho no consumer or marketing Dept. would see it like that, ofc)

Alright, I concede :ohwell:
 
Last edited:
2021 already lmfao. To buy FHD monitor (yes, note: if more than 24") for gaming in 2k22 (even in 2021) for me means you have pretty weak gpu and don't plan to upgrade it in near future. As I'm OK with 60 Hz/60 fps then I don't care too much reso means way more than Hz for me, I can't get the point getting 1920*1080 even 120 Hz vs 2560*1440*60, I've tried many notebooks and monitors with all these 144 and 240 Hz and still can't damn get the difference. For 2K 60 fps the gpu tax is NOT SO BIG, so stop yelling pointless things. I;ve played Hitman 3 (very well optimized game) in 2560*1440 on medium settings even with 1660 Super! With RTX 2060, RX 6600 or RTX 3060 there are very stable, comfortable and visually appealing gameplay. Heck, I know dudes who will play even on 3060 Ti in FHD, but I don't believe many of people could tell you a difference between 100 and 120+ fps for example.

again, the gpu taxing is all these marketing "wide" resolutions, like 3440*1440 or whatever. But 2560*1440 is not so different from FHD. Even in daily web use, I barely notice a diff between FHD and 2K, 4K is so way more real screen estate lol.
 
I know it was native. :)

That's why I think...
4K FSR = a little bit worse than 4K native,
4K DLSS = a little bit better than 1440p native.

The crux in it is that ideally, you don't run a native 1440p image on a 4K screen, and you don't run a native 1080p image on a 1440p screen, so one should be comparing how much worse upscaling looks compared to native, and not how much better it looks compared to a lower resolution.

With my 1080p screen, my question is "how much do I lose compared to 1080p native" and not "how much do I gain compared to 720p or 900p".
I have two 32" monitors, one 4K and one 1440p side by side

The 4K runs 150% scaling which makes them perfectly match up - even youtube reports the desktop as 2560x1440, so its pixel perfect for everything but clarity


And yeah, it's super obvious side by side that you simply dont see anywhere near as much. While the speed of my 1440p display is great at 165hz, simply going from clearly visible legible text to blurry smudge is just... nope.

(Also, screenshots negate any issues from non-native res - you're sharing the rendering, not any fuzz or issues from the display)
 
this is the stupidest poll I have seen on the internet this week
 
okay, but that not common sense as 1920x2=3840 and 1080x2=2160

but in term of pixel count, alright i agree (tho no consumer or marketing Dept. would see it like that, ofc)
4K means 4x the number of pixels as in FHD. It's the same thing as camera megapixels. 20 MP is 10 MP x 2, and not the width and height of 10 MP x 2 (which would be 40 MP). 2x2=4.

2021 already lmfao. To buy FHD monitor (yes, note: if more than 24") for gaming in 2k22 (even in 2021) for me means you have pretty weak gpu and don't plan to upgrade it in near future. As I'm OK with 60 Hz/60 fps then I don't care too much reso means way more than Hz for me, I can't get the point getting 1920*1080 even 120 Hz vs 2560*1440*60, I've tried many notebooks and monitors with all these 144 and 240 Hz and still can't damn get the difference. For 2K 60 fps the gpu tax is NOT SO BIG, so stop yelling pointless things. I;ve played Hitman 3 (very well optimized game) in 2560*1440 on medium settings even with 1660 Super! With RTX 2060, RX 6600 or RTX 3060 there are very stable, comfortable and visually appealing gameplay. Heck, I know dudes who will play even on 3060 Ti in FHD, but I don't believe many of people could tell you a difference between 100 and 120+ fps for example.

again, the gpu taxing is all these marketing "wide" resolutions, like 3440*1440 or whatever. But 2560*1440 is not so different from FHD. Even in daily web use, I barely notice a diff between FHD and 2K, 4K is so way more real screen estate lol.
Very well said!

The reason why I recently upgraded to a 6750 XT with a 1080p 60 Hz monitor is because a faster GPU can render my frames more efficiently, and potentially for a longer time. Even a 6500 XT is enough for my needs right now, but it might not be next year or the year after, while a 6750 XT definitely will be. This is also why super expensive, ultra high-end GPUs don't interest me.
 
Going forward I tend to look at what the broadcasters are doing for TV. After 1080p we now have 4k broadcasting with 8k picking up momentum. .1440p can be skipped as most card that can do 1440p should be able to do 4k with setting turned down. Also, AFAIK there's no 1440p TV broadcast. I know we are talking about gaming on real gaming monitors that can do 1080p+, but I tend to follow what the broadcasters are doing.
 
I suspect that, over the past couple years, there was a large correlation between what monitor resolution people play at and the price of proof of work cryptocurrency. Now that Ethereum went to proof of stake, videocard prices have been affordable, and therefore it is likely people will buy higher resolution screens going forward.
 
Back
Top