• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

What resolution do you game at

The resolution you game at

  • 720p

    Votes: 2 1.1%
  • 768p

    Votes: 2 1.1%
  • 960p

    Votes: 1 0.6%
  • 1080p

    Votes: 82 45.8%
  • 1440p

    Votes: 59 33.0%
  • 2160p

    Votes: 24 13.4%
  • Beyond 4k

    Votes: 1 0.6%
  • 1600p

    Votes: 6 3.4%
  • 1200p

    Votes: 4 2.2%
  • Other please state

    Votes: 21 11.7%

  • Total voters
    179
I know this thread is about gaming, but has any of you low-res guys ;) tried a higher resolution screen for productivity? I used to have a secondary 1080p to my old 2048x1152 screen, but now I don't even need that with 1440p. I can run two applications side by side on the same screen and it really makes life a simpler. I don't understand why people wouldn't want higher resolution screens (not taking budget/cost into consideration), but hey, each to their own.
That is also one of the benefits of ultra wide screens which I like.
 
1440p@100Hz on my main rig
1080p@60Hz on my secondary
720p@60Hz on my laptop
 
Having 1440p and 144Hz and expecting anything good from that, you won't get it...

1440P at 144hz > 1080P at 144hz. I've had both, 1440P is much better -- games look much sharper and less blurry. If you can run it 1440P is the way to go.

If y'all wanna sit around and convince yourselves otherwise go ahead; but once you try 1440P there is really no contest.
 
Apart from more expensive monitor from the start, higher need for more powerful graphic card, faster depreciation of graphic cards, all that totally doesn't matter right? If things were that simple, everyone would be at 4K. If you have the money to have a SLi of GTX 1080, good for you. Most of us simply don't.

Using lower resolution (but still monitor native) is how you game with Ultra settings "on the cheap".
 
Aspect Ratio 5:4. Resolution 1280*1024 75Hz. 16 years old LCD.

That's all that will fit on my desk.
 
Last edited:
Anything from 320x240 to 4K

I know this thread is about gaming, but has any of you low-res guys ;) tried a higher resolution screen for productivity? I used to have a secondary 1080p to my old 2048x1152 screen, but now I don't even need that with 1440p. I can run two applications side by side on the same screen and it really makes life a simpler. I don't understand why people wouldn't want higher resolution screens (not taking budget/cost into consideration), but hey, each to their own.
Even better on 4K or an ultrawide screen with PiP mode.
I can finally run the main PC and a testbench with whatever I currently fix without constantly switching between video sources. Starting to like a portrait mode more and more.

And I rather play at 1080p with everything at Ultra than 1440p and settings turned down.
I do the same (can't do better on GTX950), but still it is much better to spend that extra cash on 4K screen for work/productivity, rather than narrow your scope to games only.
You don't need a lower-res display to justify spending less on a GPU. All it takes is stop listening to people and evaluate your options objectively.
E.g. I do know that I'm working a lot with text, schematics, PDFs and various PCB/circuit design tools, so having 4 times more pixels will pay off by reducing my need in future laser correction surgery.
Been working with PCs for the past 16-17 years, spending no less than 6 hours a day in front of the screen (from awful 15" CRTs to my new 24" 4K display), and I still have 1.0 vision (20/20 for americans).
The only reason I do, is because every once in awhile (~5yrs or so), I force myself to buy a new monitor.
 
I know this thread is about gaming, but has any of you low-res guys ;) tried a higher resolution screen for productivity? I used to have a secondary 1080p to my old 2048x1152 screen, but now I don't even need that with 1440p. I can run two applications side by side on the same screen and it really makes life a simpler. I don't understand why people wouldn't want higher resolution screens (not taking budget/cost into consideration), but hey, each to their own.

Because resolution is just one aspect of consideration when buying a monitor, the other being high refresh rate, color accuracy, etc
 
Yeah, but we're talking games here, not "productivity". Besides, I do "productivity" stuff on my monitor as well and for the needs, it's perfectly good.
 
My philosophy exactly!

If I only have to turn down a few settings, I'll play at 1440p. If the difference is going from high settings to all medium, I'll play at 1080p.
 
And I rather play at 1080p with everything at Ultra than 1440p and settings turned down.
You can still game at 1080 with some 1440 monitor and then go back to the 1440 for anything else that you like.
Some games, and internet I do 1080, other times I'm back to 1440 just depends on what and where.
Mainly though having 1440 is fantastic but it is on a 27" screen so that is really nice too. but really having the option all in one swoop make it worthy but honestly most all the time I use 1440.
And when the deals come up 2-4 times a year for the $200- $300 price point it's an excellent choice for a 27" screen.
 
1920x1200 @ 60 Hz

"p" denotes an ATSC/PAL TV standard (4:3 or 16:9). WxH denotes a computer monitor which likely follows VESA's standards (usually 4:3, 5:4, or 8:5 but there are literally no limits). Silly numbers like "1080p" come from letterboxing/downsizing 8:5 VESA widescreen resolutions to ATSC/PAL standards.
 
Last edited:
P or I means progressive scan or interlaced.
 
Apart from more expensive monitor from the start, higher need for more powerful graphic card, faster depreciation of graphic cards, all that totally doesn't matter right? If things were that simple, everyone would be at 4K. If you have the money to have a SLi of GTX 1080, good for you. Most of us simply don't.

Using lower resolution (but still monitor native) is how you game with Ultra settings "on the cheap".

You have a 980 - it will game at 1440P just fine. Plus think of all of the older games you play that would bomb along at 80+ fps at 1440P ultra settings.

The most popular games don't even need that much juice (Overwatch, CS:GO, League, Dota) - even GTA V would run great on Ultra @ 1440P with a 980. But all of them look a hell of a lot better with more resolution than an extra bump in post processing or ambient occlusion.

I personally prefer 1440P with no AA to 1080P with SMAA/TXAA and the performance hit from those is sometimes even more than bumping the resolution.
 
My philosophy exactly!
Ok and I get it but what's the point in using very high or ultra textures then when you can't see the details in it and after stepping up to 4k that's what most hit me ,at 1080p you see a much blurrier version regardless of quality sliders.
Plus as I said you don't have to run everything at max Res.
That said I only drop Res for doom and that's likely due to it not supporting multi GPU.
 
Ok and I get it but what's the point in using very high or ultra textures then when you can't see the details in it and after stepping up to 4k that's what most hit me ,at 1080p you see a much blurrier version regardless of quality sliders.
Plus as I said you don't have to run everything at max Res.
That said I only drop Res for doom and that's likely due to it not supporting multi GPU.

If games have motion blur in them i shut that crap off.
 
You have a 980 - it will game at 1440P just fine. Plus think of all of the older games you play that would bomb along at 80+ fps at 1440P ultra settings.

The most popular games don't even need that much juice (Overwatch, CS:GO, League, Dota) - even GTA V would run great on Ultra @ 1440P with a 980. But all of them look a hell of a lot better with more resolution than an extra bump in post processing or ambient occlusion.

I personally prefer 1440P with no AA to 1080P with SMAA/TXAA and the performance hit from those is sometimes even more than bumping the resolution.

There is a difference between "fine" and "excellent". What's the point of 144Hz monitor if games won't be achieving that because of higher resolution. And post process AA is basically free these days...
 
I can't Judge 144Hz monitors, but am inclined to say it presents a smoother image. So my next monitor will be either 144Hz (which has my current preference) or a 1440p one.
The games I can play at 1440p@60fps are really more enjoyable with vsr on my 1080 monitor.
Voted 1080 though since that's the res mostly used.
 
If I only have to turn down a few settings, I'll play at 1440p. If the difference is going from high settings to all medium, I'll play at 1080p.

That's a good thought process I would like to adopt. I plan to go up to 1440 within 2 years or less, but right now I am happy with my visuals, and I have a perfectly running monitor.

Ok and I get it but what's the point in using very high or ultra textures then when you can't see the details in it and after stepping up to 4k that's what most hit me ,at 1080p you see a much blurrier version regardless of quality sliders.
Plus as I said you don't have to run everything at max Res.
That said I only drop Res for doom and that's likely due to it not supporting multi GPU.

Except I don't see blurry. Two different sets of eyes I guess. :) I know graphics isn't all there is, I play plenty of lesser graphical quality games for fun just because they are that: fun. However, on games where a lot of hard work has gone into making a very visually appealing game, I want to see it as detailed as the developers intended. I won't be able to do that consistently with my hardware at 1440p. I need to get to that point because it will involve not only a monitor but a new top-tier GPU.
 
I currently play @ 2,560 x 1,080 pixels (1080p UltraWide).
I plan on moving to 3,440 x 1,440 as soon as we'll see panels with good HDR capabilities and higher refresh rates. This will also be the time I'll upgrade my GPU.
 
P or I means progressive scan or interlaced.
All modern monitors are progressive scan. HDTVs too for that matter. You'd have to go back at least a decade to find a genuine interlaced TV.

Progressive/interlaced refers to the incoming ATSC/PAL signal (480i and 1080i). In the perfect world, "p" and "I" would only refer to displays that have an MPEG2 decoder and thus, can display ATSC/PAL signals.
 
Last edited:
I know this thread is about gaming, but has any of you low-res guys ;) tried a higher resolution screen for productivity? I used to have a secondary 1080p to my old 2048x1152 screen, but now I don't even need that with 1440p. I can run two applications side by side on the same screen and it really makes life a simpler. I don't understand why people wouldn't want higher resolution screens (not taking budget/cost into consideration), but hey, each to their own.

I can see the benefit of higher res and bigger screen when using Blender as well as GIMP basically because you have more screen real estate and less need for scrolling.

But those are luxuries.
 
Back
Top