• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

What display resolution are you using?

What display resolution are you using?

  • 3840x2160

    Votes: 64 24.4%
  • 2560x1440

    Votes: 96 36.6%
  • 7860x4320

    Votes: 0 0.0%
  • 1920x1080

    Votes: 70 26.7%
  • 3440x1440

    Votes: 22 8.4%
  • 1440x900

    Votes: 0 0.0%
  • 1280x720

    Votes: 0 0.0%
  • 3840x1080

    Votes: 4 1.5%
  • 5120x1440

    Votes: 6 2.3%

  • Total voters
    262
If someone here has a 4K 32" display, do you use it with scaling other than 100% or 100% is sufficient?
Replying almost an year late, but I use 125% myself.
 
That's why I wait buying 4K, I like gaming at higher fps.
Depends hella much of the game, I'd say. I play mostly multi-platform games where 60 is more than enough. I feel like getting a 1080p144 was just pure waste of money for me :D
 
Of course 4K is also nice for the extra desktop real estate, I would run and enjoy it at 100% , no scaling.
 
My problem now isn't the price of the display, but the price of the hardware necessary to drive it in games. Also, what am I gonna do with my current monitor that I absolutely love?
 
4K60? Nope!
I like 4K but I cannot accept 60Hz anymore.

I prefer a lower resolution, like 3440x1440 that I use, and 100+Hz.

My monitor is decent enough. 35inch, 3440x1440, 100Hz, VA. Great colours and contrast but terrible ghosting in very dark games.
It's been easy-ish to drive with most of the gpus I've had so far. The heavy RT games are the ony ones that make all the gpus to sweat. My 2080Ti was ok - now the 4080 capped at 100fps, is quite relaxed at the moment.

I would like a monitor change, just for a change, such as a 43inch OLED or similar. But not yet.
 
3440x1440 144hx on a 34" UW Dell
 
I'm after resolution by don't care for speed (not really a gamer) so 60 Hz works for me.
 
Last edited:
My problem now isn't the price of the display, but the price of the hardware necessary to drive it in games. Also, what am I gonna do with my current monitor that I absolutely love?

Are you going to retire with that monitor?
It really depends on the games. You can run CS:GO at 4K literally on a potato. :roll:
 
1440p@60 Hz.
But I do wish Dell would launch 24" 1440p monitor with 75 Hz FreeSync...
 
Are you going to retire with that monitor?
It really depends on the games. You can run CS:GO at 4K literally on a potato. :roll:
I don't play CS:GO, or any other fast-paced online shooter, to be fair. :D

As for retiring with the monitor... I dunno... anything seems better than throwing away a perfectly good monitor that I have no problem with. :ohwell:
 
1920x1080 @ 240Hz

most of the time i lock fps to 120 to save watts
 
31.5" FHD 83 Hz as a main display.
43" 4K 60 (factual 50) Hz as the only TV.
24" 1920x1200 60 Hz as a backup monitor.

31.5" 1440p Million Hz would be just perfect but paying $500 for a display is not my hobby, and I still play Cyberpunk time to time and it's too heavy for RX 6700 XT at native 1440p. Will probably get myself a Gigabyte M32U when I get rich enough. 1080p when 6700 XT is non-enough, 4K when 6700 XT is enough.
 
I have two monitors, but I only use one at a time, the other one sits in my closet when not in use.

23.8" 165hz 1080 IPS, for games like Cyberpunk 2077, because I want a consistent 150-165 fps on best settings I can get, no ray tracing.

Games that don't need that lower resolution to get high frames, I use my 27" 2560x1440 165hz IPS.

I am a high refresh whore... Sure, I could use windowed mode on the 1440p, but I find it distracting.
 
i am at 1440p now, down from 2160p, because oled do not come in 2160p yet, and 1440p is too low, please fix.
 
That's not how it works at all though. FHD, QHD, etc, are official naming classifications used to describe specific fixed resolutions:-

HD = 1280x720, FHD = 1920x1080, QHD = 2560x1440, UHD/4K= 3840x2160, etc...

... is absolutely no different to how qVGA (320x240), VGA (640x480), SVGA (800x600), XGA (1024x768), SXGA (1280x1024), etc, are still named exactly the same after all these years despite not being mainstream anymore. Nor are PAL (720x576) or NTSC (720x480) going to change either just because HDTV / Blu-Ray is a thing. Just because someone bought themselves a new 4k monitor doesn't mean the entire naming system of every other resolution has to be renamed at all. That would be insanely and unnecessarily confusing on an industry level.
I hate all of this, because I just think of "full HD" as "Normal HD", & 2160P as Quad-HD. or QHD.
I don't even count 720P as HD,lol
 
  • Like
Reactions: ARF
That's not how it works at all though. FHD, QHD, etc, are official naming classifications used to describe specific fixed resolutions:-

HD = 1280x720, FHD = 1920x1080, QHD = 2560x1440, UHD/4K= 3840x2160, etc...

... is absolutely no different to how qVGA (320x240), VGA (640x480), SVGA (800x600), XGA (1024x768), SXGA (1280x1024), etc, are still named exactly the same after all these years despite not being mainstream anymore. Nor are PAL (720x576) or NTSC (720x480) going to change either just because HDTV / Blu-Ray is a thing. Just because someone bought themselves a new 4k monitor doesn't mean the entire naming system of every other resolution has to be renamed at all. That would be insanely and unnecessarily confusing on an industry level.

You describe it in the wrong way. 720 is HD ready, not HD.
Names change and must change in order to represent the new reality and make it easier for the consumers to understand.
Can you explain why 2560x1440 is QuadHD? And 2K, at the same time... :D It is a wrong labeling and must change.

The term high definition once described a series of television systems originating from August 1936;[5] however, these systems were only high definition when compared to earlier systems that were based on mechanical systems with as few as 30 lines of resolution. The ongoing competition between companies and nations to create true "HDTV" spanned the entire 20th century, as each new system became higher definition than the last.
 
Last edited:
shouldn't 4k be QHD as it is literally 4x1080p?
 
3840x2160 gang. Like with some other things, sites like these cater to an enthusiast crowd, we see some items massively over represented than compared to the broader market.

Like 2160p monitors for gaming, iirc last time I checked the steam survey we represented ~3.42% of users, but in poll respondents here, 24.9%

Same with AMD Radeon gfx, on steam they represent ~15.93% of users, but here we have respondents indicating Radeon owners make up 33.7% of users, less drastic than the monitor resolution responses, but also over a much larger respondent sample size.
 
The only game I play Windowed on the desktop is Heroes of Might and Magic 2. If you fullscreen that shit you have pixels the size of a Halfling fist, that just doesn't work :D

As for the commentary of sticking to 1080p~ish monitors because performance... I don't know guys, try it and be amazed. Resolution up isn't as big a performance killer as you might think. Quality level up kills perf, resolution not quite as much... I did play @ 3440x1440 even on a GTX 1080 and at medium~high settings virtually everything except new AAA was perfectly playable, as in >50 FPS and much higher.

Still wouldn't advocate moving to 4K 16:9 though... if you want to step up from 1440p, get the ultrawide version, horizontal view expansion is awesome and 21:9 doubles as that dual monitor setup you used to have, WIN key plus directional is your new friend. Two full height web pages side by side... glorious.
 
This counts as a close to zero performance level than to a million FPS I'm aiming for.
Resolution up isn't as big a performance killer as you might think
What if I already tested 1440p in my favourite games and found out I have about 20 percent less performance than it's needed for a bare minimum of comfort? Not to mention real comfort which is twice the FPS. Upgrading a GPU is not an option, everything 2x the performance of 6700 XT costs wild money where I live and having less than 100% uplift is just a waste of an effort. Can't afford and even if I could... waste. I'd better invest in something else. Better audio system or a new fridge lol.
3440x1440
Sucks big time. Software engineers are still completely unbothered by scaling issues and such small pixel size is unusable at 100% from 50ish inches away. Yeah, I could easily put a monitor closer so I could be able to use it at 100% scale but it's darn uncomfortable when the display is so close.
Don't exist in 31.5" extension format (resounding about 39"). I tried using conventional 34" panels, vertical size is insufficient. 50 plus inches away from a display is the reason why.

This sums it up why I am using 1080p as my daily driver. It's not ideal but other options are either too cost inefficient or just non-existent.
 
The only game I play Windowed on the desktop is Heroes of Might and Magic 2. If you fullscreen that shit you have pixels the size of a Halfling fist, that just doesn't work :D

As for the commentary of sticking to 1080p~ish monitors because performance... I don't know guys, try it and be amazed. Resolution up isn't as big a performance killer as you might think. Quality level up kills perf, resolution not quite as much... I did play @ 3440x1440 even on a GTX 1080 and at medium~high settings virtually everything except new AAA was perfectly playable, as in >50 FPS and much higher.

Still wouldn't advocate moving to 4K 16:9 though... if you want to step up from 1440p, get the ultrawide version, horizontal view expansion is awesome and 21:9 doubles as that dual monitor setup you used to have, WIN key plus directional is your new friend. Two full height web pages side by side... glorious.

Just got a 34" UW, have to agree with you. Went from two monitors to this and this is better(imo) and it has PIP too.
 
Back
Top