• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

How would you rather game?

How would you rather game?


  • Total voters
    73

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.78/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
Since not many can play 4K at 144Hz yet, since the monitors cost about 3 grand and the graphics cards for that resolution and framerate aren't quite available yet, how would you prefer to game when it comes to spatial resolution v temporal resolution? Vote in the poll!
 
Ideal situation for me is 60+ fps in either 1080 or now 1440p, Ive just bought myself my first ever 1440p 144hz monitor so I'm bit excited to try that out.
 
Please vote kurosagi01!
 
Other; 1440p @ 165 Hz Gsync. Perfect balance for me. 1080p is too low, 2160p is too high. My OC'ed 1080 Ti is pretty crap in 2160p... in demanding games on high settings that is.

I think I will stay at 1440p if the upcoming Mini LED backlight gaming panels will have this option. Maybe even with 240 Hz. Or just keep my current monitor.
 
Last edited:
2560x1440 constant 60fps. For me. 60fps is actually very good and acceptable. I want the top-end visuals.
 
4k @ 60fps

(since I'm doing a lot on a TV). But that is but a dream.

edit: I should say that I'm getting close, since I have one of the newer Samsungs with Freesync. So I can get by with 50fps games. While some lighter games easily do 60fps without help (Ori, Cuphead). Surpringly, Forza 7 and DOOM are solid 60 fps, but that's a rarity.
 
Last edited:
1080p is fine for me. If the game dips to 40fps it doesn't bother me. It only becomes noticeable for me at around 30 fps. I like max details but If I have to lower some details because of performance, I go at AA first, and the type used. Then I go after some other questionable functions.

With my system I shouldn't have a problem, but you would be surprised. Killing Floor 2 with max details drives my system down in some areas to 37fps if the battle is too heated.

Doesn't make my game unplayable. But does ruin some of the experience.

Would like to play on my 4k tv, even if at 1080p res. So i can sit back in my underwear and enjoy the big screen.
 
This is a very good poll qbit.
I'd take 1080p 144 fps over 4k 60 anyday, though it's not my preferred #1 choice. I value high framerate more than level of detail but I just hate aliasing, I like smooth edges and sharp image. That's why my primary gaming display is 24" 1440p at 165hz. If I had to choose between high framerate at 1080p and better sharpness at 4K, I'd chooose smoother framerate without a dobut.
 
I agree with rtwjunkie though,if I didn't want 144 fps and I was okay with 60, I'd much rather turn on all eyecandy at 1440p (e.g. PCSS/HTFS) than run without them at 4K.
 
I see a few of your are picking 2560p at 165Hz. I guess that would have made for a good third option. Didn't think about it. :ohwell:

I can edit the poll to add it, but I think it would remove all your votes if I did so.
 
I only have vsync off in Overwatch and Battlefield, everything else 1440p 60hz vsync on. Due to gpus prices not many people can afford a 1440p monitor along with a strong gpu that can push 100fps at that resolution even 60fps stable. Hope that next gpus gen makes this posible turning 1440p 60hz the standard res for most gamers. So for me 1440p 60hz cuz i cant reach 144 or 100 fps in most games for now. But i would love to be able to play at 144hz all the time.
So 2560x1440p 144hz is my desire
2560x1440p 60hz is what i can achieve and no complains looks amazing
 
Last edited:
@Tatty_One I see you've taken the Old Skool option! :p
 
My ideal monitor would be 2560x1600 (8:5), 144 Hz, DisplayHDR 600, FreeSync 2 and enough GPU grunt to maintain it.
 
My ideal monitor would be 2560x1600 (8:5), 144 Hz, DisplayHDR 600, FreeSync 2 and enough GPU grunt to maintain it.

Why HDR 600 when 1000 exist? HDR 400 and 600 is pretty crap compared to my OLED HDR. I have not seen 1000 on a PC monitor yet. Hopefully it's way better.

Might as well add mini LED backlight. For "perfect", micro LED instead.

Not sure I'd want anything else than 16:9 after trying out Ultrawide for a few weeks. Too bad support in games. Most of my games were played with black bars or streched.. Most games are made and optimized for 16:9.
 
60 fps+ in 4k and i am happy.
 
Why HDR 600 when 1000 exist?
Because 1000 nit is intended for being in direct sunlight, which I am not. They are blindingly bright in a dimly lit room, which I am.
 
Because 1000 nit is intended for being in direct sunlight, which I am not. They are blindingly bright in a dimly lit room, which I am.

Well LCD needs this kind of brightness boost to be able to deliver decent HDR, even in a dim room. Should have FALD backlight anyway. This is why mini LED backlight is going to be interresting. Edge lit LCD sucks for HDR. Most "HDR" PC monitors are a complete joke tho.
 
Been gaming 1440p@144hz for almost 3 years. Love it. Can’t go back to 1080. Since 4K is so expensive I’ll stay on this res until 2020 and then upgrade
 
Well LCD needs this kind of brightness boost to be able to deliver decent HDR, even in a dim room. Should have FALD backlight anyway. This is why mini LED backlight is going to be interresting. Edge lit LCD sucks for HDR. Most "HDR" PC monitors are a complete joke tho.
HDR is about contrast more than brightness and DisplayHDR 600 requires 3000:1 ratio which is three times better than typical displays.

Here's an example of a 1500 nit presentation/advertising display and it also has a contrast ratio of 3000:1 (via local dimming):
http://www.dynascanusa.com/product/55-1500-nit-high-brightness-lcd-with-super-narrow-bezel-ds55lx3/
The brighter the illumination, the more difficult it is to prevent bleeding.

DisplayHDR 1000 is intended for living rooms or situations where you're sitting many feet away.
 
HDR is about contrast more than brightness and DisplayHDR 600 requires 3000:1 ratio which is three times better than typical displays.

Here's an example of a 1500 nit presentation/advertising display and it also has a contrast ratio of 3000:1 (via local dimming):
http://www.dynascanusa.com/product/55-1500-nit-high-brightness-lcd-with-super-narrow-bezel-ds55lx3/
The brighter the illumination, the more difficult it is to prevent bleeding.

DisplayHDR 1000 is intended for living rooms or situations where you're sitting many feet away.

DisplayHDR is a measurement for PC monitors from what I know. All DisplayHDR certified screens are PC monitors, zero TV's on the list. With TV's, you simply look for Ultra HD Premium to ensure a good HDR experience and requirement is 1000 peak-nits and 0.05 nits black level, which is on par with DisplayHDR 1000 for LCD panels.

So why should DisplayHDR 1000 be intended for a living room TV? TV's already have a standard.

You're talking about your "dream specs". Pretty strange to settle with crappy HDR then. DisplayHDR 1000 + FALD / Mini LED backlight and you will get a much better HDR experience than DisplayHDR 600. 400 is basicly useless HDR. There's tons of fake HDR TV's and PC monitors. Some even use 8 bit panels, no FRC.

My dream monitor would not use LCD. It would use Micro LED. Backlight is crap. Self emitting pixels is the future.
Even the best LCD TV's with FALD is not even close to the HDR quality that OLED is capable of.

Nothing below DisplayHDR 1000 or Ultra HD Premium would be able to deliver a good HDR experience. LCD tech that is. If you go HDR, get proper HDR.
Proper HDR makes a huge difference. Crappy HDR does not. When some people say that HDR is not impressive, you just know that they have not seen proper HDR. Either because they were using a "fake HDR" monitor or TV... or simply failed to set the software up correctly.
 
Last edited:
1440p w/ AU Optronics 165 Hz IPS Screen, when > 75 fps, G-Sync OFF / ULMB ON. Would not consider any monitor w/o ULMB.
 
So someone voted for 30 FPS !! , I know it's very playable , but you will not feel the pleasure of the gaming.:(
 
1440p@75/100/120Hz. I vary the refresh rate depending on what game I'm playing and what I can run smoothly without dropping frames. Though I actually can't tell the difference between 75Hz and 120Hz, so I tend to just leave it at that...
 
DisplayHDR is a measurement for PC monitors from what I know.
VESA creates standards for everything display. VGA is VESA, DVI is VESA (which HDMI Forum piggy backed on for HDMI), DisplayPort is VESA, the mounts for panels is VESA, sRGB was designed by VESA, etc.

All DisplayHDR certified screens are PC monitors, zero TV's on the list.
The standard is still new. DisplayHDR TVs will be coming. There's only maybe a dozen DisplayHDR monitors and most of those were FreeSync 2 compliant (which exceeds DisplayHDR 400 so certification was a breeze). Very few "HDR" TVs on the market today can pass DisplayHDR testing.

With TV's, you simply look for Ultra HD Premium to ensure a good HDR experience and requirement is 1000 peak-nits and 0.05 nits black level, which is on par with DisplayHDR 1000 for LCD panels.
UHD Premium, where this topic is concerned, is basically HDR10+ which is a video encoding and signaling standard for 10-bit color data with lighting information. It isn't a test/standard for ensuring panels faithfully display what they receive. DisplayHDR is. Example, this UHD premium TV does not qualify for DisplayHDR 1000 because it's maximum luminosity is only 900 cd/m2. Depending on how well it can control lighting across the breadth of the panel, it may not qualify for DisplayHDR 600 or even DisplayHDR 400. It can display HDR10+ encoded video though (big whoop-dee-doo).

You're talking about your "dream specs". Pretty strange to settle with crappy HDR then.
I said "ideal" not "dream." The monitor I want doesn't even exist thanks to the rarity of FreeSync 2 certified monitors (because fast response time, broad frequency response, and infinite dynamic contrast ratio) and 8:5 resolution ratio (because freakin' Hollywood killed it by throwing 16:9 at everyone).
 
Last edited:
Back
Top