• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Better monitor vs higher in game settings ? What's more important for prettier and more immersive experience in your opinion ?

Better monitor vs higher in game settings ? What makes for prettier and more immersive experience in

  • Better monitor-higher resolution/refresh rate/better panel/adaptive sync tech

    Votes: 25 78.1%
  • Higher in game settings

    Votes: 7 21.9%

  • Total voters
    32
Joined
Aug 6, 2017
Messages
7,412 (2.55/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
Is it possible to achieve great visuals while sacrificing the monitor and going all out on the GPU and settings ?
Is it possible to achieve the best experience while sacrificing some of he visual settings in favor of a better panel ?
what are your thoughts?

A theroretical question - would you prefer a GTX 1070ti/V56 paired with a quality 1440p high refresh panel and then dial back a few settings to achieve +90 fps at all times,or would you take a 1080Ti/R7/2080 and pair it with a 1080/60 dispaly to crank up all the visuals in any game to the max and not worry about fps for a few years.

If you were to make a list,how would you prioritize those when it comes to your own preferences:

in game settings
resolution
monitor size
panel type
refresh rate
adaptive sync technologies
blur reduction technologies
aspect ratio

chooose 3 for most important,2 for of secondary importance,1 for least important and 0 for not important at all.
 
In game Ultra 4k, fast vsync on, fps capped to 60, 4k @60 monitor.
The static image quality is more noticeable then fast movement quality, I dont think the eye can tell a difference between 60 and 120 hz, the more natural fps we are used to in films and tv is 30-60 hz.
 
In game Ultra 4k, fast vsync on, fps capped to 60, 4k @60 monitor.
The static image quality is more noticeable then fast movement quality, I dont think the eye can tell a difference between 60 and 120 hz, the more natural fps we are used to in films and tv is 30-60 hz.
I think the exact opposite is true,that's why I think it's important to have decent resolution but also a fast panel.
Games are constant movement,whether it's you or the surroundings.
 
Interesting question. But definitely monitor.

The reason for this is that games are usually optimized around a 'basic' graphics level that is sufficient to convey the game's message, the experience. Everything beyond 'Medium' is just bonus really, it helps immersion further but it doesn't create immersion. It barely ever happens that a game gets released with quality settings like 'Low' of the good old days, which basically meant a blurry, indistinct mess of pixels.

Next question is indeed what monitor aspects help immersion. But the basis of your question is focused entirely on 'immersive and pretty'. That is not the category I would file high refresh under. High refresh is an improvement in usability, in feedback from the game that enables you to respond to it better.

Its also game dependant. I'm playing Darkest Dungeon now... quite immersive, but has no graphics settings and zero benefits from a good monitor beyond it having decent colors and contrast.

For monitor specs the base line for me, for high refresh/gaming that is:
- 120hz, higher not required or noticeably beneficial
- IPS or VA, slight preference for VA
- Variable refresh tech does not command a price premium for me
- Must have strobe / ULMB and accompanying sufficient luminance of 350 cd/m
- Resolution can evolve naturally along with mainstream. Which is 1080p at this point, with 1440p being very feasible already, but expensive still.

In game Ultra 4k, fast vsync on, fps capped to 60, 4k @60 monitor.
The static image quality is more noticeable then fast movement quality, I dont think the eye can tell a difference between 60 and 120 hz, the more natural fps we are used to in films and tv is 30-60 hz.

Fast Sync is meant for 100hz and up, where it provides a combo of eliminating tear at lower input lag cost than normal Vsync. It is pointless at 60 FPS @ high detail, better off using Adaptive Vsync. Fast Sync will produce stutter or not do a thing at all. On 60hz, Fast Sync's benefit is very low input lag but only if you push uncapped FPS far higher (2x or more) than the refresh rate.

 
Last edited:
Well... I do know that GTA V looks like $h*t @ 60fps in my eyes...
 
I think the exact opposite is true,that's why I think it's important to have decent resolution but also a fast panel.
Games are constant movement,whether it's you or the surroundings.
Depends on what game you're talking about. FPS games might feel like that because things happen so fast. I don't play them all the time, but I don't choose my hardware specifically for twitch gaming. For me, I really like my 4k display, but I think the panel itself has a lot to do with it. For me color reproduction and image quality are bigger concerns for me, so I'll typically go with some form of IPS panel.

FWIW, games on a good panel look great at 4k.
 
Everything beyond 'Medium' is just bonus really, it helps immersion further but it doesn't create immersion.
there are exceptions,but very,very important ones.And they're game dependent.
I'll give you ambient occlusion for one.
When I turned vxao in rotr it changed the experience completely.It's like the rocks,trees and whole environments got almost photorealistic.
HTFS in WatchDogs2 is another example.Very taxing but god damn it's absolutely fantastic on sunny streets of SF.Hard to quit playing once you see how they work.

@Vayra86 you're correct about fastsync too.I tried it in various games and the only scenario I saw it work well was when running a 144hz panel at close to 200 fps.In other scenarios,e.g. 144hz mode but fps at 100,it's not woring like it should.I can't exactly describe what is wrong,but you get the feeling that there's something wrong.For the lack of a fancier word,I'd say that fast sync just feels weird outside of its use scenario.
 
Last edited:
I dont think the eye can tell a difference between 60 and 120 hz
I can tell the difference between 60 and 75hz , and it's clear.
 
I can tell the difference between 60 and 75hz , and it's clear.
I can tell 60 vs 65 easily,but the point is what makes a better immersion is dependent on an individual.The fact I can see 65 vs 60 doesn't mean I wanna sacrifice settings to run it.
In most cases I'd dial back the graphics slightly to get 70 fps over 60 fps or 90 fps over 70 fps.80 fps better IQ vs 90 fps lower IQ I'd stay at 80. Above 90 it's entirely game dependent.There's as many where I'd push for 120-130 fps as there are those where I'd be super comfortable with 90 and wouldn't feel the need for more at all.
I think it's got a lot to do with your vision too.For example I have great near vision,I'm very sensitive to fps/hz and aliasing.But sitting on a couch with a game running on a TV I couldn't tell 4K from 1080p with a gun to my head.I could see the fps changes easily though,just not the details.
 
A theroretical question - would you prefer a GTX 1070ti/V56 paired with a quality 1440p high refresh panel and then dial back a few settings to achieve +90 fps at all times,or would you take a 1080Ti/R7/2080 and pair it with a 1080/60 dispaly to crank up all the visuals in any game to the max and not worry about fps for a few years.
Neither. To me, both options blow. My answer is 1080Ti on 1440p monitor that will do 75 hz as long as games will output it, but frequently runs at 60.

My reasoning is this: at 1080, you are cpu bound and not going to get all you can out of the GPU with a 1080Ti or above. And with larger monitors, going hig refresh rate means that in order to keep my maxed out settungs, I have to buy a new GPU more frequently. That is not an option I can afford not want.
 
Neither. To me, both options blow. My answer is 1080Ti on 1440p monitor that will do 75 hz as long as games will output it, but frequently runs at 60.

My reasoning is this: at 1080, you are cpu bound and not going to get all you can out of the GPU with a 1080Ti or above. And with larger monitors, going hig refresh rate means that in order to keep my maxed out settungs, I have to buy a new GPU more frequently. That is not an option I can afford not want.
so you prefer a balance,like most of us.
I wouldn't say they blow.And if they do,there's one that blows more.I'd certainly prefer a better monitor even if that meant some sacrifice as it comes to gpu.

at 1080p you're exactly the same amount of cpu bound as you'd be at 1440p.

And with larger monitors, going hig refresh rate means that in order to keep my maxed out settings, I have to buy a new GPU more frequently. That is not an option I can afford not want.
exactly the area my question is touching.
firstly,whether or not maxing out settings is in any way more immersive than a better monitor.
and secondly,whether maxing out settings on a lower grade monitor is any more of a long term strategy than sacrificing a bit of IQ for a better monitor.
The more I follow new game releases,the more I'm inclined to believe that maxing out settings provides much less in return than buying a better monitor and tinkering with the settings if necessary.
 
Last edited:
Give me a large quality screen over higher refresh and pixels on a cheaper smaller screen. I'll take my Dell 27" IPS 1080p 75hz over cheap 24" TN 1440p 122hz panels.
 
I think there is a point where no matter how ultra your graphics settings may be, if you don't have a sufficiently adequate monitor to enjoy it, there is little to gain.

Although not a high refresh rate panel or 10 bit color, my current monitor is IPS for one main reason: I don't tolerate the color changing one bit just because I have my head a few degrees to the left, to the right, up or down. I freaking hate TN panels because of that.
 
I just changed my monitor a few months back. I had a Acer™ 1080p monitor @ 60hz and changed to a Electriq™ 4k @ 60hz.
the improvement is very good but the loss of fps as in not hitting 60fps on the ultra settings is disappointing.
I have never used or seen a high refresh rate monitor so I cant comment on that.
the worst thing I experience from buying the new monitor is the HDR™ feature, it looks like crap and most games look like crap with it on.

how have you found the effect of HDR™ on you monitors please?

I will say that even though the 4k is better the Acer™ 1080 monitor was a better class that my cheap 4k, so in my opinion, get a brand that is known good as cheaping out will be a mistake.
I had to RMA™ a Acer™ monitor once, excellent service, they paid to collect and then returned it in 7 days and paid return shipping too.
 
The more I follow new game releases,the more I'm inclined to believe that maxing out settings provides much less in return than buying a better monitor and tinkering with the settings if necessary.
Well keep in mind, nowhere did I say anything about preferring a more crappy monitor. IQ of the monitor is important to me too, just not high refresh (75 he is fine for me) because I am not forcing my graphics cards into early retirement and can still use it at max settings in game on my 1440p IPS.

There are so many options because there are no “right” answers. Everyone has their own preferences.
 
Is it possible to achieve great visuals while sacrificing the monitor and going all out on the GPU and settings ?
Is it possible to achieve the best experience while sacrificing some of he visual settings in favor of a better panel ?
what are your thoughts?

A theroretical question - would you prefer a GTX 1070ti/V56 paired with a quality 1440p high refresh panel and then dial back a few settings to achieve +90 fps at all times,or would you take a 1080Ti/R7/2080 and pair it with a 1080/60 dispaly to crank up all the visuals in any game to the max and not worry about fps for a few years.

If you were to make a list,how would you prioritize those when it comes to your own preferences:

in game settings
resolution
monitor size
panel type
refresh rate
adaptive sync technologies
blur reduction technologies
aspect ratio

chooose 3 for most important,2 for of secondary importance,1 for least important and 0 for not important at all.
hELL NO, You can set the highest settings you can on a 2080Ti ,4k textures in a game, everything.

But with a 1080p or1440p monitor ,it cannot actually show you a 4k texture ,What you see is an interpretation of what you asked for, you have to have the right optics/tool for the job your going to ask of it, First.

I had crap monitors for years(unknown brands, second hands and what have you /£70) , But I learned this after getting a 4k.
 
There is always a weaker link.
The static image quality is more noticeable then fast movement quality,
I agree with this - but will add "most of the time". I am not a hard core gamer so for me, most of the time, the background and much of the foreground imagery is static. And I want that to be of superb quality. And even with games, they are NOT constant movement - at least not the background. Much of the time, yes. Constant? No.

And I bet even hard core gamers are not gaming all of the time. I suspect some of the time they are surfing the internet, checking email and social media, or posting comments at TPU.

On the flip side, when gaming or with any animation, I sure don't want a bunch of blurry jaggies or atifacts either.

My point is, theoretical or not, I don't think it is a valid question until you (speaking to everyone) define which value you personally feel is more important to you based on what you do with your computer. In other words, it is a totally individual decision (I guess the budget might be a factor too).

For me, I want it all - except speakers. I don't want speakers in my monitors. I also want thin bezels (because I always go with a dual monitor setup) and a must is height adjustment because I hate putting phone books or reams of paper under monitors. Plus my monitors must still fit under my desk's hutch. And my chair height is set for my legs and back - as it should be.
 
and secondly,whether maxing out settings on a lower grade monitor is any more of a long term strategy than sacrificing a bit of IQ for a better monitor.
The more I follow new game releases,the more I'm inclined to believe that maxing out settings provides much less in return than buying a better monitor and tinkering with the settings if necessary.

This is a good observation. It has to do with the way games are optimized for the current console crop. They like to put '4K' on the box as resolution, so tricks are deployed within that render to maintain reasonable FPS. This means that the 'baseline' of graphics quality goes up (because that is what we optimize towards) but there is little to gain beyond that (the majority of sold games won't see it used). In some cases you see 'ultra Hd texture packs' post-release to cater to the increased capabilities of PCs, and you get easy to implement, but expensive post effects as well.

With the updated '4K capable' console versions what you get at 4K is often comparable to a 1080p ~ 1440p native render on the PC, often on many bits of the scene, while some others are true native renders such as the UI. DLSS is another such thing that does that for the PC - making it an enabler for 4K at the cost of a low quality hit. In my view this is useful only because it allows the use of a wider range of monitor resolutions, but I do think its a mistake to think 'there isn't more to gain' out of 4K for example.

But with a 1080p or1440p monitor ,it cannot actually show you a 4k texture

Euh... what? A '4K' texture is nothing but a detail resolution within that texture. When you walk closer to it, you can appreciate all the detail it has to offer, and it will lose that detail as you move further away from it... regardless of what monitor you use.

This is remarkably similar to the loss of detail you experience from view distance - being able to discern individual pixels or not.

You're rendering a 3D environment here... its not a static painting.
 
Last edited:
hELL NO, You can set the highest settings you can on a 2080Ti ,4k textures in a game, everything.

But with a 1080p or1440p monitor ,it cannot actually show you a 4k texture ,What you see is an interpretation of what you asked for, you have to have the right optics/tool for the job your going to ask of it, First.

I had crap monitors for years(unknown brands, second hands and what have you /£70) , But I learned this after getting a 4k.
what are you talking about dude?
like you can't tell a 4K photo from a 1080p photo on a 1080p monitor?
 
what are you talking about dude?
like you can't tell a 4K photo from a 1080p photo on a 1080p monitor?
they don't look the same though, 4k photo on 1080p v 4k photo on 4k.
 
they don't look the same though, 4k photo on 1080p v 4k photo on 4k.
that is correct.
but buying a 4k display currently means being forced to run 60hz on panels that aren't exactly the fastest.means you'll get ultra sharp static image but large amounts of blur once in motion.
I believe the middle of the road solution - 1440p on a high refresh panel - gives you the best balance out of currently available solutions.my own monitor being 24" with ulmb/very well implemented overdrive only makes it better when it comes to static vs motion sharpness.

IQ of the monitor is important to me too, just not high refresh because I am not forcing my graphics cards into early retirement and can still use it at max settings in game on my 1440p IPS.
empty words.
this would be true only if you could prove 2 things:
one that that max settings take a lower performance hit over time,let's say at 1440p 60,than raising your target fps to 100 with adjusted settings.
IMO,beware this is subjective,it's more of a fool's game to chase after maxing every slider than setting your fps target higher and trying to strike a balance with IQ settings.

look at 1080Ti at maxed out settings,barely 30 fps

now look at plain Ultra, 1.60x performance uplift

is it worth it ?
cause the story is always the same.once you get the settings to high/very high,it's usually a lot of performance sacrificed for limited visual returns if you want to go all out on max.
 
Last edited:
that is correct.
but buying a 4k display currently means being forced to run 60hz on panels that aren't exactly the fastest.means you'll get ultra sharp static image but large amounts of blur once in motion.
I believe the middle of the road solution - 1440p on a high refresh panel - gives you the best balance out of currently available solutions.my own monitor being 24" with ulmb/very well implemented overdrive only makes it better when it comes to static vs motion sharpness.
That ignores the fact I can overclock my panel to 85hz at 1440p and just run it at that for Apex, which i do tend todo for latency.

But for any single player game ,I would go for max IQ 4k and lower fps, i got a freesync panel.

so all in i'm with Bill, when it comes to monitors it depends on the person and their tasks, but getting the right tool as i said increases your effectiveness at getting the image you want ,when you want.

@Vayra86 I know and get that but was noting that on a 1080p panel with 4k textures on any scene, you can never see a 4k image even if you move the view port directly to a decal or tree or wall , it all looks blured ,even static.
 
That ignores the fact I can overclock my panel to 85hz at 1440p and just run it at that for Apex, which i do tend todo for latency.

But for any single player game ,I would go for max IQ 4k and lower fps, i got a freesync panel.

so all in i'm with Bill, when it comes to monitors it depends on the person and their tasks, but getting the right tool as i said increases your effectiveness at getting the image you want ,when you want.

@Vayra86 I know and get that but was noting that on a 1080p panel with 4k textures on any scene, you can never see a 4k image even if you move the view port directly to a decal or tree or wall , it all looks blured ,even static.
well I certainly could not have igonred the fact which you hadn't disclosed earilier ;)
 
well I certainly could not have igonred the fact which you hadn't disclosed earilier.
Fair enough :)

Also of note then, I do have a 49" LG Hdr 4k TV , while it is not terrible to game on , I do not really ever game on it, i use it exclusively for driving games with a wheel and chair ,just because in that task it's ok but a proper pc monitor just feel's better to my eyes long term.

and HDR is a crapshoot, the likes of Shadow of the tombraider ,it can look good at times but you also have moments of wtaf ,where am i ,and where do i go to see something ,especially on a sunny day, games can get unplayable, not so usually on a monitor without HDR.
 
the thing I noticed most about 4k was not the texture even though they are much sharper, is the models, they have more detail than anything and you can see that detail much further away.
thanks MrK, was wondering if it was my monitor or HDR™ that was crap.
 
Back
Top