• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GPUs: what is enough?

What resolution do you currently game at?

  • 2160p (3840x2160) 120hz+

  • 2160 (3840x2160) 60hz

  • 1440p (2560x1440) 120hz+

  • 1440p (2560x1440) 60hz

  • 1080p (1920x1080) 120hz+

  • 1080p (1920x1080) 60hz

  • 900p (1600x900) 120hz+

  • 900p (1600x900) 60hz

  • 768p (1366x768) 60hz+

  • Potato (below 768p) 60hz+


Results are only viewable after voting.
Point and click games yeah....
Not only. I'm currently playing Mass Effect: Andromeda. It fluctuates between 35-55 FPS on my 6500 XT and I find it fine. Not perfect, but fine.
 
i currently have a 4K 160Hz Monitor.
the next GPU Upgrade (which i really need soon when i don't want to play at 4K60 with slightly reduced settings or upscaling) would be a 4090.
i will not pay 2000€ for a GPU and go back to 1440p today. i'll sell at least one of my 4K Monitors and play games at 1440p again. (keep the GN950)
 
Not only. I'm currently playing Mass Effect: Andromeda. It fluctuates between 35-55 FPS on my 6500 XT and I find it fine. Not perfect, but fine.

I guess it just depends on what you got used to...
 
It's just depends on what you got used to...
Exactly. I grew up on DOS and early Windows games when we didn't even know what FPS was. Then I had a Pentium 2 for 6 years, so I know very well what stuttering is. :D 30 FPS is not.
 
Let see, I got
1440p 60hz in 2012
1440p 144hz in 2014
1440p UW 120hz in 2017
4K 120hz in 2020

Maybe 4K 240hz is in order in the near future :D
 
I play at 1080P 60Hz. So on my main desktop I simply disable one monitor. But when playing MSFS2020 I play across both monitors using eyefinity.
 
I haven't seen a single game that doesn't look butter smooth at 60 FPS. Most of them are OK even at 30 (to me at least).

Same here, as long as its 45-50 FPS I can play pretty much anything w/o being bothered as long as its not filled with stutters and such.
30 depends on the game but I did finish Mafia 3 with the 30 FPS ingame cap cause whenever I disabled it the game kept randomly crashing and nothing else fixed it so I just dealt with the 30 FPS mode and played like that. 'took me 2-3 days to get used to it or so'

I did play GTA V on my old i-3 4160+ GTX 950 system with sub 60 FPS drops and I honestly wasn't bothered or felt that its a slide show and when I upgraded to a 1600x I played the game all the same anyway. 'I was getting much better frames cause the i 3 was limiting even the 950 in that game when I was fast driving'

Tho yea this is a topic thats entirely subjective and ppl will most likely never agree so I will leave it like that. :laugh:
 
Highest resolution possible @lowest refresh rate

First Person Shooter gives me headache. I always liked games that were well like movies. so why bother going above 30fps.

I like to watch the scenery and enjoy the music in games. Not aiming at enemies and shooting them within fraction of a second. Basically I play GTA with cheat codes. kill them all and let me drive in town.

As for Forza and NFS, even our brain creates a fake motion blur in real life. You can't drive in real life without motion blur.

P.S: wish OP could let the votes be public.
 
First Person Shooter gives me headache. I always liked games that were well like movies. so why bother going above 30fps.

I like to watch the scenery and enjoy the music in games. Not aiming at enemies and shooting them within fraction of a second.
Same here. :) I think it's worth separating "games as an experience" from "games as sport". GPU and monitor makers tend to say that you need the latest and greatest for games, which I think is only true if you play games to sharpen your reflexes in online battles - which I don't. Even the concept of massively online gaming is alien to me. I spend 40 hours per week among a massive group of people in a relatively fast-paced environment. I don't mind it, but it's more than enough. When I'm gaming, I prefer to be alone and relax. :ohwell:
 
I game on a Samsung QLED 4k screen. It's just 60 Hz. I use an RTX 3090. I managed pretty ok with a GTX 1080 Ti for a good while before finally picking up the 3090. Now I can play many games maxed out at 4K but not all. Even the 3090 can't do it all.
 
  • Like
Reactions: Lei
I game on a Samsung QLED 4k screen. It's just 60 Hz. I use an RTX 3090. I managed pretty ok with a GTX 1080 Ti for a good while before finally picking up the 3090. Now I can play many games maxed out at 4K but not all. Even the 3090 can't do it all.
Jensen said 3090 is for 8k lmao
What game? Cyberpunk?
60hz is sweet, is it a TV?
 
1200p60 with maxed out graphics is the way I game. I play single player titles exclusively, usually older games. Immersive sim / open world are among my favorite genres. I spend considerable time in these simply taking in the environment and exploring the game world, while checking out every nook and cranny. I also examine level geometry, textures and models up close. Call me weird :laugh:

Needless to say, I don't need high frame rates, but desire maximum visual fidelity. I find 40 fps to be a playable minimum in most types of games, while 30 fps already feels choppy. That said, smooth frame rate with equal frame pacing and sufficiently high 1% and .1% lows are much more important to me than maximum fps value or high refresh rate.

As long as my rig can fulfil these criteria, I see no reason to go higher res. I also like the 16:10 aspect ratio, and as far as I can tell no such displays are being made at present.

EDIT: And I play DOS classics in glorious 200p or 480p. Forgot to pick the "potato" option in the poll :oops:
 
Last edited:
I also examine level geometry, textures and models up close. Call me weird :laugh:
That's exactly what I do.
When Claire Redfield enters the police station, the texture on left and right side of the main gate are symmetrical. It was dark and rainy, but I examined the cracks from every possible angle.

I drive 2km/h in Forza and look at the ceilings in tunnels. Damn, repetitive textures.
I streamed GTA to my sister on Skype and told her : look, the stains on pavements repeat here and here and again here.

Guess they can use two textures, one for the base material surface and a secondary blended texture for dust and cracks, but instead they copy paste assets. Multiple UV maps...
 
Jensen said 3090 is for 8k lmao
What game? Cyberpunk?
60hz is sweet, is it a TV?
Yeah. Its a 55" screen. I love it for gaming.
 
  • Like
Reactions: Lei
My b7 is still quite enjoyable - 4k 60 on easier titles and 1080p 120 hz on more demanding / twitchy games.

cant wait until I can finally afford a new card (cant wait for 4050 to go on clearance, so I can replaxce my gtx 960 with something more modern)
 
My b7 is still quite enjoyable - 4k 60 on easier titles and 1080p 120 hz on more demanding / twitchy games.

cant wait until I can finally afford a new card (cant wait for 4050 to go on clearance, so I can replaxce my gtx 960 with something more modern)
I don't think a 4050 will have enough vram for a 4k display. You better pick a 3080 or an AMD
 
I play and try to have a 30fps minimum.

I was content with my GTX 570s in SLI, played them across multiple resolutions (1680x1050, 5040x1050 and 1080p and finally 5760x1080) for 4.5 years.

The most demanding game I played last with my 570s in SLI was Far Cry 3, I could keep between 30-45fps on 5760x1080 with lower settings, but with only a shared 1.25GB of VRAM the cards were choking. Sure, there was some little jerks here and there while playing, but it didn't bother me. VRAM of only 1.25GB is what was the downfall of those cards. Had I known the GTX 570s would have come out with a 2.5GB version I would have went with them and I'm certain would have given me another year or two with them. I wanted to keep playing at the ultrawide resolution on 3 monitors, so I opted for a high end card from Maxwell.

I moved on to a 980Ti and used it for 6 years. Ran games on it across multiple resolutions (1080p, 5760x1080 and 1440p). Had my 980Ti not developed an issue with the card's fans ramping up to 100% randomly, I'd have stayed on it longer and skipped Ampere and RDNA 2.

As long as I can get 30+fps, I'm content. I've never been a graphics whore so I have no issues dropping settings to keep a framerate that I can enjoy playing games on. I like having a high end card from a new generation (used to be two mid-high end cards for SLI) and use the card for years 4/5/6 years. Going to be rocking my 3080 10GB for hopefully another 4-5 years (already had it a year now) and it handles games way past my needs.

Currently I'm on a 2560x1440 165Hz freesync monitor. I've actually got it set to 120Hz and I set the frame limit in games to 120. Any game I play hits that limit just fine and sometimes, just because I can I set the frame limit to 60 and games still look great (slightly smoother at 120fps, but 60fps is more than enough for my needs).
 
28" 4K 60Hz from 2015-2018
34" 3440 x 1440 60Hz 2018/19 onwards

My current Asus ROG and LG UW monitors support 100Hz and 165Hz respectively, but I'd only switch to 100Hz for a bit of eyestrain relief - otherwise, keep the custom loop cooler by running at 60Hz.
 
  • Like
Reactions: Lei
I play and try to have a 30fps minimum.

I was content with my GTX 570s in SLI, played them across multiple resolutions (1680x1050, 5040x1050 and 1080p and finally 5760x1080) for 4.5 years.

The most demanding game I played last with my 570s in SLI was Far Cry 3, I could keep between 30-45fps on 5760x1080 with lower settings, but with only a shared 1.25GB of VRAM the cards were choking. Sure, there was some little jerks here and there while playing, but it didn't bother me. VRAM of only 1.25GB is what was the downfall of those cards. Had I known the GTX 570s would have come out with a 2.5GB version I would have went with them and I'm certain would have given me another year or two with them. I wanted to keep playing at the ultrawide resolution on 3 monitors, so I opted for a high end card from Maxwell.

I moved on to a 980Ti and used it for 6 years. Ran games on it across multiple resolutions (1080p, 5760x1080 and 1440p). Had my 980Ti not developed an issue with the card's fans ramping up to 100% randomly, I'd have stayed on it longer and skipped Ampere and RDNA 2.

As long as I can get 30+fps, I'm content. I've never been a graphics whore so I have no issues dropping settings to keep a framerate that I can enjoy playing games on. I like having a high end card from a new generation (used to be two mid-high end cards for SLI) and use the card for years 4/5/6 years. Going to be rocking my 3080 10GB for hopefully another 4-5 years (already had it a year now) and it handles games way past my needs.

Currently I'm on a 2560x1440 165Hz freesync monitor. I've actually got it set to 120Hz and I set the frame limit in games to 120. Any game I play hits that limit just fine and sometimes, just because I can I set the frame limit to 60 and games still look great (slightly smoother at 120fps, but 60fps is more than enough for my needs).
Some people would argue that the 10 GB on your card won't be enough for 4-5 years at 1440p, but I'm not sure. I have a 2070 8 GB that can play everything with maxed graphics at 1080p. I currently have a 6500 XT 4 GB in my system (the 2070 is too loud) and even this is enough most of the time. I think you'll be fine. :)
 
Cheated a bit and chose "2560x1440 120Hz", though my monitor is 3440x1440 100Hz. My last monitor was 2560x1440 144Hz.

My system is tuned for quiet operation, so I globally limit my framerates to 95 FPS.

I consider the 40s to be the low end of playable, though some games are fine at 30.

I'd love to upgrade to that QD-OLED ultrawide that Alienware has at some point, but that would become the single most expensive component of my setup...
 
15 people already that do 3840x2160 @ 120hz+, a category I joined a few months ago.

It's certainly a lot of pixels to fill, and the ongoing requirement to service 4k60 (minimum, aiming for 120) is a hefty bill to keep footing.

Having said that, it boggles my mind when I see people genuinely and disparagingly complain that 3080/3090 tier cards are rubbish for 4k, they're absolutely not rubbish and already extremely capable at that resolution, but you can't expect to turn every dial in every AAA game to 11 and expect 120fps, which I guess is what they expect given the level of entitlement it comes across with.

In fact, I've managed to play a many an older/niche title at 8k and get 60+ fps, currently really enjoying Mario kart 8 Deluxe on Yuzu at 8k render resolution, silky smooth 60hz with BFI enabled, and the 3080 rarely hit's it's rated boost clock, and hasn't once pipped 50% GPU usage.

GTA V looks horrible @ 60 fp
I'd be lost without my RTSS+MSI AB on screen display, with frametime graph, easily allows me to fettle settings to ensure smoothness. Some games just play shit in borderless etc, I was getting 60fps straight in borderless in MK8 and it felt janky and off, exclusive fullscreen and that 60hz is absolutely silky smooth.

4K and huge refresh hz is pure marketing bs.
Hard disagree there, my time so far on a 4k120 OLED has been absolutely stunning. And, for what I want from games, is so far from BS I can even fathom the notion that it's BS.

Is it for eveyone? of course not, for most the rig and monitor are prohibitively expensive to play the same games lesser setup's can easily also play.

Is it BS? hell no.
 
Some people would argue that the 10 GB on your card won't be enough for 4-5 years at 1440p, but I'm not sure. I have a 2070 8 GB that can play everything with maxed graphics at 1080p. I currently have a 6500 XT 4 GB in my system (the 2070 is too loud) and even this is enough most of the time. I think you'll be fine. :)
Change the thermal paste on 2070 to make it quieter, or sell it before it becomes dirt cheap.
 
Change the thermal paste on 2070 to make it quieter, or sell it before it becomes dirt cheap.
Thermal paste/pads, custom fan curve and a relatively heavy handed undervolt would have the card whisper silent imo, but like @AusWolf I do enjoy getting the most out of lower end/power sipping stuff too. It'd be very interesting to see a drastically undervolted 2070's performance.
 
  • Like
Reactions: Lei
Is it BS? hell no.
We just disagree on that then, simple enough. You're gaming on a 42 inch TV, which isn't a desktop setting, and might see some return on your increased pixel count in your setup, I don't know. But for myself, having experienced a 1080p TV-situation and a much higher res desktop situation at 34 inch ultrawide with practically 4K horizontal (3440); I can safely say I see absolutely no point in upping the res to 4K for 16:9. The extra width works on a desktop setting, but extra height would make it uncanny. Already 34 inch UW is at the very limit of what's feasible for gaming, as in, the peripheral sight gets filled horizontally, but you're already not 'seeing everything' on screen, you need to move your head all the time to get there - OR sit back further; and in both cases pixel density is high enough to skip AA. I find myself moving UI elements in games not to the edge but to the middle/bottom/top of the screen because edges are really too 'far away'. Productivity wise, on ultrawide, I find myself using WIN+arrow keys to move a (browser) window to half of the screen, and put another one next to it. Effectively I just 'view' half the screen.

On a 42 inch you're looking at similar stuff, albeit with higher diagonal forcing you to sit back further, OR enjoy bad ergonomics. But if I sit back further, that is at 2m or more which is the very least for anything 40 inch 16:9 and up to feel comfortable, the bonus of those extra pixels in 4K is quickly just eliminated versus looking at a similar image on 1080p. I just don't see the difference anymore. Samsung TV does 4K, I can safely nudge back to 1080p and at normal view distance and relative same scaling of elements on screen, I really can't see anything different. Suffice to say, I run the PC at 1080p on that TV; 4K only adds latency and makes browser content slower.

I've experienced all the different situations over time, this is what I've arrived at. 4K and up is marketing/innovation for the 'must always have moar' crowds. It is not for me, all it does is massively increase required processing power for, realistically, no advantages at all.

And let's take a look at the horizon ahead of us. GPU advancements are now slowly but surely getting achieved not by making things better, but by making them bigger and more power hungry. The graphical fidelity in games is pretty much plateau'd and at least suffering from heavy diminishing returns. RT is adding yet another performance hog. 4K feasible? I think its beyond that, and you've already attested to that saying you're not capable of running with all bells and whistles already on top end GPUs. Who are we kidding here? Efficiency, in gaming, is absolute king, so the preferable PPI/diagonal is one where you see everything, are not counting pixels, and are not wasting pixels you won't see proper. It feels totally counterproductive to me to sacrifice IQ to get higher resolution, while the opposite does show a difference. Let's not forget that games are not getting lighter regardless of resolution, either.
 
Last edited:
We just disagree on that then
I suppose so, and your opinion on the matter couldn't really be further from mine then. I sit roughly 2-4 feet away depending on the type of game (controller for more relaxed games more like ~4ft, or KB/Mouse for shooters more like ~2ft), and it's easily evident how much clearer and richer with detail 4k is. As for ergonomics, people can judge that however they see fit I suppose, I have zero issues and haven't since I just got a chair that actually supports me.

Gaming at 6ft+ and sitting back on a couch, I can appreciate that 4k means less and less, I'm at the other end of the scale, rendering less intense things at 8k because the difference between an 8k down-sampled render and 4k native is immediately evident and impressive to my eyes, and that really caters to what I want and enjoy from gaming.

I come unstuck a lot with the words people choose to use when voicing their opinion. For example, I put a lot of emphasis on choosing my words to most accurately and objectively convey my points, like say "I've tried 4k/ultra high refresh and don't see the merit for myself personally, and I'd argue a majority would agree, yet the marketing makes it seem like a necessity", rather than saying "it's pure marketing bs", but naturally, you do you. Clearly I've butted heads a many a time on these forums over that notion before.
 
Back
Top