• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GPUs: what is enough?

What resolution do you currently game at?

  • 2160p (3840x2160) 120hz+

  • 2160 (3840x2160) 60hz

  • 1440p (2560x1440) 120hz+

  • 1440p (2560x1440) 60hz

  • 1080p (1920x1080) 120hz+

  • 1080p (1920x1080) 60hz

  • 900p (1600x900) 120hz+

  • 900p (1600x900) 60hz

  • 768p (1366x768) 60hz+

  • Potato (below 768p) 60hz+


Results are only viewable after voting.
I suppose so, and your opinion on the matter couldn't really be further from mine then. I sit roughly 2-4 feet away depending on the type of game (controller for more relaxed games more like ~4ft, or KB/Mouse for shooters more like ~2ft), and it's easily evident how much clearer and richer with detail 4k is. As for ergonomics, people can judge that however they see fit I suppose, I have zero issues and haven't since I just got a chair that actually supports me.

Gaming at 6ft+ and sitting back on a couch, I can appreciate that 4k means less and less, I'm at the other end of the scale, rendering less intense things at 8k because the difference between an 8k down-sampled render and 4k native is immediately evident and impressive to my eyes, and that really caters to what I want and enjoy from gaming.

I come unstuck a lot with the words people choose to use when voicing their opinion. For example, I put a lot of emphasis on choosing my words to most accurately and objectively convey my points, like say "I've tried 4k/ultra high refresh and don't see the merit for myself personally, and I'd argue a majority would agree, yet the marketing makes it seem like a necessity", rather than saying "it's pure marketing bs", but naturally, you do you. Clearly I've butted heads a many a time on these forums over that notion before.
But I really DO feel in every fiber that it IS utter bullshit, and humans are highly susceptible to bullshit. I choose words with intent.

Sorry. The world is built on a vast majority of marketing BS, and we should stop that, because the world's going to shit because of our never ending urge for more. I'm of the notion that 'enough is enough'. The fact it bothers you is an interesting thing to reflect on. Maybe unconsciously, you feel some of the same there.
 
Boy some people here in a tech forum would really appreciate living in the dark age where nothing is being improved for hundred of years.
 
Boy some people here in a tech forum would really appreciate living in the dark age where nothing is being improved for hundred of years.
You should read some history books to see how progress has actually accelerated in our lifetimes, and with that, other processes have also accelerated. Its the reason we have climate problems and they heavily coincide with commerce. 'Dark age' or just saying 'this is enough' couldn't be further apart.
 
I think its beyond that, and you've already attested to that saying you're not capable of running with all bells and whistles already on top end GPUs. Who are we kidding here? Efficiency, in gaming, is absolute king, so the preferable PPI/diagonal is one where you see everything, are not counting pixels, and are not wasting pixels you won't see proper. It feels totally counterproductive to me to sacrifice IQ to get higher resolution, while the opposite does show a difference.
What I'm getting at there is optimised settings, I don't want a 'medium' experience, I want the perfect (for me) balance between framerate and visuals, and it's very easy to achieve ~95% of "all dials at 11" IQ and pump FPS up 25-50%+
But I really DO feel in every fiber that it IS utter bullshit, and humans are highly susceptible to bullshit. I choose words with intent.

Sorry. The world is built on a vast majority of marketing BS, and we should stop that, because the world's going to shit because of our never ending urge for more. I'm of the notion that 'enough is enough'. The fact it bothers you is an interesting thing to reflect on. Maybe unconsciously, you feel some of the same there.
Cool well then again, you do you, I'm here to say that marketing didn't influence it whatsoever for me, I saw one with my own eyes and knew it would be what I want from a personal gaming experience, horses for courses.

The things that bother me in life are numerous, people stating opinion as fact, people misrepresenting truths, people being needlessly rude or spiteful, people being imprecise in their speech, people claiming to know how I function and suggesting that I change at a personal level because they don't agree with me, and many many others (note that I'm not accusing you of any of these, just saying).

I'm more than capable of spotting marketing BS when I see it and forming opinions and purchase decisions accordingly. You think it's marketing BS, power to you, I'm here to say that I disagree 4k/high refresh is bs, and because for at least me it isn't, it cannot be a universal truth only opinion, and marketing had nothing to do with my purchase decision. So like so many times here I end up at, shall we agree to disagree?
 
Thermal paste/pads, custom fan curve and a relatively heavy handed undervolt would have the card whisper silent imo, but like @AusWolf I do enjoy getting the most out of lower end/power sipping stuff too. It'd be very interesting to see a drastically undervolted 2070's performance.
Nah, undervolting a GPU is too much of a hassle for me. Heck, doing literally anything with an Nvidia GPU requires third-party software, which I'm not a fan of, so I guess I'll just do what @Lei said: repaste the GPU die to see if it improves thermals and noise. I briefly considered selling it too, but it is already way cheaper than what I would comfortably part with it for. It'll be a good addition to my other spare GPUs, I guess (albeit the most expensive of the bunch). :D There might come some impressive Nvidia feature at some point that makes me glad I still have it. :)

Boy some people here in a tech forum would really appreciate living in the dark age where nothing is being improved for hundred of years.
Let's just say that not all progress means improvement. ;)
 
3440x1440 100hz, a 3090 producing the necessary pixels for it :) and will be okay for the next 3 or more years
 
Let's just say that not all progress means improvement. ;)

I have enjoyed 20+ years of GPU improvements, hopefully this continue for another 40 years :D
 
Nah, undervolting a GPU is too much of a hassle for me. Heck, doing literally anything with an Nvidia GPU requires third-party software, which I'm not a fan of
I will respect your opinion and choice there, but I would strongly implore you to give that a try first, one piece of very lightweight software, and a little testing will be far less time consuming and risky than a repaste with better results, ideally both though. Perhaps it fits into a case of 'the ends justify the means', my 2c :)
 
I have enjoyed 20+ years of GPU improvements, hopefully this continue for another 40 years :D
Well, 400+ W TDP graphics cards are certainly a sign of some progress, but to call them improvement without a second thought would be a bit steep.

I will respect your opinion and choice there, but I would strongly implore you to give that a try first, one piece of very lightweight software, and a little testing will be far less time consuming and risky than a repaste with better results, ideally both though. Perhaps it fits into a case of 'the ends justify the means', my 2c :)
I'd be lying if I said a part of me didn't want to fiddle with something curious that may or may not be worth it, but currently, I want to satisfy that side of me with an Intel Arc GPU. :D
 
I use a 43" 3840x2160 120Hz screen paired with a 3080. Funnily enough, I don't really care about games all that much. I got this screen mostly for real estate while working - a reason why I would never get an ultrawide or an OLED - and because my previous screen, Philips BDM4065, had a backlight which very dim and prone to failures. The 3080 I got mostly to play Cyberpunk 2077 and it will probably be my last dedicated GPU.
Most of today's consumer needs seem to be motivated purely by marketing bullshit. People "need" the newest, fastest, most expensive things because that's what marketing tells them they need. Actual requirements are lost in the noise.
 
These days I would only accept 120 Hz, 1440 on 24" and above 4K 28". The frame rate however can be as low as 60 because what matters is latency. My 4K60 is so slow that it can only display 30 frames or 1000/30 ms graytogray and to mitigate that a high refresh monitor is needed. And 120 frames comes at high cost on the gpu cpu side of things anyways. This is the price difference between "60 and "80 class that has to be upgraded every 2-3 years to stay relevant.
 
I don't think a 4050 will have enough vram for a 4k display. You better pick a 3080 or an AMD
sure it will -its way more than the 2gb I'm making due with

the only folks who need that kind if vram are those who run way more demanding games than me (or insist om running Ultra everything when High looks identical)! its still 2gb more than my game systems 1060 6gb (and I've hit the compute wall before vram)
 
  • Like
Reactions: Lei
sure it will -its way more than the 2gb I'm making due with

the only folks who need that kind if vram are those who run way more demanding games than me (or insist om running Ultra everything when High looks identical)! its still 2gb more than my game systems 1060 6gb (and I've hit the compute wall before vram)

Yeah, also it's not like you can't use FSR or DLSS.
 
So what is enough for you? At what point is the excess performance just wasting electricity and driving up your electric bill? Are you happy with a potato? Are you frustrated with only 400FPS at 4K?
Ideal = 1440p @ 60-144fps.
Minimum I'm happy to game at = 1080p @ 60fps.

As with many things, upgrading over time ends up in depreciating gains. 5120x2880 vs 2560x1440 vs 1280x720 vs 640x480 vs 320x240. Is the perceptual "jump" after the 4th iteration of adding +300% more pixels the same as the 1st or 2nd? Of course not. Habituation is also a thing too, ie, you'll get a lot of answers along the lines of "I just spent $2,000 on a 4k setup and refuse to play anything less", whereas in the real world if an incident took place, eg, massive power surge fried computer to a crisp on a Friday Xmas Eve and replacement parts would take a week to replace, but you happened to have a 15" 1080p/60 laptop lying around, a lot of people would suddenly be quite flexible again as to what they 'could' play at and after a couple of days it's nowhere near and 'bad' as originally feared. Likewise, this also reminds me of the amusing advice - if you spend every second of a game / movie worrying about whether it's in 4k HDR or not, then it sounds like a sh*tty game / movie to sit and stare at whilst it fails to 'draw you in'. ;)
 
sure it will -its way more than the 2gb I'm making due with

the only folks who need that kind if vram are those who run way more demanding games than me (or insist om running Ultra everything when High looks identical)! its still 2gb more than my game systems 1060 6gb (and I've hit the compute wall before vram)
Allocated vram and memory in use are different things. Having memory full doesn't mean that is necessarily what game certainly needs.


I think 4050 will not be as strong as 3070. You must see the prices. Anyway 4050 is more than a year away.

About 1060 6gb, I understand it's one of the most popular cards. But 1070 is just a wee bit higher and gives 2gb extra vram. When they were released, ok maybe 1070 was overspending, but nowadays I don't understand why people choose 1060 6gb if there's a 1070 available for just a few dollars more.

In fact, I bought a 1060 and cancelled it next morning, changed to 1070.
 
I think 4050 will not be as strong as 3070. You must see the prices. Anyway 4050 is more than a year away.
Yeah, there's zero chance it will be that fast.

The RTX 3050 for example was slower than a 3+ year old RTX 2060 and wasn't much faster than a 2016 era GTX 1070......
 
  • Like
Reactions: Lei
Yeah, there's zero chance it will be that fast.

The RTX 3050 for example was slower than a 3+ year old RTX 2060 and wasn't much faster than a 2016 era GTX 1070......

its about 25% slower than the 3060 at 4k

now dream a bit, and it should be about 45-50% faster than that 3050 - that puts it in 3060 ti range

no, not a 3070, but ill take a 3060 ti for half the cost

that 50% performance bump is hardly out-of-line, as its 60% faster than the old 1650 Super at 4k (at the time, the world's fastest 128-bit card, exceeding the 5500xt 8gb); the 3050 managed this miracle , all while having the exact same memory clock speeds as old Turing.
 
Last edited:
Voted 1080 and 900 @60hz. Running a Maxwell-based Quadro laptop (M620 mobile) as an interim solution (trying to get too fancy on my build and got too busy to finish). AC: Origins is probably the most intensive game I am running - 35 FPS average with dips to 9 FPS. Personally, I would be happy at a locked 30 FPS with low input lag. Some games I have had to drop to 900 or 720 to prevent terrible slideshow effect.
 
4k@45fps
 
  • Like
Reactions: Lei
its about 25% slower than the 3060 at 4k

now dream a bit, and it should be about 45-50% faster than that 3050 - that puts it in 3060 ti range

no, not a 3070, but ill take a 3060 ti for half the cost

that 50% performance bump is hardly out-of-line, as its 60% faster than the old 1650 Super at 4k (at the time, the world's fastest 128-bit card, exceeding the 5500xt 8gb); the 3050 managed this miracle , all while having the exact same memory clock speeds as old Turing.
I would just say a 4k screen is better be handled by AMD
The type of cards your aiming for doesn't indicate you need any features Nvidia exclusively offers (like content creation, AI or so)

He's the man. :toast:
You're right, you're always right

 
  • Haha
Reactions: r9
its about 25% slower than the 3060 at 4k

now dream a bit, and it should be about 45-50% faster than that 3050 - that puts it in 3060 ti range

no, not a 3070, but ill take a 3060 ti for half the cost

that 50% performance bump is hardly out-of-line, as its 60% faster than the old 1650 Super at 4k (at the time, the world's fastest 128-bit card, exceeding the 5500xt 8gb); the 3050 managed this miracle , all while having the exact same memory clock speeds as old Turing.
To be fair the 3050 came out 2.5 years after the 1650 super and almost 3 years after the original 1650.

The xx50 series basically skipped 2 generations that's why the performance increase was so drastic. The 3050 isn't even a full 9 months old at this time so to expect a 45-50% jump anytime in the 12 months is a pipe dream (the 3050 was an outlier due to the long gap between xx50 cards and even then it wasn't even faster than the 3 year old 2060..........).

I understand the enthusiasm but at the same time you do have to realize that you basically just typed that it skipped 2 generations and was still slower than a 3 year old 2060 (i'd hold the expectations to something a little more realistic especially given NVidias xx50 track record)
 
"Enough" could be anything, depending on the game and context. I have a 4K 32" monitor (bought on sale) and honestly I would have preferred a 1440p monitor instead as I feel I have to use scaling to be comfortable with the UI elements. I was perfectly happy with the 24" 1080p monitor though. I play some games on the 1440x900 laptop and it's fine. I played a lot of Terraria, Rimworld and Sunless Sea on an 1366x768 laptop and it was great.

Hz and FPS in games only matter if you measure it and I don't, so my criteria is "does it feel ok to play" and so I play heavy titles with settings that make the games dip below 60FPS and I'm fine with that.
 
To be fair the 3050 came out 2.5 years after the 1650 super and almost 3 years after the original 1650.

The xx50 series basically skipped 2 generations that's why the performance increase was so drastic. The 3050 isn't even a full 9 months old at this time so to expect a 45-50% jump anytime in the 12 months is a pipe dream (the 3050 was an outlier due to the long gap between xx50 cards and even then it wasn't even faster than the 3 year old 2060..........).

I understand the enthusiasm but at the same time you do have to realize that you basically just typed that it skipped 2 generations and was still slower than a 3 year old 2060 (i'd hold the expectations to something a little more realistic especially given NVidias xx50 track record)
Personally, I wouldn't put any faith into Nvidia's low-mid tier. The last proper cards in this category were the 16-series. The 3050 is just a severely cut-down GA106 at GA106 prices which makes it not worth it, imo. Nvidia has had all their attention on the high-margin, high-power tiers in the last couple of years.
 
Back
Top