• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Do you buy RTX 4090 for a 1080p resolution gaming?

Do you buy RTX 4090 for a 1080p resolution gaming?


  • Total voters
    47
Joined
Mar 31, 2009
Messages
145 (0.02/day)
Some YouTubers use this card for CPU's testing in 1080p resolution. Interesting know it is right choice?
 
While it's excellent for being used for benchmarking cpus at 1080p for the majority of games you really need to be cranking RT for it to even break a sweat at 1440p and really streches it's legs at 4k.

For cpu review purposes it makes a lot of sense but so would a 7900XTX you are trying to remove the gpu as a bottleneck.


At the same time if reviewers didn't include 1080p for gpu reviews even high end ones people would throw a fit.
 
Your average rich Joe/Jane only buys 4090 for 1080p if their monitor allows for a million Hz refresh rate and every frame matters.
Be me a rich Joe, I'd buy it just because I can. This will guarantee the GPU dies long before becoming useless at my resolution. Which is completely fine by me. Invest 2 grand, and after a decade, still a decent GPU inside (R9 290X is about a decade old as of now and it still can run new games, and since the progress is slower gen by gen...). Cheaper than buying 5 hunnit ones every couple years.
 
Last edited:
Yes i do. I need those 10000 fps to play pacman and Tetris. Relax its a joke jess.:p

I got it for 1440P so i can run all games with all eyecandy on while still can utilize my screens 144 hz. Also with the intention of that 1440P, rtx 4090 should last me longer before buying a new card eventually.

But if the future is like Starfield. I may have to replace gpu sooner than expected. Time will tell.
 
Last edited:
depends. if you're playing some (weird) competitive multiplayer game, maybe / probably.
 
Monitor does 240Hz so yes that horsepower will help and with AA and lots of vram its a party.

im over 4k bs
 
If you have a 360 / 500hz monitor, that is a reason to play at 1080. Otherwise just do 4k 120hz
 
If you have a 360 / 500hz monitor, that is a reason to play at 1080. Otherwise just do 4k 120hz

I'd sat 1440p 240/360hz also

I just have a hard time mentally wrapping my head around someone spending 1600+ on a gpu 400+ on a cpu then skimping on the display arguably the most important component.

I feel like pc gamers in general skimp on displays regardless though.

A 4090 isn't going to be future proof enough to justify keeping it more than a couple generations so that is not a smart financial decision either.
 
Less GPU load = less power draw, so net profit :D

More serious note :
1) It doesn't matter, since I will buy it in few years to play at what ever resolution it manages ~120FPS. I will use it at 1080p, because it's good comparison point with older stuff (I am a Enthusiast/collector, so I'm in less 1% of users ;)).
2) It's 100% pointless what resolutions you set as "base", if all you have to do is enable DLSS/DSR or FSR/VSR in order to throw "I only own 1080p display" argument on it's head.
In other words : If You use either of them when gaming, "gaming at 1080p resolution" largely doesn't apply to You.
 
Last edited:
I'd sat 1440p 240/360hz also

I just have a hard time mentally wrapping my head around someone spending 1600+ on a gpu 400+ on a cpu then skimping on the display arguably the most important component.
Idk. I had a 19inch 85Hz CRT to play Counter Strike 1.3 and whatever the most expensive card was around at the time. It made a difference.

At CES Nvidia had a demo of overwatch 2 with 360hz and it does have a impact on your aim ( tried it). Went from 5/10 hits to 8/10. If your a pro player, that relax is already mussel memory, but for everyone else, it helps.
 
Idk. I had a 19inch 85Hz CRT to play Counter Strike 1.3 and whatever the most expensive card was around at the time. It made a difference.

At CES Nvidia had a demo of overwatch 2 with 360hz and it does have a impact on your aim ( tried it). Went from 5/10 hits to 8/10. If your a pro player, that relax is already mussel memory, but for everyone else, it helps.

Must have been you merking me in quake right as I spawned :laugh::laugh::laugh:
 
Some YouTubers use this card for CPU's testing in 1080p resolution. Interesting know it is right choice?

For CPU testing, they could use 480p or 720p with the same success, or even better.

RTX 4090 is for 4K, 5K, 6K, 8K, but if you don't want these, the lowest sensible resolution should be 1440p.
 
There are no 1080p, 1440p, 4K cards. They don't exist. They never have.

There are only games that either run well or don't on your GPU. And the more GPU you have, the more games will run well for that much longer. Its that simple. However, since you pay quite a price for a 4090 and it won't be able to tap into the full perf at 1080p more often than not, you could question if its the best buy in 2023 for 1080p. But you don't buy a 4090 for the bang/buck aspect. You buy it for the bang.
 
What's important is the average FPS across a wide range of titles. If the FPS is around or higher than 60 FPS, it's safe to accept that it is good for the given resolution.
RTX 4090 is good for 4K 144 Hz gaming.

1693987260034.png
 
Your average rich Joe/Jane only buys 4090 for 1080p if their monitor allows for a million Hz refresh rate and every frame matters.
Be me a rich Joe, I'd buy it just because I can. This will guarantee the GPU dies long before becoming useless at my resolution. Which is completely fine by me. Invest 2 grand, and after a decade, still a decent GPU inside. Cheaper than buying 5 hunnit ones every couple years.

GPU's do not hold up for that long, no matter what you spend. The GTX690 is in no way a "decent card" today.
 
No, for 1080p you buy a Radeon HD. :)
Seriously though, depends on load - just look at how much the normal HD (1080p) gaming bar was raised since the Radeon HD (5xxx = year 2009, 6xxx = year 2010).
 
GPU's do not hold up for that long, no matter what you spend. The GTX690 is in no way a "decent card" today.
They did not. Now, since both AMD and nVidia decided to stop giving us measurable gen-to-gen progress, 10 years to live doesn't seem crazy to me. I guess RTX 4090 will still have it by 2032. I mean, yes, running 1080p60@Ultra will be a problem but 1080p60@Med-High most likely will not be. And this is decent. Not good, not bad. Just decent.
 
If I would buy it, I would do it for 1440p, games at ultra settings where possible, then at high-fps....

Games that aren't hard on the GPU, ultra settings, 4K high-fps.
 
Nope as I'm not going to buy any Nvidia ever. Leather man's greedness makes me sick.
 
id say 1440p at high refresh rate at least, i tend to go one step lower than its "intended" resolution, that way, i can maximize its performance for years. (like im on my 1080p for my 4070)
 
Depends on the purpose - if you want COD to run at the highest possible fps with a 1080p 360hz monitor and fastest cpu, then yes it makes sense to pair it with a 4090.
Although in this case you would rather go with a 7900XTX, as amd has less cpu overhead and thus will give you higher fps while cpu limited.
 
They did not. Now, since both AMD and nVidia decided to stop giving us measurable gen-to-gen progress, 10 years to live doesn't seem crazy to me. I guess RTX 4090 will still have it by 2032. I mean, yes, running 1080p60@Ultra will be a problem but 1080p60@Med-High most likely will not be. And this is decent. Not good, not bad. Just decent.

The higher end RTX4xxx cards was a huge jump over previous generations. Even the 7900xtx was a decent jump, just not as big as the 4090.
 
The higher end RTX4xxx cards was a huge jump over previous generations.
1694000933162.png


Is GTX 1660 a decent low budget 1080p GPU today? Indeed it is. You can play anything using this card and get reasonable performance in every game. Of course you can forget about Ultra settings and stable 60+ FPS but playing at quote-unquote acceptable 45 to 60 with half settings put to minimum is still an option.

How much is this in frames per second?

1694001156237.png


At 1080p, GTX 980 Ti (portrayed by GTX 1660 Super) is capable of decent 1080p experience. Of course 1660 Super has more advanced DX12 support and more useful for modern game features overall but 980 Ti destroys it in older titles. I guess my assumptions are accurate enough.

If rumours about AMD not making higher end RDNA4 GPUs will come true this will eliminate all need for Jensen to improve his GPUs. He will postpone new generations, lower gen-to-gen progression etc. And, knowing AMD, it seems to be very much possible.

This is why I don't consider RTX 4090 bad by early 2030s standards. It will run games from then, most of them on med to high settings, providing with more than playable experience.
 
Nope as I'm not going to buy any Nvidia ever. Leather man's greedness makes me sick.
Riiiight. As if there's an actual caring vendor.
 
Back
Top