• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Will you buy a RTX 5090?

Will you buy a RTX 5090

  • Yes

    Votes: 33 15.3%
  • No

    Votes: 66 30.7%
  • Will not buy any RTX 50 Series

    Votes: 116 54.0%

  • Total voters
    215
No. Won't buy a 5090. I *might* consider a 4090 if one were available for cheap. Otherwise, I still have my (unused and unopened) 4080.

Currently using my 1080. Before that a 970. Before that a 670. Before that... a lot of other cards. My first one was an AvanceLogic back in 92, very quickly followed by ATi. After that I had a couple of Matrox (remember them?). Also some early Geforce and Radeon cards.
 
I mean in complete fairness, most of the huge jumps that I'm referring too was due to die sizes for GPU's shrinking primarily (as far as im aware, I honestly havent done a super due diligence there yet tbf), but still the 40 series left me feeling mixed, which the 40 supers did redeem. I'm curious to see where the 5070 lands, as thats where that sort of stuff matters imo (the midrange)

I really hope the 5060 and 5060ti are not duds becuase they do more to push gaming forward than a 90 class card ever will.

I'm annoyed Nvidia went with 12GB for the 5070 but it matching or barely beating the 4070 super is the bigger issue if that comes to pass hopefully it at least catches the 4070ti super but that should have been the low bar.... my guess is 5% slower than the 4070ti 12G
 
I really hope the 5060 and 5060ti are not duds becuase they do more to push gaming forward than a 90 class card ever will.

I'm annoying Nvidia went with 12GB for the 5070 but it matching are barely beating the 4070 super is the bigger issue if that comes to pass hopefully it at least catches the 4070ti super but that should have been the low bar.... my guess is 5% slower than the 4070ti 12G
I think 12GB is pretty much where I'd like it to be for 1440P and 10GB for 1080P works.. which to be fair I think the only reason I'm okay with using 8GB of VRAM for 1080p or even sometimes 1440p depending on the game was because I could tweak settings to avoid using too much VRAM, but not everyone does that. I think the B580 and B570 have where its at in terms of VRAM allocation imo, we don't really need the huge VRAM jumps like we used to get, just a small one should do imo

But yea, I feel ya there. the xx60, xx60Ti's, and xx70's are where I think NVIDIA pushes gaming for more than just the huge whales the most.
 
I mean in complete fairness, most of the huge jumps that I'm referring too was due to die sizes for GPU's shrinking primarily (as far as im aware, I honestly havent done a super due diligence there yet tbf), but still the 40 series left me feeling mixed, which the 40 supers did redeem. I'm curious to see where the 5070 lands, as thats where that sort of stuff matters imo (the midrange)

I think the 1080Ti would be ironically an example of what I mean. If we were still in the olden days the 1080Ti would be beyond illrelevant before even the 50 series came out, but it wasn't, and will still probably be considered pretty usuable for 1080p even when a potential 5060 drops. (I pray to god that the 5060 isnt just another 4060 situation, or I will be fuming..)

lack of competition, price parity between AMD and Nvidia, the focus on AI brought them so much money they don't even care about us anymore, more expensive TSMC, tariffs, etc... nothing helps
 
My cpu/gpu history has been... biased
Pentium 4 2.8 / X1550 PCI
Pentium 4 HT 3.2 / HD 2400 PRO PCI
i7 2600K / HD 6870
i7 3770K / HD 7970
i7 6700K / R9 290X
i7 6700K / GTX 1070
i7 6700K / GTX 1080 TI
i9 9900K / RTX 2080 Ti
i9 12900K / RTX 3080
i9 14900K / RTX 3080
I plan on buying a 5090 expecting more than a 2x performance uplift in raster
The early gap was just me trying to update my dell dimension 3000 trying to create a gaming pc lol, then i bought a gaming pc, then built my own after.

Whats your gpu history and will you be getting a RTX 5090 or any 50 series for that matter?
My history:

HD3650
HD4850
HD4850+HD4850
GTX295
HD4890+HD4890
GTX590
HD5950+HD5950
HD6950+HD6950
HD7950
GTX 970
GTX 1080ti
RTX 3080ti
7900XT
RTX 4080S
RTX 5090 hopefully

Ive always sold my gear before buying new gear makes it easier to keep up! I usually stay away from flagships like the Titan etc. Gaps in there I had a gaming laptop. I am in my mid 30s now and this hobby has stuck for over a decade, cost per performance doesnt matter to me anymore its still cheaper than when I got into cars which can cost a house deposit. This is the only one of my remaining hobbies now (and music) so take my money NVIDIA!

I am still rooting for AMD to make a come back to compete flagships! ATi > NVIDIA.
 
Waiting for next gen, 4090 still holding strong.
 
I love hardware in general so I like to see any gpu they put out but at this point I try to have 0 expectations their gpu division seems to have an identity crises and they've been so up and down over the last decade mostly down in my book...
I think AMD will achieve performance like the 5090 in their next generation cards, but whether it will be competitive in the high end is where it gets really interesting.
So the waiting joke from above is just a joke :)
 
I think AMD will achieve performance like the 5090 in their next generation cards, but whether it will be competitive in the high end is where it gets really interesting.
So the waiting joke from above is just a joke :)

They'd need to beat the 7900XTX by 100%+ in rt and 60 ish in raster...

While I don't think it's impossible I just feel like they are not going to commit to it. The whole udna thing isn't to make better gaming cards its to focus more on AI and just sell the leftovers to gamers...

If the 5080 is indeed 15% faster than the 4080 they should have been easily able to beat that if they chose to they decided it wasn't profitable they'd only need RDNA3 like gains and they'd at least have come close to the 4090.... In raster anyways...
 
They'd need to beat the 7900XTX by 100%+ in rt and 60 ish in raster...
If it would help AI performance, they'd probably try to improve RT performance, but for gamers only - more likely not.
 
Pffffft hell no. We've gone from:

~£450 flagship models (most generations up to and including GTX600)
To £550 for GTX700-900
To £700 for GTX1000
To £1000 for RTX2000
To £1500+ for RTX3000-4000
To now £2000.

You can't blame that on inflation. It should come bundled with a decent CPU, motherboard and QD-OLED in the box for that price.

ATi EGA Wonder 800+ 256KB (I was very young at the time)
Western Digital Paradise something (also EGA and 256KB)
Cirrus Logic GD5422 1MB
S3 Trio 64 1MB
Onboard SiS 6326 (8MB maybe of my 64MB system RAM)
Voodoo 3 3000 16MB
GeForce 4 MX460 64MB
Radeon 9700 128MB
Radeon 9800 Pro 128MB
GeForce 6600 128MB
GeForce 8800 GTX 768MB
Radeon 5970 2GB
Radeon 4870 1GB (temporary after the 5970 died)
GeForce GTX 670 2GB
GeForce GTX 970 4GB
GeForce GTX 980 Ti 6GB
GeForce RTX 4070 Ti 12GB
GeForce RTX 6070 Super 13GB
 
Last edited:
I won't. I probably will buy a 6090, so long as it's a large performance leap, which the 5090 isn't.

ATi Radeon x1300M
NVIDIA GS 9200M
AMD Radeon 5870M
GTX 770M
GTX 960M
GTX 1060 Mobile
GTX 1660 Ti Mobile
RTX 2070 Super Mobile
RTX 3080 Mobile
RTX 4090 Mobile & RTX 4090

Currently have both laptop and desktop 4090, desktop 5090 isn't a big enough performance jump to make me remotely interested.
 
I refuse to pay the nGreedia tax for JH's increasingly shiny leather jackets, i can BUT I REFUSE TO.
The only way i'm buying a new gpu is if my beloved 6800XT Red Devil fails and if my MSI 1080Ti Gaming X also fails
and even then i'll buy AMD.
As far as i'm concerned since AMD isn't fighting in the upper segment, this gen is dead.
 
there is really a lot of people buying a card every gen or changing cards on sequential gens, and they are the same people complaining about prices :D.
I'm pretty sure Nvidia is not the only one distorting the market, causing all the problems
 
Gigabyte shows off his AORUS GeForce RTX 5090 XTREME WATERFORCE 32G, with dual slot graphics card… interesting, but no prices yet…
 

Attachments

  • 2000_2.jpg
    2000_2.jpg
    157.9 KB · Views: 36
  • 2000_1.jpg
    2000_1.jpg
    162.1 KB · Views: 35
  • 2000.jpg
    2000.jpg
    466.4 KB · Views: 46
Last edited:
No. This summer i will put a 7900xtx in my new rig, then swap it for amds high end card in a few gens. After i swap it, the 7900xtx goes into my sisters rig.
 
No..... I'm hoping my current hardware will last. I know raptor lake isn't known for longevity but I've taken every precaution possible, and go very easy on it. Anyway I just want to get out of the habit of constantly buying new hardware, I was happy with my 3570k and 1070 for so damn long. I hope I can get this one to last at least half as long as that one did.

But if there's a failure, or requirements rapidly increase, I may not have a choice. I'm pretty sure I can manage to skip at least one generation of GPU though.

2400 gtx 560
3570k gtx 560
3570k gtx 760
3570k gtx 1070
11600kf 3060, 3070, a different 3070, a 4070, then a 4090 (all in rapid succession, long story).
13600kf 4090
14700kf 4090
 
Last edited:
geforce3 ti 200
geforce4 ti 4600
ati 9800 pro
ati x850 xt platinum
8800 gtx tri sli
gtx 295 quad sli
gtx 480 sli
gtx 680 sli
gtx 780 ti sli
gtx 1080 ti sli
rtx 2080 ti
rtx 3090
rtx 4090

Will i get a 5090... still undecided. From what i can tell there wont be any massively demanding games i wanna play where i would need those extra 30% performance (which is not that big of a jump) over my 4090 in the coming 2 years (next is witcher 4 in 2028). So i might skip it and wait for 6090.
 
There's more to the 5000 than meets the eye. Transformer does improve visual quality significantly (to the point you don't want to use CNN). However, 4000 will increase latency much more than 5000. So it's not just about raw fps per se. Though if there are no games to play then it doesn't matter unless one would like improved quality in older games at the same or better performance/latency.
 
Last edited:
I have a 4090 and I won't be getting a 5090, here's why:
1. I have yet to play a game that doesn't look and run well on my LG C1 120Hz OLED with the 4090. Sure I can't max new AAA-games out and get a locked 4k120, but I have always been able to get there with very few compromises.
2. The 450W (well more like ~350-400W in most games) power draw of the 4090 is already a pain in the summer months, 575W would probably make that at least 9 months of the year instead. Gaming in a sauna is not a good experience.
3 MFG has little to no use on a 120Hz display, even regular FG is pushing it in some games and in most cases I rather play at 90 fps without FG than 120 fps with it enabled.
4. Because of my local currency losing value against the USD since I bought my 4090 the 5090 is ~35% more expensive than the 4090 in my region.
5. I find myself playing and enjoying less demanding Indie titles more in the last year than ever before and only played two more demanding titles, Indy and Hellblade 2. Don't see that changing much in the next 2-3 years.
 
I have a 4090 and I won't be getting a 5090, here's why:
1. I have yet to play a game that doesn't look and run well on my LG C1 120Hz OLED with the 4090. Sure I can't max new AAA-games out and get a locked 4k120, but I have always been able to get there with very few compromises.
2. The 450W (well more like ~350-400W in most games) power draw of the 4090 is already a pain in the summer months, 575W would probably make that at least 9 months of the year instead. Gaming in a sauna is not a good experience.
3 MFG has little to no use on a 120Hz display, even regular FG is pushing it in some games and in most cases I rather play at 90 fps without FG than 120 fps with it enabled.
4. Because of my local currency losing value against the USD since I bought my 4090 the 5090 is ~35% more expensive than the 4090 in my region.
5. I find myself playing and enjoying less demanding Indie titles more in the last year than ever before and only played two more demanding titles, Indy and Hellblade 2. Don't see that changing much in the next 2-3 years.

I agree with all those points.

Another one which is huge for me is coilwhine - the more power a card draws, the more likely it is to have coilwhine. I bought 5 different 4090's to get one with minimal coilwhine, and judging by reviews (that actually mentions coilwhine, like guru3d and hardware unboxed, unlike tpu) the 5090's all have coilwhine, ranging from moderate to terribad. And i absolutely do not want a gpu with coilwhine !

There's more to the 5000 than meets the eye. Transformer does improve visual quality significantly (to the point you don't want to use CNN). However, 4000 will increase latency much more than 5000. So it's not just about raw fps per se. Though if there are no games to play then it doesn't matter unless one would like improved quality in older games at the same or better performance/latency.

Marketing bs from nvidia - articifically imposed limits to sell the new gen as always...
 
I agree with all those points.

Another one which is huge for me is coilwhine - the more power a card draws, the more likely it is to have coilwhine. I bought 5 different 4090's to get one with minimal coilwhine, and judging by reviews (that actually mentions coilwhine, like guru3d and hardware unboxed, unlike tpu) the 5090's all have coilwhine, ranging from moderate to terribad. And i absolutely do not want a gpu with coilwhine !



Marketing bs from nvidia - articifically imposed limits to sell the new gen as always...

eh? the tests clearly show it. Try it out. DLSS4 is available now. And obviously the new model requires more compute.

My point is just looking at FPS/raw raster is so last year. Fidelty equality important. If you can run DLSS4 perf at the same quality as DLSS3 quality then obviously you got more performance than before.
 
eh? the tests clearly show it. Try it out. DLSS4 is available now. And obviously the new model requires more compute.

My point is just looking at FPS/raw raster is so last year. Fidelty equality important. If you can run DLSS4 perf at the same quality as DLSS3 quality then obviously you got more performance than before.

I have, and it makes zero difference performance wise. And it does not make dlss performance look like dlss quality... it just handles motion slightly better.

Raster performance is what purely matters for the vast majority of games, namely all multiplayer games.
 
Last edited:
No.

This was a quick thread.


Ah, GPU history? Well, started with a 512kB Trident ISA card, but that's technically not a GPU, right? So a few steps (including Riva TNTs, Voodoo and such) ahead, there was the GeForce 2 MX 400, and after that, it just went wild.

The last ~5 years or so looked like this: Sapphire Radeon RX 580 Nitro+ 4 GB ---> Sapphire Radeon Vega 64 Nitro+ ---> Sapphire Radeon Vega 56@V64 BIOS ---> MSI GeForce GTX 1080 Ti Armor OC ---> NVIDIA GeForce GTX 980 Ti Founders ---> EVGA GeForce GTX 950 SC+ Gaming ACX 2.0 ---> Sapphire Radeon RX 580 Nitro+ 4 GB, but I might swap in the GTX 950 once again. So I'm not the target audience for the RTX 5090. Might buy a 2nd hand EVGA RTX 3050 or a Sapphire RX 6600 in the spring, when I swap the R7 2700 for an R7 5700X.
 
Last edited:
No.

Geforce 2MX
Radeon x1800xt
Radeon HD 4850
Radeon HD 6950
Radeon 6800m
 
Back
Top