• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Best time to sell your used 4090s is now.

Even now I could still sell my 4090 for the same price I bought it or even higher.

I'm not going to, though, because for one I'd have to deal with the cost and bother of a new GPU waterblock and I'd still be at least 600€ in the red at the end of it because I'm not going to downgrade.
For another, I got lucky on getting a GPU that has almost no coil whine at all, and I'm not eager to roll the dice again.

Also, the new generation has no new features I want and I'm not hurting for performance. At this point, I expect to skip the next generation of GPUs as well. If I can't at least double my performance, it's not worth the bother of upgrading. I know how to use the quality settings when I need to and DLSS4 is looking pretty damn good.

The RX 9070 is now new 599€ and faster then the RTX 3090, which is worth 400€ old used is fine.
Where can you get a 3090 for 400€?

It's still selling for 600~700€ in Germany due to AI peeps looking for cheap nvidia GPUs with high vram.
The alternatives are the 4090/5090 (hellishly expensive) or something like the 5060Ti 16GB (less than half the memory bandwidth), both of which have their own drawbacks.

I wouldn't buy a 3090 for gaming, but it's still the best value for casual AI usage.
 
I don't know the AI benchmarks, but the RTX 3090 is really falling Bach due to lower clocks and high power consumption :
Screenshot_20250628_180851_com.android.chrome.jpg


For 699 you are getting the RX 9070XT, which is +46% faster for example. No way people buying the RTX 3090 for 700€, it's insane. The better 7900XTX 24GB going for 899 to 999 brand new with better Ai capabilities then the old RTX 3090.

People just overpaying for Nvidia, it's crazy.
 
Last edited by a moderator:
I don't know the AI benchmarks, but the RTX 3090 is really falling Bach due to lower clocks and high power consumption :
View attachment 405742

For 699 you are getting the RX 9070XT, which is +46% faster for example. No way people buying the RTX 3090 for 700€, it's insane. The better 7900XTX 24GB going for 899 to 999 brand new with better Ai capabilities then the old RTX 3090.

People just overpaying for Nvidia, it's crazy.

The 9070 is only 3% faster than the 3090 with way less vram, people are overpaying for AMD, it's crazy.


1751127436119.png
 
I don't know the AI benchmarks
But I do know the AI benchmarks and I'm telling you that AI is the reason the 3090 (Ti) is still so popular. The memory capacity to bandwidth to cost ratio is unbeatable even at 700€.

This is particularly true of LLMs whose performance pretty much scales linear to memory bandwidth (as long as the model fits in memory).
But if you're playing with AI image generation and have any kind of ambition, you'll want the extra headroom of 24GB of VRAM for the ability to use better models (Flux, HiDream), more Loras, more Controlnets and higher resolution.
 
Where can you find a 5090 for 2000??

Most 5080 are between 1500-2000.
MSRP

Where can you find a 4090 for $1600?

It's a 25% price increase for an average of 16% more performance (assuming you are running at full wattage) while using 34% more power. The 4090 was overpriced yes but by comparison the 5090 is vastly more so.
16% isn’t quite accurate. Sure it varies so it really depends on what game you’re playing. And my Strix 4090 used to pull well over 450W at the time.
 
MSRP

Where can you find a 4090 for $1600?

I got mine for $1650, but that was when it was current gen. You'd be hard pressed to find a 4090 anywhere near MSRP now. Like all the higher-end, high-capacity VRAM cards, they have gone up with the demand of AI and applications that make use of the compute without having to shell out for a Quadro or FirePro.

What's your point??? You haven't shown where you can get a 5090 for MSRP. It's because you can't.
 
What's your point??? You haven't shown where you can get a 5090 for MSRP. It's because you can't.

What’s yours? Mine was that, theoretically, the price of a new 5090 should be no more than 25% over what a new 4090 is and that’s give or take the advantage you get in demanding games. For LLMs this looks much better for the 5090 due to the bandwidth increase. IMHO overclocking potential of the 5000 series is also better. The 4090 is a great card but the 5090 is better in every way except for its price.

You can’t get a 4090 for MSRP at the moment. You can get a 5090 for 20% over MSRP but you can get a pre-built system with decent components (9800X3D based) for not much more. If you work out the cost of the GPU then it’s effectively MSRP. This is in Australia and seemingly every country is different. Supposedly they have 5090s below MSRP in Scandinavia.
 
a)
1000025306.jpg
b)
1000021056.jpg
c)
1000020814.jpg

Some eye candy to zen out the noise.
Guess which one is the 5090 or 4090?

d)
1000021333.jpg

e)
1000021334.jpg
f)
1000020576.jpg
 

Attachments

  • Screenshot_20250612_101447_Gallery.jpg
    Screenshot_20250612_101447_Gallery.jpg
    367.1 KB · Views: 17
Last edited:
What’s yours? Mine was that, theoretically, the price of a new 5090 should be no more than 25% over what a new 4090 is and that’s give or take the advantage you get in demanding games. For LLMs this looks much better for the 5090 due to the bandwidth increase. IMHO overclocking potential of the 5000 series is also better. The 4090 is a great card but the 5090 is better in every way except for its price.

You can’t get a 4090 for MSRP at the moment. You can get a 5090 for 20% over MSRP but you can get a pre-built system with decent components (9800X3D based) for not much more. If you work out the cost of the GPU then it’s effectively MSRP. This is in Australia and seemingly every country is different. Supposedly they have 5090s below MSRP in Scandinavia.
I think you could make an argument that the 4090 is best for backwards compatibility reasons due to the dropping of 32-bit physx. I know its not much but, its a cool feature I wouldn't want to lose, especially for those old games where the source code may not be around anymore, or even the studio. 5090 is also pretty dangerously close on the safety margin of its connector. And I mean, even the 4090 was, I think decreasing that margin even more was a mistake. Something like power delivery should be easy to get right for people who design silicon as complicated as nvidia's GPUs.

Without a doubt the 5090 is better for llms, and the fact that it has 32gb of vram is a dead give-away its for more than gaming which I don't like since there is already a line of cards for business use, though I guess this is more for the middle market, individuals or small companies perhaps, instead of large ones. Still, not good for gamers, as it increases demand for the card, driving up the price and reducing them from stock for those who do want it for gaming. The products should be segmented, imho. Since both use the same die, vram is the best way to segment.... thats why its so obvious. Its still extremely rare to run into a case where 24gb is needed in gaming, unless there's a vram leak or you're running a bunch of inefficient mods. So yeah, that kind of bugs me. Geforce should be gaming. Maybe they could make a new line if the target is small business or w/e.

But you're right about it being better at modern gaming, no doubt about that.

But what about stability? There's been quite a few hardware and driver issues related to the 50 series launch. Missing rops, black screens, etc etc. Thats one reason why I still haven't gone onto 57x drivers. I mean, I will, when I have to. But I haven't found the need yet, though I did briefly try them once. But everything I've needed to run still runs fine on the old drivers. Though its a little unclear to what degree the software vs the hardware is at fault here. Either way its nice to have the option to backtrack on the 4090 if you do suffer issues, like I did one time, the 57x drivers caused a ghosting issue in ff16, I thought it was related to the dlss upgrade I used, but it wasn't, even after putting the backup dlss file back, the ghosting continued, and going back to 56x resolved the issue. Can't do that on the 5090.

And yeah the 5090 msrp isn't bad, relative to the 4090, but you could at least at one point buy the 4090 at msrp, I did, I thought I was getting ripped off at the time and that I'd regret it when better, cheaper cards came out, but now I realize I was lucky, as the price ended up going up and staying up. And I know its not like this in the US, or maybe it is now with the tariffs, idk, but I've never even seen an nvidia founders edition card with my own two eyes, ever. Thats how rare they are here. If you want a cheap(er) 5090 in my country, you have to buy one of the models with bad reviews, and usually there's a reason for a cheap price and bad reviews. The decent cards, cost double what my 4090 cost when I bought it at msrp. Though I admit, I got it at a good time.

A lot depends on where you live. AMD prices are even worse here, at least at the times I've been shopping.
 
Last edited:
And yeah the 5090 msrp isn't bad, relative to the 4090, but you could at least at one point buy the 4090 at msrp, I did, I thought I was getting ripped off at the time and that I'd regret it when better, cheaper cards came out, but now I realize I was lucky, as the price ended up going up and staying up. And I know its not like this in the US, or maybe it is now with the tariffs, idk, but I've never even seen an nvidia founders edition card with my own two eyes, ever. Thats how rare they are here. If you want a cheap(er) 5090 in my country, you have to buy one of the models with bad reviews, and usually there's a reason for a cheap price and bad reviews. The decent cards, cost double what my 4090 cost when I bought it at msrp. Though I admit, I got it at a good time.

A lot depends on where you live. AMD prices are even worse here, at least at the times I've been shopping.

I remember thinking paying 1699 was stupid for a gpu, next generations is gonna be so much better..... Sucks to be wrong. I only did it because it was the last time I would be able to buy a stupidly priced gpu without factoring in a toddler lol and knew I would be retiring around the time the next generation came out.
 
I put my Asus TUF 4090 back in the box and that's it, maybe in 20 years I can sell it as collector item :D
 
I put my Asus TUF 4090 back in the box and that's it, maybe in 20 years I can sell it as collector item :D

Up for auction is a card from the last generation Nvidia gave a $#!+
 
16% isn’t quite accurate. Sure it varies so it really depends on what game you’re playing. And my Strix 4090 used to pull well over 450W at the time.

Nonsense, my number is the 1440p average that was pulled right from TPU's review:

1751173196882.png



Hardware Unboxed's 1440p results are even more dismal, with there being a 12% difference between the 4090 and 5090:

1751173491332.png



It's one thing if you are buying a card for yourself and intend to use it for specific games that happen to outpace the average but that's more a personal argument than one based on the overall merits of the product itself.
 
I remember thinking paying 1699 was stupid for a gpu, next generations is gonna be so much better..... Sucks to be wrong. I only did it because it was the last time I would be able to buy a stupidly priced gpu without factoring in a toddler lol and knew I would be retiring around the time the next generation came out.
I felt like a sucker paying 1900 for the 3090. For the 4090 on the other hand I actually felt like im stealing jensen (based on the pricing of the other gpus on the market, both nvidias and the competitions).

Nonsense, my number is the 1440p average that was pulled right from TPU's review:

View attachment 405802
That's 20%, and most likely that's gpu bound. I've tested the cards, the difference is 30+ when cpu isn't slowing em down.

EG1. Yean, the hwunboxed video makes it obvious. Some of those games are so heavily cpu limited at 1440p there is little to no difference between the cards. That's not a proper way to measure performance. Warthunder, hogwarts, starfield etc.
 
That's 20%, and most likely that's gpu bound. I've tested the cards, the difference is 30+ when cpu isn't slowing em down.

I agree, closer to 35% when not gpu limited in my very un scientific testing. Or 25% when both are limited to 350w/400w although that could just be the 4 games I tried and my +/‐ is probably 5% becuase I only do 4 runs and throw out the fastest and slowest run.

4K is really the only resolution 1000+ gpu should be tested at if not even the fastest cpus will limit them in some games at 1440p.

W1z got 32% at 4k I believe I think that is probably about what one should expect.
Screenshot 2025-06-28 231432.png
I felt like a sucker paying 1900 for the 3090. For the 4090 on the other hand I actually felt like im stealing jensen (based on the pricing of the other gpus on the market, both nvidias and the competitions)..

Honestly I only felt that way till I got it home and in some scenarios it was almost double my 3080ti in heavy RT 4k 70-90% and I was like damn and that was without BSgen.

The 4090 doesn't crash and has physx.

While I'm annoyed that it got removed people are making a larger deal about it than it really is also it's mostly Red Kool-aid drinkers that couldn't even use it in the first place and use to cry whenever a game included it.

Borderlands 2 and The Arkham games are the only ones I really liked the implementation. I'm sure I'm forgetting some but I dont regularly play decade + old games.
 
Last edited:
Nonsense, my number is the 1440p average that was pulled right from TPU's review:

Hardware Unboxed's 1440p results are even more dismal, with there being a 12% difference between the 4090 and 5090:
Sorry but no sane person would use 1440p to compare cards at that level. Who cares whether you get 120 or 140 fps unless it’s competitive gaming and I’m sure even cheaper cards manage to get good fps at Counterstrike.

I don’t know how we got here….The only point I was trying to make was MSRP is up 25%, memory size is up 33%, memory bandwidth is up 80% and those two significantly benefit AI usage. Gaming performance @ 4k is up 20-30% and that’s always nice to have.
 
Sorry but no sane person would use 1440p to compare cards at that level. Who cares whether you get 120 or 140 fps unless it’s competitive gaming and I’m sure even cheaper cards manage to get good fps at Counterstrike.

I don’t know how we got here….The only point I was trying to make was MSRP is up 25%, memory size is up 33%, memory bandwidth is up 80% and those two significantly benefit AI usage. Gaming performance @ 4k is up 20-30% and that’s always nice to have.

The problem is for first 6 months after release it was easy to get a 4090 for 1600-1700 the 5090 is sitting around 2800. So real world pricing isn't very close during the same time frames post launch at least in the states.

I don't think it's as bad in Europe and I'm not familiar with other territories like Australia etc.
 
I don't think it's as bad in Europe and I'm not familiar with other territories like Australia etc.
MSRP for the 4090 was over $3k AU. We get shafted here ALL the time. I paid $3.7k for my Strix because that’s all I could get on the morning of release. The 4090 was then mostly available only for over MSRP for the first 6 months. NOT much different from the current situation. The MSRP for the 5090 was over $4k here which is $2000 * exchange rate plus GST (our sales tax) plus another 20% of rip-me-off surcharge. What can I say, these things are definitely luxury items and luxury items have strange pricing
 
MSRP for the 4090 was over $3k AU. We get shafted here ALL the time. I paid $3.7k for my Strix because that’s all I could get on the morning of release. The 4090 was then mostly available only for over MSRP for the first 6 months. NOT much different from the current situation. The MSRP for the 5090 was over $4k here which is $2000 * exchange rate plus GST (our sales tax) plus another 20% of rip-me-off surcharge. What can I say, these things are definitely luxury items and luxury items have strange pricing

I know for a while the 5090 was 5000 AUD because a few people in this thread were lamenting it cheapest I've seen over there is 4800 AUD.
 
I agree, closer to 35% when not gpu limited in my very un scientific testing. Or 25% when both are limited to 350w/400w although that could just be the 4 games I tried and my +/‐ is probably 5% becuase I only do 4 runs and throw out the fastest and slowest run.

4K is really the only resolution 1000+ gpu should be tested at if not even the fastest cpus will limit them in some games at 1440p.

W1z got 32% at 4k I believe I think that is probably about what one should expect.
View attachment 405808
I've measured a ~22% difference with both locked to 400w. But yeah, your 25% seems on point.
Honestly I only felt that way till I got it home and in some scenarios it was almost double my 3080ti in heavy RT 4k 70-90% and I was like damn and that was without BSgen.

I've seen up to 110% over a 3090. RDR2 completely maxed took me from 55-90 depending on location to 85-90 minimum and up to 160+ where I hit a cpu bottleneck. The problem with the 3090 was - well, the 3080, 3080s and 3080ti were very close in performance at half the price (ignoring the mining shaenanigans)

Sorry but no sane person would use 1440p to compare cards at that level. Who cares whether you get 120 or 140 fps unless it’s competitive gaming and I’m sure even cheaper cards manage to get good fps at Counterstrike.

I don’t know how we got here….The only point I was trying to make was MSRP is up 25%, memory size is up 33%, memory bandwidth is up 80% and those two significantly benefit AI usage. Gaming performance @ 4k is up 20-30% and that’s always nice to have.
The big downside - for me - and the reason I didn't keep the 5090 is the power draw. Limited to the power im using on the 4090 the performance difference just isn't there.
 
I've seen up to 110% over a 3090. RDR2 completely maxed took me from 55-90 depending on location to 85-90 minimum and up to 160+ where I hit a cpu bottleneck. The problem with the 3090 was - well, the 3080, 3080s and 3080ti were very close in performance at half the price (ignoring the mining shaenanigans)

Even during Mining I paid almost half what I could source a 3090 at it was 1400 vs 2600 lmao and even at 1400 that 3080tii is one of the most meh cards I've ever purchased with it's massive 1 extra GB of vram over the 2080ti I was using lol.

I've measured a ~22% difference with both locked to 400w. But yeah, your 25% seems on point.

The slight difference is probably due to me having the 4090 at 350w vs the 5090 at 400w. The only reason I even tested it like that was because that is what the customer was going to run the card at and I was just curious.

I also only measure power at the wall so there is likely a small difference vs what software was reading.
 
Last edited:
Back
Top