• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 5090 Founders Edition

Mid to end of gen generally is best time to buy yeah.
Oh I know it is, I got a 6950XT near the end of its release for just over £500 which is ~50% of its RRP.

The fact people are going around going "Hey I got this GPU for a steal near the end of its life at 10% below RRP" just makes me sad that this is what we have to deal with now. I just had a look at the Asus ROG STRIX GAMING OC details.

1737689324573.png


1737689368732.png


The LOWEST it ever went to was maybe 30% lower than its INSANE RRP at most and well above even the RRP of a Founder Edition at its lowest point, 18 months after release and this seems to be fairly standard acorss everything. If what we have to expect is basically the RRP of a card will be its RRP for its life then I can see why the high end gaming area is going to be limited to a select few because I cannot see many people willing to shell out £2-3k on a GPU every 4-6 years on top of everything else in a PC.
 
The 5090 is a great GPU, just (too) expensive.
The 4090 was just $100 more (~6.6% more expensive) but ~70% faster than the 3090
The 5090 is $400 more (25% more expensive) but around 30-40% faster than the 4090

If the 5090 was $1600 nobody would be complaining. But Nvidia are marketing DLSS 4 (MFG using AI) to set new "metrics" and blur the lines of Performance.
 
What's crazy is remembering the 3090ti launch around January of '22 with the expected release of the 40 series later that year. The 3090ti went for 2k USD and it was less than 10% greater than the 3090. I believe the consensus was that it was just binning to get the ti version. What a terrible deal that was
i thought the 3090ti got released because of RX6950xt
 
i thought the 3090ti got released because of RX6950xt

Nvidia RTX 3090 Ti got released in March 2022 for a MSRP of $ 2000 - but it was barely faster than RTX 3090 from September 2020, released for MSRP of $1500. Crypto fell by that time from the high in December 2021, but it kind of bounced around - but not for long, soon after release it crashed completely, ruining all ROI calculations of poor miners.

1000004903.png


But it enabled Nvidia to release RTX 4080 for $ 1200 and RTX 4090 for $1600 - even here on TechPowerUP they wrote how RTX 4080 was good value compared to slower RTX 3090 Ti at $ 2000, or the scalped prices RTX 3080 was going for at the height of crypto...
 

Attachments

  • 1000004902.png
    1000004902.png
    367.7 KB · Views: 32
Nvidia RTX 3090 Ti got released in March 2022 for a MSRP of $ 2000 - but it was barely faster than RTX 3090 from September 2020, released for MSRP of $1500. Crypto fell by that time from the high in December 2021, but it kind of bounced around - but not for long, soon after release it crashed completely, ruining all ROI calculations of poor miners.

View attachment 381390

But it enabled Nvidia to release RTX 4080 for $ 1200 and RTX 4090 for $1600 - even here on TechPowerUP they wrote how RTX 4080 was good value compared to slower RTX 3090 Ti at $ 2000, or the scalped prices RTX 3080 was going for at the height of crypto...

Nvidia has gotten very good at controlling the perception of it's products. They stopped 4000 series production prior to the 5000 series and drained the channel of stock precisely to make the 5000 series appear to be better value.
 
Exactly. 'Muh fantastic FE'
Lmao. Gonna be nice for the lifetime of this product.

I keep harping on about the wall that silicon lithography has hit and people keep ignoring me, and then do surprised Pikachu faces when there is no efficiency gain generation-on-generation. Because efficiency, by and large, comes from the node size and that isn't getting appreciably smaller. If y'all are crying this hard about lack of generational performance uplift now, you're gonna be drowning in your tears for a long time, because there ain't any good solutions in sight in the next half-decade at best. Physics is a harsh mistress.
Zen X3D has convincingly shown you could not be more wrong about that.

And we know current GPUs doing RT have literally created an extra consumption hog to add to the power needed to reach X FPS. Similarly there will likely still be a gap between RDNA4, Intel Xe and Geforce even if they are baked on the same node.

You are being ignored on that point because its only 'right' in the moment where the monopolist stops truly innovating its hardware. Moore's Law is similarly dead until someone proves Huang wrong.
 
Just noticed this.
View attachment 381322
That’s… some interesting behavior from the Swarm Engine. Doesn’t show up on higher resolutions.
This seems to be too small of data to put enough load on the gpu problem.
that 4090 was probably the limit of the data amount the game engine puts out.
More data isn't available & the 5090 wants more data but gets nothing. doesn't seem to be a cpu problem it's common on some other games with the 4090 too below 1080p.
This would be like when people run 720p on the 4090 & it ends up below things like the 4080 or 4070.
 
Oh I know it is, I got a 6950XT near the end of its release for just over £500 which is ~50% of its RRP.

The fact people are going around going "Hey I got this GPU for a steal near the end of its life at 10% below RRP" just makes me sad that this is what we have to deal with now. I just had a look at the Asus ROG STRIX GAMING OC details.

View attachment 381366

View attachment 381372

The LOWEST it ever went to was maybe 30% lower than its INSANE RRP at most and well above even the RRP of a Founder Edition at its lowest point, 18 months after release and this seems to be fairly standard acorss everything. If what we have to expect is basically the RRP of a card will be its RRP for its life then I can see why the high end gaming area is going to be limited to a select few because I cannot see many people willing to shell out £2-3k on a GPU every 4-6 years on top of everything else in a PC.
'The cost of RT'

Next chapter:

'The lacking RT adoption rate'
 
HWUB Steve was calling it 4090Ti on purpose. I mean, i see where he was coming from. 25% price bump for 26% average performance? Different sites different % performance increase but still not a lot of an increase and the price bump? The power draw is way higher as well. I must say, this does not look so good to me. Very average on the lower side of average and I'm being generous here.
I think this one is pretty bad and considering 5090's performance, I doubt we can expect huge differences gen to gen from the NV 5000 series lower tier cards. I'm kind of disappointed.
AMD you are up. lets see what you got. Odds are nothing groundbreaking really but lets just wait and see.
 
But it's perfectly in line with Jensen's Law:

"Moore's Law is dead … It's completely over, and so the idea that a chip is going to go down in cost over time, unfortunately, is a story of the past."
 
I feel this is a fallacy, it should totally be compared to other products and depending on the usage, a choice must be made. Purely for gaming, you must have money to burn. AI, game development, video editing, modeling and rendering, and gaming, here's your best option.
Then why are we comparing a b580 (msrp 250 $) to an 5090 (msrp 2000 $).

It's the best $ to performance (if you can get it for that price) card against the strongest card currently on the market.

Who thinks:

Hmmm... I can buy this card from intel for 250 $ (with perfromance between a 4060 ti 8GB & 16GB Version) or the 5090 for 2000 $.

The comparison is just nonsensical.
 
Power consumption video playback? Multi monitors? Idle?! ITS HORRIBLE!
People literally hated AMD and XTX because of this, but i guess...everyone and their mom will love Nvidia now. For the record, XTX was high too. I watch a lot of movies, can you imagine draining 50-60W for no reason at all, while my 10W CPU can nicely play movies on my laptop without any issue, and i can even put it to my 4k TV. What's going on with Nvidia now? They are becoming AMD?
For productive use as research computations or making money with this card using it for rendering, etc, this idle/low use power draw is not very relevant.

Rich people buying this card only for occasional gaming will not care either.

5080 and below for normal people will thanks to its lower ram capacity and silicon area hopefully have much lower power draw. But I believe that even my tiny chip 4070 with 12 GB of RAM draws at least 13W while doing very little.

I just checked TPU numbers for video playback, 4070 draws 15W and 5090 54W, and these cards are massively different in silicon area, ram capacity and power circuitry. I do not expect these cards to draw exactly the same power in those three low usage scenarious.
 
Then why are we comparing a b580 (msrp 250 $) to an 5090 (msrp 2000 $).

It's the best $ to performance (if you can get it for that price) card against the strongest card currently on the market.

Who thinks:

Hmmm... I can buy this card from intel for 250 $ (with perfromance between a 4060 ti 8GB & 16GB Version) or the 5090 for 2000 $.

The comparison is just nonsensical.
Of course it isn't. The 5090 will eventually be a similar class of paperweight, it just takes more time. The comparisons are great to make, they provide you with much needed perspective on how silly it is to spend 2K on a GPU. Or how useful, given your use case. Fact is, B580 also offers 16GB, so if its just VRAM you need...
 
Of course it isn't. The 5090 will eventually be a similar class of paperweight, it just takes more time. The comparisons are great to make, they provide you with much needed perspective on how silly it is to spend 2K on a GPU. Or how useful, given your use case.
Yes I also think the price is stupid for the 50 Series, also the availability for the b580 at msrp is quite bad. (But the same goes for the 50 Series)
Fact is, B580 also offers 16GB, so if its just VRAM you need...
there are 16 GB models of the b580? I thought they all were 12GB
 
Yes I also think the price is stupid for the 50 Series, also the availability for the b580 at msrp is quite bad. (But the same goes for the 50 Series)

there are 16 GB models of the b580? I thought they all were 12GB
You're right! My bad, that was A770
 
'The cost of RT'

Next chapter:

'The lacking RT adoption rate'
or "The Rise of the PS6 for upscaled path traced AAA gaming." The PS5 Pro has been selling pretty well despite being expensive. Console players are not that fussy; low framerate and (bad) upscaling have been their standards.

And seriously, PC gaming playtime (and a big chunk of the revenue) is being carried by competitive/online games that could run on a potato. I have a hunch that the "PC GAMING MASTERRACE premium experience" becoming increasingly expensive isn't going to make that much of a dent overall.
 
I just checked TPU numbers for video playback, 4070 draws 15W and 5090 54W, and these cards are massively different in silicon area, ram capacity and power circuitry. I do not expect these cards to draw exactly the same power in those three low usage scenarious.

A bit more apples to apples:

4090 idle: 22W
5090 idle: 30W, +36%

4090 multi monitor: 27W
5090 multi monitor:39W, +44%

4090 video playback: 26W
5090 video playback: 54W, +108%

It's quite horrible. AMD "We'll fix it in drivers (but doesn't)" horrible.

But making excuses for Nvidia that this card isn't meant for gamers, home users is silly. Nvidia spent quite a big chunk of their presentation of RTX 5090 on how good it is in gaming - since it's apparently the only card that will have any significant performance uplift compared to Lovelace equivalent without using "frame quadrupling". Delegate this card to "Quadro" lineup, or "home and small business AI accelerator" lineup, what are you left with? Cards within 10- 15 % of their predecessors? That's within overclocking margin, as measly as it is now.
 
A bit more apples to apples:

4090 idle: 22W
5090 idle: 30W, +36%

4090 multi monitor: 27W
5090 multi monitor:39W, +44%

4090 video playback: 26W
5090 video playback: 54W, +108%

It's quite horrible. AMD "We'll fix it in drivers (but doesn't)" horrible.

But making excuses for Nvidia that this card isn't meant for gamers, home users is silly. Nvidia spent quite a big chunk of their presentation of RTX 5090 on how good it is in gaming - since it's apparently the only card that will have any significant performance uplift compared to Lovelace equivalent without using "frame quadrupling". Delegate this card to "Quaddro" lineup, or "home and small bussines AI accelerator" lineup, what are you left with? Cards within 10- 15 % of their predecessors? That's within overclocking margin, as measly as it is now.
High idle power is generally horrible, but... it's a trade off for having the fastest. I didn't particularly like my 4090s idle draw either (was hitting 27 watts) but again I was willing to pay that price for the fastest. Comparing it to amd gpus ain't fair because you are not really making a trade off, you aren't getting the fastest card, you are just getting a card with high power draw
 
A bit more apples to apples:

4090 idle: 22W
5090 idle: 30W, +36%

4090 multi monitor: 27W
5090 multi monitor:39W, +44%

4090 video playback: 26W
5090 video playback: 54W, +108%

It's quite horrible. AMD "We'll fix it in drivers (but doesn't)" horrible.

But making excuses for Nvidia that this card isn't meant for gamers, home users is silly. Nvidia spent quite a big chunk of their presentation of RTX 5090 on how good it is in gaming - since it's apparently the only card that will have any significant performance uplift compared to Lovelace equivalent without using "frame quadrupling". Delegate this card to "Quadro" lineup, or "home and small business AI accelerator" lineup, what are you left with? Cards within 10- 15 % of their predecessors? That's within overclocking margin, as measly as it is now.
I do not think that this low usage power draw is good, but 5090 draws almost 600W in 4K. That is a real concern. I do not think that this card should be consireded a common consumer product. This power draw for a "normal home pc" is INSANE.
 
I do not think that this low usage power draw is good, but 5090 draws almost 600W in 4K. That is a real concern. I do not think that this card should be consireded a common consumer product. This power draw for a "normal home pc" is INSANE.
don't worry they won't be common, there will be 20 - 30 of them at launch. And they will heat your room in the mean time (mostly kidding)
 
If it's 1% better than the most efficient card on earth, then it is, by definition, the new most efficient card on earth.

Pulling 600w on its own does not make something inefficient.

To clarify, the 4090 was not the most efficient card on Earth. The 4080 was at the time, and now it's the 4080 Super, at least according to TPU charts.

The issue is that if a new generation product has the same efficiency as the previous gen, it's not a good thing. Obviously it's mainly because of the 4N node, but it shows that the architecture doesn't actually improve anything, at least for existing titles. There are some features that have to be implemented in games (neural rendering, mega geometry), but current games don't really benefit from any of the architectural changes.
It's always worth bringing up Maxwell. The was a crazy new architecture which improved efficiency by 50% on the same node. It was unbelievable. It can be done.

Raising power targets like this is a problem for the following generation. What's going to happen with the 60 series? A new node will improve efficiency, but to get a significant performance improvement they'll need to keep the same power levels. If they lower the power to 40 series levels, there will be a small performance improvement, or none at all. Do you really want 70-class cards at 250-300 W? I certainly don't, at least not in the summer.
 
Raising power targets like this is a problem for the following generation. What's going to happen with the 60 series? A new node will improve efficiency, but to get a significant performance improvement they'll need to keep the same power levels. If they lower the power to 40 series levels, there will be a small performance improvement, or none at all. Do you really want 70-class cards at 250-300 W? I certainly don't, at least not in the summer.
Well they could lower the prices a bit (a series 60 card for 1000$ with almost 4090 performance wouldn't be bad) maybe something similar to 20series -> 30series.

Or they just lock really cool software features behind the new cards...
 
To clarify, the 4090 was not the most efficient card on Earth. The 4080 was at the time, and now it's the 4080 Super, at least according to TPU charts.

The issue is that if a new generation product has the same efficiency as the previous gen, it's not a good thing. Obviously it's mainly because of the 4N node, but it shows that the architecture doesn't actually improve anything, at least for existing titles. There are some features that have to be implemented in games (neural rendering, mega geometry), but current games don't really benefit from any of the architectural changes.
It's always worth bringing up Maxwell. The was a crazy new architecture which improved efficiency by 50% on the same node. It was unbelievable. It can be done.

Raising power targets like this is a problem for the following generation. What's going to happen with the 60 series? A new node will improve efficiency, but to get a significant performance improvement they'll need to keep the same power levels. If they lower the power to 40 series levels, there will be a small performance improvement, or none at all. Do you really want 70-class cards at 250-300 W? I certainly don't, at least not in the summer.
Of course it can be done, but that would mean they need to spend R&D team on improving shader perfomance, rather than their Tensor cores/AI. And they won't do that as it will not benefit them in the enterprise AI craze. From RT benchmarks it seems like they also stopped caring that much for it as well. It's AI all the way.
Ada also didn't improve shaders at all practically, the perfomance gains were simply from the better node and ability to run at 2,7-2,8GHz, rather than under 2GHz in Ampere.
If there will be a new development in that market or bubble will burst then they will be in a bad place, but that won't happen soon probably.
Now all you get in Geforce are leftovers from their enterprise arch. Radeon will go the same way with UDNA.
 
Now i want the same perf for 75W :)
In hope for a better power efficiency improvement for Blackwell, I did a calculation and got a 33% power efficiency improvement on average over 4 generations. Let's assume this trend continues (I think TSMC's own data for 3N and 2N suggests that it does not?), it would mean in 7 generations we'd be there (7.14 = ln(575/75)/ln(1.33)). If the 5090 had a 75W TDP, I'd consider getting it.
 
Back
Top