• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5090 Features 575 W TDP, RTX 5080 Carries 360 W TDP

What that diagram tells me is that the 3090 should be a 250-260 Watt card. There is no need for it to eat more than that out-of-the-box. Overclockers would be happy with that, too.
Haven't you noticed how overclocking has diminished in popularity lately? Manufacturers are pushing components with higher clocks (and power, as a consequence) out of the box to try and get an edge and bigger numbers for marketing reasons.
Consumers have already shown they don't care about sane power consumption, they want that extra performance out of the box. Just look at what happened to the 9000 series from AMD, where they had to push for a bios with a higher default TDP to appease their consumers. Or Intel, where most people didn't give a damn about the great efficiency vs the previous gen.
 
Haven't you noticed how overclocking has diminished in popularity lately? Manufacturers are pushing components with higher clocks (and power, as a consequence) out of the box to try and get an edge and bigger numbers for marketing reasons.
Except that it doesn't bring bigger numbers. 71 vs 73 FPS is not a difference in my books. It's just margin of error. This is why overclocking is dead. Everything is overclocked out-of-the-box for no reason.

Consumers have already shown they don't care about sane power consumption, they want that extra performance out of the box. Just look at what happened to the 9000 series from AMD, where they had to push for a bios with a higher default TDP to appease their consumers. Or Intel, where most people didn't give a damn about the great efficiency vs the previous gen.
I see that, but I still find it weird, especially with all the propaganda about going green by using less power and stuff. Where is all that greenness in the home PC industry? You're constantly being reminded to use LED lights in your home and turn them off to save 5 Watts per hour only so that your high-end GPU can waste another 100 W on nothing? :confused:
 
That's awesome! :) Sometimes it's nice to be proven wrong. :ohwell:

It begs the question though, why the 4090 has to be a 450 W card by default if it doesn't bring any extra performance to the table. What is Nvidia aiming at with such a high power consumption?
The actual workloads it's intended for. The x090 series cards are spec'd and priced for those. They aren't gaming products.
 
Except that it doesn't bring bigger numbers. 71 vs 73 FPS is not a difference in my books. It's just margin of error. This is why overclocking is dead. Everything is overclocked out-of-the-box for no reason.
People are really willing to go through great lengths for that extra 2~5% performance, it is what it is.
I see that, but I still find it weird, especially with all the propaganda about going green by using less power and stuff. Where is all that greenness in the home PC industry? You're constantly being reminded to use LED lights in your home and turn them off to save 5 Watts per hour only so that your high-end GPU can waste another 100 W on nothing? :confused:
Eh, I do have opinions on that, but I guess it'd be pretty off-topic.
The actual workloads it's intended for. The x090 series cards are spec'd and priced for those. They aren't gaming products.
Most prosumers will actually power limit those GPUs. Just look at the TDP of their enterprise lineup, it's way lower than their geforce counterparts since perf/watt is an important metric in that space.
 
So that for some very specific cases it can stretch its legs all the way, using an extra 100W for a 100MHz bump for those synthetic benchmark scores.
Same goes for the 600W limit some models have, really pushing the power envelope for minor clock gains. Reminder that after some point, the performance scaling x power becomes exponential.

Both my 3090s have a default power limit of 370W, whereas at 275W I loose less than 10% perf.

Here's a simple example of power scaling for some AI workloads on a 3090, you can see that after some point you barely get any extra performance when increasing power:
View attachment 378225

That has been the case since... always. Here's another example with a 2080ti:
View attachment 378224

Games often don't really push a GPU that hard, so the consumption while playing is really lower than the actual limit.
Biscuits, path tracing etc etc IE the whole point of buying too much GPU ATM pushes a GPU just fine or do people buy these over the top priced cards just for raytraced Fortnite and Roblox.

Games now often Do push high load's.
 
But i'd also run the 5080 at 320w, and so the performance difference will still be whatever it ends up being.
problem is 5080 vram is bad over 2k already on Star Wars Outlaws with secret settings outlaws.
idk what games will use over 16gb vram maxed out or what insane textures can fit 32gb vram. witcher4 and gta6 might be some of it. no words on ac shadows and hexe.
and no, 5080 underpowered is not same as 5090. cause under hood is a totally different card. this is why overclocking won't make a huge difference excepting benchmarks.
and seriously,anyone buying 5090 should watercooled it in summers which become more and more hotter, way over 2 degree climate target
 
problem is 5080 vram is bad over 2k already on Star Wars Outlaws with secret settings outlaws.
idk what games will use over 16gb vram maxed out or what insane textures can fit 32gb vram. witcher4 and gta6 might be some of it. no words on ac shadows and hexe.
As long as the latest gen consoles have 16 GB RAM (not VRAM - total system RAM also used as VRAM), I'd be cautious about making predictions on this front.
 
Except that it doesn't bring bigger numbers. 71 vs 73 FPS is not a difference in my books. It's just margin of error. This is why overclocking is dead. Everything is overclocked out-of-the-box for no reason.


I see that, but I still find it weird, especially with all the propaganda about going green by using less power and stuff. Where is all that greenness in the home PC industry? You're constantly being reminded to use LED lights in your home and turn them off to save 5 Watts per hour only so that your high-end GPU can waste another 100 W on nothing? :confused:
Yes, why I also no longer buy OC models, unless they come with Dual BIOS, or better VRM/Phases etc. Makes no sense to pay more for what, a name and maybe 2% more performance? lol
 
Yes, why I also no longer buy OC models, unless they come with Dual BIOS, or better VRM/Phases etc. Makes no sense to pay more for what, a name and maybe 2% more performance? lol
Not to mention made-by-AMD and Nvidia FE cards look so much better than all the plastic bling-bling gamery shit made by board partners! :rolleyes:
 
Yes, why I also no longer buy OC models, unless they come with Dual BIOS, or better VRM/Phases etc. Makes no sense to pay more for what, a name and maybe 2% more performance? lol
I wish better models had 2% more performance :D

They don't, just look at their clockspeeds, they are all within 0.01% of each other. Higher end models just have better PCB power delivery etc.

Not to mention made-by-AMD and Nvidia FE cards look so much better than all the plastic bling-bling gamery shit made by board partners! :rolleyes:
FE was nice when it was used on smaller models (got a 3060ti FE, it's really nice), but look at the 4090 FE, it's ugly as hell. There are a few designs that still look decent, usually it's the high end that's bling bling gamerz OC RGB, base models are nice, check the 4090 windforce for example.
 
Biscuits, path tracing etc etc IE the whole point of buying too much GPU ATM pushes a GPU just fine or do people buy these over the top priced cards just for raytraced Fortnite and Roblox.

Games now often Do push high load's.
I don't think that's the case, just take a look at steam's hw survey and see how the 3090 and 4090 fare in the list, way below the other products.
Most gamers won't be buying those products. Reminder that this forum is a niche with some enthusiasts that can afford it, but your average buyer won't even consider that product as an option.
That's debatable, the workloads you're talking about would still be VRAM limited.
And that's why you buy multiple of those :p
 
I don't think that's the case, just take a look at steam's hw survey and see how the 3090 and 4090 fare in the list, way below the other products.
Most gamers won't be buying those products. Reminder that this forum is a niche with some enthusiasts that can afford it, but your average buyer won't even consider that product as an option.
x90 is a niche even on this forum, I'd dare to say. It's for professionals, 4K high refresh gamers.
 
Last edited by a moderator:
Undervolting is the new overclocking. You get a card juiced way beyond the v/f curve out of the box, and the game is now to keep 95% of the performance for 70% of the power.

It's honestly for the best this way. Overclocking was fun (especially when you had to draw your own traces), but most buyers are fundamentally getting ripped off by not getting all of the performance out of the card they paid for.

Now you get all the performance, and it's on you to find the power/heat level that you're OK with.
 
It's honestly for the best this way. Overclocking was fun (especially when you had to draw your own traces), but most buyers are fundamentally getting ripped off by not getting all of the performance out of the card they paid for.
I kind of agree and disagree. I like using my card to its full potential, but 1. I'm not keen on the top 5% at all costs, and 2. we're constantly being fed that the planet needs to be saved, we're using too much energy and whatnot, so then why do GPUs have to consume hundreds of Watts more out-of-the-box for the extra 5% performance? Is it really worth it? I'm not sure it is.
 
That's debatable, the workloads you're talking about would still be VRAM limited. Unless you meant selling them in China, bypassing some of the usual sanctions?

They really aren't, the xx90 are neither here nor there IMO.
In art and design There's lots of freelancer who prefer the Geforce RTX xx90 over the workstation RTX. CAD/Scientific simulation users are pretty much the only target that I've really seen have a hard on for those professional cards, otherwise I've seen professionals says that in their domain, the ROI is just higher with a Geforce. And if a project really need more than that, they would use a cloud farm anyway (and charge the client accordingly)
 
Will this be the next Fermi?
 
I kind of agree and disagree. I like using my card to its full potential, but 1. I'm not keen on the top 5% at all costs, and 2. we're constantly being fed that the planet needs to be saved, we're using too much energy and whatnot, so then why do GPUs have to consume hundreds of Watts more out-of-the-box for the extra 5% performance? Is it really worth it? I'm not sure it is.

- Could just go with the accelerationism approach and try to burn down the planet faster... better a quick death than a long and drawn out battle against the inevitable :rockout:

But that's the beauty of undervolting... you don't have to get that last 5% at all costs, you can reduce power consumption and your own carbon footprint. The choice has simply shifted from having to gain performance to having to conserve power and heat, other side of the same coin.

I can't speak to newer Nvidia cards, but AMD definitely has some one click and done settings in their software that will bias the card one way or another so it doesn't even require a whole to of tweaking and noodling to get decent results.
 
- Could just go with the accelerationism approach and try to burn down the planet faster... better a quick death than a long and drawn out battle against the inevitable :rockout:
Interesting thought. I wouldn't say I entirely disagree, but let's not go there for the sake of other forum members' sanity. :D

But that's the beauty of undervolting... you don't have to get that last 5% at all costs, you can reduce power consumption and your own carbon footprint. The choice has simply shifted from having to gain performance to having to conserve power and heat, other side of the same coin.
Sure, but how many people do that? I'd bet at least 9 out of 10 people just plug their cards in and let it run full blast. What we see here in the forum is a tiny minority.

I can't speak to newer Nvidia cards, but AMD definitely has some one click and done settings in their software that will bias the card one way or another so it doesn't even require a whole to of tweaking and noodling to get decent results.
It's not that great. The last time I tried the "auto undervolt" button (if that's what you mean) when I briefly had a 7800 XT on my hands, it shaved maybe 10 W off of it. The power slider works nicely, though.

Does Nvidia have an equivalent function in their new software? I haven't tried it, yet (my HTPCs are running old drivers at the moment).
 
You seem to be under the assumption that performance scales linearly with power.
A 5090 at a lower power budget than a 5080 is still going to have almost double the memory bandwidth, and way more cores, even if those are clocked lower.

Your assumptions are also wrong. A 5090 at 320W is likely to only be 10~20% slower than the stock setting.
The 5080 math is also not that simple because things (sadly) often do not scale linearly like that.

Example - my 4090 at a 90% power limit loses 2% performance.

and people with more money than brain.

That‘s where I have a problem with your thinking. Who are you to judge how I spend my money, what my values are, and what I derive pleasure from?

Seriously, just fuck off with this bullshit attitude. I thought this was an enthusiast website.

Just as an FYI I upgraded from a GTX 960. The reason I can afford a 4090 is because I saved the money by not upgrading constantly.
 
Last edited:
The actual workloads it's intended for. The x090 series cards are spec'd and priced for those. They aren't gaming products.

Nvidia has other products for non-gaming purposes. They are called Quadro, Tesla and the like.
They are very bad for gaming, while the x090 are the highest performant for gaming. They are gaming cards.
 
That‘s where I have a problem with your thinking. Who are you to judge how I spend my money, what my values are, and what I derive pleasure from?

Seriously, just fuck off with this bullshit attitude. I thought this was an enthusiast website.

Just as an FYI I upgraded from a GTX 960. The reason I can afford a 4090 is because I saved the money by not upgrading constantly.
I listed several (even legitimate) use cases for the 4090. Did I say which kind of buyer you personally were? ;)

Also, how do you define an "enthusiast"?

Nvidia has other products for non-gaming purposes. They are called Quadro, Tesla and the like.
There's no more Quadro, just RTX Axxx. And who said they're not good at gaming?

They are very bad for gaming, while the x090 are the highest performant for gaming. They are gaming cards.
That's pure marketing, not the full picture.
 
Seriously, just fuck off with this bullshit attitude. I thought this was an enthusiast website.
Judging by your own comments here, I can safely say that it is not.
 
There's no more Quadro, just RTX Axxx. And who said they're not good at gaming?

Me. You know it, but... ?

1735930673661.png


 
Back
Top