• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5090 Features 575 W TDP, RTX 5080 Carries 360 W TDP

It's totally excessive and unnecessary, just like a 4090 is for most people.
Why does it matter to you though?

You made your point about sticking with AMD midrange and Linux..

5090 isn't made for "gamers"
 
The 5090 is almost double the power usage of my 7900XT which already heats up my tiny gaming room. The room would be hotter than a sauna with the 5090!
At the same game with the same settings and same fps the 5090 will likely need half the power of your 7900xt though. That's what matters when it comes to power, it's called efficiency.
 
The 5090 is almost double the power usage of my 7900XT which already heats up my tiny gaming room. The room would be hotter than a sauna with the 5090!
A 600W GPU is ridiculous unless its undervolted, but I expect 5090 buyers won't care and probably can afford the AC to handle the heat load. It's also silly to me how the 5090 might have 2x 16 pin connectors, retracting from the whole point of having a more "convenient" power connector.
To me it's like asking "who's gonna buy a TV that comes with 100% brightness out of the box".
Honestly, probably most people do. However with a graphics cards I shouldn't need to undervolt and tweak the power usage out of the box, manufacturers are pushing parts too hard if I have to undervolt things right away in order for the power use to be reasonable.
 
just pick any game and see the frame rate drop when you exit a building or enter a large open world area
That's a .1% hitch. Nothing to write home about, imo.

Why does it matter to you though?
It doesn't. I answered a question. The hostility... Geez. :confused:

Should everybody who doesn't intend to buy a 5090 just shut the F up and leave? :confused:
 
It's also silly to me how the 5090 might have 2x 16 pin connectors, retracting from the whole point of having a more "convenient" power connector.

it's is more convenient than a 4 or 5 * 8 pin connector :D
 
Honestly, probably most people do. However with a graphics cards I shouldn't need to undervolt and tweak the power usage out of the box, manufacturers are pushing parts too hard if I have to undervolt things right away in order for the power use to be reasonable.
Nobody said anything about undervolting. I agree you shouldn't need to undervolt out of the box, but you don't need to. But power usage, why not? Nvidia, amd, intel and whatever other company that exists CANNOT by definition know how you want to use a product better than you do, therefore they cannot ship it with settings that fit your needs. Same applies to most products, be it TVs, ACs, monitors etc.

Some GPUs have a dual bios switch for noise / performance but still it's just 2 options, it can't cast a wide enough net. The same way that laptops come with turbo / silent / performance modes etc. Should I be complaining that my laptop came set to turbo out of the box? Who cares, it takes a second to change it
 
None at all mate, I was just curious that's all :)
Then please be careful with words. "What does it matter to you" and "you made your point" sure sound hostile.

Sometimes, people come here out of some genuine interest for tech, not because they have a personal stake in the matter. Not everybody in a 5090 thread is a potential buyer. Sometimes, one is just curious. :)
 
I would let it receive the full glory of 600w. If that is TGP than its only ~200w more than what my Ti is doing right now at full load.

Then please be careful with words. "What does it matter to you" and "you made your point" sure sound hostile.

Sometimes, people come here out of some genuine interest for tech, not because they have a personal stake in the matter. Not everybody in a 5090 thread is a potential buyer. Sometimes, one is just curious. :)
I get it..
 
Nobody said anything about undervolting. I agree you shouldn't need to undervolt out of the box, but you don't need to. But power usage, why not? Nvidia, amd, intel and whatever other company that exists CANNOT by definition know how you want to use a product better than you do, therefore they cannot ship it with settings that fit your needs. Same applies to most products, be it TVs, ACs, monitors etc.

Some GPUs have a dual bios switch for noise / performance but still it's just 2 options, it can't cast a wide enough net. The same way that laptops come with turbo / silent / performance modes etc. Should I be complaining that my laptop came set to turbo out of the box? Who cares, it takes a second to change it
Like I said in another thread, I'd much rather spend less on a card that uses less power by default. Choosing the more expensive option and then limiting it by software sounds excessive to me.

But each to their own.
 
Then please be careful with words. "What does it matter to you" and "you made your point" sure sound hostile.

Sometimes, people come here out of some genuine interest for tech, not because they have a personal stake in the matter. Not everybody in a 5090 thread is a potential buyer. Sometimes, one is just curious. :)
For not caring, its sounds a lot like they do to be honest. Everyone here has their own opinions, unless these threads should be nvidia users only lol.

But the 5090 is for "gamers" as it has the Geforce branding, it definitely works as gamers with deep wallets think they need the latest flagship.
it's is more convenient than a 4 or 5 * 8 pin connector :D
A 5090 wouldn't need more than 3 of the 8 pin connectors, 8 pin molex can handle a lot more than what it's rated for.
 
Like I said in another thread, I'd much rather spend less on a card that uses less power by default. Choosing the more expensive option and then limiting it by software sounds excessive to me.

But each to their own.
But this thread is about the most ridiculous card that money can buy. It is supposed to be excessive, that is what you are paying for.. The best.
 
For not caring, its sounds a lot like they do to be honest. Everyone here has their own opinions, unless these threads should be nvidia users only lol.
Should I write "in my opinion" in front of every post I make? :confused:

I am an Nvidia user, by the way, just not in my main gaming rig at the moment. I've got two HTPCs that both have Nvidia GPUs in them. Does that make me more qualified to comment here?

But the 5090 is for "gamers" as it has the Geforce branding, it definitely works as gamers with deep wallets think they need the latest flagship.
Let me disagree there. The 5090 has double of everything compared to the 5080 (shaders, VRAM, etc) which is already gonna be a stupidly expensive card. The 5090 is only GeForce by name to sell it to gamers. But it is not a card that your average gamer needs. Otherwise, there wouldn't be such a gigantic gap between it and the 5080 in specs.
 
Like I said in another thread, I'd much rather spend less on a card that uses less power by default. Choosing the more expensive option and then limiting it by software sounds excessive to me.

But each to their own.
So if nvidia was selling a 5070 at 200w you'd buy one, but if it's shipping it at 500 watts you wouldn't (even though you can limit it to 200w) and you'd rather buy a 5060....it doesn't make sense to me.
 
At the same game with the same settings and same fps the 5090 will likely need half the power of your 7900xt though. That's what matters when it comes to power, it's called efficiency.
And I can run an iGPU to play Tetris FTW. Some Intel fans tried to play the ISO card over and over but were ultimately ratioed out of the comments.

No one is buying a $2000 GPU to run it at the speed of a $500 GPU. There’s a reason review sites measure gaming power at max settings and resolution because that is how the product will be used by the vast majority of users.

It seems that every comment to do with Nvidia relates to poor performance (RT), blurry image quality (DLSS) and now lower settings / limiting power for even worse performance and image quality.

What the hell is going on with Nvidia users?!?!
 
But this thread is about the most ridiculous card that money can buy. It is supposed to be excessive, that is what you are paying for.. The best.
Totally. :) And that's what makes it not made for the average gamer, imo.

So if nvidia was selling a 5070 at 200w you'd buy one, but if it's shipping it at 500 watts you wouldn't (even though you can limit it to 200w) and you'd rather buy a 5060....it doesn't make sense to me.
As long as I find the performance of the 5060 acceptable, yes. Spending more money on something that I'm not fully using is what doesn't make sense to me.
 
And I can run an iGPU to play Tetris FTW. Some Intel fans tried to play the ISO card over and over but were ultimately ratioed out of the comments.

No one is buying a $2000 GPU to run it at the speed of a $500 GPU. There’s a reason review sites measure power because that is how the product will be used by the vast majority of users.
An IGPU will be much slower than your 7900xt.

Of course noone is buying a 2000$ gpu to run it at the speed of a 500$ gpu, nobody argued that. I'm arguing that at the same power as your 7900xt it will be vastly faster (and it should be), so I don't get the notion of complaining about it's power.

As long as I find the performance of the 5060 acceptable, yes.
Well if you find the performance of the 5060 acceptable you wouldn't even be looking at the 5070 the first place regardless of the power draw. Anyways, it just doesn't make sense to me but whatever, im not the arbiter of what makes sense.
 
Well if you find the performance of the 5060 acceptable you wouldn't even be looking at the 5070 the first place regardless of the power draw.
Why wouldn't I? What's wrong with getting a clear picture of the full market before buying something?

Don't take it personally, but this is the difference between a value-conscious buyer and a moron, imo. A value-conscious buyer looks at every option and chooses the one that is the closest to satisfying their needs at their given budget, while a moron buys the most expensive shit available without thinking about it.
 
An IGPU will be much slower than your 7900xt.

Of course noone is buying a 2000$ gpu to run it at the speed of a 500$ gpu, nobody argued that. I'm arguing that at the same power as your 7900xt it will be vastly faster (and it should be), so I don't get the notion of complaining about it's power.


Well if you find the performance of the 5060 acceptable you wouldn't even be looking at the 5070 the first place regardless of the power draw. Anyways, it just doesn't make sense to me but whatever, im not the arbiter of what makes sense.
Okay guy, buy a 5090 for $2000+ and run it at half power for the entire time you own it. I hope it’s worth it.

Btw, the 4090 is only 5% more efficient than a 7900XTX at raster and the 5090 at best will be 30% faster for almost 50% higher power than the 4090.

1735912482110.png


1735912513555.png
 
Okay guy, buy a 5090 for $2000+ and run it at half power for the entire time you own it. I hope it’s worth it.
Don't know if im buying one, but if I do it's going to be locked to 320w exactly like my 4090.
 
More so, the 5090 won't use the fully enabled version, either. 5090 Ti coming later, perhaps? Or will it be reserved for industrial cards?
I believe it's the latter, just like what happened to the 4090 (it wasn't the fully enabled die either).
No reason for a Ti version.
Who's gonna buy 575 W GPU unless they live in Iceland or Grenland? Imagine gaming in summer when it's +43C outside, which is like every year nowo_O I'm not gonna overload my AC, just too game.
Well I might get one just for winter months as I'm heating my PC room with 600W IR panel + 4070TIS atm. Maybe I could replace IR heater with 5090:rolleyes:
I personally just power limit my GPUs. Both my 3090s run at 275W each.
 
Good for 5080, but so expected that 5080~4090 in performance too.
Near 600watt for 5090 stock is bad...
 
Don't know if im buying one, but if I do it's going to be locked to 320w exactly like my 4090.
Just buy a 5080 and save $1000+. The performance between a 5090 at 320W and a 5080 at 360W is going to be about the same. Maybe and this is a big maybe, the 5090 will be a little faster but don't forget that Nvidia is using the same node as the 4000 series. This means efficiency of the 5000 series will go down as more transistors are added.

This is a buyer beware situation and no company logo on the box beats physics.
 
Back
Top