• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5090 Features 575 W TDP, RTX 5080 Carries 360 W TDP

Again, people voting with their asses instead of brains/wallets. "It's the fastest crap in the world, so I must have it" - bah. (I'm not talking about people doing professional work on their GPU, of course)

Good luck paying £4000 for this "fastest".

U should know Energy Efficiency and power usage is 2 dif things.

Maxwell and pascal is just bad..
4000 series is most energy efficient gpus there is Atm

Power usage and performance =Energy Efficiency

If GPU use +300w or more that dosent mean it cant be Energy Efficient

Rtx4090 +350w gpu is more Efficient than Rtx3050

energy-efficiency.png


Does this graph include undervolted cards or simply stock? If stock, it is not representative.
 
My Maytag Microwave is good for 1000 watts. My Lian-Li Edge PSU is good for 1300 watts. We will fear no GPU.
The difference between the men and the boys is the price of their toys. Saddle up kids. It's gonna be a rough ride.
But will you be able to run both before your fuses melt in the box!
 
Great AC keeps room temp just the same.
So impact is 0c in room temps because of that.

U dont know what AC is,right?
its not a Hevy band..
Dump 1000+ W from your PC into your room, then turn on your 2000 W AC to dissipate it. Way to go, champ! :rockout:
 
As I said, people don't aim for 50% just for efficiency, but also to allow for transient spikes as well as the option to upgrade to the next GPU tier without needing to change PSU's later on.


I'll tell you when it gets released and reviewed. As with previous flagship GPU's, I can see existing 1000-1200w PSU owners having no problems at all without needing a 2000w PSU upgrade. Also not sure where you got the "bias" from as this is my first post to the thread. I'm not in the market for a "flagship" GPU myself but I also see little value in arguing in pre-release threads vs "just wait and see" then base your build over actual measurements...
of course they will need to upgrade, because of 5090 ti and whatever special cable 6090 ti will require /s
 
Yes, why I also no longer buy OC models, unless they come with Dual BIOS, or better VRM/Phases etc. Makes no sense to pay more for what, a name and maybe 2% more performance? lol
I disagree with some of the comments here. I just signed up to mention this. I have the suprim x 4090 (already OC), and the extra OC (%20 on clock, %15 on mem) made a difference with minimal tbp increase. I get about 10 fps extra with only 450 to 480 Max tbp. I run it at 120% Max power limit. When you play at 4k, that's when the 4090 shines and power difference comes in to play. Anything less than 4k, 4090 is literally useless. CP2077, Jedi Survivor, Plage Tale Requiem are all power hungry, yet max tbp I've seen is 480w spikes, but 400 to 420 average.
 
I disagree with some of the comments here. I just signed up to mention this. I have the suprim x 4090 (already OC), and the extra OC (%20 on clock, %15 on mem) made a difference with minimal tbp increase. I get about 10 fps extra with only 450 to 480 Max tbp. I run it at 120% Max power limit. When you play at 4k, that's when the 4090 shines and power difference comes in to play. Anything less than 4k, 4090 is literally useless. CP2077, Jedi Survivor, Plage Tale Requiem are all power hungry, yet max tbp I've seen is 480w spikes, but 400 to 420 average.
10 extra FPS on a 4090 at 120% power? That sounds like a total waste to me. No offense.

On a lower class card, the differences are even smaller.
 
Funny that back then people went "Nvidia is so much better because it's more efficient". And now "Nvidia is so much better because... um... 4090!!!" :laugh:

The people now really want AMD to seriously compete with Nvidia :roll:
IMO, XX90 is meant for gaming but buying this card aimed at gaming is beyond clueless.
 
The people now really want AMD to seriously compete with Nvidia :roll:
IMO, XX90 is meant for gaming but buying this card aimed at gaming is beyond clueless.
Personally, I just want a decent mid-range card at a decent price. I remember when owning x90 used to be a normal, everyday thing for normal, everyday monitor resolutions, but those days are long gone. It's gone way overboard in both performance and price for me to be interested.
 
Dump 1000+ W from your PC into your room, then turn on your 2000 W AC to dissipate it. Way to go, champ! :rockout:
i have new house build 2024, i can change all room temps individualy how i like.
Good system, in winter or summer

Winter times cost lot more than summer times, but i realy dont care anymore bc i sold lot of those Xlm/Xrp +500% profit.
So i can buy 100x 5090 atm if price is 2490e/each

10 extra FPS on a 4090 at 120% power? That sounds like a total waste to me. No offense.

On a lower class card, the differences are even smaller.
if u know anything about OC
its 120% powerlimit slider.
it gives more room for OC
 
If you can afford a 5090 you can afford a new psu ;)
Agree! I bought a Seasonic PRIME TX-1600 Noctua Edition for the occasion! Lol.
Also if the 5090 really has a 575W TDP then you can be sure that AIB OC models will get 2x 12V-2x6 connectors with a ~700W BIOS (or more for Overclocking).
 
Agree! I bought a Seasonic PRIME TX-1600 Noctua Edition for the occasion! Lol.
Also if the 5090 really has a 575W TDP then you can be sure that AIB OC models will get 2x 12V-2x6 connectors with a ~700W BIOS (or more for Overclocking).

There are 900W XOC Bios for 4090, my bet is those people people with burnt connector flashed their 4090 with XOC Bios.

Stock BIOS my TUF 4090 goes up to 600W already.
 
of course they will need to upgrade, because of 5090 ti and whatever special cable 6090 ti will require /s
What is funny is that Nvidia pushed to get that 12VHPWR/16-pin out so people only have 1 cable, but here we are with a GPU of 575W (almost the max 600W) and we're going to need 2 connectors & cables soon...

There are 900W XOC Bios for 4090, my bet is those people people with burnt connector flashed their 4090 with XOC Bios.

Stock BIOS my TUF 4090 goes up to 600W already.
I would never install a non original BIOS on my GPU but my 4090 SUPRIM LIQUD X has a 530W max TDP yeah. I know FE and ASUS have 600W but I didn't want that because of all the melting connectors. I bought a TX-1600 Noctua to get 2x 12V-2x6 (ATX 3.1 & PCIe 5.1) connectors & cables and get ready for the 5090.
 
What is funny is that Nvidia pushed to get that 12VHPWR/16-pin out so people only have 1 cable, but here we are with a GPU of 575W (almost the max 600W) and we're going to need 2 connectors & cables soon...
Maybe they pushed so that people can get 1200 W through 2 cables instead of 300?
 
Maybe they pushed so that people can get 1200 W through 2 cables instead of 300?
The previous 8-pin were rated for 150W but could actually sustain 300W on each cable. Whereas the 12VHPWR is rated for 600W but its max power draw before melting is 630W... definitely not as safe as the old 8-pin certification was for sure.
1200W for a GPU would be nonsense though. I hope we never get there! It's going to create so much heat...
 
The previous 8-pin were rated for 150W but could actually sustain 300W on each cable. Whereas the 12VHPWR is rated for 600W but its max power draw before melting is 630W... definitely not as safe as the old 8-pin certification was for sure.
1200W for a GPU would be nonsense though. I hope we never get there! It's going to create so much heat...
I thought even 600 W was nonsense for a GPU and look where we are now (I still think it's nonsense, though).
 
I thought even 600 W was nonsense for a GPU and look where we are now (I still think it's nonsense, though).
Yeah agree, I remember when High-end GPUs had a 250W TDP... and now it's 575W. Just like Intel and their 13/14th CPUs that can reach ~300W... This world is going nuts!
 
Maybe they pushed so that people can get 1200 W through 2 cables instead of 300?
my (game) theory is that latest hedt seasonic psu have 2 12whpr to run 5090 in sli/nvlink. it make a lot of sense for r/localllama and a few other niche communities such as fluxai, aivideos .
there is basically no vram usage limit when using big models, ai image & video generation and those dudes run a lot of gpus in same rig
 
my (game) theory is that latest hedt seasonic psu have 2 12whpr to run 5090 in sli/nvlink. it make a lot of sense for r/localllama and a few other niche communities such as fluxai, aivideos .
there is basically no vram usage limit when using big models, ai image & video generation and those dudes run a lot of gpus in same rig
It looks like you'll need both of those connectors for a 5090, at least with some models. Nvlink will need a second PSU (or one with 4 connectors). Even just the thought makes me shiver.
 
It looks like you'll need both of those connectors for a 5090, at least with some models. Nvlink will need a second PSU (or one with 4 connectors). Even just the thought makes me shiver.
4 connectors would be insane... even the Seasonic PX ATX 3 (2200W) only has 2x 12V-2X6 connectors!

 
So the 5090 has 25 W for the VRAM and other components, but the 5080 has 40 W? That doesn't compute, even with the 28/30 Gbps speed difference. The 5090 has double the VRAM chips (or density?), a much more complex PCB and a much beefier power delivery.
IDC, I just could say that 360 W is "nearly" "acceptable" TDP, but 575 W is a nonsense.
 
my (game) theory is that latest hedt seasonic psu have 2 12whpr to run 5090 in sli/nvlink. it make a lot of sense for r/localllama and a few other niche communities such as fluxai, aivideos .
there is basically no vram usage limit when using big models, ai image & video generation and those dudes run a lot of gpus in same rig
Nvidia went away with nvlink on the 4000 series, I doubt the 5090 will have it.
But yeah, running 2 of those with PCIe 5.0 is gonna be really nice and still really useful for the cases you mentioned.
 
Back
Top