• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 5090 Runs on 3x8-Pin PCI Power Adapter, RTX 5080 Not Booting on 2x8-Pin Configuration

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
3,087 (1.09/day)
NVIDIA's flagship GeForce RTX 5090 demonstrated flexibility in power compatibility, while its sibling, the RTX 5080, struggled with stricter requirements. Recent tests by a German tech outlet, ComputerBase, reveal that the RTX 5090 can operate with three 8-pin PCI power connectors instead of the recommended four, albeit with a performance trade-off. However, the RTX 5080 fails to boot when using only two 8-pin connectors. The RTX 5090, with a default TDP of 575 W, officially requires a 600 W 12V-2×6 connector or an adapter with four 8-pin PCI cables. However, tests on the ASUS ROG RTX 5090 Astral and Zotac RTX 5090 Solid show the GPU boots even with three 8-pin cables, capping its TDP at 450 W—matching the three connectors' 150 W-per-cable spec. Performance losses are modest: benchmarks indicate a 5% drop in average FPS at 450 W compared to full power.

In contrast, the RTX 5080's 360 W TDP proves less forgiving. Attempts to run the Founders Edition and Zotac RTX 5080 AMP Extreme Infinity with two 8-pin connectors (300 W total) resulted in failure: the screen remained blank, and the card refused to initialize. NVIDIA's firmware appears to lack a lower power-limit threshold for the RTX 5080, unlike the 5090, which automatically adjusts when detecting insufficient power delivery. This requirement forces users to adhere strictly to the three 8-pin or 12V-2×6 power connectors. While the RTX 5090 offers flexibility for users upgrading from older systems, the RTX 5080's limitations may frustrate owners of less powerful PSUs. For the RTX 5090, the 5% performance penalty at 450 W may be a reasonable trade-off for avoiding costly PSU upgrades, but RTX 5080 users have no such recourse. Verifying power supply compatibility, as underpowered setups risk instability or hardware damage, is a must, and when your $2000+ GPU runs, you should at least power it properly. This experiment is more a "for science" type of run.



View at TechPowerUp Main Site | Source
 
So much for all those "but you can undervolt it" arguments.
 
How does the card know what's at the other end of the adapter?
 
How does the card know what's at the other end of the adapter?
Maybe the PSU won't supply the power it needs?
 
575W @ 100% versus 450W @ 95%, it's crazy how far above their optimal range these chips are clocked.
 
575W @ 100% versus 450W @ 95%, it's crazy how far above their optimal range these chips are clocked.
That's the average case; the worst case in that small test suite is more concerning. War Hammer 40k: Space Marine 2 saw average fps increase by 16% when the power limit was raised from 450 W to 575 W.

1738611984916.png
 
it will just heat up those 3 cables more. Ask me how I know.

Screenshot 2025-02-03 134909.png


Was running some maps in POE 2 when the distinct smell of burning plastic filled the air. I was only running 3 cables because I read when I first bought the card that would limit the card in my build to 450W instead of 450W+, and was in spec.

I would not recommend it.
 
16% is still not a lot for a 27% reduction in power.
It's a 16% increase with a 28% increase in power, not a 28% reduction in power. 99th percentile frame times also improved by 20% with the higher power limit. I would say that it's highly usage dependent; for games that don't run into the card's power limit, a lower limit doesn't really hurt.
 
It's a 16% increase with a 28% increase in power, not a 28% reduction in power. 99th percentile frame times also improved by 20% with the higher power limit. I would say that it's highly usage dependent; for games that don't run into the card's power limit, a lower limit doesn't really hurt.
it's also 28% increase in the maximum possible consumption, but the actual and average power consumption will be much more linear to the increase in performance.
 
it will just heat up those 3 cables more. Ask me how I know.

Was running some maps in POE 2 when the distinct smell of burning plastic filled the air. I was only running 3 cables because I read when I first bought the card that would limit the card in my build to 450W instead of 450W+, and was in spec.

450 watts with 3 PCIe 8-pin means 12.5 amps per connector.
150 watts which is the theoretical limit of the same 8-pin means 12.5 amps.
450 watts over 4 PCIe 8-pin means 9.375 amps per connector.

Something doesn't add up in your experiment. Are you sure something is not wrong in your case ?

So much for all those "but you can undervolt it" arguments.

That's an ugly artificial software limitation, not that the hardware is no capable to support it. Blame Nvidia, vote with your wallet and don't buy.
 
Clearly they had to push it that far in power to make it worth buying. Otherwise it would have been so close to the 4080 Super that it probably would hurt sales in the long run (I guarantee it would have sold it immediately regardless).

We really are not going to see much this generation overall even if a 5080 ti comes out. Its really going to be 6XXX that makes a much more major difference all around.
 
That's an ugly artificial software limitation, not that the hardware is no capable to support it. Blame Nvidia, vote with your wallet and don't buy.
Well, I was never looking for a $1000+ graphics card in the first place, so I think I'm fine. :)

It's really going to be 6XXX that makes a much more major difference all around.
Don't bet on that, yet.
 
Clearly they had to push it that far in power to make it worth buying. Otherwise it would have been so close to the 4080 Super that it probably would hurt sales in the long run (I guarantee it would have sold it immediately regardless).

We really are not going to see much this generation overall even if a 5080 ti comes out. Its really going to be 6XXX that makes a much more major difference all around.

I bet Nvidia will charge first with a super-duper RTX 5000 lineup, and then if lucky, will release a "new architecture" on the same old 4nm TSMC node, because 2nm and 3nm will remain prohibitively expensive, and just because Nvidia doesn' want to sell graphics cards any longer.
 
Maybe the adapter isn't just wires but contains some electronics, and sends a "600W available" signal through the sensor pins only if all four 8-pin cables are connected.
 
Well, I was never looking for a $1000+ graphics card in the first place, so I think I'm fine. :)


Don't bet on that, yet.
Could be right, they do seem more focused on software and that could be their future.
I bet Nvidia will charge first with a super-duper RTX 5000 lineup, and then if lucky, will release a "new architecture" on the same old 4nm TSMC node, because 2nm and 3nm will remain prohibitively expensive, and just because Nvidia doesn' want to sell graphics cards any longer.
Oh I don't think they hate selling cards, they just now want to focus on stuff that is more proprietary and make it part of the cards they sell.
 
Wait, aren't these cards pulling 75w from the PCIe slot as the standard allows them to?
 
it's also 28% increase in the maximum possible consumption, but the actual and average power consumption will be much more linear to the increase in performance.
Well, they didn't measure average power draw under the lower limit so we can only rely on how silicon typically scales with increased power. Average power consumption will increase by more than the increase in performance. Note that the example I gave was a game that was running close to the power limit at stock so limiting to 450 W resulted in a significant decrease in performance.
 
Something doesn't add up in your experiment. Are you sure something is not wrong in your case ?

You are right we do not know all details.

Some power supplies have Y - CAbles.
Two graphic card connectors on the same cable which goes to the same connector to the power supply unit. It looks like the letter Y.

Regardless - I feel sorry for someone who get his expensive hardware damaged by an unfinished product.

I think those sensor pins just check if there are 12 Volt DC or not on 5 different pins what i saw a few weeks ago. I wouldnot call them sensor pints. They are just checking if there is a high signal there or not most likely.

Gamers nexus has equipment which they did a lot of efforts to calibrate. they can measure the peg slot and the other wires in total.

What i remember those 5000 series graphic card take around 40 Watts in idle. Just check the video first please. Thank you. It does not matter if its 32 or 40 or 35 Watts. Its far off the 6 to 12 Watts it should be in idle mode.

At the end of the day only the power consumption at the wall socket counts. That includes the efficiency for the efficiency curve of the power supply unit. Which is hardly measured in the range from 1 to 75 Watts output.

--

Assuming that information is correct, you just need to set two pins to Gnd for the 600 Watts. That should be not difficult for an adapter cable.
 
Last edited:
You are right we do not know all details.

Some power supplies have Y - CAbles.
Two graphic card connectors on the same cable which goes to the same connector to the power supply unit. It looks like the letter Y.

Regardless - I feel sorry for someone who get his expensive hardware damaged by an unfinished product.

I think those sensor pins just check if there are 12 Volt DC or not on 5 different pins what i saw a few weeks ago. I wouldnot call them sensor pints. They are just checking if there is a high signal there or not most likely.

Gamers nexus has equipment which they did a lot of efforts to calibrate. they can measure the peg slot and the other wires in total.

What i remember those 5000 series graphic card take around 40 Watts in idle. Just check the video first please. Thank you. It does not matter if its 32 or 40 or 35 Watts. Its far off the 6 to 12 Watts it should be in idle mode.

At the end of the day only the power consumption at the wall socket counts. That includes the efficiency for the efficiency curve of the power supply unit. Which is hardly measured in the range from 1 to 75 Watts output.
It didn't damage the card, just melted 2 cables themselves... and melted a bit of the plastic on one of the 3 connectors on the PSU.

I went ahead and ordered a corsair 12vhpwrcable, and she's been running great at 450W.

It was just too much heat for those 3 8pins cables? They melted pretty evenly... they may have been defective, or maybe some other issues, who knows... but running anything at the absolute theorhetical max is never a great idea. Very little room for error - the connector from the nvidia dongle to the psu cable also melted and exuded some sort of clear resin.
 
That's an ugly artificial software limitation, not that the hardware is no capable to support it. Blame Nvidia, vote with your wallet and don't buy.
This is the most absurd pseudo-objection I've ever heard. The 2x8 configuration isn't simply a small undervolt; it's an attempt to supply the card only 2/3 of normal power. No one buys a $1000 video card to undervolt it to the point that it runs like a $500 card.
 
This is the most absurd pseudo-objection I've ever heard. The 2x8 configuration isn't simply a small undervolt; it's an attempt to supply the card only 2/3 of normal power. No one buys a $1000 video card to undervolt it to the point that it runs like a $500 card.
What 2/3 are you talking about?
This is a 5080 with 360W power limit and 2x8pin can supply 300W + the 75W from PCI-E. Total of 375W.
Compared to a 5090 that requires 575W and only getting 450W + 75W (525W) the 5080 should be "easier" to run with 2x8pin.

Clearly this is a limitation somewhere that could/should(?) have been avoided. Maybe it would require a more complex PCB/Power delivery subsystem that most likely the 5090 has.
In all honesty when you buy a 1000~1500+ GPU you dont try to cheap on power...

The 12VHPWR has a build-in sensing system for 450W and 600W options.


1738623249637.png
 
Back
Top