• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 5090 Runs on 3x8-Pin PCI Power Adapter, RTX 5080 Not Booting on 2x8-Pin Configuration

So how much power does a 3090 need? Cause there are 2 and 3 pin models. Same with the 3080 and probably the 3070ti. There is no such thing as need, hardware doesn't have needs. They take as much power as the user decides to give them.
So can I give my 6750 XT just 150 Watts through a single 6-pin coming from a noname 300 W PSU and the PCI-e slot? :p
 
What 2/3 are you talking about?
This is a 5080 with 360W power limit and 2x8pin can supply 300W + the 75W from PCI-E. Total of 375W.
Compared to a 5090 that requires 575W and only getting 450W + 75W (525W) the 5080 should be "easier" to run with 2x8pin.

Clearly this is a limitation somewhere that could/should(?) have been avoided. Maybe it would require a more complex PCB/Power delivery subsystem that most likely the 5090 has.
In all honesty when you buy a 1000~1500+ GPU you dont try to cheap on power...

The 12VHPWR has a build-in sensing system for 450W and 600W options.


View attachment 383127
Well maybe the card just doesn't look at slot power and 'wants' all of it from the other end? Is that possible? We've seen in the past its not just 'oh this gives me juice I'll just take it' as we've also seen cards pull too much from the slot.

I can see why, 450w+ is starting to get crazy. I'm getting comfier with the 300w range though :twitch:

2w I'd say seems to low, maybe 10-40w seems more reasonable if it really wants all it's power through the plug in cables.
300W was always my hard limit and it hasn't changed. 250W being optimal to me in a regular system. If you go bigger you need to worry a lot more about case cooling, case choice, but also PSU sizing, etc. Its a straight up price increase for almost the entire system. And those systems also tend to be louder.
 
So can I give my 6750 XT just 150 Watts through a single 6-pin coming from a noname 300 W PSU and the PCI-e slot? :p
If you limit it to 150w sure, why not. I've used my 4090 on a 650w psu
 
Heh, I'd like to see someone try.


650 W is enough for a 4090 as long as you don't have an unlocked i9 CPU and ten HDDs, pumps, fans and such.
If 650 is enough then what are we even talking about?
 
If 650 is enough then what are we even talking about?
This:
There is no such thing as need, hardware doesn't have needs. They take as much power as the user decides to give them.
Obviously hardware has needs. You need a certain voltage to maintain certain clock speeds, which means power is being used. There's also no power saving limiting things when you press the power button.

The only hardware that doesn't have needs is the one sitting on your shelf.
 
If you can run the until recently highest end gpu that costs over 2k$ with a 650w gpu then it's a non issue.
 
I don't know what you mean by argument? does undervolting not work when using software?

I think you misunderstood the basics.

in the direct current world is wattage equal voltage multiplied with current or current multiplied with current multiplied with resistance or voltage multiplied with voltage divided by resistance. Basics.
(Blame this forum for not proper software to make mathematical basic formulas from 3rd school year here.)

Auswolf stated - the card will not boot without all proper connectors during bootup.

You need to bootup the hardware first.
than you can make software hacks and tweaks
when the hardware does not bootup with less connectors, you can not undervolt

i hope this is clear now.


Remember:

Like in OSI layered modell. (that is very important you may read it and learn it please)

Physical layer = hardware first - missing connector - nope
much later comes software
much later comes the application

edit: don*t be angry. These are basics in electronics or physics or mathematics. The other stuff are basics from my education. nothing new.

edit: That is not an argument get another power supply unit. When the card takes less Watts an older power supply could be more the fit. The real reason are those power spikes which are insane. Which most likely previous cards did not really generate. See igorslab. Sometimes he is right, sometimes there is room for improvement. Why should someone needs a new power supply unit when the older card had similar wattage as the newer card? the only reason are those power spikes.
 
I wouldn't expect a card with 2x8-pin to work with only one connected for example, even if a power target was set needing less than 225w.

In my point of view it hsould be possible.

The connector can distinguish between 4 different wattage modes for the 600 Watt connector. The card should bootup.

Check post #22 link - the card should even bootup with 100 Watts - initial permitted power at system power up. The card should run fine with 150 Watts. Any questions?
Assuming htat spec is correct in #22 - nvidia has a design error with that particular graphic card mainboard - or software error with that particular graphic card mainboard

Sense0Sense1Initial Permitted Power at System Power UpMaximum Sustain Power after Software Configuration
GndGnd375 W600 W
OpenGnd225 W450 W
GndOpen150 W300 W
OpenOpen100 W150 W

When you reply, please provide datasheet for all components involved, especially for the controller ics, provide full schematics for a recent powersupply unit, atx 3.1 spec and full schematics for those cables and adapters.

Naturally, cards with external power connectors don't use the PCI-e slot to its full 75 W specification, but to say they use 2 W is a bit daft.

I doubt the Wattage is limited over the PEG slot. I forgot which card it was, i think one card even took over 125 Watts over the PEG slot.

Anyway. Nvidia does not contribute to open source software - see recent gamers nexus video.

I invite AMD, Nvidia, INTEL, power supply companies and the others: Please publish the full specifications, the full schematics, the full datasheets, the full register and programming sheets for all components being used on pc parts.

Well it's easy to sell garbage, when you do not publish the full specifications and no one can check if the firmware, software, hardware is working according to the specifications. Check e.g. the linux kernel (it is only the kernel!) for all the workaround for the hardware and firmware bugs. A generic statement.

I really want to see in less than 2 minutes the full nvidia graphic card connector specs on the nvidia homepage in english. That includes datasheet, application notes and other common documents needed to design a device.
 
I think you misunderstood the basics.

In my point of view it hsould be possible.
I'm thinking it's you that's misunderstood the basics. I'm not providing you anything. It's reasonable to expect hardware to be connected as the manufacturer intended. I have zero intention of proving or justifying anything to you, as if you're some barrier that if I don't what I've said is invalid, the hubris.
 
No. As a rule Nvidia only pulls about 2W from the PCIe slot.

I wrote on igorslab for ages that I want to see proper measurements.

I only know gamers nexus measures with internal checked and before external calibrated measurement device the peg slot and all the cables.

I doubt 12 Watts on the peg slot are 2 watts. I doubt the windows software reads out these values correctly.

source:

NVIDIA GeForce RTX 5090 Founders Edition Review & Benchmarks: Gaming, Thermals, & Power​


04-02-2025_11:30:10_screenshot.png


That measurement is plausible.

I also wrote on those cpu tests in teh past. I want to see the wattage for every cpu pin - not the hole mainboard nonsense. Like igorslab does for example in the past. Igor for example can not differentiate between cpu only consumption and the correct cpu + mainboard + mainboard peripherals + ram consumption + something else which may be on the mainboard I forgot as of now.
 
That's completely wrong. The card not starting cause a cable is missing has nothing to do with power draw or undervolting. There are 3080 models with 2x 8pin and 3x 8pins. They all draw the same power (locked to 360) but the 3x8pins don't start with only 2 cables. That has nothing to do with them requiring more power, it's just the way they were designed.

We'll see what happens the 3x8pin 9070XT I guess. By some people's thinking they won't be able to be undervolted.

Nvidia does not contribute to open source software - see recent gamers nexus video.

You should get your information from better sources than techtubers that have their own agendas.

 
Last edited:
This is nonsense, the cards must boot with close to idle power settings, not at maximum peak clocks, etc...
Right, you mean like your PC booting up with an idle CPU? OH wait

Let's reflect a little bit on us talking about how a card should behave 'power wise' when its one of the rare cards that has odd behaviour and a... oh! 600W power allowance! Gosh
 
You should get your information from better sources than techtubers that have their own agendas.

Just - don't

I had a Nvidia 9800m gts (asus g70sg) with "bar" error in the firmware. I patched by hand every kernel source I used with that graphic cards for years. It was several years, not only weeks or months because the card had a hardware - firmware error with the binary nvidia-drivers.
I had a nvidia 660m GTX (asus g75VW) - barely any open source stuff
I had in 2023 to test again the windows 10 pro and the gnu linux driver state again a second hand MSI 960 GTX 4GB card. I used it for several months to test it. One of the reasons why i had before radeon 6600XT and bought after that radeon 6800 non xt and later radeon 7800XT. I'm one of the few users who did not test intel, but reevaluated the other gpu manufacturer also for driver quality, daily windows 10 pro and the linux user space daily scenario. I was also interested about the 4GB VRAM and how slow that card really is.

Just do not tell stories who none believes. I used notebooks with nvidia graphics with the same gnu linux installation for a very long time.

Years ago I checked this page - no progress - not the bare minimum works
 
Back
Top