• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA RTX 5090 "Blackwell" Could Feature Two 16-pin Power Connectors

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,852 (7.39/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
NVIDIA CEO Jensen Huang never misses an opportunity to remind us that Moore's Law is cooked, and that future generations of logic hardware will only get larger and hotter, or hungrier for power. NVIDIA's next generation "Blackwell" graphics architecture promises to bring certain architecture-level performance/Watt improvements, coupled with the node-level performance/Watt improvements from the switch to the TSMC 4NP (4 nm-class) node. Even so, the GeForce RTX 5090, or the part that succeeds the current RTX 4090, will be a power hungry GPU, with rumors suggesting the need for two 16-pin power inputs.

TweakTown reports that the RTX 5090 could come with two 16-pin power connectors, which should give the card the theoretical ability to pull 1200 W (continuous). This doesn't mean that the GPU's total graphics power (TGP) is 1200 W, but a number close to or greater than 600 W, which calls for two of these connectors. Even if the TGP is exactly 600 W, NVIDIA would want to deploy two inputs, to spread the load among two connectors, and improve physical resilience of the connector. It's likely that both connectors will have 600 W input capability, so end-users don't mix up connectors should one of them be 600 W and the other keyed to 150 W or 300 W.



Above is a quick Photoshop job by TweakTown of how such a card could look like. The requirement of two 16-pin connectors should rule out older PSU types, and NVIDIA will likely only include one adapter that converts two or three 8-pin PCIe power connectors to a 16-pin, with the other input expected to be a native 600 W input from an ATX 3.0 or ATX 3.1 PSU. Most of the newer generation ATX 3.0 or ATX 3.1 PSUs in the market only have one native 16-pin connector, and three or four additional 8-pin PCIe power connectors. As for the connector itself, this could very likely be a 12V-2x6 with compatibility for 12VHPWR.

Some PSU manufacturers are beginning to release high-Wattage models with two native 12V-2x6 connectors. These would typically have a Wattage of over 1300 W. The Seasonic Prime PX-2200 W, released earlier this week, is an extreme example of this trend. Besides its high Wattage, this PSU puts out as many as four 12V-2x6 connectors. Another recent example would be the MSI MEG AI1600T PCIE5 (1600 W), with two native 600 W 12V-2x6.

View at TechPowerUp Main Site | Source
 
Well let hope it's only for the OC models, I will stick with 1x 12VHPWR cable only.
 
And there really is no proper source or even rumor to this. A deduction from:
1. Rumor that 5090 is a 550W card.
2. Big power supplies come with 2 or more 12V2x6 cables.
 
I'll wait to see if it's true before jumping on the invariable hate/clowning on it train. Personally I don't see the 5090 wanting for 600w let alone more as stock, so even if some cards carry two connectors (and let face it, some might purely to chase world OC / benchmark records), if the 5090 wants for 550w or less (my money is on 450-500w stock), many if not most would come with a single connector.
 
Maybe there is someone with sense in Nvidia after all
 
Better than actually trying to pull, say, 550W through a connector "rated for 600W," but known to burn out with rather less.

What comes next?
 
What kind of PSU do you need a minimum 1600 watts and a case with 7 expansion slots the card itself needs 7 expansion slots for the cooler and cases can be vertical so you don't break your PCIe o_O
 
What kind of PSU do you need a minimum 1600 watts and a case with 7 expansion slots the card itself needs 7 expansion slots for the cooler and cases can be vertical so you don't break your PCIe o_O
Put it in a WS and you need redundant 2k or higher PSU since there is a possibility user will be using more than 1 GPU for their work. If its Intel Xeon-W then you really need 2.5k or higher PSUs.
 
So those new motherboards with extra 6 pin power connector for GPU pcie power only ... pushing the maximum pcie delivery from 70 to 220 watt ...

Is there a correlation?
 
Last edited:
NVIDIA CEO Jensen Huang never misses an opportunity to remind us that Moore's Law is cooked...
And here I thought Moore's Law was baked. Silly me.

Apart from that, two 16-pin connectors seem insane.
 
I'll wait to see if it's true before jumping on the invariable hate/clowning on it train. Personally I don't see the 5090 wanting for 600w let alone more as stock, so even if some cards carry two connectors (and let face it, some might purely to chase world OC / benchmark records), if the 5090 wants for 550w or less (my money is on 450-500w stock), many if not most would come with a single connector.
It won’t be long before Nvidia reveals Blackwell gaming GPUs. But if the rumoured specs are correct, it does sound like a very power hungry card. More CUDA cores, 512bit memory bus (which likely won’t be fully enabled for retail level GPUs), GDDR7 and on the same TSMC node. Nvidia can reduce clock speed to keep power draw in check, but it will also impact performance. So it all boils down to how hard they want to push a product that’s got no competition.
 
Man! That's a bad photoshopped image to show how it could look, lol
 
I'm curious when this high wattage craziness of 12 V come to the end and logical era of 24V begin.
 
This wont't even be aimed at gamers.

I predict they will compare it to Titan lineup, show all the AI productivity you could do with it for the fraction of the cost of dedicated AI accelerators, and then reveal some utterly ridiculous price like $7499, and call it a bargain!
 
Previously, Huang: "The more you buy, the more you save."
Huang, now: "The hungrier the GPU, the more energy efficient it is."
 
"NVIDIA CEO Jensen Huang never misses an opportunity to remind us that Moore's Law is cooked, and that future generations of logic hardware will only get larger and hotter, or hungrier for power."

In other words he doesn't give a rat's about efficiency or our power bills as long as they sell and he profits.
 
I'm curious when this high wattage craziness of 12 V come to the end and logical era of 24V begin.
Why? To move the conversion from the cheap PSU that you most likely will use for 10+ years and move it to the expensive GPU, make it more expensive, easy to break and on top of that for sure it will be replaced more often than the PSU
 
"NVIDIA CEO Jensen Huang never misses an opportunity to remind us that Moore's Law is cooked, and that future generations of logic hardware will only get larger and hotter, or hungrier for power."

In other words he doesn't give a rat's about efficiency or our power bills as long as they sell and he profits.
Frankly speaking it is not Jensen's fault. If you can overvolt piece of silicon you will do it for sure. Isn't it ? At least as long as you need to use expensive EUV tools to create this expensive piece of silicon. In this case power efficiency is a clear victim
 
Why? To move the conversion from the cheap PSU that you most likely will use for 10+ years and move it to the expensive GPU, make it more expensive, easy to break and on top of that for sure it will be replaced more often than the PSU
Arguably, bumping up the voltage worked for HVDC folks, and the automotive industry which tried to go 48V straight for a lot of the same reasons.

The way things are going, it might be necessary soon enough.
 
I see the progressively increasing peak power requirements as mainly a way for justifying higher prices via indirect means (I'm not sure how many realize what it means to have a 600W heater inside the case) and making multi-GPU setups as cumbersome as possible (they'd really prefer if people purchased their datacenter systems for compute needs—too bad that not just anybody can have them).
 
Isn't this the same crap that flew around last cycle? Like 900w TDP and needing two power connectors...

Sounds like b/s to me to sell whales big fat PSUs.
 
Just put 8x8 pin on it, I mean, the GPU is half a PC case sized at this point anyway.

Alternatively, just standardize it:!

1727167426871.png


Well let hope it's only for the OC models, I will stick with 1x 12VHPWR cable only.
That's clearly user error when the 5090 wants 2x :)
 
Back
Top