• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA RTX 5090 "Blackwell" Could Feature Two 16-pin Power Connectors

Why? To move the conversion from the cheap PSU that you most likely will use for 10+ years and move it to the expensive GPU, make it more expensive, easy to break and on top of that for sure it will be replaced more often than the PSU
The reason is obvious - more power hungry GPU fit well with more phases of power supply and this fit well with higher input voltage.
Look how many phases we have on decent mobos these days.
The same trend will come to power hungry add on boards sooner or later imho like GPU or NPU discrete accelerators.

Demanding and aware users will prefer silent - well designed- power section than couple bucks cheaper but noisy and less reliable.
Especially in case they have to spend grand or more to get the card.
 
"Some PSU manufacturers are beginning to release high-Wattage models with two native 12V-2x6 connectors. These would typically have a Wattage of over 1300 W. The Seasonic Prime PX-2200 W, released earlier this week, is an extreme example of this trend. Besides its high Wattage, this PSU puts out as many as four 12V-2x6 connectors. Another recent example would be the MSI MEG AI1600T PCIE5 (1600 W), with two native 600 W 12V-2x6."

if using that PSU 1300-2200w just for i9 gen 14-15 series/r9 9000-10000 series with rtx 5090 48gb, how much pc gamer should use and what UPS type, for handling all power from that PSU ??
 
Last edited:
Real RTX GPU connector.

1727168241712.png
 
Actually there is a water colled version of the 4090 that has two 16pin connectors, if I am not mistaken.
 
The reason is obvious - more power hungry GPU fit well with more phases of power supply and this fit well with higher input voltage.
Look how many phases we have on decent mobos these days.
The same trend will come to power hungry add on boards sooner or later imho like GPU or NPU discrete accelerators.

Demanding and aware users will prefer silent - well designed- power section than couple bucks cheaper but noisy and less reliable.
Especially in case they have to spend grand or more to get the card.
The GPUs already have as much power phases as the motherboard and no, 24V won't make anything less noisy, because you will need to convert the power anyway. The GPUs already hove conversion from 12V, switching to 24V will make it much bigger and expensive so you don't get anything.
Are you so spoiled to want single power connector to the GPU? Where were you the past 15 years when almost every X60 class card were equipped with 2 8 pin connecters and the high end card were sometimes with 3x8, sometimes even 4x8? The 16pin connector is smaller than single 8 pin power connector, so what is the problem?
 
The power requirements are crazy. I hope that the card is going to be a monster of a card when it comes to performance.
 
Arguably, bumping up the voltage worked for HVDC folks, and the automotive industry which tried to go 48V straight for a lot of the same reasons.

The way things are going, it might be necessary soon enough.
I see I'm not alone here at least.
 
Didnt Nvidia say the reason for these was to avoid multiple connectors? they need to get the power draw in check.

2030

"Nvidia have just announced their upcoming 7090 GPU, comes with 64pin connector to mean no more multiple cables, and even comes with its own 3000W PSU to ensure it has the power it needs. All for the great price $4000, The cooler has also been beefed to a 5 slot solution".

PSU vendors love Nvidia right now.
 
Last edited:
They can come with a dedicated power plant for all I care. What I'm looking for is a decent mid-range. Haven't seen that in years.
 
Actually it will make the GPUs much safer, because even if one connector fail, the other will take the load. The reasons for burning connectors is bad contact which lead all the power to go through few pins and they heat up
 
The GPUs already have as much power phases as the motherboard and no, 24V won't make anything less noisy, because you will need to convert the power anyway. The GPUs already hove conversion from 12V, switching to 24V will make it much bigger and expensive so you don't get anything.
Are you so spoiled to want single power connector to the GPU? Where were you the past 15 years when almost every X60 class card were equipped with 2 8 pin connecters and the high end card were sometimes with 3x8, sometimes even 4x8? The 16pin connector is smaller than single 8 pin power connector, so what is the problem?
Alone 24V won't make anything less noisy - I agree but coupled with good culture of design can lead to silent, reliable and efficient power section.

If you still cant imagine why 12V dont help to achieve these goals try to think what would happend if you tried to supply GPU cards by 5V voltage like it were in vintage days.
 
Alone 24V won't make anything less noisy - I agree but coupled with good culture of design can lead to silent, reliable and efficient power section.

If you still cant imagine why 12V dont help to achieve these goals try to think what would happend if you tried to supply GPU cards by 5V voltage like it were in vintage days.
You have the SAME power conversion, no matter if it is in the GPU or the PSU and if you hear noises from your GPU now, what will happen when you have even more power phases and heavy electricity components? I will tell you - even noisier GPU, If they are using trash components now, they won't use better with 24V. I trust much more the manufacturers of PSUs and I can spend few more bucks to get better power supply that will be quiet instead to spend more times, more money on top halo GPU, to get better power delivery.
By the way, I never heard noises from the electricity of the GPU, you know, even if there is, when the GPU is loaded the fans are at high RPM and mute that. What trashy GPU you have to hear such noises?

Edit: One more think, currently the PSUs have 3.3, 5 and 12V. Do you want to know what will happen when you add 24V output to the PSUs? They will become more expensive, complex than now and your favorite - noisier
 
Last edited:
Nvidia invents new connector that would eliminate the need for multiple power cables and then plans to release a GPU that needs multiple power cables.
 
Actually it will make the GPUs much safer, because even if one connector fail, the other will take the load. The reasons for burning connectors is bad contact which lead all the power to go through few pins and they heat up
It's just a meme.
 
Soon people will have to get a dedicated 20A/2400W circuit installed in their home just for their computer (15A/1800W is standard for residential in the US).

I guess if you can afford these top-end cards it wouldn't be a problem to get done though.
 
Soon people will have to get a dedicated 20A/2400W circuit installed in their home just for their computer (15A/1800W is standard for residential in the US).

I guess if you can afford these top-end cards it wouldn't be a problem to get done though.

I already use my RTX3090 with a 250W limit for several reasons, one of them being that my uninterruptible power supply will begin beeping due to overload above 500W. Performance by decreasing it from the default 370W is only slightly affected. Der8auer also did some related testing with an RTX4090:


1727171480505.png


I expect the same to happen with this supposedly 600+W RTX5090. These high-end GPU don't really need to use that much power for great performance; there are commercial factors deciding that they just have to, primarily.
 
2030

"Nvidia have just announced their upcoming 7090 GPU, comes with 64pin connector to mean no more multiple cables, and even comes with its own 3000W PSU to ensure it has the power it needs".

"And the PSU cable is hard-mounted to the card to remove possible user errors!"
 
You have the SAME power conversion, no matter if it is in the GPU or the PSU and if you hear noises from your GPU now, what will happen when you have even more power phases and heavy electricity components? I will tell you - even noisier GPU, If they are using trash components now, they won't use better with 24V. I trust much more the manufacturers of PSUs and I can spend few more bucks to get better power supply that will be quiet instead to spend more times, more money on top halo GPU, to get better power delivery.
By the way, I never heard noises from the electricity of the GPU, you know, even if there is, when the GPU is loaded the fans are at high RPM and mute that. What trashy GPU you have to hear such noises?
Now i see you have completely no idea what you talking about.
It is not a matter of trust - it is matter of desing constraints.

Do you have tiniest piece of idea why PMIC is mounted on each DDR5 module instead of power supply or on mobo itself ?

Shortly speaking 12V solution is archaic* these days and still exists as a matter of industry inertion.

Higher input supply voltage will force designers to spread higher power budget per more physically installed phases cos of higher ratio of conversion.
Now we have 12V --> ~1V so 12:1 ratio . In 24V supply case we will have 24:1 ratio instead.
For the matter of desing culture it is a game changer.
But some folks like you are unable to get it I see.

edit
Archaic for 400W+ power budgets. For egpu's power budget 80-150W 12V power voltage level is still fine and will be forever.
 
Last edited:
I was not disappointed by the jokes in this thread :D :D

Nvidia can make efficient cards. Their RTX A2000 was phenomenal and now the RTX A4000 SFF is a little marvel out there, like a 3060ti @70W, fantastic efficiency.

Clearly they know how to create these gems.

I guess it's just not worth the effort plus the premium feeling you are gaming on a huge, gas-guzzler you can look through the glass and feel better about the money spent :P
 
Now i see you have completely no idea what you talking about.
It is not a matter of trust - it is matter of desing constraints.

Do you have tiniest piece of idea why PMIC is mounted on each DDR5 module instead of power supply or on mobo itself ?

Shortly speaking 12V solution is archaic these days and still exists as a matter of industry inertion.

Higher input supply voltage will force designers to spread higher power budget per more physically installed phases cos of higher ratio of conversion.
Now we have 12V --> ~1V so 12:1 ratio . In 24V supply case we will have 24:1 ratio instead.
For the matter of desing culture it is a game changer.
But some folks like you are unable to get it I see.
Strange to hear this from someone so clueless who can't answer basic questions, want the GPUs to be more expensive when he "hear" electricity noises from his cheap GPU and want to have single power connector. The problem is not in the GPUs and the 12V, the problem is in you
 
I already use my RTX3090 with a 250W limit for several reasons, one of them being that my uninterruptible power supply will begin beeping due to overload above 500W. Performance by decreasing it from the default 370W is only slightly affected. Der8auer also did some related testing with an RTX4090:


View attachment 364602

I expect the same to happen with this supposedly 600+W RTX5090. These high-end GPU don't really need to use that much power for great performance; there are commercial factors deciding that they just have to, primarily.

You do know there is not that much performance difference between RTX 4080 Super and RTX 4090 to just throw it away?

In latest reviews it's :

14.6% in 1080p (192 vs 230 FPS)
22% at 1440p (148 vs 280.4)
27.8% at 4K (89.2 vs 114)

By "going green" you might be throwing away half the reason you spent 1750 EUR instead of 1000 EUR?
 
We got 75W on PCI-E, even more with newer motherboards (with that extra PCI psu input)
We got 600W from the 12V2x6

How in hell are they going to use more than that in a consumer-grade GPU ?

If you are pumping that much Watts, it's the intel way, it means the efficiency innovation is not fast enough and you need to pump wattage up to mark a clear improvement with Ada Lovelace.

In that case IMHO, there's no point gaming on Blackwell and I'd wait for the RTX6000, tired of companies using 300W CPUs, up to 1200W GPU because they want to use cheaper TSMC nodes that are more power hungry

I sure hope it's either a Titan, a 5090Ti or some OC AIB cards...I'm sticking with one 12V2x6 for my update (I really need to change that RTX2060)
 
Back
Top