Monday, October 11th 2021

PCIe Gen5 "12VHPWR" Connector to Deliver Up to 600 Watts of Power for Next-Generation Graphics Cards

The upcoming graphics cards based on PCIe Gen5 standard will utilize the latest PCIe connector with double bandwidth of the previous Gen4 that we use today and bring a new power connector that the next generation of GPUs brings. According to the information exclusive to Igor's Lab, the new connector will be called the 12VHPWR and will carry as many as 16 pins with it. The reason it is called 12VHPWR is that it features 12 pins for power, while the remaining four are signal transmission connectors to coordinate the delivery. This power connector is supposed to carry as much as 600 Watts of power with its 16 pins.

The new 12VHPWR connector should work exclusively with PCIe Gen5 graphics cards and not be backward compatible with anything else. It is said to replace three standard 8-pin power connectors found on some high-end graphics cards and will likely result in power supply manufacturers adopting the new standard. The official PCI-SIG specification defines each pin capable of sustaining up to 9.2 Amps, translating to a total of 55.2 Amps at 12 Volts. Theoretically, this translates to 662 Watts; however, Igor's Lab notes that the connector is limited to 600 Watts. Additionally, the 12VHPWR connector power pins have a 3.00 mm pitch, while the contacts in a legacy 2×3 (6-pin) and 2×4 (8-pin) connector lie on a larger 4.20 mm pitch.
There are already implementations of this connector, and one comes from Amphenol ICC. The company has designed a 12VHPWR connector and listed it ready for sale. You can check that out on the company website.
Source: Igor's Lab
Add your own comment

93 Comments on PCIe Gen5 "12VHPWR" Connector to Deliver Up to 600 Watts of Power for Next-Generation Graphics Cards

#1
ZoneDymo
so this is not the same as that new standard Nvidia is trying to push?

and also....great....lets just embrace gpu's using more and more power....
Posted on Reply
#2
Metroid
This is good and bad at same time.
Posted on Reply
#3
lynx29
All of it confuses me. So will we all need to buy new PSU's for these cards or just use adapters?
Posted on Reply
#4
Ferrum Master
I really don't like the bottom section.

I cannot watch YouTube videos at work, what's the exact pinouts? The First datasheet is for the PCIE 2x not the hybrid PEG12.
Posted on Reply
#5
Flanker
Burn mother****er burn
Posted on Reply
#6
nguyen
um...So I can power another PC with this cable? 2 for 1 baby :roll:
Posted on Reply
#7
InVasMani
This like pouring gasoline on a fire.
Posted on Reply
#8
xenocide
ZoneDymoso this is not the same as that new standard Nvidia is trying to push?

and also....great....lets just embrace gpu's using more and more power....
Either you get more powerful GPU's, or they're more energy efficient. Pick one, because you cannot do both realistically.
Posted on Reply
#9
Dammeron
ZoneDymoso this is not the same as that new standard Nvidia is trying to push?

and also....great....lets just embrace gpu's using more and more power....
Pin sizing (3mm spacing between centers) and shapes do resemble the nV's 12pin plug, so let's hope both are under the same standard.
Posted on Reply
#10
Richards
Any gpu power consumption of 500 watts should be banned.. you d'not need that much performance
Posted on Reply
#11
londiste
ZoneDymoso this is not the same as that new standard Nvidia is trying to push?
12-pin part is identical. The additional 4 pins for control are not which means they might not be interchangeable :(
Posted on Reply
#12
ZoneDymo
RichardsAny gpu power consumption of 500 watts should be banned.. you d'not need that much performance
You might need the performance, what you dont need/want is the consumption, we evolve in many areas by making products more efficient, while gpu's also are more efficient they also consistently use more power.
I am in favor of some law that puts a limit on the power consumption of such a product, let the manufactuers find other ways to squeeze out more performance.
Posted on Reply
#13
MentalAcetylide
RichardsAny gpu power consumption of 500 watts should be banned.. you d'not need that much performance
Well unless there's some kind of breakthrough in GPU tech, there's really no other route for them to go besides "bigger" and "moar wattage".
Posted on Reply
#14
cst1992
My GTX 970 only consumes a max of 145 watts besides being the monster that it is(for its time).
My 3060Ti is almost 4x as powerful as the 970, and STILL only consumes 200 watts maximum.
WHY THE HELL do we need individual 600W PCIe power connectors??!! For 1200W graphics cards? Or are there some monster mining chips coming that pack the hash rate of multiple 3090s that I don't know about?
MetroidThis is good and bad at same time.
How the hell is it good?
nguyenum...So I can power another PC with this cable? 2 for 1 baby :roll:
"New in stock - Power Supply Risers!"
InVasManiThis like pouring gasoline on a fire.
Right now I'm feeling like standing in midst of a fire and pouring gasoline on myself.
Posted on Reply
#16
lemoncarbonate
for the upcoming RTX 5090 Ti, 700W TDP!

Someday we will get free external power brick in the GPU box.
xenocideEither you get more powerful GPU's, or they're more energy efficient. Pick one, because you cannot do both realistically.
I remember Pascal was a leap in terms of power efficiency. GTX 1080 with only one 8-pin was a hot topic back then at launch.
Even 1080 Ti with 8+6 pin was considered very efficient.
Posted on Reply
#17
TheDeeGee
How on earth are you even going to cool a 600 Watt GPU... even a 360 Rad will struggle.

What is even going on... we're going backwards again.
Posted on Reply
#18
cst1992
lemoncarbonatefor the upcoming RTX 5090 Ti, 700W TDP!

Someday we will get free external power brick in the GPU box.
Like the ones on a laptop? Except this one would literally be the size and weight of an actual brick.
Posted on Reply
#19
b4psm4m
I imagine its primary intention is for things like animation workstations or AI machines where you have many top tier GPUs running in parallel, this would cut the number of cables from the PSU to GPU quite considerably. Just because something like this will exist, doesn't mean AMD/nvidia have to use it in the consumer space; although obviously nvidia are pushing their own 12 pin connector atm, but at least that is backwards compatible with existing PSUs.

Also, I don't see why anyone would care that it is rated for up to 600W of load? The 3090 consumes up to 400W of power under load, realistically its nice to have a delivery system with a decent amount of over capacity so that the wires/connector don't heat up and melt/cause a fire.
Posted on Reply
#20
qubit
Overclocked quantum bit
Christ, that's enough to start a fire!
Posted on Reply
#21
lemoncarbonate
cst1992Like the ones on a laptop? Except this one would literally be the size and weight of an actual brick.
In case you need extra power when your 1600W PSU is not sufficient for RTX 9090 Super Ti
Posted on Reply
#22
Khonjel
It is said to replace three standard 8-pin power connectors found on some high-end graphics cards and will likely result in power supply manufacturers adopting the new standard.
Why?! Why only high wattage ones I mean. Replace 6-pin with this as well. Are we in the dawn of two different PCI-SIG power connectors? 6-pin for low power and 12-pin for high power. I know there's 6-pin and 8-pin today as well but 8-pins are mostly 6+2 pins so they are compatible.
Posted on Reply
#23
b4psm4m
looniamick. why not the same gauge? - yeah whatever; this needs to die.
Not the same gauge as the signal wires are not carrying any significant power.
Posted on Reply
#24
cst1992
b4psm4mits nice to have a delivery system with a decent amount of over capacity so that the wires/connector don't heat up and melt/cause a fire.
It sets a bad precedent. It's 600W today. What next? 800W? 1000W?
lemoncarbonateIn case you need extra power when your 1600W PSU is not sufficient for RTX 9090 Super Ti
That thing is going to require a direct extension from the power meter and its own fuse!
Posted on Reply
#25
Bomby569
600W just for a GPU is insane
Posted on Reply
Add your own comment