Thursday, July 16th 2020

The Curious Case of the 12-pin Power Connector: It's Real and Coming with NVIDIA Ampere GPUs

Over the past few days, we've heard chatter about a new 12-pin PCIe power connector for graphics cards being introduced, particularly from Chinese language publication FCPowerUp, including a picture of the connector itself. Igor's Lab also did an in-depth technical breakdown of the connector. TechPowerUp has some new information on this from a well placed industry source. The connector is real, and will be introduced with NVIDIA's next-generation "Ampere" graphics cards. The connector appears to be NVIDIA's brain-child, and not that of any other IP- or trading group, such as the PCI-SIG, Molex or Intel. The connector was designed in response to two market realities - that high-end graphics cards inevitably need two power connectors; and it would be neater for consumers to have a single cable than having to wrestle with two; and that lower-end (<225 W) graphics cards can make do with one 8-pin or 6-pin connector.

The new NVIDIA 12-pin connector has six 12 V and six ground pins. Its designers specify higher quality contacts both on the male and female ends, which can handle higher current than the pins on 8-pin/6-pin PCIe power connectors. Depending on the PSU vendor, the 12-pin connector can even split in the middle into two 6-pin, and could be marketed as "6+6 pin." The point of contact between the two 6-pin halves are kept leveled so they align seamlessly.
As for the power delivery, we have learned that the designers will also specify the cable gauge, and with the right combination of wire gauge and pins, the connector should be capable of delivering 600 Watts of power (so it's not 2*75 W = 150 W), and not a scaling of 6-pin. Igor's Lab published an investigative report yesterday with some numbers on cable gauge that helps explain how the connector could deliver a lot more power than a combination of two common 6-pin PCIe connectors.

Looking at the keying, we can see that it will not be possible to connect two classic six-pins to it. For example pin 1 is square on the PCIe 6-pin, but on NVIDIA's 12-pin is has one corner angled. It also won't be possible to use weird combinations like 8-pin + EPS 4 pin, or similar—NVIDIA made sure people won't be able to connect their cables the wrong way.

On topic of the connector's proliferation, in addition to PSU manufacturers launching new generations of products with 12-pin connectors, most prominent manufacturers are expected to release aftermarket modular cables that can plug in to their existing PSUs. Graphics card vendors will include ketchup-and-mustard adapters that convert 2x 8-pin to 1x 12-pin; while most case/power manufacturers will release fancy aftermarket adapters with better aesthetics.

Update 08:37 UTC: I made an image in Photoshop to show the new connector layout, keying and voltage lines in a single, easy to understand graphic.
Sources: FCPowerUp (photo), Igor's Lab
Add your own comment

175 Comments on The Curious Case of the 12-pin Power Connector: It's Real and Coming with NVIDIA Ampere GPUs

#76
BoboOOZ
True, but Amazon is not a manufacturer, it's just a reseller. Small margins high volume works fine for reselling, for manufacturers it's harder. It's sufficient to have a few faulty products and you throw away all your margin.
Posted on Reply
#78
L'Eliminateur
londiste
The connector pinouts are not quite accurate.
While technically most PSUs provide +12V on pin 2 for 6-pin connector, that is not the spec and pin 5 accordingly is sense. 6-pin connector officially has 2 +12V pins.
Similarly, 8-pin connector includes 2 sense pins 4 and 6 and has 3 +12V pins.
I bet 12-pin connector will end up with 5 +12V pins.
i don't think so, the current connectors have sense pins since they can be joined together and without sense the card has no way of knowing you plugged a 6 or 8 pin cable to it so they have to waste pins for that.
Since this new 12-pin connector seems to be monolithic (althought it's mentioned it can be split in half, that has me puzzled) they wouldn't need any sense as it's always a 12-pin plugged.
Or if it indeed can be split for smaller GPUs then you're right it would require one sense pin on the 2nd half
Posted on Reply
#79
tomc100
Soon the gpu will just plug directly into the wall with an AC adapter.
Posted on Reply
#80
L'Eliminateur
tomc100
Soon the gpu will just plug directly into the wall with an AC adapter.
jaja cannot happen, the power consumption of GPUs means the PC PSU is the only viable option, or the power brick would essentially be a dedicated PC PSU, doesn't make sense :D
Posted on Reply
#81
Jism
L'Eliminateur
jaja cannot happen, the power consumption of GPUs means the PC PSU is the only viable option, or the power brick would essentially be a dedicated PC PSU, doesn't make sense :D


The 3dfx actually came with it's own power brick to be plugged into the wall to feed the GPU.

Apart from that, weird decision. 2x8 pins should be more then enough for most GPU's unless this thing is supposed to feed GPU's in enterprise markets or machines.

Lot of wires are overrated as well. One single yellow wire on a good build PSU could easily push 10A up to 13A.
Posted on Reply
#82
Legacy-ZA
Vya Domus
Yeah I realized that, still, they aren't exactly small.
Don't believe their waffle for a moment, they make moooooooooooooore than enough. Not "small profits" at all. I can tell you stories that will make you infuriated with the injustices people pay for today's products. Suffice to say, most consumers are extremely unaware of how badly they are being ripped off.
Posted on Reply
#83
Overclocker_2001
i bet this connector will be used in server / hpc space, where one cable is better than 2, and where PSU are designed to be fitted inside a case, with special connector.

remember Nvidia HGX gpu? well.. that's a 400W BEAST... but does not have any connector to power it because.. well.. 4*8pin PEG cable is not an option.

definitely 600W for a single VGA (single or dual gpu doen't matter) is quite too much for consumer, but not for PRO / server / HPC application
Posted on Reply
#84
Vya Domus
Overclocker_2001
but not for PRO / server / HPC application
Yeah it is, when you have thousands of them the cost of electricity and cooling stacks up.
Posted on Reply
#85
Dirt Cheap
What about LED??!
Must have RGB lighting in this new spec and adapters!
Posted on Reply
#86
Flanker
Verpal
看来有一些读者尤其是国外的读者不能理解中文的幽默,我更新一下最新的图纸和消息汇总。消息是真的。既然Techpowerup发了图纸,那我也发一下。

You guys can't read Chinese?

以上的内容都是我编的。
Anyone with somewhat reasonable competency in Chinese would know it is literally a joke, nothing more.
fcpowerup are legit, they are reviewing power supply since birth of Jesus Christ, if you don't believe it, at least try to read the write up from igor labs.
I read and speak Chinese and I don't see the joke? Some historical reference with FCPowerup?
I get that FCPowerup is legit, I'm just confused AF
Posted on Reply
#87
Mistral
So now we have to have different nVidia and AMD connectors on PSUs, or what?
Posted on Reply
#88
Krzych
Vya Domus
Funny how when AMD has a power hungry GPU every one thinks power consumption is everything but when Nvidia hints at an upcoming power hungry atrocity everyone's cool with it.
This is entirely relative. There is a difference between pointlessly drawing way more power for the same performance and pushing the limits of performance, and we are talking the latter here. To do as badly as AMD did back in the days you refer to, they would have to release a new 300W+ card thats slower than RTX 2070.
Posted on Reply
#89
Th3pwn3r
Krzych
This is entirely relative. There is a difference between pointlessly drawing way more power for the same performance and pushing the limits of performance, and we are talking the latter here. To do as badly as AMD did back in the days you refer to, they would have to release a new 300W+ card thats slower than RTX 2070.
Exactly, in terms of power consumption to performance AMD wasn't doing good.
Posted on Reply
#90
Verpal
Flanker
I read and speak Chinese and I don't see the joke? Some historical reference with FCPowerup?
I get that FCPowerup is legit, I'm just confused AF
TF? I thought the joke is like..... obvious?
To be fair if it late night and you are just reading stuff literally...... sure
But context is important here, the piece is too high effort and well reference to be a joke, especially when it is just a single line at the bottom of the page.

Anyway, he put in additional material already, although I like the write up with Igor's lab more, FCPowerup's info is about the same, go check it out :D
Posted on Reply
#91
Parn
Does that mean for any potential RTX 3080/3080 Ti buyers they would have to buy one of those new shiny PSUs to benefit from less cluttering cables? Unless this 12-pin connector can provide double the amount of wattage of the 8-pin, what's the point?
Posted on Reply
#92
thebluebumblebee
Okay, I didn't read the thread, but....

I can't see this coming to desktops. AI focused cards? Yes. Desktops? No. Why? Because places like California would have a power usage fit. To me, the industry is very conscience of their power usage and NOT trying to draw attention to themselves.
www.techpowerup.com/225808/new-california-energy-commission-regulation-threatens-pre-built-gaming-desktops
www.techpowerup.com/249605/all-asus-motherboards-meet-stringent-new-california-energy-commission-standards
Posted on Reply
#94
BoboOOZ
thebluebumblebee
I can't see this coming to desktops. AI focused cards? Yes. Desktops? No. Why? Because places like California would have a power usage fit. To me, the industry is very conscience of their power usage and NOT trying to draw attention to themselves.
If I understand correctly, all these regulations stipulate about efficiency, not power draw. That's why the evolution of the ATX format, so that there are less losses. There will still be 1500W PSU, it's just that the current conversions will be made at a single point, to avoid waste.
Posted on Reply
#95
nickbaldwin86
I hope this is true. would be great to have a single plug. I have 2 8pins, which is still more wires and involves two plugs and two cables.

wondering when cards will just have 2x12pin LOL
Posted on Reply
#96
Assimilator
One more thing: the schematic for that connector has durability at only 25 insertion/removal cycles.
RH92
Ehhhhhhhhhhhhhh .........

[MEDIA=twitter]1283502759292162051[/MEDIA]
TPU should at the very least check what the Chinese text says instead of running with whatever BS leak comes to their hands , just saying !
You should at the very least check the previous posts in this thread before you post something that has already been addressed, thereby making yourself look ignorant.

Just saying!
Posted on Reply
#97
W1zzard
Assimilator
he schematic for that connector has durability at only 25 insertion/removal cycles
Hard to believe, and would be terrible if true. I must have plugged my PSU cables several hundred times and they're still fine. Slightly less force needed to connect, but np otherwise. Didn't we heard similar claims the first time LGA sockets were announced?
Posted on Reply
#98
Assimilator
W1zzard
Hard to believe, and would be terrible if true. I must have plugged my PSU cables several hundred times and they're still fine. Slightly less force needed to connect, but np otherwise. Didn't we heard similar claims the first time LGA sockets were announced?
I had a look at Molex's page for current PCIe connectors afterwards, and those are only rated at 30 cycles! Guess they just take the worst-worst-worst case...
Posted on Reply
#99
Vya Domus
Krzych
This is entirely relative. There is a difference between pointlessly drawing way more power for the same performance and pushing the limits of performance, and we are talking the latter here. To do as badly as AMD did back in the days you refer to, they would have to release a new 300W+ card thats slower than RTX 2070.
"To do as badly" is entirely relative, gotcha. You can try and justify it all you can want, an unacceptable amount of power should remain unacceptable no matter what.
Posted on Reply
#100
Chrispy_
I'm not really interested in a GPU that needs more than 250W. Even if the graphics are phenomenal, I don't need that noise or heat and after the lacklustre RTX game library I can honestly say that awesome graphics do not fix bad or re-hashed gameplay.
Posted on Reply
Add your own comment