• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

8-pin PCIe to ATX 12VHPWR Adapter Included with RTX 40-series Graphics Cards Has a Limited Service-Life of 30 Connect-Disconnect Cycles

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,774 (7.41/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
PSUs with native 12+4 pin ATX 12VHPWR connectors are few and far between, which means some of the first adopters of the GeForce RTX 4090 "Ada" graphics cards will rely on the adapter cable that converts 8-pin PCIe power connectors into one 12VHPWR connector that plugs into the graphics card. Cards that stick to the baseline specs include adapters that convert three 8-pin PCIe connectors (for 450 W output that matches the RTX 4090 reference specs); whereas some premium overclocked RTX 4090 cards, such as the ZOTAC RTX 4090 AMP Extreme, include adapters that convert as many as four 8-pin PCIe to a 12VHPWR, maxing out its 600 W power delivery capability. The product page of the ZOTAC AMP Extreme has an interesting sentence describing this in-box adapter: "Limited service life with up to 30 connect / disconnects."

Apparently the adapter is only good for up to 30 connect/disconnect cycles safely, before you'll need another one. For most gamers who'll install the card and forget about it for years, this shouldn't be an issue. However, for overclockers and enthusiasts using the card on an open-air bench and who rely on a lot of moving cards around, this could be an irritant. Tech journalists (reviewers) swap graphics cards out a lot, too, but then they're likely to have several such adapters lying around from multiple samples, or a PSU with a native 12+4 pin connector.



Our best guess is that this is a mechanical limitation assessed by NVIDIA for the maximum number of times the adapter can handle connection cycles before its contacts begin to wear out, and its safety is compromised. If you look closely at the picture above, the adapter has an NVIDIA logo, which means NVIDIA is directly supplying this adapter to AIC partners to include with their custom-design cards (and not counting on them to develop their own adapters). The 12VHPWR connector may look diminutive, but it's capable of delivering not just 600 W continuously, but also handle 200% excursions (brief spikes in power draw), which mean 1200 W. This is a lot of current (12 V, 100 A, enough to crank an automobile), and so NVIDIA isn't taking any chances with safety.

View at TechPowerUp Main Site | Source
 
My concern is not that much about the 30 cicles, but more for the quallity and safety of this connector that will delivery 600W to the card with such short life span
 
Almost wondering why the didn't go with a larger gauge wiring but that would only make things melt faster. More bad news for Nvidia if this ends up being an issue with gaming rigs that have bad air flow; need to carry away the heat from the wires or it can lead to some melted connectors.
 
Class action law suit against the house burning down connector or its inceptioner/ promoter.

Something always gives when cutting corners and quite a few have been.

I am afraid the new nVidia power connector standard has no usefull life as it is. I wont feel bad if proven wrong( kudos to them if so).
 
Class action law suit against the house burning down connector or its inceptioner/ promoter.

Something always gives when cutting corners and quite a few have been.

I am afraid the new nVidia power connector standard has no usefull life as it is. I wont feel bad if proven wrong( kudos to them if so).
It's about as much power as a travel hairdryer, which comes with a similar-sized 2-pin AC plug. It won't burn houses. Besides 12VHPWR is an ATX standard, NVIDIA only implemented it.
 
the socket of cpu of motherboard is less if you read intel datasheet

Makes sense given most Intel boards will have the CPU changed 0 times due to the short socket support. Mind you if the socket fails it doesn't result in an extreme hazard, the PC just becomes unusable. A 600W PCIe cable on the other hand failing can melt, smoke, ect. For a CPU you don't have to remove it from the socket to replace the paste or cooler either. GPU you should be replacing paste once per year.
 
My concern is not that much about the 30 cicles, but more for the quallity and safety of this connector that will delivery 600W to the card with such short life span
that lifespan you are talking about is purely mechanical. Everything wears out when you repeatedly cause friction between two contact points. If you only plug it in one or two times, the wear and tear is minimal and it more or less slows downs or stops there. it will last as long as any other cable that carries electricity. it can last forever even if you wish. you can just leave it plugged in and it will stay that way. your worry is misplaced. you should be worrying about something else.
 
*chuckles*
"Pedantic" details are going to end up being the root of so many issues this incoming generation.

Joy.
 
Most small plugs like that including usb etc etc typically have a low in/out cycle life.
 
It's about as much power as a travel hairdryer, which comes with a similar-sized 2-pin AC plug. It won't burn houses. Besides 12VHPWR is an ATX standard, NVIDIA only implemented it.
That's not true at all. Your hairdryer uses either 120v or 230v. That's between 2-5 amps. At 12V 600W is 50 amps. Spread between 6 cables it's still not much though.
 
maybe this is the reason why EGVA terminates partnership with nvidia :), ridiculous ..
 
I'm confused as to why it's a 12+4 connector. Why not just 16 pin? Have we seen any designs that use only the 12 pin? It seems like it would be cheaper and easier to manufacture to use only the 16 pin design... even if the card is only physically wired for 12 pins, just leave the 4 additional pins unpopulated.
 
I'm confused as to why it's a 12+4 connector. Why not just 16 pin? Have we seen any designs that use only the 12 pin? It seems like it would be cheaper and easier to manufacture to use only the 16 pin design... even if the card is only physically wired for 12 pins, just leave the 4 additional pins unpopulated.

Some of the GeForce RTX 30 Series Founders Edition cards have 12-pin power connectors. The 12+4 pin connector can still work with those cards. If it were a single 16-pin connector, it wouldn't fit.

In the same way, PSUs often ship with 6+2 PCIe power cables so they can plug into graphics cards that have six-pin port and/or eight-pin ports. My RTX 2070 Super FE has one of each.
 
Last edited:
Some of the GeForce RTX 30 Series Founders Edition cards have 12-pin power connectors. The 12+4 pin connector can still work with those cards. If it were a single 16-pin connector, it wouldn't fit.

In the same way, PSUs often ship with 6+2 PCIe power cables so they can plug into graphics cards that have six-pin port and/or eight-pin ports. My RTX 2070 Super FE has one of each.
Right, but PCI-E 6 pin was around for a long time before they introduced the 8 pin connector. Now that you mention it, I do remember the 12VHPWR connector originally being 12 pins, now it's 16... well, that escalated quickly.
 
It's not about "escalation."

The four additional pins are for sense signals. They don't deliver extra power.

If I understand correctly, these new GeForce 40 series cards don't require ATX 3.0 PSUs with the 12+4 power connectors. You can use an older, "dumb" PSU with the adapter pigtail. You won't be getting any of the "smarter" power communications between the GPU and PSU but the card will still get adequate power.
 
*chuckles*
"Pedantic" details are going to end up being the root of so many issues this incoming generation.

Joy.
Some of them, perhaps.

When it comes to electricity, one can never be *too* safe.
Yeah, that's exactly how the engineers who made these things think, guys. These are not fire hazards. They have a mechanical use limit. Wait until you find out the connection rating on standard pcie plugs.

I do remember the 12VHPWR connector originally being 12 pins, now it's 16... well, that escalated quickly.
the 4 more are sense pins and have nothing to do with power.

btw I am using one of these connector adapters with my 3090ti. It is perfectly safe.

Have we seen any designs that use only the 12 pin?
3090ti and all FE cards.

maybe this is the reason why EGVA terminates partnership with nvidia :), ridiculous ..
Considering I own an evga branded one of these adapters with my 3090ti, really doubt it.

That's not true at all. Your hairdryer uses either 120v or 230v. That's between 2-5 amps. At 12V 600W is 50 amps. Spread between 6 cables it's still not much though.
power/wattage != amps/current
 
power/wattage != amps/current
For resistive loads like a hair dryer and DC loads it is. Unless you think the fan on a hairdryer is a significant part of the load. Then sure...
 
Most small plugs like that including usb etc etc typically have a low in/out cycle life.
usb is much higher but internal connectors like this expect a low rating yes.

For resistive loads like a hair dryer and DC loads it is. Unless you think the fan on a hairdryer is a significant part of the load. Then sure...
amps * voltage = wattage, even in DC my friend.

AC we have PF indeed but that's OT.
 
That's what I said. Why did you quote me then.
You know what, I'm not entirely sure? I think I misread your post.

All good as long as the facts are, lol.
 
the socket of cpu of motherboard is less if you read intel datasheet
Fair point,.....

However, how many times are you going to realistically remove and install an Intel CPU? Especially so when virtually each new processor line requires a new motherboard. In some respects Intel might as well have soldered the CPU to the motherboard,.....

Not so with video cards,...
 
Back
Top