Monday, March 20th 2023

16-pin Power Connector Optional for GeForce RTX 4070, Partners Free to Use 8-pin PCIe

NVIDIA is reportedly flexible with its add-in card (AIC) partners with their choice of power connectors for the upcoming GeForce RTX 4070 graphics card. It stands to reason that this flexibility could extend even further down the lineup, to the RTX 4060 series and below. Igor's Lab reports that NVIDIA is allowing partners to choose between the newer 16-pin ATX 12VHPWR connector, or the older [and more reliable] 8-pin PCIe power connectors, in their custom-design products. Sources tell Igor's Lab that the RTX 4070 could have two TGP classes, a 225 W-rated one that is likely to feature 16-pin connectors, and a 200 W-class that will stick to PCIe power connectors.

With typical graphics power (TGP) values for the upcoming RTX 4070 expected to be well under 300 W, graphics cards can make do with two 8-pin PCIe connectors on the card, or a 16-pin connector with a dongle that converts two 8-pin connectors to a 300 W-rated 16-pin. A pair of 8-pin inputs on the card is the more cost-effective solution, as it spares AICs from having to include that NVIDIA-designed power adapter, besides the more exotic 16-pin input on the board. We have seen pictures of RTX 4070 and RTX 4060-series Founders Edition cards, which are sure to feature 16-pin connectors. For the RTX 4070, NVIDIA could take an unconventional product launch path, with custom-design graphics cards priced on NVIDIA MSRP launching a day sooner than the ones that are priced at a premium. It's possible that the MSRP cards could come with 8-pin PCIe connectors.
Source: Igor's Lab
Add your own comment

13 Comments on 16-pin Power Connector Optional for GeForce RTX 4070, Partners Free to Use 8-pin PCIe

#1
wolf
Performance Enthusiast
Interesting move, but it makes sense if true and the 4070 really is ~200w, it could make do with a single 8 pin, and the 4060 with a single 6 pin.
Posted on Reply
#2
Chaitanya
Still wont be surprised to find some "conditions" in place to satisfy in order to use 8Pin PCI-e power connector.
Posted on Reply
#3
Crackong
I bet there are some T&C to use the 8PIN
Maybe they have to restrict the 8PIN cards so they will never outperform the 12VHPWR version.
Posted on Reply
#4
nguyen
make sense, when 1x 8Pin is enough there is no need for 16pin connector

Single cable masterrace :D
Posted on Reply
#5
W1zzard
nguyenSingle cable masterrace :D
Yeah .. having swapped GPUs like 100 times this year I feel like having a single cable is the best solution, plugging in 3x 8-pin feels like such a waste of time and having to fiddle around near the area doesn't help
Posted on Reply
#6
enb141
W1zzardYeah .. having swapped GPUs like 100 times this year I feel like having a single cable is the best solution, plugging in 3x 8-pin feels like such a waste of time and having to fiddle around near the area doesn't help
So after plugging and unplugging so many 4090 don't you feel scary to catch some fire?
Posted on Reply
#7
W1zzard
enb141So after plugging and unplugging so many 4090 don't you feel scary to catch some fire?
Not at all .. just plug it in and verify it's really plugged in. Once you know how the "click" sounds and feels it's a total non-issue
Posted on Reply
#8
wNotyarD
nguyenmake sense, when 1x 8Pin is enough there is no need for 16pin connector

Single cable masterrace :D
I mean, doesn't the 12VHPWR specification range from 150W to 600W depending on wire gauge?
Posted on Reply
#9
DeathtoGnomes
Just when you think Nvidia was inflexible, but methinks somehow they were forced to make this move by its board partners. Would you say its a common sensed type of move?
Posted on Reply
#10
docnorth
The option to choose adapter is positive. Option to buy at a sane price? :nutkick:
Posted on Reply
#11
nguyen
wNotyarDI mean, doesn't the 12VHPWR specification range from 150W to 600W depending on wire gauge?
What is the point of going from 8pin from the PSU side to 16pin on the GPU side anyways :D.

for 300W 16pin cable it would connect to 2x 8pin, for 150W might as well use 8pin cable
Posted on Reply
#12
wNotyarD
nguyenWhat is the point of going from 8pin from the PSU side to 16pin on the GPU side anyways :D.
One day every PSU on the market will have its 16-pin cable/connector. But I know that day isn't today.
Posted on Reply
#13
trsttte
nguyenWhat is the point of going from 8pin from the PSU side to 16pin on the GPU side anyways :D.
Because PCIe SIG is either dumb, lazy or both and didn't bother to update the spec. The only thing limiting a gpu 8 pin connector to 150W is the spec, nothing else.
wNotyarDOne day every PSU on the market will have its 16-pin cable/connector. But I know that day isn't today.
The connector was already demonstrated to not handle bends as well as the previous 8pin, and it would put a lot more power in the same surface area in the psu side where they like to use cheaper pcb's. It's not a good solution, maybe everyone involved in these decisions will come to their senses and we won't have to deal with that (needing to pay for a better pcb for the connectors because someone decided they wanted a smaller connector with no advantages is dumb)
Posted on Reply
Add your own comment
Apr 24th, 2024 08:33 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts