• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

The Curious Case of the 12-pin Power Connector: It's Real and Coming with NVIDIA Ampere GPUs

I'm not really interested in a GPU that needs more than 250W. Even if the graphics are phenomenal, I don't need that noise or heat and after the lacklustre RTX game library I can honestly say that awesome graphics do not fix bad or re-hashed gameplay.
 
Well, graphic cards power consumption is only going up, so they might as well ensure that power delivery is flawless. Otherwise, many users might complain about their card's buggy drivers crashing.

Still, I wouldn't like to change my PSU just yet.
Multi rail cheap PSU plus adapter cables is a sure way to Imitate buggy driver's from my experience.

Not good, Ampere is really looking like 4-450 watts at the top end.

After biting, Rtx ended up like Physx, shit and underutilized, a disappointment.
 
Multi rail cheap PSU plus adapter cables is a sure way to Imitate buggy driver's from my experience.

Not good, Ampere is really looking like 4-450 watts at the top end.

After biting, Rtx ended up like Physx, shit and underutilized, a disappointment.

Usually, first-gen of anything is a disappointment but now with consoles using RTX, there will actually be games that actually use it.
 
This has to be in a server environment. Think about how stiff 12 each 16AWG would be.
 
you need a connector on the psu not just the cable
all modern psus have 8-pin connectors for PEG,not 6 -pin
it's a nice thing for sure,but if it gets released there will be normal 2x8-pin versions for sure.
That is true, though I'm wondering what the options are when you have two 8-pin connectors on a single cable. Would it be possible for the PSU manufacturer to supply a cable with one 12-pin instead?
 
That is true, though I'm wondering what the options are when you have two 8-pin connectors on a single cable. Would it be possible for the PSU manufacturer to supply a cable with one 12-pin instead?
even daisy chaining two 8-pins is not recommended.
 
The current 8 pin connector is rated for 150 watts, or 50 watts per hot wire. This 12 pin connector is rated for either 9 or 9.5 amps per hot wire which puts it at at least 108 watts per hot wire. Still don't see a problem?
 
The current 8 pin connector is rated for 150 watts, or 50 watts per hot wire. This 12 pin connector is rated for either 9 or 9.5 amps per hot wire which puts it at at least 108 watts per hot wire. Still don't see a problem?
Similar spec for 8-pin connector is 8 amps per hot wire - 96W or so.
 
Usually, first-gen of anything is a disappointment but now with consoles using RTX, there will actually be games that actually use it.
Yeah except they're not, soo
That's not necessarily going to happen.
 
Difference is, Nvidia requires more power on the topend max tuned cards whereas AMD needs the same at midrange.
 
Hard to believe, and would be terrible if true. I must have plugged my PSU cables several hundred times and they're still fine. Slightly less force needed to connect, but np otherwise. Didn't we heard similar claims the first time LGA sockets were announced?

I'm finding the whole story about people buying new PSUs for a GPU a bit terrible if true. And hard to believe.
 
well I imagine adapters won't be ideal, so you will need a new PSU for this to really run Ampere properly. and PSU shortages are a real thing in last 6 months... and I just bought my evga 700w gold last year, works great for me. I think I am just going to risk Big Navi and hope Lisa Su heard the community loud and clear and Big Navi drivers won't be so much an issue this round. so ryzen 4800x and big navi it is for me. now I just have to keep my fingers crossed on drivers, but I suppose if they are super horrible I can just return for refund.
 
well,not the worst of ideas to have even the 3x8-pin cards like lightning run off a single connector instead of this mess

maxresdefault.jpg

it would run off of two 12-pins not a single, not sure that would be a step up more of an aside.
 
What was wrong with 2x6pins? Almost like Apple inventing unique items instead of sticking with standards.
 
Ok, just cause the connector is able to do 600 watts doesn't mean gpu you will put in will use that. I do see the idea that this single 600watt connector is way to eliminate need for 2x cables to power the gpu. Plus would help for gpgpu side for compute cards that need ton of power so they loose the need for 2x power per card to 1. So to me this is way to remove up to 1 power cable for gpu outta the pc and help's remove many more on server side that could have 5 or 6+ gpu's in a server that each could need 2x power cables per card.

Some of comments about "worry needing a new PSU". I don't think power that 2x8 pin provides is gonna be over used least on consumer side any time soon. Even though 8pin is rated at 150watts per spec it can do more then that provided the psu you own isn't some cheaply made pile of junk. AMD had a gpu that was pulling like 250+ watts per 8pin.

it would run off of two 12-pins not a single, not sure that would be a step up more of an aside.
It would be 1, since 12 pin can do 600 watts. Those 3x 8pins are 450watts total.
 
Last edited:
if this is true, do you think when buying the gpu from say asus, gigabyte, msi etc
would include adapter for current psu connectors?
 
Nvidia bought 3dfx
Is the just add moooore the solution?
img_e2779-jpg.130112

 
i really don't understand why we need so many 12 v rails as entrance to a vga; the current/power is coming from psu - one rail and is delivering the needed amps; even 3 12v rails can deliver and the wires support the max amps that cards need; on pcb entrance they make 3-6 points with different usage within the card but they all are connected in the end to the psu rail which deliver or not; you won't have cleaner power to different subsystem once all are interconnected physically to the psu...
 
GPU's should go the way of efficiency. 600W power requierements for gpus ? Should have been the opposite.
Plus this connector is too heavy and it will sag the cards down.

Genious!
Time to get yourself a brace. They are cheap and pretty much are not very visible while holding up alot of the weight.

My complaint is I would rather “wrestle” with two connector cords that are easy to get out of the way than 1 thick connection cord that will ACTUALLY be tough to “wrestle” out of the way. Thick cord will not make for a clutter free case that doesn’t impede air flow.
 
Last edited:
Is Nvidia aware that nobody can buy power supplies at the moment?
Making your new product dependent on something that's unobtainable at the moment seems like a poorly-planned idea.
 
Is Nvidia aware that nobody can buy power supplies at the moment?
Making your new product dependent on something that's unobtainable at the moment seems like a poorly-planned idea.
I'm not sure at all that this connector, even the rumor seems correct, is arriving with the 30XX series of cards, or that there won't be some type of convertor available, at least. New standards take a while to implement and this looks more like a long shot to me.
 
I'm not sure at all that this connector, even the rumor seems correct, is arriving with the 30XX series of cards, or that there won't be some type of convertor available, at least. New standards take a while to implement and this looks more like a long shot to me.
I agree. The last 3-4 generations have all delivered 1.5-2x the performance with the same power consumption as the previous one. I don't believe (consumer) Ampere is about to break this trend - especially at a time when they're making the jump from 12 to 7 nm as well.
 
Funny how when AMD has a power hungry GPU every one thinks power consumption is everything but when Nvidia hints at an upcoming power hungry atrocity everyone's cool with it.
Oh they paid with the GTX 480. My egg cooker as I called it back then. Mine would nearly overheat in Dead Island in a HAF X case, hitting about 100 C against the 105 C throttle stop. And I consume less power with this 1080Ti than I did with that...I regretted not getting the HD 5870 instead.

Every generation is a shit flinging contest, nothing new here.
 
Back
Top