Sunday, August 23rd 2020

Picture Proof of NVIDIA 12-pin Power Connector: Seasonic Ships Modular Adapters

Leading PSU manufacturer Seasonic is shipping a modular cable that confirms NVIDIA's proprietary 12-pin graphics card power connector for its upcoming GeForce "Ampere" graphics cards. Back in July we did an in-depth analysis of the connector, backed by confirmation from various industry sources about the connector being real, being a proprietary design by NVIDIA (and not a PCI-SIG or ATX standard), and its possible power output limit being 600 W. Seasonic's adapter converts two versatile 12 V 8-pin PSU-side connectors into one 12-pin connector, which tends to back the power output information. On typical Seasonic modular PSUs, cables are included to convert one PSU-side 8-pin 12 V connector into two 6+2 pin PCIe power connectors along a single cable. HardwareLuxxx.de reports that it's already received the Seasonic adapter in preparation for its "Ampere" Founders Edition reviews.

NVIDIA leaker with an extremely high hit-rate, kopite7kimi, however predicts that the 12-pin connector was designed by NVIDIA exclusively for its Founders Edition (reference design) graphics cards, and that custom-design cards may stick to industry-standard PCIe power connectors. We recently spied a custom-design RTX 3090 PCB, and it's shown featuring three 8-pin PCIe power connectors. This seems to be additional proof that a single 12-pin connector is a really fat straw for 12 V juice. The label on the box for the Seasonic cable reads that it's recommended to use the cable with PSUs with at least 850 W output (which could very well be a system requirement for the RTX 3090). Earlier this weekend, pictures of the RTX 3090 Founders Edition surfaced, and it is huge.
Source: VideoCardz
Add your own comment

119 Comments on Picture Proof of NVIDIA 12-pin Power Connector: Seasonic Ships Modular Adapters

#76
Dave65
Ok so, this is supposed to be a step forward in technology?
Sorry, not seeing it..
Posted on Reply
#77
freeagent
Shouldn’t have bought those 550w Titanium PSU’s :slap:

:laugh:
Posted on Reply
#78
ppn
Most PSU already have 16AWG cables. even the SF450. plus it has 3 of those 8 pins, not just 6 pins in the molex populated with cables out of 8 possible.. so it would not make a difference what cable is being used 2x6 or 12x1 since the possibility to pull is already there.
Posted on Reply
#79
R-T-B
AsRockMobo's going 12v ?, i was sure it was the 5v and 3.3v Intel is trying to push.
Nope. They want to plop more vrms on the mobo and go 12V only.
Posted on Reply
#80
PowerPC
Dave65Ok so, this is supposed to be a step forward in technology?
Sorry, not seeing it..
You can't really make more of a step forward with a simple power connector. One thing this article doesn't mention is that this connector is significantly smaller than a normal 12 pin connector would be. I think it's about the size a normal 8 pin would be or maybe even a bit smaller. It's also probably a lot easier to insert and disconnect for weaker people who maybe would have trouble with normal-sized 12 pin connectors. These connectors can be pretty tough sometimes. But makes not much sense to only offer it on the Founders Edition though.
Posted on Reply
#81
moproblems99
PowerPCWe don't even know yet how hot it will get. That bigger cooler might be there for a good reason.
Where do you think that heat goes that the bigger cooler dissipates? It doesn't vaporize or go to Neverland Ranch. The more effective the cooler, the more heat that is dissipated into the room. Big coolers and big wattage means bad news for room temps.
Posted on Reply
#82
Fluffmeister
TheDeeGeeSeems to be as big as an 8-pin.

Hmm very interesting, not long now either way. And damn this forum love a pitchfork.
Posted on Reply
#83
tygrus
Here's a video of me taking delivery of my next PSU for my next Triple-SLI GPU setup with 64core CPU ...
Posted on Reply
#85
PowerPC
moproblems99Where do you think that heat goes that the bigger cooler dissipates? It doesn't vaporize or go to Neverland Ranch. The more effective the cooler, the more heat that is dissipated into the room. Big coolers and big wattage means bad news for room temps.
Never in my life have I wasted a thought on what my pc means for my room temperature... I thought most people only care about how hot the actual components are. This is a new one for me, I must admit.
Posted on Reply
#86
Fluffmeister
Divide OverflowNvidia's new Ampere cooler, or it's power source?

Are you on glue?
Posted on Reply
#87
AsRock
TPU addict
R-T-BNope. They want to plop more vrms on the mobo and go 12V only.
Aah yes they doing away with 3.3 and 5v, well going try it might take a while for everyone to move with that.

As for efficency we will have to see and thinking it will depend on MB to MB manufacture, which to me will make mobo's even more expensive. Lets face it it only makes a real difference for those who actually leave the PC on so i guess it be good for businesses.
Posted on Reply
#88
Mussels
Freshwater Moderator
Okay, i'm all for the concept of a 12 pin connector smaller than a single 8 pin.

That at least seems a step forward, even if its annoying to have a new proprietary design (And i think the wattage its capable of may be deliberately overkill)
Posted on Reply
#89
moproblems99
PowerPCNever in my life have I wasted a thought on what my pc means for my room temperature... I thought most people only care about how hot the actual components are. This is a new one for me, I must admit.
Live in a hot climate?
FluffmeisterAre you on glue?
AMD has lots of glue.
Posted on Reply
#90
Darmok N Jalad
RedelZaVednoFermi Deja Vu all over again :(
Leather jacket not required!
Posted on Reply
#91
techboj
dj-electricIsnt this just a standard 0430251200 Molex connector? that would suck.
It does look like a standard 0430251200. Nothing proprietary here. Wiring this would take a quick minute.
I guess you could use the 0430251200 molex since the keyed plugs look identical. However, the seasonic part plugs directly into the 8 pin of the psu modular connector instead of being an adapter /
TheLostSwedeI have to ask, how did you know that connector even exists?
Unfortunately, it looks like the actual "molex" plug is physically smaller
Posted on Reply
#92
kayjay010101
RedelZaVednoIt's not just the price, it's the heat such GPU emits too. Imagine playing something like FS 2020 which stresses GPU to the max all the time plus add 140W for a typical intel 8 core K CPU gaming power consumption and you're at +500W real power consumption. This makes gaming impossible in warmer climate during summer months and I'm definitely not installing AC just to use PC.
Given how CPU limited FS 2020 is, and how it seems most cards above ~2060 go unutilised, I would say your example falls flat. FS 2020 is not a GPU hog in any way. Most 2080 Ti users report around 40% usage.
I agree with you, just funny to use FS 2020 as an example.
Posted on Reply
#93
Emu
ZoneDymoso motherboards are going 12v and now videocards as well, interesting
Errr, PCIe power has always been 12V only. There are 4 x 3.3V power pins on the PCIe slot connector but every other power connection is 12V.
Posted on Reply
#94
Vayra86
PowerPCThis is the price you have to pay for that smooth 4K gaming experience. People who complain about how big this card is or how much power it will suck at max or even the price, don't seem to understand what this card is geared to. This might be the first real 4K gaming card that actually deserves the name. But I'll reserve judgment until launch, which should be very soon.
There are no 4K cards and there never will be. There still isn't a 1080p card. The goal posts move and viewport resolution is already far from the only major influence. What really will matter is how you render within the viewport. We already have many forms of pseudo 4K with internal render res that is far lower, and even dynamic scaling, on top of all the usual lod stuff etc. etc. On top of even that, we get tech like DLSS and obviously RT.

Basically with all that piled up anyone can say he's running 4K, or 1080p, whatever seems opportune at the time :)
PowerPCGood note and something that I would actually like to know. The fact that it's so big and draws so much power, is called 3090 (above the max xx80 from before) and has the new architecture must surely mean that its performance is beast. Do we even know anything about its performance yet? I feel like most of the posts are just critiquing this card without any actually relevant information.

Because if rumors of this are true, and it really offers something ridiculous like 50% more performance at half the power of Turing, which would mean it'll be like 100% faster per Watt than 2080 ti, using way more Watts... It could literally demolish the 2080 ti. Why is nobody talking about this possibility? Even if it's just a rumor, it also kinda makes sense to me so far looking at the leaks of the size, cooling, and price of this thing. If it's like 90% faster than 2080 ti, many people won't be able to hold on to their wallets.
Its not really the right perspective, perhaps.

If Nvidia has deemed it necessary to make a very big, power hungry GPU that even requires new connectors, it will royally step outside their very sensible product stack. If that is something they iterate on further, that only spells that Nvidia can't get a decent performance boost from node or architecture any more while doing a substantial RT push. It means Turing is the best we'll get on the architecture side, give or take some minor tweaks. I don't consider that unlikely, tbh. Like CPU, there is a limit to low hanging fruit.

This is not good news. It is really quite bad because it spells stagnation more than it does progress. The fact it is called 3090 and is supposed to have a 2k price tag tells us they want to ride that top-end for quite a while and that this is their big(gest) chip already. None of that is good news if you ask me.

Another option though is that they could not secure the optimal node for this, or the overall state of nodes isn't quite up to what they had projected just yet. After all, 7nm has been problematic for quite some time.
Agentbb007Wow I’m glad I bought a 1000 watt power supply. The EVGA 1000 P2 is going to be in for a workout.
Lol yeah you might use all of 600W from it at peak :laugh:
Posted on Reply
#95
PowerPC
moproblems99Live in a hot climate?
I lived in a hot climate 20 years ago. Still this idea that my PC will make my room hotter and that's why I'm going to buy a lower tier PC never crossed my mind. Unless you live in a closet-sized apartment in Africa, but then you're not buying a $1400 card.
Posted on Reply
#96
rtwjunkie
PC Gaming Enthusiast
PowerPCI lived in a hot climate 20 years ago. Still this idea that my PC will make my room hotter and that's why I'm going to buy a lower tier PC never crossed my mind. Unless you live in a closet-sized apartment in Africa, but then you're not buying a $1400 card.
LOL, try living in near-bath level humifity where every day is over 92F and rarely goes below 80 at night and the air conditioning is already straining. You really do notice the difference from heat being expelled by a hot system.
Posted on Reply
#97
PowerPC
rtwjunkieLOL, try living in near-bath level humifity where every day is over 92F and rarely goes below 80 at night and the air conditioning is already straining. You really do notice the difference from heat being expelled by a hot system.
Then buying a new AC should probably be a higher priority than gaming on max settings.
Posted on Reply
#98
Aquinus
Resident Wat-man
PowerPCThen buying a new AC should probably be a higher priority than gaming on max settings.
You're assuming that he can get a bigger AC that runs on the same circuit. There comes a point when you either need to consider a 208/230v spit-phase unit or go with a ductless split. Both of those will cost you a whole lot more money than a new gaming PC.
Posted on Reply
#99
nickbaldwin86
cant wait for this, going to be nice to just have a single bundle of wires going to the GPU.
Posted on Reply
#100
PowerPC
Vayra86There are no 4K cards and there never will be. There still isn't a 1080p card. The goal posts move and viewport resolution is already far from the only major influence. What really will matter is how you render within the viewport. We already have many forms of pseudo 4K with internal render res that is far lower, and even dynamic scaling, on top of all the usual lod stuff etc. etc. On top of even that, we get tech like DLSS and obviously RT.

Basically with all that piled up anyone can say he's running 4K, or 1080p, whatever seems opportune at the time :)
The way I view a real 4K card (I think the same way most people view it) is that it can run the newest AAA games in 4K today. None of that non-sense you mentioned that some people may view as 4K. I could tell you I'm running 4K on Minecraft with a reduced field of view on my old R9 280X. That doesn't make my 280X a true 4K card. Sure, AAA games get better with time and also people will argue about the perfect frame rate a card should output for them, but all in all, hitting at least 60 fps on high or ultra settings on all the new AAA titles that are out today is where it's at with 4K for most people. I think 3090 could be the first card that actually will offer that and why I called it the first true 4K card.
Vayra86Its not really the right perspective, perhaps.

If Nvidia has deemed it necessary to make a very big, power hungry GPU that even requires new connectors, it will royally step outside their very sensible product stack. If that is something they iterate on further, that only spells that Nvidia can't get a decent performance boost from node or architecture any more while doing a substantial RT push. It means Turing is the best we'll get on the architecture side, give or take some minor tweaks. I don't consider that unlikely, tbh. Like CPU, there is a limit to low hanging fruit.

This is not good news. It is really quite bad because it spells stagnation more than it does progress. The fact it is called 3090 and is supposed to have a 2k price tag tells us they want to ride that top-end for quite a while and that this is their big(gest) chip already. None of that is good news if you ask me.

Another option though is that they could not secure the optimal node for this, or the overall state of nodes isn't quite up to what they had projected just yet. After all, 7nm has been problematic for quite some time.
CPUs are much different than GPUs when it comes to parallel processing. You can do a lot more with GPUs by just increasing the core counts and improving the memory bandwidth. Graphics is the best-known use case for parallelism while increasing the bandwidth to the memory and that's something they are clearly doing. That's actually what they have been doing forever. So yea, at some point, increasing cores all the time, the cards have to become bigger no matter what. And they have been already becoming gradually bigger throughout the years. Nothing new. The new part is this massive jump in size from the last generation, which means they are going to increase the cores by more than usual and clock them much higher. They may even bake in some new stuff that takes advantage of all that bandwidth (over 1 TB/s). This is also a huge jump in bandwidth from 2080 ti (620GB/s).

But none of this really tells you anything about the architecture improvement of Ampere. None of the info on power consumption or size really shows you anything about that. Also, the price means very little here. It could still be some totally crazy performance increase per Watt, we don't know. People just like to speculate and be negative without actual info.
Posted on Reply
Add your own comment
Apr 23rd, 2024 08:03 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts