• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Picture Proof of NVIDIA 12-pin Power Connector: Seasonic Ships Modular Adapters

Wow I’m glad I bought a 1000 watt power supply. The EVGA 1000 P2 is going to be in for a workout.
 
Ok so, this is supposed to be a step forward in technology?
Sorry, not seeing it..
 
Shouldn’t have bought those 550w Titanium PSU’s :slap:

:laugh:
 
Most PSU already have 16AWG cables. even the SF450. plus it has 3 of those 8 pins, not just 6 pins in the molex populated with cables out of 8 possible.. so it would not make a difference what cable is being used 2x6 or 12x1 since the possibility to pull is already there.
 
Mobo's going 12v ?, i was sure it was the 5v and 3.3v Intel is trying to push.

Nope. They want to plop more vrms on the mobo and go 12V only.
 
Ok so, this is supposed to be a step forward in technology?
Sorry, not seeing it..
You can't really make more of a step forward with a simple power connector. One thing this article doesn't mention is that this connector is significantly smaller than a normal 12 pin connector would be. I think it's about the size a normal 8 pin would be or maybe even a bit smaller. It's also probably a lot easier to insert and disconnect for weaker people who maybe would have trouble with normal-sized 12 pin connectors. These connectors can be pretty tough sometimes. But makes not much sense to only offer it on the Founders Edition though.
 
We don't even know yet how hot it will get. That bigger cooler might be there for a good reason.

Where do you think that heat goes that the bigger cooler dissipates? It doesn't vaporize or go to Neverland Ranch. The more effective the cooler, the more heat that is dissipated into the room. Big coolers and big wattage means bad news for room temps.
 
Seems to be as big as an 8-pin.

74664_01_nvidias-new-12-pin-connector-is-super-small-on-geforce-rtx-30-series_full.jpg

Hmm very interesting, not long now either way. And damn this forum love a pitchfork.
 
Nvidia's new Ampere cooler, or it's power source?

1598230376645.png
 
Where do you think that heat goes that the bigger cooler dissipates? It doesn't vaporize or go to Neverland Ranch. The more effective the cooler, the more heat that is dissipated into the room. Big coolers and big wattage means bad news for room temps.
Never in my life have I wasted a thought on what my pc means for my room temperature... I thought most people only care about how hot the actual components are. This is a new one for me, I must admit.
 
Nope. They want to plop more vrms on the mobo and go 12V only.

Aah yes they doing away with 3.3 and 5v, well going try it might take a while for everyone to move with that.

As for efficency we will have to see and thinking it will depend on MB to MB manufacture, which to me will make mobo's even more expensive. Lets face it it only makes a real difference for those who actually leave the PC on so i guess it be good for businesses.
 
Okay, i'm all for the concept of a 12 pin connector smaller than a single 8 pin.

That at least seems a step forward, even if its annoying to have a new proprietary design (And i think the wattage its capable of may be deliberately overkill)
 
Never in my life have I wasted a thought on what my pc means for my room temperature... I thought most people only care about how hot the actual components are. This is a new one for me, I must admit.

Live in a hot climate?

Are you on glue?

AMD has lots of glue.
 
Isnt this just a standard 0430251200 Molex connector? that would suck.
It does look like a standard 0430251200. Nothing proprietary here. Wiring this would take a quick minute.
I guess you could use the 0430251200 molex since the keyed plugs look identical. However, the seasonic part plugs directly into the 8 pin of the psu modular connector instead of being an adapter /
I have to ask, how did you know that connector even exists?
Unfortunately, it looks like the actual "molex" plug is physically smaller
Clipboard01.jpg
 
It's not just the price, it's the heat such GPU emits too. Imagine playing something like FS 2020 which stresses GPU to the max all the time plus add 140W for a typical intel 8 core K CPU gaming power consumption and you're at +500W real power consumption. This makes gaming impossible in warmer climate during summer months and I'm definitely not installing AC just to use PC.
Given how CPU limited FS 2020 is, and how it seems most cards above ~2060 go unutilised, I would say your example falls flat. FS 2020 is not a GPU hog in any way. Most 2080 Ti users report around 40% usage.
I agree with you, just funny to use FS 2020 as an example.
 
so motherboards are going 12v and now videocards as well, interesting

Errr, PCIe power has always been 12V only. There are 4 x 3.3V power pins on the PCIe slot connector but every other power connection is 12V.
 
This is the price you have to pay for that smooth 4K gaming experience. People who complain about how big this card is or how much power it will suck at max or even the price, don't seem to understand what this card is geared to. This might be the first real 4K gaming card that actually deserves the name. But I'll reserve judgment until launch, which should be very soon.

There are no 4K cards and there never will be. There still isn't a 1080p card. The goal posts move and viewport resolution is already far from the only major influence. What really will matter is how you render within the viewport. We already have many forms of pseudo 4K with internal render res that is far lower, and even dynamic scaling, on top of all the usual lod stuff etc. etc. On top of even that, we get tech like DLSS and obviously RT.

Basically with all that piled up anyone can say he's running 4K, or 1080p, whatever seems opportune at the time :)

Good note and something that I would actually like to know. The fact that it's so big and draws so much power, is called 3090 (above the max xx80 from before) and has the new architecture must surely mean that its performance is beast. Do we even know anything about its performance yet? I feel like most of the posts are just critiquing this card without any actually relevant information.

Because if rumors of this are true, and it really offers something ridiculous like 50% more performance at half the power of Turing, which would mean it'll be like 100% faster per Watt than 2080 ti, using way more Watts... It could literally demolish the 2080 ti. Why is nobody talking about this possibility? Even if it's just a rumor, it also kinda makes sense to me so far looking at the leaks of the size, cooling, and price of this thing. If it's like 90% faster than 2080 ti, many people won't be able to hold on to their wallets.

Its not really the right perspective, perhaps.

If Nvidia has deemed it necessary to make a very big, power hungry GPU that even requires new connectors, it will royally step outside their very sensible product stack. If that is something they iterate on further, that only spells that Nvidia can't get a decent performance boost from node or architecture any more while doing a substantial RT push. It means Turing is the best we'll get on the architecture side, give or take some minor tweaks. I don't consider that unlikely, tbh. Like CPU, there is a limit to low hanging fruit.

This is not good news. It is really quite bad because it spells stagnation more than it does progress. The fact it is called 3090 and is supposed to have a 2k price tag tells us they want to ride that top-end for quite a while and that this is their big(gest) chip already. None of that is good news if you ask me.

Another option though is that they could not secure the optimal node for this, or the overall state of nodes isn't quite up to what they had projected just yet. After all, 7nm has been problematic for quite some time.

Wow I’m glad I bought a 1000 watt power supply. The EVGA 1000 P2 is going to be in for a workout.

Lol yeah you might use all of 600W from it at peak :laugh:
 
Last edited:
Live in a hot climate?
I lived in a hot climate 20 years ago. Still this idea that my PC will make my room hotter and that's why I'm going to buy a lower tier PC never crossed my mind. Unless you live in a closet-sized apartment in Africa, but then you're not buying a $1400 card.
 
I lived in a hot climate 20 years ago. Still this idea that my PC will make my room hotter and that's why I'm going to buy a lower tier PC never crossed my mind. Unless you live in a closet-sized apartment in Africa, but then you're not buying a $1400 card.
LOL, try living in near-bath level humifity where every day is over 92F and rarely goes below 80 at night and the air conditioning is already straining. You really do notice the difference from heat being expelled by a hot system.
 
LOL, try living in near-bath level humifity where every day is over 92F and rarely goes below 80 at night and the air conditioning is already straining. You really do notice the difference from heat being expelled by a hot system.
Then buying a new AC should probably be a higher priority than gaming on max settings.
 
Then buying a new AC should probably be a higher priority than gaming on max settings.
You're assuming that he can get a bigger AC that runs on the same circuit. There comes a point when you either need to consider a 208/230v spit-phase unit or go with a ductless split. Both of those will cost you a whole lot more money than a new gaming PC.
 
cant wait for this, going to be nice to just have a single bundle of wires going to the GPU.
 
Back
Top