Sunday, August 23rd 2020

Picture Proof of NVIDIA 12-pin Power Connector: Seasonic Ships Modular Adapters

Leading PSU manufacturer Seasonic is shipping a modular cable that confirms NVIDIA's proprietary 12-pin graphics card power connector for its upcoming GeForce "Ampere" graphics cards. Back in July we did an in-depth analysis of the connector, backed by confirmation from various industry sources about the connector being real, being a proprietary design by NVIDIA (and not a PCI-SIG or ATX standard), and its possible power output limit being 600 W. Seasonic's adapter converts two versatile 12 V 8-pin PSU-side connectors into one 12-pin connector, which tends to back the power output information. On typical Seasonic modular PSUs, cables are included to convert one PSU-side 8-pin 12 V connector into two 6+2 pin PCIe power connectors along a single cable. HardwareLuxxx.de reports that it's already received the Seasonic adapter in preparation for its "Ampere" Founders Edition reviews.

NVIDIA leaker with an extremely high hit-rate, kopite7kimi, however predicts that the 12-pin connector was designed by NVIDIA exclusively for its Founders Edition (reference design) graphics cards, and that custom-design cards may stick to industry-standard PCIe power connectors. We recently spied a custom-design RTX 3090 PCB, and it's shown featuring three 8-pin PCIe power connectors. This seems to be additional proof that a single 12-pin connector is a really fat straw for 12 V juice. The label on the box for the Seasonic cable reads that it's recommended to use the cable with PSUs with at least 850 W output (which could very well be a system requirement for the RTX 3090). Earlier this weekend, pictures of the RTX 3090 Founders Edition surfaced, and it is huge.
Source: VideoCardz
Add your own comment

119 Comments on Picture Proof of NVIDIA 12-pin Power Connector: Seasonic Ships Modular Adapters

#101
rtwjunkie
PC Gaming Enthusiast
PowerPCThen buying a new AC should probably be a higher priority than gaming on max settings.
LOL. Let’s see, a $6000 (+) system that is not broken vs playing on a computer, just to keep you happy so you can dictate to us how to spend our money? LOL, what hubris!

How presumptuous you are to decide that my AC which is to all the specs for the house size and is not broken should be replaced. I didn’t complain, I merely told you that it can be felt in the summer months.
Posted on Reply
#102
PowerPC
rtwjunkieLOL. Let’s see, a $6000 (+) system that is not broken vs playing on a computer, just to keep you happy so you can dictate to us how to spend our money? LOL, what hubris!

How presumptuous you are to decide that my AC which is to all the specs for the house size and is not broken should be replaced. I didn’t complain, I merely told you that it can be felt in the summer months.
Dude, 20 years ago I lived on the border to Africa for 3 years and I didn't have AC, and having a PC in my room was the least of my problems. Chill.

If it's that important to you, go write a complaint to Nvidia lol. Maybe they'll write you a letter back apologizing that their GPUs don't cool your room yet. Your expectations are way too overblown. Do you want them to make this GPU 50% slower just so your room stays cooler? I mean, what kind of point are you even trying to make?
Posted on Reply
#103
moproblems99
PowerPCDo you want them to make this GPU 50% slower just so your room stays cooler?
Since when did power requirements double for generational improvements?

Or why blow 300 extra watts for rtrt that many people find unappealing? Or not even noticable?
Posted on Reply
#104
rtwjunkie
PC Gaming Enthusiast
PowerPCDude, 20 years ago I lived on the border to Africa for 3 years and I didn't have AC, and having a PC in my room was the least of my problems. Chill.

If it's that important to you, go write a complaint to Nvidia lol. Maybe they'll write you a letter back apologizing that their GPUs don't cool your room yet. Your expectations are way too overblown. Do you want them to make this GPU 50% slower just so your room stays cooler? I mean, what kind of point are you even trying to make?
AGAIN, I didn’t say it was important or a major issue. I simply told you that a high performing system DOES make a difference inside the house, contrary to your stated view. That is it, nothing more.

You are the one who assumed it meant more than it did, or that I was complaining and presumed to tell me what to do with my house.
Posted on Reply
#105
Aquinus
Resident Wat-man
PowerPCYour expectations are way too overblown.
That's your opinion, and considering
PowerPChaving a PC in my room was the least of my problems.
I suggest you consider that not everyone has the same standards and priorities that you do. I just spent $4,000 USD on an Apple laptop. Where do you think I fall in that spectrum? Without knowing anything about me or what I do, would you judge me just as you're judging @rtwjunkie?

Have some perspective, my friend. The world is a diverse place and I'd be careful about judging others unless you want to be judged yourself.
Posted on Reply
#106
R-T-B
PowerPCStill this idea that my PC will make my room hotter and that's why I'm going to buy a lower tier PC never crossed my mind.
Either you have AC or a big room. My PC can heat up my room within 5 minutes, which is why I run it remotely in a dedicated closet.

YPMD

(Your Priorities may differ)
Posted on Reply
#107
John Naylor
I feel like I'm watching White House press releases ...

2010: A you can fry an egg on [GeForce 400 series], ha ha ha

2013 - 2020: (29x - Vega) heat doesn't matter

2020 - Ah ya can fry and egg on [card that has not been released]
Posted on Reply
#108
R-T-B
John NaylorI feel like I'm watching White House press releases ...

2010: A you can fry an egg on [GeForce 400 series], ha ha ha

2013 - 2020: (29x - Vega) heat doesn't matter

2020 - Ah ya can fry and egg on [card that has not been released]
Heck it is pretty ironic. But that happens on both sides really. They are called fanboys and are best ignored.

On that note, heat does not matter to me MUCH (being my computer lives in a data closet). But it's not an encouraging development either.
Posted on Reply
#109
Caring1
nickbaldwin86cant wait for this, going to be nice to just have a single bundle of wires going to the GPU.
I can't wait for an electrostatic Induction card where no wires are visible. ;)
Posted on Reply
#110
Mussels
Freshwater Moderator
PowerPCThen buying a new AC should probably be a higher priority than gaming on max settings.
How.... American. Why not work to reduce the heat output instead? Lower electrical cost from the PC and the AC unit, more comfort.

Brute force wastefulness is not a good ideal.
Posted on Reply
#111
moproblems99
R-T-BHeck it is pretty ironic. But that happens on both sides really. They are called fanboys and are best ignored.

On that note, heat does not matter to me MUCH (being my computer lives in a data closet). But it's not an encouraging development either.
I live with what I have to live with but to celebrate potential rediculous wattage because we'll get two rt rays instead of one is not my cup of tea. If we were doubling the 2080ti raster and rtrt then I might be inclined to agree...
Posted on Reply
#112
John Naylor
Chloe PriceTechnically 6pin and 8pin are the same, the two extra pins are just sense and a ground pin. And the sense pin is a ground too.
Technically, an 8 pin cable is "officially" rated for 150 watts and a 6 pin is "officially" rated for only 75 watts.... so if making a warranty claim, good idea not to mentioned you used a 6 pin cable beyond it's official rating. On reason it may not matter is that quite often the 8 pins are not really necessary and provided just in case of heavy overlcocking. Anoter issue to be concerned with is that cheap cable manufacturers use unsersize wiring. Some quotes from the pinout sites:

"Some video cards come with the 8 pin PCI express connector to support higher wattage than the 6 pin PCI Express connectors. It's okay to plug a 6 pin PCI Express power cable into an 8 pin PCI Express connector. It's designed to work that way but will be limited to the lower wattage provided by the 6 pin version of the cable. The 6 pin cable only fits into one end of the 8 pin connector so you can't insert it incorrectly but you can sometimes force the 6 pin cable in the wrong way if you try hard enough. Video cards can sense whether you have plugged a 6 pin or 8 pin cable into an 8 pin connector so the video card can impose some kind of restriction when running with only a 6 pin power cable. Some cards will refuse to run with only a 6 pin cable in an 8 pin socket. Others will work with a 6 pin cable at normal speeds but will not allow overclocking. Check the video card documentation to get the rules. But if you don't have any other information then just assume that if your video card has an 8 pin connector then you must plug in an 8 pin cable. "

"The PCI Express 2.0 specification released in January 2007 added an 8 pin PCI Express power cable. It's just an 8 pin version of the 6 Pin PCI Express connector power cable. Both are primarily used to provide supplemental power to video cards. The older 6 pin version officially provides a maximum of 75 watts (although unofficially it can usually provide much more) whereas the new 8 pin version provides a maximum of 150 watts. It is very easy to confuse the 8 pin version with the very similar-looking EPS 8 pin power cable "

That sense cable, when used, is to detect which cable is connected and to limit power as described above.

There's good info here on the subject

electronics.stackexchange.com/questions/465726/what-are-sense-pins-in-8-pin-pci-express-power-plug

Another thing worth looking at is this video :

Around the 18 minute mark, they talk about the RX 480 ... and what happened with that card was that with the 6 pin connector, power delivery was limited to 75 watts ... Because the card was drawing more than 150 watts .... the extra wattage was not being drawn thru the cable, it was pulled from the PCI slot ... which they explain is a very bad thing. So it's not a matter so much if the cable can stand it w/o melting, it's whether the PSU will deliver more than 75 watts. They show how the 6 pin 480 causes way more than 75 watts thru the slot ... and than later, with an 8 pin connection more powerful cards and bigger loads don't do that.

So yes a 6 pin cable isn't going to melt under normal circumstances. However ...
Your PSU may not deliver 150 watts to the card ... it may be limited to 75 watts and this will impact performance
You could wind up in a situation like with the 480.... whereby the MoBo and / or card vendor did not provide adequate protections and the 750 watt limit of the card slot could be exceeded.

Advice ?

1. Don't throw away your power cables
2. If you do spend the $4 on a OEM replacement cable
Posted on Reply
#113
Fluffmeister
John NaylorI feel like I'm watching White House press releases ...

2010: A you can fry an egg on [GeForce 400 series], ha ha ha

2013 - 2020: (29x - Vega) heat doesn't matter

2020 - Ah ya can fry and egg on [card that has not been released]
Bingo!

A mod told me this was REALLY NAUGHTY:
That looks like a good idea, but AMD fans want lots of connectors so they can be upset.
Welcome to AMDsuperTechpowerUp forums. You access may be denied.
Posted on Reply
#114
hat
Enthusiast
You can definitely heat a room with a power hungry computer. My PC doubles as a space heater for my room. The rest of the house is kept very cold... but not in here. My crappy secondary computer kicks out a little bit of heat too, but not nearly as much. I run WCG on my CPUs and mine with my graphics cards. This keeps my teeth from chattering, gives me some fake internet money, and does a little good for everyone else, too.
Posted on Reply
#115
PowerPC
moproblems99Since when did power requirements double for generational improvements?

Or why blow 300 extra watts for rtrt that many people find unappealing? Or not even noticable?
How do you know that these 300 extra watts are only going to ray tracing?
rtwjunkieAGAIN, I didn’t say it was important or a major issue. I simply told you that a high performing system DOES make a difference inside the house, contrary to your stated view. That is it, nothing more.

You are the one who assumed it meant more than it did, or that I was complaining and presumed to tell me what to do with my house.
I just never heard or read anyone ever making that claim. That is all I answered you. Apparently I was wrong and most people here actually do experience this and everybody experiences physical discomfort from it. Now 100 people want to correct me. I'll just keep worrying about other things.
AquinusThat's your opinion, and considering

I suggest you consider that not everyone has the same standards and priorities that you do. I just spent $4,000 USD on an Apple laptop. Where do you think I fall in that spectrum? Without knowing anything about me or what I do, would you judge me just as you're judging @rtwjunkie?

Have some perspective, my friend. The world is a diverse place and I'd be careful about judging others unless you want to be judged yourself.
Ok, you're really gonna read me a sermon against judging others (which I wasn't even really doing if you turn on your brain and not just pile on) in a forum that judges everything all the time based on some shady leaks and has actual internet wars over their favorite GPU company over said leaks.

There's a really weird "Nvidia bad" vibe going around lately. And AMD hasn't even leaked anything yet... How does that even make sense? It's like some people just don't want 4K gaming and want to live in the past. Look, I just want 4K gaming to come as soon as possible, but hey that's just me. This card has a prohibitive price for a reason. It's about pushing the envelope and about pushing 4K gaming, not about saving your electricity bill or the planet (although the price will be in its favor). It's not about keeping your room cool, either. I just hope people can understand that.
R-T-BEither you have AC or a big room. My PC can heat up my room within 5 minutes, which is why I run it remotely in a dedicated closet.

YPMD

(Your Priorities may differ)
Ok, if it's really that fast I'll think about replacing my heating system with a 3090. By the way, that joke is hilarious, let's never stop making this joke. At least I thought it was a joke all these years, but now I'm starting to take this whole sausage grill thing seriously.
Posted on Reply
#116
R-T-B
PowerPCin a forum that judges everything all the time based on some shady leaks and has actual internet wars over their favorite GPU company over said leaks.
This is not what we consider a positive quality of the forum, so yeah.
PowerPCOk, if it's really that fast I'll think about replacing my heating system with a 3090.
It only works if you live in a small space with poor ventilation. Guess where poor froggy lives? We call it the "gamer hive"

Some day, I'll escape.
PowerPCbut now I'm starting to take this whole sausage grill thing seriously.
You could certainly manage it with some clever engineering but I'd rather just use it as a gpu..
Posted on Reply
#117
CubanB
I find this whole power cable thing that NVIDIA has done to be very odd, not sure what to think of it. Maybe in time it will make sense. If custom AIB GPU's come with the traditional connectors it will be a non issue I guess. I'm currently using Corsair HX750i (with 10 year warranties) on two systems and won't be changing anytime soon.
Posted on Reply
#118
kiriakost
John NaylorAround the 18 minute mark, they talk about the RX 480 ... and what happened with that card was that with the 6 pin connector, power delivery was limited to 75 watts ... Because the card was drawing more than 150 watts .... the extra wattage was not being drawn thru the cable, it was pulled from the PCI slot ... which they explain is a very bad thing. So it's not a matter so much if the cable can stand it w/o melting, it's whether the PSU will deliver more than 75 watts. They show how the 6 pin 480 causes way more than 75 watts thru the slot ... and than later, with an 8 pin connection more powerful cards and bigger loads don't do that.

So yes a 6 pin cable isn't going to melt under normal circumstances. However ...
Your PSU may not deliver 150 watts to the card ... it may be limited to 75 watts and this will impact performance
You could wind up in a situation like with the 480.... whereby the MoBo and / or card vendor did not provide adequate protections and the 750 watt limit of the card slot could be exceeded.

Advice ?

1. Don't throw away your power cables
2. If you do spend the $4 on a OEM replacement cable
I can hardly take seriously their wattage analysis, the method and tools that they have use they are poor.
There is available a special current (clamp probe) for oscilloscope, this having the necessary bandwidth of 500 kHz, this be able to solely measure amperes draw at the side of the wires.
Either way , heat and lots of amperes, this will puzzle more all those planning 4K gaming.
Posted on Reply
#119
John Naylor
There is a nice video out about thermal design, cross talk and other technical issues ... and yes, good part of it is abiut the 12 pin power connector which appears to have been included as its significantly smaller than 2 6-pins


Don't go looking for performance numbers or something to crow about for or against ... but provides a good tool for a beginner to begin understanding how heat and other design considerations are addressed....

Thermal Design - 0.00 ==> 3:00

Mechanical Design - 3:00 ==> 4:45 (Cooling System Hold Down)

Power Handling - 4;46 ==> 6:05 (Crosstalk, PCB)

Power Connector - 6:06 ==> 6:25

Form and Function Blah Blah 6;26 ==> end
kiriakostI can hardly take seriously their wattage analysis, the method and tools that they have use they are poor.
You should. This site is one that half of youtube hosts use to parrot "their findings

en.wikipedia.org/wiki/Dunning%E2%80%93Kruger_effect
Posted on Reply
Add your own comment
May 16th, 2024 16:43 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts