Wednesday, August 26th 2020

NVIDIA Shares Details About Ampere Founders Edition Cooling & Power Design - 12-pin Confirmed

NVIDIA today shared the design philosophy behind the cooling solution of its next-generation GeForce "Ampere" RTX 3080 / 3090 graphics cards, which we'll hopefully learn more about on September 1, when NVIDIA has scheduled a GeForce Special Event. Part of the new video presentation shows the evolution of NVIDIA's cooling solutions over the years. NVIDIA explains the four pillars behind the design, stressing that thermals are at the heart of its innovation, and that the company looks to explore new ways to use air-cooling more effectively to cool graphics cards. To this effect, the cooling solution of the upcoming GeForce Ampere Founders Edition graphics cards features an airflow-optimized design focused on ensuring the most effective way to take in fresh air, transfer heat to it, and exhaust the warm air in the most optimal manner.

The next pillar of NVIDIA's cooling technology innovation is mechanical structure, to minimize the structural components of the cooler without compromising on strength. The new Founder Edition cooler introduces a new low profile leaf spring that leaves more room for a back cover. Next up is reducing the electrical clutter, with the introduction of a new 12-pin power connector that is more compact, consolidates cabling, and yet does not affect the card's power delivery capability. The last pillar is product design, which puts NVIDIA's innovations together in an airy new industrial design. The video presentation includes commentary from NVIDIA's product design engineers who explain the art and science behind the next GeForce. NVIDIA is expected to tell us more about the next generation GeForce Ampere at a Special Event on September 1.
Although the video does not reveal any picture of the finished product, the bits and pieces of the product's wire-frame model, and the PCB wire-frame confirm the design of the Founders Edition which has been extensively leaked over the past few months. NVIDIA mentioned that all its upcoming cards that come with 12-pin connector include free adapters to convert standard 8-pin PCIe power connectors to 12-pin, which means there's no additional cost for you. We've heard from several PSU vendors who are working on adding native 12-pin cable support to their upcoming power supplies.

The promise of backwards compatibility has further implications: there is no technical improvement—other than the more compact size. If the connector works through an adapter cable with two 8-pins on the other end, its maximum power capability must be 2x 150 W, at the same current rating as defined in the PCIe specification. The new power plug will certainly make graphics cards more expensive, because it is produced in smaller volume, thus driving up BOM cost, plus the cost for the adapter cable. Several board partners hinted to us that they will continue using traditional PCIe power inputs on their custom designs.
The NVIDIA presentation follows.

Add your own comment

143 Comments on NVIDIA Shares Details About Ampere Founders Edition Cooling & Power Design - 12-pin Confirmed

#26
Caring1
john_
So, they did this?


Yes, they reversed the picture so Sapphire was spelt backwards. :laugh:
Posted on Reply
#27
AnarchoPrimitiv
QUANTUMPHYSICS
I'm gonna sell my 2080Ti on Ebay and get as much as possible for it.
Definitely going for the 3090, as my PSU can handle it.
Gonna buy it on my card, get the Rewards Flyer points for it and then write the whole thing off as a business expense.
Wow, that's a relief.... I've been loosing sleep from constantly worrying over what you intended to do with your old graphics card and whether you you were going to get that 3090. I think I speak for the entire community when I say that.
Vayra86
I'm curious, what is your take on their reason to increase the TDP so heavily?
Well, I've read at very sources that the 3090 will use 390+ watts, with AIBs going above that.... There's already leaks showing three 8 pin connections in AIB models. According to moore's law is dead (if you have a negative opinion of him, keep it to yourself unless it's based on some empirical data or evidence that clearly warrants that negative opinion) Nvidia had to up the TDP significantly because they're that concerned about RDNA2. Furthermore, he sais that while AMD may not take the top performance crown, that RDNA2 should be more efficient and that second biggest RDNA2, based on what his leaks tell him, should be the star of the next gen cards, offering the best performance per dollar and watt and that it should allow AMD to lock up the upper modtier/lower top tier of videocard....he also said that he strongly encourages everyone to wait until RDNA2 is released before making any purchases, and this is looking like Radeon's Zen2 moment.
Posted on Reply
#28
Vayra86
AnarchoPrimitiv
Wow, that's a relief.... I've been loosing sleep from constantly worrying over what you intended to do with your old graphics card and whether you you were going to get that 3090. I think I speak for the entire community when I say that.



Well, I've read at very sources that the 3090 will use 390+ watts, with AIBs going above that.... There's already leaks showing three 8 pin connections in AIB models. According to moore's law is dead (if you have a negative opinion of him, keep it to yourself unless it's based on some empirical data or evidence that clearly warrants that negative opinion) Nvidia had to up the TDP significantly because they're that concerned about RDNA2. Furthermore, he sais that while AMD may not take the top performance crown, that RDNA2 should be more efficient and that second biggest RDNA2, based on what his leaks tell him, should be the star of the next gen cards, offering the best performance per dollar and watt and that it should allow AMD to lock up the upper modtier/lower top tier of videocard....he also said that he strongly encourages everyone to wait until RDNA2 is released before making any purchases, and this is looking like Radeon's Zen2 moment.
Well, if all that is true, I suppose 2020 really is full of surprises.
Posted on Reply
#29
bug
Vayra86
I'm curious, what is your take on their reason to increase the TDP so heavily?
What increase? This is still equivalent to using 2 8-pin connectors which we've had for years.
Posted on Reply
#30
jesdals
Well I did not see the new cooler design in that video?
Posted on Reply
#31
Vayra86
bug
What increase? This is still equivalent to using 2 8-pin connectors which we've had for years.
Max power draw perhaps, but it also looks like a pretty beefed up cooler, doesn't suggest we'll stick to 250W... There is 375 available.
jesdals
Well I did not see the new cooler design in that video?
A tiny, tiny glimpse in the last few seconds. This presentation is a whole lot of nothing :D Gotta get that hype train rollin
Posted on Reply
#33
ppn
Part of the backside is shown for 1 second at the very end. There may be other hidden messages, apart from the ridiculously standing up 12 pin next to the PCB fish tail cutout.
Posted on Reply
#34
bug
Vayra86
Max power draw perhaps, but it also looks like a pretty beefed up cooler, doesn't suggest we'll stick to 250W... There is 375 available.
Yeah, that cooler... I don't buy 3-slot video cards on principle.

One more week of guessing and then we'll see.
My money's on the 3090 being some freak like the 2080Ti and rest of the cards being much saner. But still priced close to their Turing counterparts :(
Posted on Reply
#35
Vayra86
bug
Yeah, that cooler... I don't buy 3-slot video cards on principle.

One more week of guessing and then we'll see.
My money's on the 3090 being some freak like the 2080Ti and rest of the cards being much saner. But still priced close to their Turing counterparts :(
Yeah its the only way out for them lol. Agreed
Posted on Reply
#36
TheDeeGee
Standing up 12-Pin, R.I.P. Arctic Cooling.
Posted on Reply
#37
Jinxed
I'd like to remind you, that high power draw, unlike in AMD's case, does not mean low power efficiency. It may very well be that with Ampere Nvidia increased power efficiency by 50% or even 100% AND at the same time they decided to feed a lot more power to the GPU, just because they can. What would that mean in the end? Extreme performance. If you double your performance per watt and at the same time double the power input, you get quadruple performance. All you need is a GPU that can take it.

It remains to be seen how much Nvidia really improved perf/power efficiency and how much they increased power input (all we have now are just rumours). But just because your AMD has to drive the clocks of their GPUs well past their optimal efficiency point in order to be at least remotely competitive, it doesn't mean Nvidia is doing the same.

Edited to remove trolling/name-calling that does not conform with forum guidelines. Please keep this in mind when posting here in the future. - TPU Moderation Team
Posted on Reply
#38
chodaboy19
People are always afraid of change, the initial reaction always follows the Kübler-Ross model with the following order emotions: denial, anger, bargaining, depression, and acceptance. :)
Posted on Reply
#39
Jinxed
chodaboy19
People are always afraid of change, the initial reaction always follows the Kübler-Ross model with the following order emotions: denial, anger, bargaining, depression, and acceptance. :)
So true!
Posted on Reply
#41
Jism
kiriakost
You are lucky, I am an electrician (plus electronics) and I did my research about wires and connectors wattage, very recently.
Its pair of wires (among molex pins in use ) this is designed to transfer 5A for 24/24 operation.
NVIDIA has the thought to use up to 30A from the new connector, but this is a planning ahead.
They are mentioning about a supplied adaptor (cables), this can use two 8Pin them able to deliver 15A its one.

According max specifications of a pair of wires among molex pin maximum current specifications, this is at 7A.
I do not think that they will push the envelope that far for now.
The PCI-E specification are so super safe, that you can push alot more through then intended. A capable PSU, connectors, wires and video cards can pull way more then the advertised 75/150W. I mean even my oc'ed 580 did manage to pull 22A from one single 8 pin connector, it got warm, yes, lol.



But if you think about it, why Nvidia introduced this "one" connector, it's designed for the enterprise market, and simply pushed over to the gaming part. They no longer have to make 2 different model(s) of cards for both enterprise and / or consumer. The cards that do not pass the enterprise quality are moved over to the gamer ones. Nothing really gets lost in these markets.
Posted on Reply
#42
M2B
According to the leaked benchmarks (Time Spy Extreme) the 3090 is up to 56% faster than 2080Ti, even if it consumes 350W it should end up being 15% or so more efficient which is not bad if Samsung's 10nm is being used, which is miles behind TSMC's N7P process in performance and efficiency.
If their cooling solution can indeed handle the 350W TDP with acceptable thermals and noise levels then they've done an amazing job.
Posted on Reply
#43
Jinxed
M2B
According to the leaked benchmarks (Time Spy Extreme) the 3090 is up to 56% faster than 2080Ti, even if it consumes 350W it should end up being 15% or so more efficient which is not bad if Samsung's 10nm is being used, which is miles behind TSMC's N7P process in performance and efficiency.
If their cooling solution can indeed handle the 350W TDP with acceptable thermals and noise levels then they've done an amazing job.
And those results are also a bit dated now, probably made with pre-production versions of the card with testing drivers. It may have improved a lot since then.
Posted on Reply
#44
Turmania
If they can make rtx 3070 around 2080ti performance but around 150w power consumption. Then game over for AMD.
Posted on Reply
#45
Emu
Jism
The PCI-E specification are so super safe, that you can push alot more through then intended. A capable PSU, connectors, wires and video cards can pull way more then the advertised 75/150W. I mean even my oc'ed 580 did manage to pull 22A from one single 8 pin connector, it got warm, yes, lol.



But if you think about it, why Nvidia introduced this "one" connector, it's designed for the enterprise market, and simply pushed over to the gaming part. They no longer have to make 2 different model(s) of cards for both enterprise and / or consumer. The cards that do not pass the enterprise quality are moved over to the gamer ones. Nothing really gets lost in these markets.
If people are worried about their cards pulling over 150W per 8 pin connector then they really shouldn't look up the r9 290x2 that pulled up to 225W per 8 pin connector lol
Posted on Reply
#46
KarymidoN
W1zzard
2x 8-pin. 150 W max power for the card (+75 W slot) would just not be enough
Basically they're saying the Max Power draw from a card with 1x 12pin new connector is 375W? (2x 150W 8pin + 75W From PCIE Power) right?

lets see what the real power draw is (after reviews), i hope they just left a lot more capacity on the connector to be used.
Posted on Reply
#47
chodaboy19
KarymidoN
Basically they're saying the max Power draw from a card with 1x 12pin new connector is 675W? (2x 300W 8pin + 75W From PCIE Power)?
lets see what the real power draw is, i hope they just left a lot more capacity on the connector to be used.
No, Nvidia hasn't released any power consumption data. These numbers are just what people are guessing.

But it's assumed the power consumption can reach: (150W x 2 ) + 75W = 375W
Posted on Reply
#48
Dante Uchiha
Even though it is an abnormally expensive product, it will be interesting to see how efficient this exotic cooling is. Where's AMD's answer?
Posted on Reply
#49
psyclist
So they stole Sapphires design, Kudos Nvidia lol. For those tower cooler folks, this feeds that hot air directly into the Tower Intake, how does that affect CPU temps I wonder? 375W is gonna make for some pretty hot air being fed directly into the CPU cooling solution
Posted on Reply
#50
Xuper
So all heats will be dumped into CPU? wow what a innovative !
Posted on Reply
Add your own comment