• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Shares Details About Ampere Founders Edition Cooling & Power Design - 12-pin Confirmed

The 12pin still looks ridiculously stupid and a huge waste given the freakishly large size. Like they couldn't find the real estate on there?
 
I'm gonna sell my 2080Ti on Ebay and get as much as possible for it.
Definitely going for the 3090, as my PSU can handle it.
Gonna buy it on my card, get the Rewards Flyer points for it and then write the whole thing off as a business expense.

Wow, that's a relief.... I've been loosing sleep from constantly worrying over what you intended to do with your old graphics card and whether you you were going to get that 3090. I think I speak for the entire community when I say that.

I'm curious, what is your take on their reason to increase the TDP so heavily?

Well, I've read at very sources that the 3090 will use 390+ watts, with AIBs going above that.... There's already leaks showing three 8 pin connections in AIB models. According to moore's law is dead (if you have a negative opinion of him, keep it to yourself unless it's based on some empirical data or evidence that clearly warrants that negative opinion) Nvidia had to up the TDP significantly because they're that concerned about RDNA2. Furthermore, he sais that while AMD may not take the top performance crown, that RDNA2 should be more efficient and that second biggest RDNA2, based on what his leaks tell him, should be the star of the next gen cards, offering the best performance per dollar and watt and that it should allow AMD to lock up the upper modtier/lower top tier of videocard....he also said that he strongly encourages everyone to wait until RDNA2 is released before making any purchases, and this is looking like Radeon's Zen2 moment.
 
Wow, that's a relief.... I've been loosing sleep from constantly worrying over what you intended to do with your old graphics card and whether you you were going to get that 3090. I think I speak for the entire community when I say that.



Well, I've read at very sources that the 3090 will use 390+ watts, with AIBs going above that.... There's already leaks showing three 8 pin connections in AIB models. According to moore's law is dead (if you have a negative opinion of him, keep it to yourself unless it's based on some empirical data or evidence that clearly warrants that negative opinion) Nvidia had to up the TDP significantly because they're that concerned about RDNA2. Furthermore, he sais that while AMD may not take the top performance crown, that RDNA2 should be more efficient and that second biggest RDNA2, based on what his leaks tell him, should be the star of the next gen cards, offering the best performance per dollar and watt and that it should allow AMD to lock up the upper modtier/lower top tier of videocard....he also said that he strongly encourages everyone to wait until RDNA2 is released before making any purchases, and this is looking like Radeon's Zen2 moment.

Well, if all that is true, I suppose 2020 really is full of surprises.
 
I'm curious, what is your take on their reason to increase the TDP so heavily?
What increase? This is still equivalent to using 2 8-pin connectors which we've had for years.
 
Well I did not see the new cooler design in that video?
 
What increase? This is still equivalent to using 2 8-pin connectors which we've had for years.

Max power draw perhaps, but it also looks like a pretty beefed up cooler, doesn't suggest we'll stick to 250W... There is 375 available.

Well I did not see the new cooler design in that video?

A tiny, tiny glimpse in the last few seconds. This presentation is a whole lot of nothing :D Gotta get that hype train rollin
 
Part of the backside is shown for 1 second at the very end. There may be other hidden messages, apart from the ridiculously standing up 12 pin next to the PCB fish tail cutout.
 
Max power draw perhaps, but it also looks like a pretty beefed up cooler, doesn't suggest we'll stick to 250W... There is 375 available.
Yeah, that cooler... I don't buy 3-slot video cards on principle.

One more week of guessing and then we'll see.
My money's on the 3090 being some freak like the 2080Ti and rest of the cards being much saner. But still priced close to their Turing counterparts :(
 
Yeah, that cooler... I don't buy 3-slot video cards on principle.

One more week of guessing and then we'll see.
My money's on the 3090 being some freak like the 2080Ti and rest of the cards being much saner. But still priced close to their Turing counterparts :(

Yeah its the only way out for them lol. Agreed
 
Standing up 12-Pin, R.I.P. Arctic Cooling.
 
I'd like to remind you, that high power draw, unlike in AMD's case, does not mean low power efficiency. It may very well be that with Ampere Nvidia increased power efficiency by 50% or even 100% AND at the same time they decided to feed a lot more power to the GPU, just because they can. What would that mean in the end? Extreme performance. If you double your performance per watt and at the same time double the power input, you get quadruple performance. All you need is a GPU that can take it.

It remains to be seen how much Nvidia really improved perf/power efficiency and how much they increased power input (all we have now are just rumours). But just because your AMD has to drive the clocks of their GPUs well past their optimal efficiency point in order to be at least remotely competitive, it doesn't mean Nvidia is doing the same.

Edited to remove trolling/name-calling that does not conform with forum guidelines. Please keep this in mind when posting here in the future. - TPU Moderation Team
 
Last edited by a moderator:
People are always afraid of change, the initial reaction always follows the Kübler-Ross model with the following order emotions: denial, anger, bargaining, depression, and acceptance. :)
 

It's right there at ~2:25. Also no crazy airflow apocalypse in the case as some suggested. Back fan is in pull config. Edit: Or they revised the design and put the second fan on the front like a sane person would do...
3080.png
 
You are lucky, I am an electrician (plus electronics) and I did my research about wires and connectors wattage, very recently.
Its pair of wires (among molex pins in use ) this is designed to transfer 5A for 24/24 operation.
NVIDIA has the thought to use up to 30A from the new connector, but this is a planning ahead.
They are mentioning about a supplied adaptor (cables), this can use two 8Pin them able to deliver 15A its one.

According max specifications of a pair of wires among molex pin maximum current specifications, this is at 7A.
I do not think that they will push the envelope that far for now.

The PCI-E specification are so super safe, that you can push alot more through then intended. A capable PSU, connectors, wires and video cards can pull way more then the advertised 75/150W. I mean even my oc'ed 580 did manage to pull 22A from one single 8 pin connector, it got warm, yes, lol.

6jtJwZxk7YGBrOAHwIWAG2pz.jpg


But if you think about it, why Nvidia introduced this "one" connector, it's designed for the enterprise market, and simply pushed over to the gaming part. They no longer have to make 2 different model(s) of cards for both enterprise and / or consumer. The cards that do not pass the enterprise quality are moved over to the gamer ones. Nothing really gets lost in these markets.
 
According to the leaked benchmarks (Time Spy Extreme) the 3090 is up to 56% faster than 2080Ti, even if it consumes 350W it should end up being 15% or so more efficient which is not bad if Samsung's 10nm is being used, which is miles behind TSMC's N7P process in performance and efficiency.
If their cooling solution can indeed handle the 350W TDP with acceptable thermals and noise levels then they've done an amazing job.
 
According to the leaked benchmarks (Time Spy Extreme) the 3090 is up to 56% faster than 2080Ti, even if it consumes 350W it should end up being 15% or so more efficient which is not bad if Samsung's 10nm is being used, which is miles behind TSMC's N7P process in performance and efficiency.
If their cooling solution can indeed handle the 350W TDP with acceptable thermals and noise levels then they've done an amazing job.
And those results are also a bit dated now, probably made with pre-production versions of the card with testing drivers. It may have improved a lot since then.
 
If they can make rtx 3070 around 2080ti performance but around 150w power consumption. Then game over for AMD.
 
The PCI-E specification are so super safe, that you can push alot more through then intended. A capable PSU, connectors, wires and video cards can pull way more then the advertised 75/150W. I mean even my oc'ed 580 did manage to pull 22A from one single 8 pin connector, it got warm, yes, lol.

View attachment 166753

But if you think about it, why Nvidia introduced this "one" connector, it's designed for the enterprise market, and simply pushed over to the gaming part. They no longer have to make 2 different model(s) of cards for both enterprise and / or consumer. The cards that do not pass the enterprise quality are moved over to the gamer ones. Nothing really gets lost in these markets.

If people are worried about their cards pulling over 150W per 8 pin connector then they really shouldn't look up the r9 290x2 that pulled up to 225W per 8 pin connector lol
 
2x 8-pin. 150 W max power for the card (+75 W slot) would just not be enough

Basically they're saying the Max Power draw from a card with 1x 12pin new connector is 375W? (2x 150W 8pin + 75W From PCIE Power) right?

lets see what the real power draw is (after reviews), i hope they just left a lot more capacity on the connector to be used.
 
Last edited:
Basically they're saying the max Power draw from a card with 1x 12pin new connector is 675W? (2x 300W 8pin + 75W From PCIE Power)?
lets see what the real power draw is, i hope they just left a lot more capacity on the connector to be used.

No, Nvidia hasn't released any power consumption data. These numbers are just what people are guessing.

But it's assumed the power consumption can reach: (150W x 2 ) + 75W = 375W
 
Even though it is an abnormally expensive product, it will be interesting to see how efficient this exotic cooling is. Where's AMD's answer?
 
So they stole Sapphires design, Kudos Nvidia lol. For those tower cooler folks, this feeds that hot air directly into the Tower Intake, how does that affect CPU temps I wonder? 375W is gonna make for some pretty hot air being fed directly into the CPU cooling solution
 
Back
Top