Wednesday, April 27th 2022

NVIDIA Allegedly Testing a 900 Watt TGP Ada Lovelace AD102 GPU

With the release of Hopper, NVIDIA's cycle of new architecture releases is not yet over. Later this year, we expect to see next-generation gaming architecture codenamed Ada Lovelace. According to a well-known hardware leaker for NVIDIA products, @kopite7kimi, on Twitter, the green team is reportedly testing a potent variant of the upcoming AD102 SKU. As the leak indicates, we could see an Ada Lovelace AD102 SKU with a Total Graphics Power (TGP) of 900 Watts. While we don't know where this SKU is supposed to sit in the Ada Lovelace family, it could be the most powerful, Titan-like design making a comeback. Alternatively, this could be a GeForce RTX 4090 Ti SKU. It carries 48 GB of GDDR6X memory running at 24 Gbps speeds alongside monstrous TGP. Feeding the card are two 16-pin connectors.

Another confirmation from the leaker is that the upcoming RTX 4080 GPU uses the AD103 SKU variant, while the RTX 4090 uses AD102. For further information, we have to wait a few more months and see what NVIDIA decides to launch in the upcoming generation of gaming-oriented graphics cards.
Sources: @kopite7kimi (Twitter), via VideoCardz
Add your own comment

102 Comments on NVIDIA Allegedly Testing a 900 Watt TGP Ada Lovelace AD102 GPU

#26
Slizzo
Vya DomusNo "G" in the codename means this isn't supposed to be a consumer product most likely, A100 for example was a completely different architecture from the Ampere used in consumer products. That doesn't mean I don't believe that they are capable of introducing cards with such ridiculous TGPs to the masses.
I believe they're skipping the G naming of GPUs this time around to avoid confusion with GA (Ampere).
Posted on Reply
#27
ppn
People were complaining at the time of GTX 480 fermi the same way, but they got used to it. Nobody cares, just like for the deficit spending.
Just give me a 2KW power supply and 2KW solar power on the roof.
Posted on Reply
#28
napata
Vya DomusNo "G" in the codename means this isn't supposed to be a consumer product most likely, A100 for example was a completely different architecture from the Ampere used in consumer products. That doesn't mean I don't believe that they are capable of introducing cards with such ridiculous TGPs to the masses.
The AD chips are definitely the consumer products. No "G" doesn't mean anything, i.e. TU102.
Posted on Reply
#29
Vya Domus
SlizzoI believe they're skipping the G naming of GPUs this time around to avoid confusion with GA (Ampere).
Even if that's the case there is no guarantee this is in fact a chip meant for consumer products.
Posted on Reply
#30
defaultluser
ppnPeople were complaining at the time of GTX 480 fermi the same way, but they got used to it. Nobody cares, just like for the deficit spending.
Just give me a 2KW power supply and 2KW solar power on the roof.
Sure they did ;why then are the vast majority of shipping gaming systems use a 3070 TGP or less?

We only accepted more powerful cards once they upgraded the shit out of cooling systems (post-Pascal), and I wouldn't expect anything air cooled to exceed 450w in three slots anytime soon! Fermi popularized stock dual-slot coolers, but triples are not gong to become commonplace anytime soon!
Posted on Reply
#31
Chrispy_
PunkenjoyIt's true that more energy intensive machine now do more. But they do WAY more and are still efficient when you compare the amount of energy in versus the work done.

With those cards, if a card running at 900w is 2 time faster than a card running at 300w, i could say, ok maybe, we are on something. But it's more likely that this card will be less than 50% more than a a 300w CPU, it's not progress.

I really doubt that it will be more than 50% more powerful than the same chip running at 300w but i may be wrong. Also, we need to consider that those are gaming card. They don't build infrastructure or house. They are just entertainment machine.

to me i think 250w is the sweat spot and 300/350w is really the upper limit of what i can tolerate. i wouldn't want to get a 900w heater in my PC. i would have to cool the case, then find a way to cool the room. If you don't have air conditioning in your room forget that during the summer.


To me, it's just Nvidia not wanting to lose max perf of next generation being slow on chiplet and taking crazy decision to stay on top.
Diminishing returns is the phrase you are looking for.
Especially with graphics cards, you get less additional performance for each Watt of power that's added.

I've graphed it in the past and the sweet spot of maximum efficiency is usually around 2/3rds of the clockspeed and half the power consumption. That's been true for both AMD and Nvidia GPUs since about 2016 so there's no reason to suspect that it won't also apply to Lovelace on the new process node.
Posted on Reply
#32
ppn
absolutely, the dies size is the same as GA102, and since N5 technology provides about 20% faster speed than N7 technology or about 40% power reduction.
and compared to its 10nm FinFET process, TSMC's 7nm FinFET features 1.6X logic density, ~20% speed improvement, and ~40% power reduction. which in turn offers 2X logic density than its 16nm predecessor, along with ~15% faster speed and ~35% less power consumption.

SO I dont exactly know how many times faster speed and/or power reduction that makes, but since Nvidia skipped 10 and 7nm, ADa102 should be running at 1,20*1,20*1,15 or 1,65 * chips were already running at 2Ghz on 16nm, so we have 3.2GHz. ADa103 at same power/ size as 1080Ti or 0.6*0,6*0,65 250W down to 60 watts when downclocked to 2Ghz.
Posted on Reply
#33
CyberCT
Hopefully these power hungry cards undervolt well. My 3070 ti is undervolted to 0.875mv max at 187X MHz. I can barely hear the GPU fans from under the TV and the GPU doesn't get hotter than 62C. Using the Kill-A-Watt it consumes roughly 70 watts less than stock for a 3-4% decrease in stock performance. Definitely worth the trade off to me.
Posted on Reply
#34
phanbuey
This is like anti-marketing.

"How can we get people to instantly dislike our products without them even launching yet"

"Leak out some 900W board tests..."
Posted on Reply
#35
AusWolf
Vya DomusNo "G" in the codename means this isn't supposed to be a consumer product most likely, A100 for example was a completely different architecture from the Ampere used in consumer products. That doesn't mean I don't believe that they are capable of introducing cards with such ridiculous TGPs to the masses.
Not necessarily. They couldn't use GA again because they already used it for Ampere, just like they couldn't use GT for Turing because of Tesla, hence TU.
PapaTaipeiThis should be illegal.


On a more serious note, this will definitely be a xx90 chip. Buyers of those don't give a damn about power consumption or heat. The majority of us, mere mortals, should be focused on xx60 and xx70 cards instead.
Posted on Reply
#36
Pumper
Why is this shit allowed? If cars are getting stricter fuel economy and pollution requirement, why is PC hardware any different? You'd think that progress would lead to lower TDP, not the other way round.
Posted on Reply
#37
Hjadams973514
PapaTaipeiThis should be illegal.
It does not have to be illegal. Anyone dumb enough to buy a 900W card for gaming using residential power capacity deserves whatever happens to them as a result of the purchase. Which is why I don't believe this rumor at all. You simply can't cool a 900W card. You simply can't power a 900W card using residential power, unless you are willing to turn off everything else in the room other than you PC, monitor, and internet modem. And lets not even get started with your power bill if you game any more than 2 hours a week, which I assume you do if you are interested in this card.... The card simply won't sell, its as simple as that....
Posted on Reply
#38
PapaTaipei
PumperWhy is this shit allowed? If cars are getting stricter fuel economy and pollution requirement, why is PC hardware any different? You'd think that progress would lead to lower TDP, not the other way round.
Correct
Posted on Reply
#40
freeagent
defaultluserFermi popularized stock dual-slot coolers, but triples are not gong to become commonplace anytime soon!
My Fermi is a triple slot card, my other one was too :D

Honestly.. they lost me at 400w. I have no interest in the next gen cards from either manufacturer.. don’t care how many times stronger it is compared to my 3070 Ti. If you have no problems with running a 500w+ GPU.. all the power to you no pun intended :D

So much for going greener.. the boys running those companies have been dipping into the green a bit too much if you ask me, and I’m not talking about money either lol..
Posted on Reply
#41
Unregistered
Graphics cards are starting to get into the custom loop only realm
Posted on Edit | Reply
#42
ppn
if you strive for silent operation 400W is 4 slots, 4080 on 4nm is perfectly doable on air. They just have to reuse the 3090Ti coolers.
and so on, 300W is triple slot. 900 watts is 9 slots. or a wall of 8x8x8fans like captain workplace did make fun of it.
And 4070 is such a joke again with 192 bit bus, i have to opt for a 4080 now.
Posted on Reply
#43
TheDeeGee
I won't go above 250 Watts, so i guess that means a RTX 4030.
Posted on Reply
#44
ARF
This is a psychological / political war against all of us, yes, there is politics in the technology. The ultimate goal is to keep us all under control and enslaved.

You, know, the world is changing towards "eco" modes, electric cars, etc. The world is turning its back to the fossil fuels... and this is the ellites' revenge - your choice now is to hate the electricity and its prices...

I have no other explanation.

This is ridiculous and if I were in the power - I would put Nvidia in the courts and not allow them to leave until they become humans again..
Posted on Reply
#46
anachron
Hjadams973514It does not have to be illegal. Anyone dumb enough to buy a 900W card for gaming using residential power capacity deserves whatever happens to them as a result of the purchase. Which is why I don't believe this rumor at all. You simply can't cool a 900W card. You simply can't power a 900W card using residential power, unless you are willing to turn off everything else in the room other than you PC, monitor, and internet modem. And lets not even get started with your power bill if you game any more than 2 hours a week, which I assume you do if you are interested in this card.... The card simply won't sell, its as simple as that....
I certainly won't buy a 900W card but i don't see how it's an issue for residential power. A single socket can deliver 3520W here, even with the whole computer and screens you will still have no issue. You may eventually need a higher subscription than the medium 6kva i have if you intend to cook and heat at the same time that you are playing but that's pretty much it. I don't think that the kind of people who would be interested in this card would mind the price difference.
Posted on Reply
#47
80251
R0H1TIt's not increasing that much with these stupidly overlocked cards, look at Apple ~ go wider or go home! This is why HBM(3) is the way to go, it's a lot more efficient than these monster power hungry GDDR6x cards.
What you said makes perfect sense, but why did AMD abandon HBM for their consumer cards? And while the bandwidth on HBM is incredible relative to any GDDRx flavour what is the latency like?
Posted on Reply
#48
TheinsanegamerN
DeathtoGnomesI wonder if Nvidia considered that consumers have to pay for electricity. I could not justify paying for an additional 1kw of service.
I wonder if Ferrari considered that consumers have to pay for gasoline. I could not justify paying for an additional 10 gallons of gas.

I wonder if pool companies considered that consumers have to pay for water. I could not justify paying for an additional 100 gallons of service.

ece. ece. If the power draw of a GPU means your power bill will be a concern, you shouldnt be buying a likely $2500+ video card. compared to a 400w GPU like the 3090 it's literally a 5-6$ difference over an entire YEAR unless you are playing 8+ hours a day, in which case your streaming will more then make up the difference.
Posted on Reply
#49
80251
mashieThat picture is of an Nvidia H100 CNX DPU: www.nvidia.com/en-gb/data-center/h100cnx/
Why does that HPC GPU have on-board 200Gb and a 400Gb ethernet ports?

Nevermind the marketing material has an answer for that:

The H100 CNX alleviates this problem. With a dedicated path from the network to the GPU, it allows GPUDirect® RDMA to operate at near line speeds. The data transfer also occurs at PCIe Gen5 speeds regardless of host PCIe backplane. Scaling up GPU power in a host can be done in a balanced manner, since the ideal GPU-to-NIC ratio is achieved. A server can also be equipped with more acceleration power, because converged accelerators require fewer PCIe lanes and device slots than discrete cards.
Posted on Reply
#50
TheinsanegamerN
80251What you said makes perfect sense, but why did AMD abandon HBM for their consumer cards?
Simple: their two HBM card lines, the fury and vega, lost to their repsetive nvidia cards, the GDDR5 powreed 980ti and the GDDR5x powered 1080ti/1080 respeively. Neither card was powerful enough for HBM to help in any way.
80251And while the bandwidth on HBM is incredible relative to any GDDRx flavour what is the latency like?
The latency is high, but GDDR memory tends to have high lateny anyway. GPUs are more about pure bandwidth for gaming appliations typically. IIRC HBM3 latency is comparable to GDDR6x.
Posted on Reply
Add your own comment
May 4th, 2024 19:02 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts