• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Allegedly Testing a 900 Watt TGP Ada Lovelace AD102 GPU

As much as I hate the idea of greater than 500w gpus, this will never happen, and honestly shouldn't. It's not the states place to tell end users what electronic wattages are acceptable, otherwise we'd end up banning all sorts of kitchen appliances.

There is the EU energy efficiency labeling, though. It basically recommends you what appliances are better. The higher the rating towards A+++ the better.

1651136472478.png
 
As much as I hate the idea of greater than 500w gpus, this will never happen, and honestly shouldn't. It's not the states place to tell end users what electronic wattages are acceptable, otherwise we'd end up banning all sorts of kitchen appliances.


No, that's completely down to the design choices of the cores, not a chiplet vs monolithic problem.
Any gpu above 500watts is dangerous for power supplies.. nvidia should accept defeat in efficiency with amd.. chiplets will bear the monolith design
 
  • Like
Reactions: ARF
I think the math might be off here, going by wiki example (because I have to use a source to prove my point :


so thats 1Kw/h (per hour) x 4 hours costs 80 cents( someone else's figure), times 365, is $292 or $24 per month. Of course your current system usage is not figured in. So lets assume you use 450w currently that works out to a $12 per month increase. Not huge sum for 4 hours of gameplay. Yea I'd be ok with that, but I dont game for just 4 hours usually. Being retired means I have more time to game if i wanted to.

EDIT: I do know that after 17 Kwh per day, I incur an addition rate to the base rate. Rates also change for peak hours.


The way inflation is increasing, by the time a 900watt card is bought, electricity will likely cost much more. The Ferrari owner will be paying $8 a gallon (by the time this card comes out), but when he bought it was $3, $50 might mean something to some people.
yes, $50 will mean something to some people. But if $50 means a lot to you then you are likely not buying $2500 video cards in the first place.

a 900w GPU under your math would be 3.6 kWh per day would be 61.2c per day, or 223.38 per year

a 250w GPU would be 1 kWh per day would be 17 per day, or 62.05 per year

If the $161 differene per year is a big issue, a $2500 GPU is not for you, hence the ferrari thing. The guys with the cash for something like this are NOT going to care about $160 a year difference. The guy who bought a several hundred thousand dollar ferrari doesnt care that gas went from $3 to $8, just like the guy dumping $2500+ on a big GPU doesnt care if his electric bill goes up $100 a year.
 
yes, $50 will mean something to some people. But if $50 means a lot to you then you are likely not buying $2500 video cards in the first place.

a 900w GPU under your math would be 3.6 kWh per day would be 61.2c per day, or 223.38 per year

a 250w GPU would be 1 kWh per day would be 17 per day, or 62.05 per year


If the $161 differene per year is a big issue, a $2500 GPU is not for you, hence the ferrari thing. The guys with the cash for something like this are NOT going to care about $160 a year difference. The guy who bought a several hundred thousand dollar ferrari doesnt care that gas went from $3 to $8, just like the guy dumping $2500+ on a big GPU doesnt care if his electric bill goes up $100 a year.
Thanks for doing the extended math. :rockout: It should be expressed that that is per 4 hour block per day..

I expect some gamers playing well over 25 hours on weekend ( Fri-Sun) raiding. I can attest that as I still occasionally play that long on weekends on some of the more intensive games still.
If the $161 differene per year is a big issue, a $2500 GPU is not for you,
Highly agree with this.
 
Nvidia trying to get themselves banned in Europe?
 
I mean... I did say I was expecting this, but I was actually just memeing...
 
How do you cool a 0.9 kW graphics card? Liquid nitrogen?
Liquid Nitrogen could be risky for a GPU, as the PCB is much more exposed. I could probably design a wildly expensive and complicated water lock with integrated heatpipes but air cooling? "Look at my E-ATX tower for my GPU and my ITX on top for everything else!"

Edit: Autocorrect
 
yeah i dont think so
 
  • Like
Reactions: ARF
I don't believe all these reports about 900W etc.
Regarding reference cards probably we will have 220W for the cut down AD104 with close to 6900XT performance at QHD hopefully and 350W for the near full AD103 version with nearly 1.6X vs the cut down AD104 at 4K.
I don't care about AD102 options, too pricey, still the reference cards are going to be between 450-600W probably.
 
Design choice like?
The core design itself?

Yes IF is efficient as heck for a chiplet. This does not make it beat an otherwise theoretically identical monolithic core, but that's really an academic point when the cost savings are so high.

Any gpu above 500watts is dangerous for power supplies..
No, it's dangerous for underrated power supplies. Same as ever. RTFM.

There is the EU energy efficiency labeling, though. It basically recommends you what appliances are better. The higher the rating towards A+++ the better.

View attachment 245259
That would be fine. Just don't blatantly outlaw the things. I'm all for consumer awareness.

How do you cool a 0.9 kW graphics card? Liquid nitrogen?
This is my main concern. As the owner of a TDP 450W GPU, this thing is going to roast your other components. No question.
 
Feels like a thing that happens over and over again, when chipmakers are desperate to get the last out of an architecture.
 
Nvidia just going in the wrong direction. 900 watts!!!! Might as well sell it with its own power supply unit to plug into the back of it. Sorta like the old Voodoo 5 6000 had.

Thats too much. Reminds me of the old Geforce 5800 ultra remember that thing sounded like a vaccum cleaner turning on.

Nvidia really need to start thinking again less power and better GPU performance per watt.
 
Design choice like?

Moving data through cores, caches, dies or chiplets is arguably the biggest power hog these days & that's where AMD excels with IF, this is especially evident in processors with over 12~16 cores.
Power-64Core_575px.png
Power-64CoreFreq_575px.png
This is actually a double-edged sword though, because you always have that baseline IF power which scales proportional to the number of links. This is most of why 8-core EPYC chips are still rated for 180W, and why AMD can't really compete in the crappy sub-10W segment. If you look at those core vs uncore power draw charts for Intel the numbers are very different.
 
Surely the title has a typo, you do mean 90 W, rright?
 
No "G" in the codename means this isn't supposed to be a consumer product most likely, A100 for example was a completely different architecture from the Ampere used in consumer products. That doesn't mean I don't believe that they are capable of introducing cards with such ridiculous TGPs to the masses.

Ada uses AD for the same reason Turing used TU nomenclature, they would repeat an already existing internal processor name

GT = Tesla
TU = Turing
GA = Ampere
AD = Ada

I think this is a meme rumor though, at the very worst this is an early ES of a ~500W SKU or an SXM2 specialty processor
 
Ada uses AD for the same reason Turing used TU nomenclature, they would repeat an already existing internal processor name

GT = Tesla
TU = Turing
GA = Ampere
AD = Ada

I think this is a meme rumor though, at the very worst this is an early ES of a ~500W SKU or an SXM2 specialty processor

It could have been called GL - GeForce (Ada) Lovelace.
 
It could have been called GL - GeForce (Ada) Lovelace.

I think it's being called just Ada though, both internally and externally... GL is also maybe a bad pick because it could cause confusion with GL subvariants? (Quadro/RTX enterprise)

Either way it does not really matter I suppose, it's just an internal oddity/curiosity to account for :)
 
I think it's being called just Ada though, both internally and externally... GL is also maybe a bad pick because it could cause confusion with GL subvariants? (Quadro/RTX enterprise)

Either way it does not really matter I suppose, it's just an internal oddity/curiosity to account for :)

"Ad" from "Ada" literally means hell, and "Ada" means "the hell" in at least one European language. Very weird naming.
 
"Ad" from "Ada" literally means hell, and "Ada" means "the hell" in at least one European language. Very weird naming.
Ada King, Countess of Lovelace - aka Ada Lovelace - is widely regarded as having written the first computer program in the mid 1850's.
 
Ada King, Countess of Lovelace - aka Ada Lovelace - is widely regarded as having written the first computer program in the mid 1850's.

Probably a little earlier, if I remember correctly she passed at the age of 36 from uterine cancer in 1852. What a way to go... with today's medicine she would probably lived to 100 and beyond.
 
Back
Top