• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Allegedly Testing a 900 Watt TGP Ada Lovelace AD102 GPU

That picture is of an Nvidia H100 CNX DPU: https://www.nvidia.com/en-gb/data-center/h100cnx/
Why does that HPC GPU have on-board 200Gb and a 400Gb ethernet ports?

Nevermind the marketing material has an answer for that:

The H100 CNX alleviates this problem. With a dedicated path from the network to the GPU, it allows GPUDirect® RDMA to operate at near line speeds. The data transfer also occurs at PCIe Gen5 speeds regardless of host PCIe backplane. Scaling up GPU power in a host can be done in a balanced manner, since the ideal GPU-to-NIC ratio is achieved. A server can also be equipped with more acceleration power, because converged accelerators require fewer PCIe lanes and device slots than discrete cards.
 
What you said makes perfect sense, but why did AMD abandon HBM for their consumer cards?
Simple: their two HBM card lines, the fury and vega, lost to their repsetive nvidia cards, the GDDR5 powreed 980ti and the GDDR5x powered 1080ti/1080 respeively. Neither card was powerful enough for HBM to help in any way.
And while the bandwidth on HBM is incredible relative to any GDDRx flavour what is the latency like?
The latency is high, but GDDR memory tends to have high lateny anyway. GPUs are more about pure bandwidth for gaming appliations typically. IIRC HBM3 latency is comparable to GDDR6x.
 
@TheinsanegamerN
Do you think AMD will give HBM a shot again? They have GPU's competitive with Team Green now maybe a modern GPU using HBM would push them over the top and give them the clear cut lead in 4K-8K gaming/mining perf?

Maybe you could vent the 900 Watts of heat into a clothes drier or central furnace heating system or a water heater?
 
I certainly won't buy a 900W card but i don't see how it's an issue for residential power. A single socket can deliver 3520W here, even with the whole computer and screens you will still have no issue. You may eventually need a higher subscription than the medium 6kva i have if you intend to cook and heat at the same time that you are playing but that's pretty much it. I don't think that the kind of people who would be interested in this card would mind the price difference.
In europe maybe, but in North America, house hold socket are on 120v and have a maximum rating most of the time of 1800w for a 15 amp circuit. Note that it's not every single socket that can deliver either 3500w or 1800w, it's each circuit. Depending on how your house is wired, you can have multiples thing on the same circuit. It's not common to have multiple house socket on the same circuit. It's even worst for older houses.

So let say you have a 900w GPU, a 170w+ CPU(i doubt you would run that card on a low end cpu), plus 30w for the rest of the system, you are at 1100w. PowerSupply aren't 100% efficient so you can add another 100w of loss on top of that. Then you have monitor. some can easily use more than 50w per monitor. if you have more than 1, you can start to get closer to the circuit breaker.

And you haven't plug anything else. Normally, for kitchen, they design it so that each socket have it's own circuit, if that trend continue, we might have to design home office to get the same treatment.

But before we get there, we need to consider how much heat those kitchen appliance output. I mean, that PC would have more heat output than the heating system of the room i would put it in.

I wonder if Ferrari considered that consumers have to pay for gasoline. I could not justify paying for an additional 10 gallons of gas.

I wonder if pool companies considered that consumers have to pay for water. I could not justify paying for an additional 100 gallons of service.

ece. ece. If the power draw of a GPU means your power bill will be a concern, you shouldnt be buying a likely $2500+ video card. compared to a 400w GPU like the 3090 it's literally a 5-6$ difference over an entire YEAR unless you are playing 8+ hours a day, in which case your streaming will more then make up the difference.
it all depend where you live.

let say you are not a hardcore gamer, but you still play 4 hours per day in average, a bit much on the weekend, a bit less during week, etc. Something normal for someone spending that much in equipment, if you live in an area where the electricity is around 0.20$ per KwH, it would cost you around 146$ per year more.

where i live, it still almost 60 buck. I mean it's not that much, but way more than 5-6 $ diff.

And it depend, some area have peek hour rate that are much higher, some have different tariff based on their consumption.

But still, my main concern would be that additional 500w of power that it would dump in my room. it would be the equivalent of having the other computer + this running at maximum power while I game:
1651089200486.png
 
This should be illegal.
I think using that much wattage to game a bit mental but it's worth remembering that 1KWatt air heaters are also quite popular.

Actually not that dissimilar besides it Could play Crisis.
 
48 GiB of VRAM wouldn't be useful in any gaming context, even 8k right?

Would there be any advantage to crypto-miners to having a 48 GiB VRAM footprint?
 
I wonder if Ferrari considered that consumers have to pay for gasoline. I could not justify paying for an additional 10 gallons of gas.

I wonder if pool companies considered that consumers have to pay for water. I could not justify paying for an additional 100 gallons of service.

ece. ece. If the power draw of a GPU means your power bill will be a concern, you shouldnt be buying a likely $2500+ video card. compared to a 400w GPU like the 3090 it's literally a 5-6$ difference over an entire YEAR unless you are playing 8+ hours a day, in which case your streaming will more then make up the difference.
I think the math might be off here, going by wiki example (because I have to use a source to prove my point :

An electric heater consuming 1000 watts (1 kilowatt), and operating for one hour uses one kilowatt-hour of energy.
so thats 1Kw/h (per hour) x 4 hours costs 80 cents( someone else's figure), times 365, is $292 or $24 per month. Of course your current system usage is not figured in. So lets assume you use 450w currently that works out to a $12 per month increase. Not huge sum for 4 hours of gameplay. Yea I'd be ok with that, but I dont game for just 4 hours usually. Being retired means I have more time to game if i wanted to.

EDIT: I do know that after 17 Kwh per day, I incur an addition rate to the base rate. Rates also change for peak hours.


The way inflation is increasing, by the time a 900watt card is bought, electricity will likely cost much more. The Ferrari owner will be paying $8 a gallon (by the time this card comes out), but when he bought it was $3, $50 might mean something to some people.
 
Last edited:
For some reason I read that as Over 9000!!!
 
FX5000 series and Fermi 4.0?
 
Anyone else got a pocket nuclear power pack on order? :roll:
 
There are plenty of played out jokes to be made about this. GTX 480 looks like an icebox now.

I'd just like to caution that testing a 900W card doesn't mean the final product will take 900W. Especially knowing what we know about this chip, that it uses a Samdung node and die size is near the reticle limit, you're way up the (presumably terrible) v/f curve there. I would expect the final product to slot in between 500-600W. Still unacceptable for most, but a lot more reasonable and practical. I personally don't have much issue with it as long as the performance justifies it and efficiency is still improved. Back in the day people who wanted the best ran 2 or 3 GPUs at 200-300W+ in CFX/SLI. Now that's dead and you get your half-kilowatt, $1500 monsters instead, I have more of a problem with the price honestly.

I certainly won't buy a 900W card but i don't see how it's an issue for residential power. A single socket can deliver 3520W here, even with the whole computer and screens you will still have no issue. You may eventually need a higher subscription than the medium 6kva i have if you intend to cook and heat at the same time that you are playing but that's pretty much it. I don't think that the kind of people who would be interested in this card would mind the price difference.
Many people in the US would probably encounter issues. I get the impression most homes/buildings haven't had electrical work done since the 1970s. Standard outlets here deliver a maximum of 1800W, which is enough, but definitely cutting it close for a full system with such a card. But if you ran the PC and another appliance on the same fuse, it'd blow nonetheless... it's not just the individual outlet that needs to be considered.
 
Granted this is all rumour but I do resent generally that such a product would demand so much from a system. Essentially a company making such a product is requiring further sums to be spent cooling and powering the thing. It better be worth it!
 
Its the drawback of going one big monolitic chip design. AMD on the other hands opts for MGM based GPU's, multiple smaller GPU's combined as one.
Monolithic would technically be more efficient power wise than a chiplet design of equal performance. The thing chiplet design helps with is cost and yields not power consumption...

Anyways as the owner of an RTX 3090ti I seriously question how this will even work in a PC case...
 
I think using that much wattage to game a bit mental but it's worth remembering that 1KWatt air heaters are also quite popular.

Actually not that dissimilar besides it Could play Crisis.
A bit?

If you need 4 slots for 450 Watts can any shroud handle that kind of size would a 900 watt card be?
 
The solution is simple... Only game in winter, and use an open air chassis in your ductwork. I read someone did that with cryptomining (Bitcoin, hold the pitchforks) and netted a profit, even though the cost of electricity was higher than the Bitcoin earned.

Seriously though, to cool the thing must take a custom loop and an outdoor chiller.
 
After reading replies here I'd be more interested in seeing how they're planning on cooling that monstrosity than the actual hardware itself (which will be priced wwwwwwwwwwwaaaaaaaaayyyyyyyyyyyyyyyy out of my price range anyway).

Could they use a thermosiphon to cool it? Or maybe some new NASA tech like the oscillating heatpipe?
 
Are we sure that's enough Watts? At this point, might as well round it up to a 1000.
 
Monolithic would technically be more efficient power wise than a chiplet design of equal performance. The thing chiplet design helps with is cost and yields not power consumption
Not necessarily, it's mostly about clocks & inter-chip communication ~ this is why zen3 is still more efficient than ADL in a lot of tasks. Yes if 12xxx chips are clocked lower they do match or beat AMD but that's with DDR5 & a slight node advantage, zen4 would likely change that once it releases to firmly AMD's side. This of course doesn't apply to GPU's because we have no working MCM models in the mainstream as yet.
 
Legislation must be made to ban client gpu's that go over 500watts.. nvidia is panicking they know rdna 3 will crush Lovelace
 
Legislation must be made to ban client gpu's that go over 500watts.. nvidia is panicking they know rdna 3 will crush Lovelace

Yes.

Certain prebuilt Alienware gaming PCs can no longer legally be sold in half a dozen US states due to recently passed power consumption laws.

As reported first by The Register (spotted by Vice), some of Dell's Alienware Aurora R10 and Aurora R12 gaming PCs are no longer available for sale in California, Colorado, Hawaii, Oregon, Vermont, or Washington. Heading over to Dell's website and looking to purchase certain configurations will display a warning message to buyers, indicating that it will not be shipped to those provinces due to power consumption regulations that have been adopted in those states. Dell notes that any orders that are slated to ship to those states will now be canceled.
US States Ban Certain Alienware PC Sales Because They Use Too Much Power - IGN
 
Legislation must be made to ban client gpu's that go over 500watts
As much as I hate the idea of greater than 500w gpus, this will never happen, and honestly shouldn't. It's not the states place to tell end users what electronic wattages are acceptable, otherwise we'd end up banning all sorts of kitchen appliances.

this is why zen3 is still more efficient than ADL in a lot of tasks
No, that's completely down to the design choices of the cores, not a chiplet vs monolithic problem.
 
That ban is based on compute efficiency though, not wattage. And honestly, it's a really dumb piece of legislation in it's own right. It uses a lot of antiquated metrics for it's required "perwatt" efficiency requirement (or at least Washingtons does)


Won't they just drive out of state to buy it? transporting it back state to state in the car is no problem is it.
The idea is to pressure manufacturers to get more efficient so they can use one design everywhere. But as I said, that ban is not rooted in raw wattage.
 
Last edited:
Nobody is panicking or crushing anything just yet.
You can glean this information by the rumour that 4080 is based on AD103, as opposed to 3080 that was last minute carved out of GA102 instead to be able to compete with 6800 XT in 4K. a full GA103 would be 5-10% slower so nvidia so it got canceled. And usually 4070 would contain exactly 7680, as 3080 was supposed to be. just like 3070 and 2080, is the same 2944, and doubles to 5888.

So nvidia must be pretty confident that a narrow 192 bit 7800 XT is a AD103 counterpart. For all weknow AMD may be counting the GPU cores differently now. so a 7680 core part is actually shown as 15360, and performs like 10240.

And the beefy GPU is being developped just for the hell of it, just because they can. The sky is the limit.
 
Last edited:
No, that's completely down to the design choices of the cores, not a chiplet vs monolithic problem.
Design choice like?

Moving data through cores, caches, dies or chiplets is arguably the biggest power hog these days & that's where AMD excels with IF, this is especially evident in processors with over 12~16 cores.
Power-64Core_575px.png
Power-64CoreFreq_575px.png
 
Back
Top