Monday, June 20th 2022

NVIDIA RTX 40 Series Could Reach 800 Watts on Desktop, 175 Watt for Mobile/Laptop

Rumors of NVIDIA's upcoming Ada Lovelace graphics cards keep appearing. With every new update, it seems like the total power consumption is getting bigger, and today we are getting information about different SKUs, including mobile and desktop variants. According to a well-known leaker, kopite7kimi, we have information about the power limits of the upcoming GPUs. The new RTX 40 series GPUs will feature a few initial SKUs: AD102, AD103, AD104, and AD106. Every SKU, except the top AD102, will be available as well. The first in line, AD102, is the most power-hungry SKU with a maximum power limit rating of 800 Watts. This will require multiple power connectors and a very beefy cooling solution to keep it running.

Going down the stack, we have an AD103 SKU limited to 450 Watts on desktop and 175 Watts on mobile. The AD104 chip is limited to 400 Watts on desktop, while the mobile version is still 175 Watts. Additionally, the AD106 SKU is limited to 260 Watts on desktop and 140 Watts on mobile.
Making a difference between a power limit and a TGP is essential. While the Total Graphics Power is what NVIDIA rates its GPUs to run at, the power limit represents the amount of power that can be applied by board partners and overclocking attempts. It is not necessarily translated into TGP, meaning the final TGP values should be significantly lower.
Sources: @kopite7kimi (Twitter), via VideoCardz
Add your own comment

133 Comments on NVIDIA RTX 40 Series Could Reach 800 Watts on Desktop, 175 Watt for Mobile/Laptop

#76
DBGT
AlwaysHopeGlad I'm picking up a new 1kw PSU tomorrow in preparation for upcoming high power draw dGPUs from either AMD or Nvidia... :)
Why not wait for the upcoming ATX 3.0?
Posted on Reply
#77
R-T-B
jesdalsHmmm gen 4 raiser cables and a mount outside the case? perhaps thats the new 2022 look. A 800watt toaster inside my Fractal Design Meshify C would be a challenge - wonder how high AMDs next gen is going to be?
You joke but I am already doing this with my 3090 TI.

I wish I was kidding. 450W TDP just needs it even with my blower fans.

There is no sane way AD102 is going in a consumer case. If I had to guess, there will probably be no consumer AD102 gpus.
Posted on Reply
#78
AlwaysHope
DBGTWhy not wait for the upcoming ATX 3.0?
We are speculating here until they are actually on the market, but I do not have plans for more than 300w card next gpu upgrade - 2x8pin connectors (if I can help it) & besides that, I'll bet they will produce adapters for non ATX 3.0 compliant PSUs. To deliver up to 600w on a single connector I think will be for cards way outside my budget anyway.
Ideally, 50 - 60% is the peak efficiency level for a PSU. So atm, if my OC rig is drawing on avg - 500w from the wall plug in a gaming session, a 1kw PSU is ideal & too complicate things further...lol.. no one knows for sure either what next gen AMD or Intel cpus will drink with power consumption when OC.
Posted on Reply
#79
Gungar
AusWolfDoes this mean that even the 4060 will be out of my heat/noise target range?
Those rumors are probably wrong. It's a new node they don't need to push the cards that much to get performance gains. Last gen GPU rumors were all wrong btw.
Posted on Reply
#80
GhostRyder
I mean, I am not against more power draw to an extent as long as:
A: It can be cooled relatively easily
B: Its performance matches the power output

Though I must say if this is true and it gets anywhere near that, its going to definitely be a water cooled card for me. Though its kinda funny as I remember the heat output on triple and quad GPU setups ive done, never though one single GPU would be like my 3 R9 290X setup I used to have.
Posted on Reply
#81
Blaeza
What about sli as well? Top SKU still has that... :rolleyes:
Posted on Reply
#82
64K
Power up your gaming rig and cause a neighborhood brown out.
Posted on Reply
#83
medi01
I suspect EU will kick in and set the limits at some point, this is getting out of hand.
Posted on Reply
#84
Blaeza
medi01I suspect EU will kick in and set the limits at some point, this is getting out of hand.
Nooooo, fuck it! Soon you'll get a mini power station with your GPU!
Posted on Reply
#85
80251
It'll be like SLI but with a single card. Ugh. It'll be great during the winter though, not so much during the summer.

Instead of jacking up power requirements why doesn't Nvidia look into using HBM over GDDR6x? I had thought HBM used a lot less power for a lot more performance than any flavour of GDDRx?
Posted on Reply
#86
TheDeeGee
Minus InfinityOr you buy a 7700XT and not need to do anything and get much faster performance at same power.

I'm abandoning Nvidia this gen.
I need Nvidia for whenever i play a retro game and need Sparse Grid Supersampling, D3D 4x4, 16xS and what not.

AMD doesn't offer that, and i don't want AMD in my PC ever again after the 5800X fiasco.
Posted on Reply
#87
Unregistered
people who buy highest end GPU's don't give a hoot anyway, they just convince people thy do.

Just buy a 4090 and put 4060 in specs, problem solved
Posted on Edit | Reply
#88
R-T-B
Tiggerpeople who buy highest end GPU's don't give a hoot anyway, they just convince people thy do.
I do. I wouldn't have said I did before, but I am firmly convinced anything over 450W becomes a thermal dynamics problem for your standard ATX case.... Big time.
Posted on Reply
#89
ModEl4
Just for example, the cut down AD104 reference card will probably have 16Gbps GDDR6 at 220W TBP, anything more it will be suicidal from Nvidia at that price range.
Also, I really can't think a single reason why a TSMC N5 or N4, below 300mm² (296?, if AD102 is 600mm²) , heavily cut down part (184TC), with plain GDDR6 will need more, since a near full chip (3070) at close to 400mm² (392) on the sabpar 8nm Samsung (10nm derivative) reference card has 220W TBP.
It just does not compute!
Posted on Reply
#90
ratirt
TiggerJust buy a 4090 and put 4060 in specs, problem solved
Buy a card for $2k and make it run as a $400 card :) that is a brilliant idea if you don't have a brain that is.
It does not solve anything. People pay for performance but with those high TDPs all over, it would be hard to sustain that performance.
Posted on Reply
#91
80251
R-T-BI do. I wouldn't have said I did before, but I am firmly convinced anything over 450W becomes a thermal dynamics problem for your standard ATX case.... Big time.
But R-T-B weren't there peeps who ran SLI/Crossfire rigs that drew more than 500 Watts power just for the video cards alone? Even on air?
Posted on Reply
#92
Unregistered
ratirtBuy a card for $2k and make it run as a $400 card :) that is a brilliant idea if you don't have a brain that is.
It does not solve anything. People pay for performance but with those high TDPs all over, it would be hard to sustain that performance.
In your system specs <-----
Not about making a 2k card run like a $400 one at all. Guess you did not fully understand the comment. It was about having a high power using card but making people think you don't as you care about the "environment"
#93
ratirt
TiggerIn your system specs <-----
Not about making a 2k card run like a $400 one at all. Guess you did not fully understand the comment. It was about having a high power using card but making people think you don't as you care about the "environment"
What about my system specs? I got 6900XT so...?
Anything below spec is unacceptable when you have to lower the performance to make it run cooler since your cooling cant keep up in your rig.
It does not matter what the reason is. If you pay a lot and lower power consumption and performance (it does not matter by how much) to make it work properly in your case is unacceptable. Odds are it will happen considering power required to run those things. Time will tell.
You dont need to buy this one and lower spec. Just buy lower model with less power needs. Save money and environment.
Posted on Reply
#94
R-T-B
80251But R-T-B weren't there peeps who ran SLI/Crossfire rigs that drew more than 500 Watts power just for the video cards alone? Even on air?
Yes. But the thermal density of two cards is far lower than one.
Posted on Reply
#95
80251
R-T-BYes. But the thermal density of two cards is far lower than one.
And the blower style cards exhausted most of the heat outside of case as well right? Are blower style cards obsolete for the high end now due to the high TBP?
Posted on Reply
#96
R-T-B
80251And the blower style cards exhausted most of the heat outside of case as well right? Are blower style cards obsolete for the high end now due to the high TBP?
I would not be surprised if that is the case.
Posted on Reply
#97
Unregistered
ratirtWhat about my system specs? I got 6900XT so...?
Anything below spec is unacceptable when you have to lower the performance to make it run cooler since your cooling cant keep up in your rig.
It does not matter what the reason is. If you pay a lot and lower power consumption and performance (it does not matter by how much) to make it work properly in your case is unacceptable. Odds are it will happen considering power required to run those things. Time will tell.
You dont need to buy this one and lower spec. Just buy lower model with less power needs. Save money and environment.
Buy a 3090ti, do not modify it in any way, run it full power balls out. In system specs put a 3060 so if people look at your specs they think you have a lower power card, while in the mean time you are putting the V up to the people whining about high power users. You are not lowering the power consumption, but fooling other forum users that you care by having a lower power using card.
#98
ratirt
TiggerBuy a 3090ti, do not modify it in any way, run it full power balls out. In system specs put a 3060 so if people look at your specs they think you have a lower power card, while in the mean time you are putting the V up to the people whining about high power users. You are not lowering the power consumption, but fooling other forum users that you care by having a lower power using card.
Well it is about power consumption though. You are missing the point.
Posted on Reply
#99
Unregistered
ratirtWell it is about power consumption though. You are missing the point.
I know it is about power consumption, you are missing my point it seems. It is not about modifying a graphics card for lower power, it is about making yourself look better by seemingly having a lower power card when you have not. Do you not get that?

Makes no difference, chances are the people in the forum who have a 3090/ti will buy a 4090/ti and use whatever means to justify it while making out they care about high power use.
#100
AusWolf
TiggerI know it is about power consumption, you are missing my point it seems. It is not about modifying a graphics card for lower power, it is about making yourself look better by seemingly having a lower power card when you have not. Do you not get that?

Makes no difference, chances are the people in the forum who have a 3090/ti will buy a 4090/ti and use whatever means to justify it while making out they care about high power use.
I see the opposite being a bit more possible. Teenagers bragging about their new 3090 when daddy won't even buy them a 3050.
Posted on Reply
Add your own comment
May 9th, 2024 03:27 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts