• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA RTX 40 Series Could Reach 800 Watts on Desktop, 175 Watt for Mobile/Laptop

If the RTX 4060 will be 320 Watts i will simply downclock it so it only uses around 220, as that's my limit in terms of noise and heat.
But what about the 600W+ cards? If you downclock those to under 300W so it stays cool in most existing cases, one would lose too much performance.

Most here are thinking about downclocking 350W and under cards as has been the normal high upper limit. This won’t work the way you think it will for insanely high power cards.

Has anyone downclocked a 450W 3090TI to 300W or below? If so, how much performance was lost on this $2000 GPU?
 
lol 800 W is insane considering we're going green but seems that the green team completely misunderstood it; 5 hrs gaming = 5 kw /day more than any other appliance in the house
 
I want to see what Gamer's Nexus can get wattage wise out of it. Can you imagine actually oc'ing the thing? My 1660 did 128 watts yesterday, so it should be 5 times at least more powerful.
 
Then what's the point ? If they have to increase the TDP so much that means there are zero efficiency improvements, if you limit it to 220W might as well buy a 3060ti because it's going to perform the same.
Core count, even at lower speed it would be better then.
 
Actually it's better to buy a higher end GPU if you want to Undervolt and use the card for a longer period of time. Lower end GPUs need their powerlimit to reach or get close to their full potential (higher clock speeds need more power) while higher end GPUs are much more flexible. Having much more cores running at lower speed/voltage is much more efficient. Igor from igorsLAB did some testing of the 3090ti at 313 Watt (66% of the base 466W) and 4k performance plunged to an abysmal ;) 90%. Making it one of the most efficient GPUs in the current generation. And with execution units going up massively in the next generation this trend will become even more obvious. I am no friend of high power consumption and I would like to see companies to limit power consumption on products right from the start but until politics or the majority of the market demands lower powerconsumption I will have to do this manually. I rather pay more for the product - using it 3-4 years - manually reducing the powerlimit to something i consider aceptable (max. 200W +/- 10%) and keep my power bill down while performance still goes up quite substanially, coming from a 175W RTX 2070.

The question is: Are you willing/capable to pay a higher price for the GPU to save money the long term (power bill). The alternative is using your current GPU longer. Something that also is an option because of FSR/DLSS etc.

True, a 3080 tuned down to 170w will certainly still outperform a 3060 running at a native 170w. You still get the benefit of more cores, larger memory buses, and more VRAM.
 
Pretty sure this means the 4090(TI) will see short 800W spikes that last a few milliseconds while the average power will be up to 600W as reported prevously, with some cards having a "quiet" bios settings that brings that down to 450W. They can't simply double the power use in one generation, they have to give some time to the component manufacturers to catch up first.
 
TSMC's N5+ process will be a great improvement compared to the Samsung's 10nm-class 8N process of the old RTX 30 generation.
Why does nvidia see the need to push the powers up through the roof? Maybe they know something about AMD that makes them scared?
 
Instead of buying a 320-watt graphics card and undervolting it to 220-watt, while losing 30-40% performance in the process, better buy a lower performance tier card, save some cash and call it a day.

Instead of RTX 4060 @320W @220W for X$, better RTX 4050 @200W @180W for (X-Y)$..
You have so much misleading stuff in your comments... man. During undervolting process, your main focus is to be more effective with your hw. Loosing performance with lower power is not linear. For example I have 3080 ti 0.75v on core and overclocked mems and my consumption dropped from 360W to 250W and performance dropped like 1-5% max even in some titles is better like default performance. You are talking trash from internet without experience with those things.
 
You have so much misleading stuff in your comments... man. During undervolting process, your main focus is to be more effective with your hw. Loosing performance with lower power is not linear. For example I have 3080 ti 0.75v on core and overclocked mems and my consumption dropped from 360W to 250W and performance dropped like 1-5% max even in some titles is better like default performance. You are talking trash from internet without experience with those things.

Your example is misleading because you are assuming that will be true for the new cards. What if it is not true?
 
Okay now we know this sort of news is just clickbait material. Scoring well on the ad revenue.

We've read this how often now? 800W, 650W, who the f. cares what the slot and 12 pin is capable of? Come on, TPU.

Here's the real source
A truth. The power limits:
AD102, 800W;
AD103 (DT), 450W, AD103 (Mobile), 175W;
AD104 (DT), 400W, AD104 (Mobile), 175W;
AD106 (DT), 260W, AD106 (Mobile), 140W.
But I don't think we need to use the full power cap.
— kopite7kimi (@kopite7kimi) June 18, 2022

Also, explain to us the madness of having an AD103 at 450W (which is already being close to ridiculous or way over it) and then nearly doubling the board power for an SKU one level up, that usually doesn't push more than 30% performance over the previous one? That is quite literally the shittiest power curve I've ever seen on silicon, and yes that is including Intel's latest.

:twitch:
 
Your example is misleading because you are assuming that will be true for the new cards. What if it is not true?
What are you talking about? It's same for 20 years... and next gen will be also same. Its just physics, not your random mumbo jumbo. Next gen nvidia will be same shit from the undervolting point of view because of usage gddr6x.
 
The official RTX 3090 TGP is 350W, and the max board power limit of ASUS STRIX 480W and it's not even the craziest 3090 design, so take into account this also...

tdp-adjustment-limit.png
 
Last edited:
What are you talking about? It's same for 20 years... and next gen will be also same. Its just physics, not your random mumbo jumbo. Next gen nvidia will be same shit from the undervolting point of view because of usage gddr6x.

For 20 years? I have never seen or experienced anything like your claims. The lowest possible undervolting resulted in instability and BSODs.
Maybe things have changed meanwhile, but definitely it wasn't like this before.

That means that nvidia and AMD push the TDPs in unjustified ways too high.

Stock 2050/1200mv:
110fps, 185W max, 160W avg, rpm 2950, gputemp 74C, junction 93C
1900MHz/1000mv:
110fps, 149W max, 130W avg, rpm 2700, gputemp 70C, junction 81C.
1800/950mv:
108fps, 133W max, 115W avg, rpm 2300, gputemp 66C, junction 74C.
1750/910mv:
106fps, 134W max, 110W avg, rpm 2300, gputemp 66C, junction 74C.
1700/890mv:
104fps, 126W max, 105W avg, rpm 2200, gputemp 65C, junction 73C.
Testing undervolting with my 5700XT | TechPowerUp Forums

Why did they need to push from 120-130-watts up to 180-185-watts for mere 5% higher performance?

Something doesn't add up.
 
No need for oil or natural gas this winder.
 
Why did they need to push from 120-130-watts up to 180-185-watts for mere 5% higher performance?

Something doesn't add up.
Just because you got a good chip doesn't mean that they're all like that. They put out specs that they know will run for sure on all chips that pass QC, if every GPU was tuned out of the factory you would then complain about prices, not TDP.
 
Have any of you people talking crap on downclocking / undervolting your GPUs actually tried it? :rolleyes:

I've spent probably 20 hours finding the best stable undervolt for my 3070ti. I'm at work so I can't remember the exact values but I run the card in the high 1800MHz range at @ 0.87v GPU Vcore max via MSI Afterburner's V/F offset graph. Benchmarking with Shadow of the Tomb Raider, Heaven, various benchmarks in the 3DMark suite .... I do lose 1-4% performance, but I also consume 60 - 90 watts less at the outlet (using a Kill-A-Watt meter, noting the max draw and relatively average draw per run). Card runs much cooler with much less noise because the fans don't have to ramp up like stock 1.081v.

I have to do this because the mini-ITX case is very small. Without the case cover on, I can run @ 2,000 MHz in the mid 0.9v range and actually get a few percentage points higher score in these benchmarks than the stock factory setting. I also check the MSI Afterburner graph after each run to make sure the GPU voltage stays consistent and doesn't dip or go higher (which it will if there's enough voltage to feed it).

Save your best profile in MSI afterburner in any of the 1 - 5 presets and every time you reboot your PC, check the graph. MSI Afterburner sometimes tweaks the curve a little causing it to rise a bit, so just reapply the profile while looking at the graph to make sure it's the exact V/F you initially saved.

I never have any crashes or issues in my games so it's 100% stable. WELL worth the few percentage point loss in performance.
 
Last edited:
My thoughts are that the top end card will come with 2 of those new 12 Pin connectors which would explain the 800W number. Having said that, if you need a 1000W PSU for a 12900K and 3090TI combo, does that mean that using a 4090 may need to have the user get 1200W+ PSU? I guess the other option (some cases can do it) is to have 2 PSUs but regardless it is not going to be cheap to keep up with the top end of Team Green.
 
If you decrease the wattage by 50%, the performance will also fall correspondingly. In which case the whole exercise of buying a brand new, shiny 40-series simply diminishes.
I have 2080Ti. Even if i play a not so demanding game at 60fps, for some reason it still wants to use alot of power. I set power limit to 40% and it runs 60fps no problem with no change in graphics settings. This also ensures that fans run at 20%, which is almost inaudible.

Assasin creed oddysey i do also play at 60fps and noticed the same thing. I set the slider to about 60% power. I didn't notice any difference in graphics looks but i noticed a big drop in power consumption and noise.

SO, i'm very happy having this ability to alter power limits and having a card which can handle everything when i need it to.
 
My thoughts are that the top end card will come with 2 of those new 12 Pin connectors which would explain the 800W number. Having said that, if you need a 1000W PSU for a 12900K and 3090TI combo, does that mean that using a 4090 may need to have the user get 1200W+ PSU? I guess the other option (some cases can do it) is to have 2 PSUs but regardless it is not going to be cheap to keep up with the top end of Team Green.

It all seems surreal to me still. Let's assume the gaming consumption of that max 800w card is around 500-550W, considering how hard it is to cool even the 3090ti, there will be nothing left but watercooling for the 4090 or 4090ti.
 
Hope the 4070 is decent and has reasonable power consumption/heat/tdp, these power figures are insane
 
But what about the 600W+ cards? If you downclock those to under 300W so it stays cool in most existing cases, one would lose too much performance.

Most here are thinking about downclocking 350W and under cards as has been the normal high upper limit. This won’t work the way you think it will for insanely high power cards.

Has anyone downclocked a 450W 3090TI to 300W or below? If so, how much performance was lost on this $2000 GPU?
Ive done it on a 470w 3090. It currently runs at 280 to 320 watts. The difference is 9% compared to stock 470 watts.
 
It all seems surreal to me still. Let's assume the gaming consumption of that max 800w card is around 500-550W, considering how hard it is to cool even the 3090ti, there will be nothing left but watercooling for the 4090 or 4090ti.
One of the things that struck me about the reference 30 series cards is that the entire card is basically a finned radiator with fans. I guess they will (high end) all be 3.5 to 4 slot cards. I am sure that at least one of the vendors is talking to Fan manufacturers to replicate the Noctua sub-variants as it might be absolutely necessary for any OC from an AIB perspective. Water cooling is not painful nowadays and there are easy ways to get into Water cooling without having a build that you don't have to completely disassemble to make changes. Actually for us users that already have Water cooling with Alphacool Quick Connect it is just a matter of ordering the block, the block with Quick Connect tubing or the block with Quick Connect Tubing, block and radiator. For up to $200 Euros it is a pretty painless to get that hardware appreciation smile.
 

so, it's confirmed then.
thanks
 
Back
Top