• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 4090 with Nearly Half its Power-limit and Undervolting Loses Just 8% Performance

Yawn.

Look, 4090 was a necessary beast. Being the top dog serves so many purposes from brand image to corporate valuation.

Problem is, low-power designs get reserved for the mobile solutions, whereas desktop is left with power hogs. It's been too many years we don't see a good <75W, cordless, low-profile cards. Both NVIDIA and AMD... Such a shame, when they are very much able.
Yep... there is so much fruit left hanging...

back in the 28nm era there was this lingering promise things could keep going smaller and more efficient alongside overall progress in performance. Honestly... I still value a more silent rig over one that gets 10 FPS more and/or can push a game slider to very high... and we shouldn't have to buy into a higher end card and undervolting it massively to achieve that.

At some point we got cards released in both ways; not only higher clocked AIB product, but also less performant & smaller form factor. Nobody ever complained about that segment, but somehow, it vanished. Hope we see a return to it, Ada and 3 slot behemoths definitely open up a niche.
 
Im surprised this is news. This data was available during launch day, people didn't seem to care then.


Powerscaling-scaled.jpg


For 10% performance loss you cut its PL from 450W to 300W.
 
According to their graphs

PL 60 % = 268W
268/347 = 0.77

Undervolting
232/347 = 0.67

So the 【PL60%】is super deceiving when it actually consumes 77% of the default power consumption.
60% should be~210W, now it is 28% more than advertised.

No the base case is not fully utilizing the GPU.
This is 60% of the power limit of 450W which is exactly 270W.

This test is flawed since the gpu was not properly loaded up in the first place. Undervolting still is quite beneficial, but not more than it always have been on other GPUs.
 
yep... Im at full performance at 325W - to be fair though - I have a .950 UV and it does hit 340W in CB2077

Same here but in New World (will have to replay CB2077 because i finished it 2 years ago at 4K 35fps lol), seeing around 260-340W max with 0.95v @ 2700mhz and 62-63°C with fans on auto :D
 
Card basically ships in inefficient overclocked mode, Nvidia chasing every % on reviews, now days best thing to do is undervolt, not overclock.
 
I don't see any problems with, actually I applaud nVidia for doing so, this is an enthusiast product.
What I disagree with is the price, this is not a Titan, plus it costs about twice the 3090, making it mediocre performance/features wise.
 
The 1.0 V setting is maybe even more comical since it achieves over 100% performance while still burning less power.
 
Just shows how far up it's efficiency curve it's running IE at the top end plus as others have said fully load it up and see what's what, I would say max Rtx settings and dlss but I doubt they'd use all resources either.
 
undervolting cards yields better efficiency, more at 11
I can undervolt my 6800 xt while overclocking it and have ~60w lower power draw at ~6% higher performance than stock. this is nothing new.

The 1.0 V setting is maybe even more comical since it achieves over 100% performance while still burning less power.
pretty standard, all cards come overvolted out of the factory so that even the worst bins are capable of operating at the advertised boost clocks... but I'm sure you already know this :)
 
Last edited:
I said comical - because it produces higher performance on less power. And apparently 1.0 V is the factory voltage. The difference is in how automatic power management interacts I guess.
 
The concept is obvious and well-known, but a lot of the specific data here is highly suspect. I've seen my 4090 reach 450W in a number of games, so I don't know how their average stock power draw is so low to begin with. I also lose much more than 8% performance at half power (more like 20%). Personally, I apply an 80% power limit combined with a small core offset. This keeps performance at stock while keeping to a 360W power limit.

But none of this is new, so I'm a little confused why it's suddenly news here and at videocardz (and wherever else picked this up). Maybe because the numbers this time are so shocking? But they're bogus.
 
Personally, I apply an 80% power limit combined with a small core offset. This keeps performance at stock while keeping to a 360W power limit.
That's not what is special here. The way I read it is they've disabled the power limit, as in unlimited, and just gone with a fixed voltage instead.

And then they've also compared with setting various power limits as well. Which highlights the lopsidedness.
 
No the base case is not fully utilizing the GPU.
This is 60% of the power limit of 450W which is exactly 270W.

This test is flawed since the gpu was not properly loaded up in the first place. Undervolting still is quite beneficial, but not more than it always have been on other GPUs.
PL 100% is 450W

Yes.
It was deceiving because they drawn the 8% loss conclusion based on not actually 60% power consumption but around 67 - 77%
If they could find a workload which utilizes full 450W power limit, then compare it againt the PL60%.
I am pretty sure it is not 8% performance lost.
 
yet another aggressive factory-overclocked card! The rest of the Ada lineup is also slightly higher power than Ampere, but the performance/watt increase should still be similarto the 4080!
 
Last edited:
But none of this is new, so I'm a little confused why it's suddenly news here and at videocardz (and wherever else picked this up). Maybe because the numbers this time are so shocking? But they're bogus.
could be just plain old marketing via pointless visibility (look this nvidia gpu is using less power!!! omg!!!) - anything that gets people to think of nvidia is mindshare
AMD's been doing the same a couple of weeks ago - any time nvidia would announce something, or reviews would start piling up, AMD would announce a rehash of the same information that they'd posted before (vram capacity, clock speeds, etc.) - all for the sake of getting people to think of their products via simple visibility

and look, it's working - you're sitting here wondering wtf this article's point is, others can't stop jizzing over generic GPU stuff - less voltage = much less power draw at minimal performance loss up until a certain point WOW
 
Last edited:
It's an average power draw over five games standard test, of course it's not 100% load all the way. The fact that there is different results tells us 100% is happening some of the time. That's what matters. That and repeatability.
 
That's not what is special here. The way I read it is they've disabled the power limit, as in unlimited, and just gone with a fixed voltage instead.

And then they've also compared with setting various power limits as well. Which highlights the lopsidedness.
This is also rather confusing though, because the behavior me and many others have observed when undervolting is that it is actually less effective than using power limits this time around. With ampere, you definitely wanted to undervolt. I have not seen a single other person replicate these results with a 4090 though. I've tried, and I've seen a few other youtubers (such as derbauer and optimum tech), but everyone's come to the same conclusion that undervolting behaves oddly and your performance is reduced at the same clock and power draw compared to simple power limiting. Everyone except these guys, I guess.
 
A 350w 4090 card with reasonably sized cooler would have made a lot of sense, a 4090U or something like that. To me that card is just too much. Too big, too much power etc. It's a great card, but man it's huge.
 
RTX 4090 was not properly loaded, which is clear when you check default power consumption of only 347W.
From W1zzard's review.
Screenshot_9.png
That Korean test used 5 games, Cyberpunk 2077 was a part of It, but RT and DLSS3 was enabled, so such a lower power consumption is not surprising.
More interesting is TPU review, where power consumption is only 76W with V-sync enabled in Cyberpunk without RT or DLSS, but performance hit was ~55-60%, because clockspeed was only ~787 MHz.
 
Every design choice is simpler when you release product after your competitor ;)
Nonsense,gpu's are designed years in advance. It seems like leather jacket man had the engineers design these 4000 series cards with clocks pushed past the point of any efficiency for miners that would've bought cards no matter what, and well rich gamers still bought 4090s anyway lol.
 
Nonsense,gpu's are designed years in advance. It seems like leather jacket man had the engineers design these 4000 series cards with clocks pushed past the point of any efficiency for miners that would've bought cards no matter what, and well rich gamers still bought 4090s anyway lol.

How can you know the actual characteristics of silicon before it has been produced? At best it would be estimates.

Nvidia probably estimated their 4090 would easily use 450W+ that they requested AIBs to make 450W+ cooler, but AD102 turned out to be more efficiency than Nvidia had thought.
 
Last edited:
How can you know the actual characteristics of silicon before it has been produced? At best it would be estimates.

Nvidia probably estimated their 4090 would easily use 450W+ that they requested AIBs to make 450W+ cooler, but AD102 turned out to be more efficiency than Nvidia had thought.
Well you are right, making a card take multiple year and you don't steer it that much to meet competitors. In this case, They probably asked AIB to make sure no cooler is loud. Nvidia doesn't want to be seens as the loud card. Being bulky could be seen by some as an advantage and consuming a lot of power will be seen by most as a non issue as long it's not loud.

The other thing to take into consideration are the safety margin. Maybe 20% of the card would perform like that with less power, but not all. By using that amount of power, they can get higher yield by making sure as much chip as possible run properly. Some card maybe have even more heardroom to undervolt. Some might not.
 
Back
Top