• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 4090 with Nearly Half its Power-limit and Undervolting Loses Just 8% Performance

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,845 (7.39/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
The monstrous NVIDIA GeForce RTX 4090 "Ada" graphics card has the potential to be a very efficient high-end graphics card with a daily-use undervolt, and with its power-limit halved, finds an undervolting adventure review by Korean tech publication Quasar Zone. The reviewer tested the RTX 4090 with a number of GPU core voltage settings, and lowered software-level power-limits (down from its 450 W default).

It's important to note that 450 W is a very arbitrary number for the RTX 4090's power limit, the GPU rarely draws that much power in typical gaming workloads. Our own testing at stock settings sees its gaming power draw around the 340 W-mark. Quasar Zone tested the RTX 4090 with a power limit as low as 60% (270 W). With its most aggressive power management they could muster (i.e. 270 W PL), the card was found to lose just around 8% of performance at 4K UHD, averaged across five AAA games at maxed out settings. The story is similar with undervolting the GPU down to 850 mV, down from its 1 V stock. In both cases, the performance loss appear well contained, while providing a reduction in power-draw (in turn heat and noise).



View at TechPowerUp Main Site | Source
 
thats.. quite alot less power for very little perf loss
 
Looks like Nvidia has maxed out the cheap approaches to gaining performance.
 
Quite impressive. Nvidia could have developed smaller, more efficient and slightly less powerful graphics cards, but instead opted for a monstrous three-slot high-powered design. It's a shame they didn't opt for a leaner, more modern approach.
 
yep... Im at full performance at 325W - to be fair though - I have a .950 UV and it does hit 340W in CB2077
 
Which raises another question: Do these cards' dimensions really need to be so big?
The morons working at Nvidia's R&D department have it in their minds to make these products as big & power-hungry as possible for some other weird arcane reason. Sure, higher MSRPs and profit, but that alone doesn't make much sense to me.
 
Der8aur already showed how 4090 was pushed beyond optimum efficiency on launch day itself.

Which raises another question: Do these cards' dimensions really need to be so big?
The morons working at Nvidia's R&D department have it in their minds to make these products as big & power-hungry as possible for some other weird arcane reason. Sure, higher MSRPs and profit, but that alone doesn't make much sense to me.
My cousin works as GPU architect in nVidia and will be talking about this stupid decision when we meet in few weeks time.
 
Which raises another question: Do these cards' dimensions really need to be so big?
The morons working at Nvidia's R&D department have it in their minds to make these products as big & power-hungry as possible for some other weird arcane reason. Sure, higher MSRPs and profit, but that alone doesn't make much sense to me.
You have answered your question. If these cards were not pushed so far with the power they would not require but since these are pushed that far and are advertised as "that fast" they require.
 
No shit sherlock.... Nothing new here.
Overclocking for the crown of "The fastest GPU possible" always had a huge impact on consumption, especially GPU, in order to get the crown for a few percentage over the other.
 
Which raises another question: Do these cards' dimensions really need to be so big?
The morons working at Nvidia's R&D department have it in their minds to make these products as big & power-hungry as possible for some other weird arcane reason. Sure, higher MSRPs and profit, but that alone doesn't make much sense to me.

Every design choice is simpler when you release product after your competitor ;)
 
Der8aur already showed how 4090 was pushed beyond optimum efficiency on launch day itself.


My cousin works as GPU architect in nVidia and will be talking about this stupid decision when we meet in few weeks time.
It seems like the bean counters in the marketing department has the louder voice in the company if the architects are finding these specs just as absurd while Nvidia could care less.
 
Which raises another question: Do these cards' dimensions really need to be so big?
The morons working at Nvidia's R&D department have it in their minds to make these products as big & power-hungry as possible for some other weird arcane reason. Sure, higher MSRPs and profit, but that alone doesn't make much sense to me.

R&D department isn't at fault for the CEO's direction. I'm sure their hands were tied.
 
R&D department isn't at fault for the CEO's direction. I'm sure their hands were tied.

As much as I dislike Nvidia, CEO's hands are often tied too, to the shareholders. Nvidia needed to show they had the most badass card around by a large margin, to prove dominance to the shareholders.
 
It's probably because not all the silicon can hit these clocks - im sure there are some doggo 4090s out there that need 400W for 2700mhz
 
It's probably because not all the silicon can hit these clocks - im sure there are some doggo 4090s out there that need 400W for 2700mhz

I just assumed bad silicon would have been sent down the line for the 4080 launch.
 
I just assumed bad silicon would have been sent down the line for the 4080 launch.

That's a different chip tho - there's not a harvested 4090 AFAIK. Ah wait nvm, im confused with the other 4080....
 
Their midrange and budget options are going to be very impressive efficiency wise
Hopefully AMD can deliver on their claims and combat them
 
Their midrange and budget options are going to be very impressive efficiency wise
Hopefully AMD can deliver on their claims and combat them

I think chiplet design gpu is going to surprise all of us. I expect the 7900 xtx will beat a 4080 in several games, just not all (no ray tracing on).
 
Which raises another question: Do these cards' dimensions really need to be so big?
Bigger is better! - member of the marketing team :) Imagine a 4090 was the size of a low profile GTX1650. But it is 4090 and you are looking at your mega-build with a $600 motherboard and the biggest RGB ram sticks you can get....how would that make you feel? Your sound card and your RAM is bigger than your video card? It's like buying a Ford car with a 1.0 Eco engine....looks like it's built from lego bricks. :D
 
Backwards move from Nvidia. It would have been preferable on practically every front to release at the lower power level with still incredible performance. They'd probably have priced lower too, though not by much I imagine.

Instead they've adopted the CPU strategy - max out the silicon at all costs for max clocks and performance.
 
This isn't new news. The wide gpus have always been like this, whereas the the small gpus definitely need the MHz cranked up.
 
The power consumption may be down to silicon quality variability and Nvidia wanting to ensure as many cards as possible can hit rated frequencies.
 
According to their graphs

PL 60 % = 268W
268/347 = 0.77

Undervolting
232/347 = 0.67

So the 【PL60%】is super deceiving when it actually consumes 77% of the default power consumption.
60% should be~210W, now it is 28% more than advertised.
 
Yawn.

Look, 4090 was a necessary beast. Being the top dog serves so many purposes from brand image to corporate valuation.

Problem is, low-power designs get reserved for the mobile solutions, whereas desktop is left with power hogs. It's been too many years we don't see a good <75W, cordless, low-profile cards. Both NVIDIA and AMD... Such a shame, when they are very much able.
 
Back
Top