• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 4090 with Nearly Half its Power-limit and Undervolting Loses Just 8% Performance

clearly 4090 is exacly 450 watts when properly loaded
477 watts with Ray tracing on.
277 watts with RT on plus DLSS only because one hell of a CPU bottleneck, it should be avoided as an example.
 
clearly 4090 is exacly 450 watts when properly loaded
477 watts with Ray tracing on.
277 watts with RT on plus DLSS only because one hell of a CPU bottleneck, it should be avoided as an example.
no no no, the proper way of testing the 4090's efficiency would be running games at a resolution of 800x600, with render scale set to lowest, and all the settings to minimum :^)
 
Im surprised this is news. This data was available during launch day, people didn't seem to care then.


Powerscaling-scaled.jpg


For 10% performance loss you cut its PL from 450W to 300W.

This is where you lost your RTX 4080 at 300 W.
Instead you received a stupid RTX 4090 with insane overclock and stupid cooling, and not custom cards with water cooling as higher tier cards which offer up to 15% higher performance.

Smart engineering would be a 300 W reference RTX 4090 and 450 W super overclocked and water cooled RTX 4090 Ti.
 
I don't know why any of this is news. This is the case for almost 100% of PC CPU's & GPU's on the market today with the exception for low power laptop chips. Most GPU's & CPU's can be undervolted by some percentage and still perform within 5-10% of their maximum performance, but not all of them can, so everything is binned to the lowest common denominator so as many chips as possible can fit into the performance threshold they need to meet.
 
Im surprised this is news. This data was available during launch day, people didn't seem to care then.


Powerscaling-scaled.jpg


For 10% performance loss you cut its PL from 450W to 300W.

Should have used the dual BIOS, 300W stock & 450W OC. :shadedshu: They need to make it more user friendly for the "average Joe" to run their hardware more economical & ecological.
 
Anybody mention that this would help alleviate them from burning up too?

Been doing undervolting/PL limit for years now. Wouldn't run them any other way.

If I did purchase a 4080/90 this would be a no briner for me.
 
The way things go the GPUs are bound to reach 1kW at some point and undervolting can be the factory default.
 
Should have used the dual BIOS, 300W stock & 450W OC. :shadedshu: They need to make it more user friendly for the "average Joe" to run their hardware more economical & ecological.
This is actually how im running my GPUs. one BIOS does include a green plan. I don't want to fire up a GPU OC tool every time i turn on my PC.
The RTX 4080 is sort of the same story, but slightly less dramatic of a drop. With it, you get your 90% performance at around 270W power.
In terms of PPW, the RTX 4090 is unsurprisingly much better here. Its easier to run more processing hardware at a lower power demand and get higher parallel compute performance. Its what we see on regular CPUs too.

For those who aren't fully updated - yes, you can edit values on Ada GPU BIOSes now
 
Edit with what. This is only a flash tool. How do i set -100 mV for example as I'm not looking to drop more than 1% performance that in this case means 17% lower power.
 
For those who aren't fully updated - yes, you can edit values on Ada GPU BIOSes now
Edit or just crossflash? Editing seems quite the lofty claim.
 
Der8aur already showed how 4090 was pushed beyond optimum efficiency on launch day itself.


My cousin works as GPU architect in nVidia and will be talking about this stupid decision when we meet in few weeks time.
Its really is not that stupid, though id be curious to know what your cousin says. Doing it this way out of the box to run higher voltage basically is playing it safe by Nvidia engineers. Not all GPUs could run at some of these lower voltages out of the box and be 100% stable so nvidia has to account for the worst GPU of the lot. If Nvidia were to do it that way, imagine the cluster fuck that would ensue if people got $1k+ GPUs that werent stable. Giving some voltage head room covers all cards and makes sure they are stable, especially for the less savy people that don't want/or know how to tweak their hardware, etc. Undervolting/power limiting is then at the customers disgression as it should be really. Also the voltage scaling is done from Nvidia boosting algorithm so they are purposefully driving voltages in excess for more than just maximum performance.

The topic of this article should be no surprise to literally anyone based on Turing and Ampere GPUs. Both those generations could undervolt like crazy with damn near no performance loss. Why would it be any different for RTX4000?
 
Der8aur already showed how 4090 was pushed beyond optimum efficiency on launch day itself.


My cousin works as GPU architect in nVidia and will be talking about this stupid decision when we meet in few weeks time.
Stupid what??????

Are you sure your cousin is work on NVIDIA as GPU architect?
Before They decided the ideal power for 4090, they already tested and made more than 1000x attempted with some un-ideal conditions to make sure this GPU run as stable as it can with its all features activated (full Ray Tracing function, heaviest rendering workload, etc.)
And if your cousin truly working on nVIDIA GPU R&D dept. ; then actually he know that this "450 watt" actually was already the "win win solution wattage" value (rather than 600watt peak) to make everybody happy ;)

Do not always trusted some kids or media with their controversial news to get more attention. They test it with what condition? as Cyberpunk with RTX ON already hit more than 450watt for make sure its running stable on that workload. And many render condition that actually need more wattage than that.

You can tell this to your "virtual cousin" ;)
so he can tell that "every team member are moron and stupid" in the morning meeting. LMAO!
 
Last edited:
I feel like 600w is perfect for the 4090, I grabbed a 450w version and it was hitting the PL and crashing/dropping performance. Asus TUF comes up to 490-510w during heavy RT titles like Control or Cyberpunk, Hitman 3 especially, and I feel further down the line with more AAA titles taking advantage of UE5 Nanite and Lumen tech on top of RT. I'm using a 5800X, not seeing bottlenecking but I play 4K 120+ LG C1.
 
You could look at this the other way at 25% over (600 watts) power target you can get 8% performance boost and here I am running my 4090 suprim liquid below recommended psu with my 750 watt sfx platinum psu since launch.:toast::cool:
 
Most of the time my total system power consumption is 250-350w. It only ever been up to 500w running benchmarks.

No game has ever gone over 400w,that is entire system load.

RTX 4090 power consumption has been blown out of proportion by team red and random weirdos. This card uses less power than my 3090!

I will say DLSS is still a hot mess.
 
Back
Top