Thursday, November 8th 2018

NVIDIA Finally Fixes Multi-Monitor Power Consumption of Turing GeForce 20. Tested on RTX 2070, 2080 and 2080 Ti.

Today, NVIDIA released their GeForce 416.81 drivers, which among others, contains the following changelog entry: "[Turing GPU]: Multi-monitor idle power draw is very high. [2400161]". Back at launch in September, Turing was plagued with very high non-gaming power consumption, in both single-monitor and multi-monitor idle.

The company was quick to fix single-monitor power consumption, which we tested promptly. Unfortunately, at the time, multi-monitor power draw wasn't improved and people were starting to get worried that there might be some kind of unfixable issue present on Turing that would prevent NVIDIA from fixing multi-monitor power draw.

Using today's 416.81 drivers I did a few quick runs for power consumption on all three existing GeForce 20 SKUs; the RTX 2070, RTX 2080 and RTX 2080 Ti.
As you can see, multi-monitor power consumption is finally back at normal levels. Good job, NVIDIA! While the power draw of Pascal cards is still a little bit lower, the differences are now negligible, and won't have any significant effect on power usage, power bill, temperatures, or the environment.

I also did a run measuring single-monitor idle power - no changes here, but not needed either. In case you wonder, since these changes are part of the driver, nothing special will have to be done for custom boards from the various NVIDIA partners. No BIOS update needed, just install the 416.81 drivers and multi-monitor power consumption will go down by more than half.
Add your own comment

31 Comments on NVIDIA Finally Fixes Multi-Monitor Power Consumption of Turing GeForce 20. Tested on RTX 2070, 2080 and 2080 Ti.

#26
cucker tarlson
INSTG8RMine says 7W at the moment ;)
tpu charts might be outdated. and you seem to have one monitor only.
Posted on Reply
#27
INSTG8R
Vanguard Beta Tester
cucker tarlsontpu charts might be outdated. and you seem to have one monitor only.
Yes and my Vega is a revision 2 it only has 2 8 pin and on the “Econo” BIOS it maxes out at 242W not the insane 300+ most charts show. The “Hot” BIOS tops out at 275W.
Posted on Reply
#28
cucker tarlson
it also loses 10% performance so there's that, if I set my 1080Ti to 70% power limit it'd lose 10% but could be run fanless. Plus I'm still not sure if you're referring to multi monitor.
Posted on Reply
#29
INSTG8R
Vanguard Beta Tester
cucker tarlsonit also loses 10% performance so there's that, if I set my 1080Ti to 70% power limit it'd lose 10% but could be run fanless. Plus I'm still not sure if you're referring to multi monitor.
I’ve benched both BIOS the difference is practically margin of error. So why run it hot for no reason. Power isn’t Vegas issue for performance it’s temperatures that dictate the clocks.
Posted on Reply
#30
TRIPTEX_CAN
Wow lots of toxic red vs green commentary here. Shame.

I didnt know gpu purchases come with a license to be a dick.
Posted on Reply
#31
John Naylor
qubitI do wonder why something simple like this is a problem in the first place. Surely power issues could have easily been ironed out before the product is released.
It's SOP .... we see the same issues with every new generations .... that's why they call it the "bleeding edge"

P68 Pre B3 Intel chipset
Asus Z87 pre C.1 steppings (external devices wouldn't wake up from sleep_
EVGA 970 SC (HS missed GPU)
EVGA 1060-1080 SC & FTW (missing thermal tape)
MSI 9xx series (adhesive tape)
Every version of Windows (ever)
Every piece of hardware / software that has ever been patched

Simply put, technology adabvances so fast that by the time all the bugs are out, the next generation is already on the way. Beta testing us essentially done, post release.
Posted on Reply
Add your own comment
Apr 26th, 2024 02:26 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts