• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

RTX 3080 high power usage with high resolution / high refresh multi-monitor setups

Joined
Sep 24, 2020
Messages
145 (0.08/day)
System Name Room Heater Pro
Processor i9-13900KF
Motherboard ASUS ROG STRIX Z790-F GAMING WIFI
Cooling Corsair iCUE H170i ELITE CAPELLIX 420mm
Memory Corsair Vengeance Std PMIC, XMP 3.0 Black Heat spreader, 64GB (2x32GB), DDR5, 6600MT/s, CL 32, RGB
Video Card(s) Palit GeForce RTX 4090 GameRock OC 24GB
Storage Kingston FURY Renegade Gen.4, 4TB, NVMe, M.2.
Display(s) ASUS ROG Swift OLED PG48UQ, 47.5", 4K, OLED, 138Hz, 0.1 ms, G-SYNC
Case Thermaltake View 51 TG ARGB
Power Supply Asus ROG Thor, 1200W Platinum
Mouse Logitech Pro X Superlight 2
Keyboard Logitech G213 RGB
VR HMD Oculus Quest 2
Software Windows 11 23H2
I recently received a Gigabyte 3080 Vision OC as a birthday present, and it replaced my Asus Dual RTX 2080. The card is very nice, I'm very happy with it, except for one thing: it uses a lot of power and generates a lot of heat when idle, at the Windows desktop, compared to the 2080. As a result, the fan almost never stops, as it has to work almost all the time to keep the temperature under 50 degrees Celsius.

According to the Gigabyte tuning/monitoring tools the memory remains at full speed, 19,004 MHz (1188 MHz x 16), even when idle, and the power usage when idle is 21% (22% with the fan running). I'm not entirely sure, but assuming this 21-22% refers to its 350W TDP, it means it wastes about 75W of power when idle, which seems a bit extreme.

After noticing the card power usage actually goes down to just a couple of percent when the displays are turned off by Windows I did some testing this morning, and I discovered the high power usage is caused by having 3 displays connected to my computer:

Display 1: 3440 x 1440 @ 100 Hz - DisplayPort
Display 2: 3840 x 2160 @ 60 Hz - DisplayPort
Display 3: 3840 x 2160 @ 60 Hz HDR - HDMI (OLED TV), it's usually turned off.

After disconnecting some of the displays, the memory speed started to drop as low as 810 MHz (51 MHz x 16) when idle, and the power usage dropped to 2-5%, so probably somewhere around 12W, which is fine, the number is similar to what I see in the 3080 reviews from TPU. This was surprising to me. I'm aware that video cards tend to use more power in multi-monitor setups, but I didn't expect it to use 6 times more power when idle, 75W, just because I have 3 displays, one of which is turned off.

The TPU reviews were also confusing in this regard, as I didn't realize that the multi-monitor power usage and the memory clock from the Clock Profiles section only applies to the specific configuration used during the TPU multi-monitor test (2 monitors, 1920x1080 and 1280x1024), especially since all the numbers looked the same when not gaming, on all 3080 cards:

msedge_2020-12-09_13-43-59.png


And, in my case, those clocks were higher when idle, the memory actually running at full speed. Anyway, @W1zzard clarified this to me in 6900 XT thread, here: https://www.techpowerup.com/forums/threads/amd-radeon-rx-6900-xt.275652/page-6#post-4410461

Since then, I did a lot of testing to determine exactly what exactly about my display configuration causes the high power usage when idle. The conclusion is that the high resolutions, high refresh rates, and having HDR enabled all contribute to the power usage.

I was sure that the 3rd display, the TV, didn't contribute to this, since I keep it off. I was wrong. Even though the other displays flicker when I turn the TV on or off, it's not actually ignored by the video card when off, in fact it doesn't matter if it's on or off, the power usage is the same, unless I disconnect the HDMI cable physically.

Anyway, these are some of the test results:

Code:
GPU  Memory   Power 
MHz     MHz    Used  Display 1            Display 2           Display 3
225  19,004     21%  3440 x 1440 @ 100Hz  3840 x 2160 @ 60Hz  3840 x 2160 @ 60Hz HDR=on
210     810      8%  3440 x 1440 @ 100Hz  3840 x 2160 @ 60Hz  3840 x 2160 @ 60Hz HDR=off
225  19,004     21%  3440 x 1440 @  95Hz  3840 x 2160 @ 60Hz  3840 x 2160 @ 60Hz HDR=on
225  19,004     21%  3440 x 1440 @  90Hz  3840 x 2160 @ 60Hz  3840 x 2160 @ 60Hz HDR=on
225  19,004     21%  3440 x 1440 @  85Hz  3840 x 2160 @ 60Hz  3840 x 2160 @ 60Hz HDR=on
225  19,004     21%  3440 x 1440 @  80Hz  3840 x 2160 @ 60Hz  3840 x 2160 @ 60Hz HDR=on
225  19,004     21%  3440 x 1440 @  60Hz  3840 x 2160 @ 60Hz  3840 x 2160 @ 60Hz HDR=on
210   1,820      8%  3440 x 1440 @  50Hz  3840 x 2160 @ 60Hz  3840 x 2160 @ 60Hz HDR=on
210   1,820      5%  3440 x 1440 @ 100Hz  3840 x 2160 @ 30Hz  3840 x 2160 @ 60Hz HDR=on
225  19,004     21%  3440 x 1440 @ 100Hz  3840 x 2160 @ 60Hz  3440 x 1440 @ 60Hz HDR=on
225  19,004     21%  3440 x 1440 @ 100Hz  3840 x 2160 @ 60Hz  2560 x 1600 @ 60Hz HDR=on
225  19,004     21%  3440 x 1440 @ 100Hz  3840 x 2160 @ 60Hz  2560 x 1440 @ 60Hz HDR=on
210   1,820      8%  3440 x 1440 @ 100Hz  3840 x 2160 @ 60Hz  2048 x 1536 @ 60Hz HDR=on
225  19,004     21%  3440 x 1440 @ 100Hz  3440 x 1440 @ 60Hz  3840 x 2160 @ 60Hz HDR=on
225  19,004     21%  3440 x 1440 @ 100Hz  2560 x 1600 @ 60Hz  3840 x 2160 @ 60Hz HDR=on
225  19,004     21%  3440 x 1440 @ 100Hz  2560 x 1440 @ 60Hz  3840 x 2160 @ 60Hz HDR=on
225  19,004     21%  3440 x 1440 @ 100Hz  2048 x 1536 @ 60Hz  3840 x 2160 @ 60Hz HDR=on
225  19,004     21%  3440 x 1440 @ 100Hz  1920 x 1440 @ 60Hz  3840 x 2160 @ 60Hz HDR=on
225  19,004     21%  3440 x 1440 @ 100Hz  1920 x 1200 @ 60Hz  3840 x 2160 @ 60Hz HDR=on
225  19,004     21%  3440 x 1440 @ 100Hz  1680 x 1050 @ 60Hz  3840 x 2160 @ 60Hz HDR=on
225  19,004     21%  3440 x 1440 @ 100Hz  1600 x 1200 @ 60Hz  3840 x 2160 @ 60Hz HDR=on
225  19,004     21%  3440 x 1440 @ 100Hz  1600 x 1024 @ 60Hz  3840 x 2160 @ 60Hz HDR=on
225  19,004     21%  3440 x 1440 @ 100Hz  1600 x  900 @ 60Hz  3840 x 2160 @ 60Hz HDR=on
225  18,504     21%  3440 x 1440 @ 100Hz  1440 x  900 @ 60Hz  3840 x 2160 @ 60Hz HDR=on
225  10,004     18%  3440 x 1440 @ 100Hz  1366 x  768 @ 60Hz  3840 x 2160 @ 60Hz HDR=on
225  10,004     18%  3440 x 1440 @ 100Hz  1280 x  768 @ 60Hz  3840 x 2160 @ 60Hz HDR=on
210   1,820      8%  3440 x 1440 @ 100Hz  1176 x  664 @ 60Hz  3840 x 2160 @ 60Hz HDR=on
210     810      5%  3440 x 1440 @ 100Hz   800 x  600 @ 56Hz  3840 x 2160 @ 60Hz HDR=on
210     810      5%  3440 x 1440 @ 100Hz        disconnected  3840 x 2160 @ 60Hz HDR=on
210     810    2-5%  3440 x 1440 @ 100Hz        disconnected               disconnected
210     810    2-5%  3440 x 1440 @  50Hz        disconnected               disconnected
210     810    2-5%  2560 x 1440 @  50Hz        disconnected               disconnected
210     810    2-5%  1920 x 1440 @  50Hz        disconnected               disconnected
210     810    2-5%  1920 x 1080 @  50Hz        disconnected               disconnected
210     810    2-5%  1280 x  768 @  50Hz        disconnected               disconnected
210     810    2-5%   800 x  600 @  60Hz        disconnected               disconnected

The first thing I attempted was to disable HDR on the TV. That reduced the power usage of the video card from 21% to 8%, more than 2 times. Also the GPU and memory frequencies started to match those from TPU reviews.

After re-enabling HDR on the TV, I reduced the refresh rate on my primary display from 100 Hz to something lower, to see if it would help. It did. Reducing the refresh rate on my primary monitor to 50Hz had the same effect as disabling HDR on the TV, the power usage dropped to 8%.

I then set my primary display back to 100Hz, kept the TV with HDR enabled as well, and I reduced the refresh rate on my secondary 4K monitor from 60Hz to 30Hz. This worked even better, power usage got as low as 5%.

I then started experimenting with reducing resolutions on the TV and monitors, while keeping the maximum refresh rates, and HDR enabled on the TV. This helped as well, so clearly the resolution is also a factor, not just refresh rates and HDR.

Then I physically disconnected my secondary display and the TV, keeping just my primary display, and the power usage got even lower, oscillating between 2% and 5%.

Then I reduced the refresh rate and resolution on the remaining monitor as low as 800x600@60Hz. The power usage kept oscillating between 2% and 5%, but it was more frequently 2%, at the lower resolutions. When it was indicating 2%, the GPU voltage indicated 0, and the GPU frequency was also 0.

It seems after a certain point, after reducing all the GPU and memory frequencies at the minimum, it actually started to fully power down the GPU, intermittently, probably between two screen refreshes, to reduce power usage further. Only the memory remains powered, for obvious reasons, and when that happens the power usage is only 2%, which would probably be about 7W. This is how it looks in the monitoring tool when it's powering down the GPU periodically:

AORUS_2020-12-09_12-36-27.png


So, to conclude, I need to disconnect the HDMI cable from the TV when I don't need it, or at least disable HDR, to make the power usage reasonable.
 
I thought this was common(ish) knowledge? Clocks don't drop with High Hz multi monitors? I experience the same thing... I don't change a thing. :)

I run 2x 1440p monitors (one 165 Hz the other 75Hz).
 
Yes, it makes sense, but I hoped it would just keep the memory clock a bit higher when needed, not at the maximum frequency. Apparently they don't bother making the frequency adjustments so granular, probably it would be more expensive, so after a certain point they just push the frequency to the maximum.

And I didn't have the same problem on the RTX 2080. When idle and with the TV off, but connected, it did reduce clocks, including the memory clock, if I remember correctly.

Hmm, I think I know what the difference might be. On the old card I didn't have HDMI ports, only DisplayPort, so I connected the TV with a HDMI to DisplayPort adapter, through which HDR worked only when I reduced refresh rates to 30Hz. The new card has a HDMI port, so I connected the card to the TV without the adapter, and this enabled HDR at 60Hz.

I just tested by using the adapter again, instead of connecting the HDMI cable directly. HDR no longer got enabled, and power usage dropped from 21% to 8%. So this is the explanation. HDR is very expensive.
 
Back
Top