• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

What AMD Didn't Tell Us: 21.4.1 Drivers Improve Non-Gaming Power Consumption By Up To 72%

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,356 (7.68/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
AMD's recently released Radeon Software Adrenalin 21.4.1 WHQL drivers lower non-gaming power consumption, our testing finds. AMD did not mention these reductions in the changelog of its new driver release. We did a round of testing, comparing the previous 21.3.2 drivers, with 21.4.1, using Radeon RX 6000 series SKUs, namely the RX 6700 XT, RX 6800, RX 6800 XT, and RX 6900 XT. Our results show significant power-consumption improvements in certain non-gaming scenarios, such as system idle and media playback.

The Radeon RX 6700 XT shows no idle power draw reduction; but the RX 6800, RX 6800 XT, and RX 6900 XT posted big drops in idle power consumption, at 1440p, going down from 25 W to 5 W (down by about 72%). There are no changes with multi-monitor. Media playback power draw sees up to 30% lower power consumption for the RX 6800, RX 6800 XT, and RX 6900 XT. This is a huge improvement for builders of media PC systems, as not only power is affected, but heat and noise, too.





Why AMD didn't mention these huge improvements is anyone's guess, but a closer look at the numbers could drop some hints. Even with media playback power draw dropping from roughly 50 W to 35 W, the RX 6800/6900 series chips still end up using more power than competing NVIDIA GeForce RTX 30-series SKUs. The RTX 3070 pulls 18 W, while the RTX 3080 does 27 W, both of which are lower. We tested the driver on the older-generation RX 5700 XT, and saw no changes. Radeon RX 6700 XT already had very decent power consumption in these states, so our theory is that for the Navi 22 GPU on the RX 6700 XT AMD improved certain power consumption shortcomings that were found after RX 6800 release. Since those turned out to be stable, they were backported to the Navi 21-based RX 6800/6900 series, too.

View at TechPowerUp Main Site
 
Joined
Mar 15, 2015
Messages
10 (0.00/day)
System Name R36XT RX6800
Processor Amd Ryzen 5 3600XT
Motherboard MAG X570 TOMAHAWK WIFI
Cooling Cooler Master MasterLiquid ML240L RGB V2 240mm AiO Liquid CPU Cooler
Memory Adata XPG Gammix D10 32GB (2x 16GB) @3200MHz DDR4
Video Card(s) AMD Radeon™ RX 6800
Storage ADATA SX8200PNP 1TB, m2 860 2 TB, SG Pro 1TB
Display(s) ASUS TUF Gaming VG27WQ1B 27" QHD VA Curved Monitor
Case Lian Li Lancool II Gaming Case - Black
Audio Device(s) Sound Blaster Omni Surround 5.1
Power Supply Gigabyte AORUS P750GM 750W Modular 80+ Gold PSU
Mouse SteelSeries Rival 310 eSports Mouse, Steelseries XAI Ruse Edition
Keyboard Razer Chroma Ornata
Software Win 10 Pro
How about fixing 165hz refresh on monitors, memory clock stays at full speed! Changing to 144 fixes the problem!
 
Joined
Dec 5, 2020
Messages
159 (0.13/day)
Maybe they didn't brag about it because the numbers before the driver were absolutely terrible and they just fixed it. 25+W for full idle means there's something that's not working properly imo.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
15,901 (4.58/day)
Location
Kepler-186f
AMD does what Nvidia doesn't ~ 7nm nom nom bois!!!

How about fixing 165hz refresh on monitors, memory clock stays at full speed! Changing to 144 fixes the problem!

did you report it through amd software? i did. and i know one other did. the more who report this issue the sooner it gets fixed. you can't complain unless you do the bug report tool in amd software.
 
Joined
Feb 3, 2017
Messages
3,481 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
What changed to drop the power consumption, lowered clocks on GPU/VRAM?
 
Joined
Feb 16, 2014
Messages
39 (0.01/day)
finally a driver that fixes my idle screen artifacts on my ryzen 4750G.
goo AMD (normally you'd expect this to work at launch, but a year after: i take it!)
 
Joined
Apr 10, 2020
Messages
480 (0.33/day)
There are no changes with multi-monitor.

Why is this still happening? I use 4 screen setup and measure no difference between 1 and 4 monitors plugged in good old EVGA 1080TI (10W) but my gigabyte 5700XT draws +50W in idle with the same 4 screen setup. I wanna keep my power draw to a minimum when not gaming so I can have my open case system completely passively cooled when in idle (I hate PC noise). While 1080TI fans never turn when in idle and case is open, 5700XT fans do turn on from time to time and are quite audible. AMD should fix this ever persisting problem with their GPUs a long time ago. It should not be all that hard to fix it, given the fact that Nvidia has nailed it 6 years ago on much bigger die.
 
Joined
Sep 4, 2019
Messages
133 (0.08/day)
Haven't updated yet but hopefully the issue with memory clocks unable to idle with a 144hz display have been resolved. Other than that my Aorus Master version of the 6800XT has been one of the most stable, non-issue GPUs I've ever owned. It crushes every game out there and it does so quietly with low power consumption.
 

las

Joined
Nov 14, 2012
Messages
1,533 (0.37/day)
System Name Obsolete / Waiting for Zen 5 or Arrow Lake
Processor i9-9900K @ 5.2 GHz @ 1.35v / No AVX Offset
Motherboard AsRock Z390 Taichi
Cooling Custom Water
Memory 32GB G.Skill @ 4000/CL15
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 280 Hz + 77" QD-OLED @ 144 Hz VRR
Case Fractal Design Meshify C
Audio Device(s) Asus Essence STX / Upgraded Op-Amps
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
AMD does what Nvidia doesn't ~ 7nm nom nom bois!!!



did you report it through amd software? i did. and i know one other did. the more who report this issue the sooner it gets fixed. you can't complain unless you do the bug report tool in amd software.

Nvidia had this stuff working for ages, + proper 2d clocks using 144+ Hz (Nvidia fixed this years ago)

Simply look up powerconsumption in earlier reviews to verify, AMD used way to much power outside of 3d, always been the case

So I'm not sure what you are celebrating
 
Joined
Feb 3, 2017
Messages
3,481 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
Nvidia had this stuff working for ages, + proper 2d clocks using 144+ Hz (Nvidia fixed this years ago)
Both Nvidia and AMD have been fixing this several times. Not only for new generation but it does seem to break every now and then. Multimonitor is worse.
 
Joined
May 31, 2016
Messages
4,324 (1.50/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 16GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
What changed to drop the power consumption, lowered clocks on GPU/VRAM?
Maybe Vram since the GPU will drop to as low as 0Mhz when idle. So I guess the fixes are somewhere else since the 0Mhz says it all. Maybe optimization of the power delivery or something.
 
Joined
Sep 28, 2012
Messages
963 (0.23/day)
System Name Poor Man's PC
Processor AMD Ryzen 5 7500F
Motherboard MSI B650M Mortar WiFi
Cooling ID Cooling SE 206 XT
Memory 32GB GSkill Flare X5 DDR5 6000Mhz
Video Card(s) Sapphire Pulse RX 6800 XT
Storage XPG Gammix S70 Blade 2TB + 8 TB WD Ultrastar DC HC320
Display(s) Mi Gaming Curved 3440x1440 144Hz
Case Cougar MG120-G
Audio Device(s) MPow Air Wireless + Mi Soundbar
Power Supply Enermax Revolution DF 650W Gold
Mouse Logitech MX Anywhere 3
Keyboard Logitech Pro X + Kailh box heavy pale blue switch + Durock stabilizers
VR HMD Meta Quest 2
Benchmark Scores Who need bench when everything already fast?
TBH 20W doens't make any difference, still love 'em. Besides I already applied undervolt.
nVidia might be better, because they knew most of its users are sticking with power hungry CPU's :rolleyes:
 
Joined
Jan 8, 2017
Messages
8,929 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
There are no changes with multi-monitor.

Why is this still happening? I use 4 screen setup and measure no difference between 1 and 4 monitors plugged in good old EVGA 1080TI (10W) but my gigabyte 5700XT draws +50W in idle with the same 4 screen setup. I wanna keep my power draw to a minimum when not gaming so I can have my open case system completely passively cooled when in idle (I hate PC noise). While 1080TI fans never turn when in idle and case is open, 5700XT fans do turn on from time to time and are quite audible. AMD should fix this ever persisting problem with their GPUs a long time ago. It should not be all that hard to fix it, given the fact that Nvidia has nailed it 6 years ago on much bigger die.
You have to understand that this can't always be fixed. The cards ramp up clocks based on certain triggers and workloads, my 1080 ramps up memory clocks for no obvious reason all the time.
 
Joined
Apr 18, 2013
Messages
1,260 (0.31/day)
Location
Artem S. Tashkinov
AMD does what Nvidia doesn't ~ 7nm nom nom bois!!!



did you report it through amd software? i did. and i know one other did. the more who report this issue the sooner it gets fixed. you can't complain unless you do the bug report tool in amd software.

AMD normally addresses bugs only when they are inundated by bug reports not only from users but from prominent tech figures, i.e. media outlets or youtubers.

E.g. they removed gamma/proper color configuration in Radeon Drivers five over five years ago. The topic has been raised numerous times. Have they done anything? F no.

Don't overestimate their eagerness to fix anything unless it's burning under them. Oh, that's funny, I'm actually ignoring you.

Oh, and NVIDIA and Intel are no different unfortunately.
 
Joined
Nov 4, 2005
Messages
11,681 (1.73/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs and over 10TB spinning
Display(s) 56" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
AMD normally addresses bugs only when they are inundated by bug reports not only from users but from prominent tech figures, i.e. media outlets or youtubers.

E.g. they removed gamma/proper color configuration in Radeon Drivers five over five years ago. The topic has been raised numerous times. Have they done anything? F no.

Don't overestimate their eagerness to fix anything unless it's burning under them. Oh, that's funny, I'm actually ignoring you.

Oh, and NVIDIA and Intel are no different unfortunately.
Exactly, where are the video acceleration controls at? Now we get a couple choices that feel like they are made for Kindergartners.

I used to get into the registry and XML files to change it but recently they get overwritten on reboot.
 
Joined
Feb 3, 2017
Messages
3,481 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
You have to understand that this can't always be fixed. The cards ramp up clocks based on certain triggers and workloads, my 1080 ramps up memory clocks for no obvious reason all the time.
There probably is an obvious reason just maybe not too straightforward to notice. As long as you have some software running that uses GPU it can and probably will regularly wake it up and run at a higher clock bin. If you happen to have monitoring software running or a browser open - browsers are all using HW acceleration for stuff including video decoding these days - ramping up clocks is expected.

Why is this still happening? I use 4 screen setup and measure no difference between 1 and 4 monitors plugged in good old EVGA 1080TI (10W) but my gigabyte 5700XT draws +50W in idle with the same 4 screen setup.
Multimonitor is not a trivial use case. Display controller and memory need higher clocks when pushing through more pixels to monitors. This also means the higher resolution and refresh rate(s) you have on your monitor(s) the more complex is becomes. Plus, mismatched resolutions and refresh rates seem to be a bit of a problem on their own.

For example - in theory 2160p monitor should need the same clocks as 4*1080p monitors - in reality there is obviously some overhead in running more monitors.
I have two monitors - 1440p@165Hz and 1440p@60Hz. GPU drivers do not really seem to figure out nice low clocks for this combination. I have had the same monitor setup over over last 5-6 GPUs and all of them have been running at increased clocks at one point or another. Currently using Nvidia GPU and the last fix for this was somewhere middle of last year if I remember correctly.
 
Last edited:
Joined
Sep 28, 2012
Messages
963 (0.23/day)
System Name Poor Man's PC
Processor AMD Ryzen 5 7500F
Motherboard MSI B650M Mortar WiFi
Cooling ID Cooling SE 206 XT
Memory 32GB GSkill Flare X5 DDR5 6000Mhz
Video Card(s) Sapphire Pulse RX 6800 XT
Storage XPG Gammix S70 Blade 2TB + 8 TB WD Ultrastar DC HC320
Display(s) Mi Gaming Curved 3440x1440 144Hz
Case Cougar MG120-G
Audio Device(s) MPow Air Wireless + Mi Soundbar
Power Supply Enermax Revolution DF 650W Gold
Mouse Logitech MX Anywhere 3
Keyboard Logitech Pro X + Kailh box heavy pale blue switch + Durock stabilizers
VR HMD Meta Quest 2
Benchmark Scores Who need bench when everything already fast?
E.g. they removed gamma/proper color configuration in Radeon Drivers five over five years ago. The topic has been raised numerous times. Have they done anything? F no.

I think this is a good decision because currently there are no monitor that support DDC / CI anymore, most of them support VRR with their own MPRT, including noise / blur reduction which all makes overriding at the GPU level meaningless. In the future, DSC will become increasingly popular, with different techniques for each manufacturer, whether chroma subsampling, primary and secondary separation or any other type of compression will render this feature completely useless.
 
Joined
Apr 18, 2013
Messages
1,260 (0.31/day)
Location
Artem S. Tashkinov
I think this is a good decision because currently there are no monitor that support DDC / CI anymore, most of them support VRR with their own MPRT, including noise / blur reduction which all makes overriding at the GPU level meaningless. In the future, DSC will become increasingly popular, with different techniques for each manufacturer, whether chroma subsampling, primary and secondary separation or any other type of compression will render this feature completely useless.

I'm talking about something completely different:




This is how it looked in old Radeon Drivers:

https://www.reddit.com/r/Amd/comments/etfm7a (that's for video)

Or this:



And now we have just this: https://www.amd.com/en/support/kb/faq/dh3-021
 
Last edited:
Joined
Apr 10, 2020
Messages
480 (0.33/day)
There probably is an obvious reason just maybe not too straightforward to notice. As long as you have some software running that uses GPU it can and probably will regularly wake it up and run at a higher clock bin. If you happen to have monitoring software running or a browser open - browsers are all using HW acceleration for stuff including video decoding these days - ramping up clocks is expected.

Multimonitor is not a trivial use case. Display controller and memory need higher clocks when pushing through more pixels to monitors. This also means the higher resolution and refresh rate(s) you have on your monitor(s) the more complex is becomes. Plus, mismatched resolutions and refresh rates seem to be a bit of a problem on their own.

For example - in theory 2160p monitor should need the same clocks as 4*1080p monitors - in reality there is obviously some overhead in running more monitors.
I have two monitors - 1440p@165Hz and 1440p@60Hz. GPU drivers do not really seem to figure out nice low clocks for this combination. I have had the same monitor setup over over last 5-6 GPUs and all of them have been running at increased clocks at one point or another. Currently using Nvidia GPU and the last fix for this was somewhere middle of last year if I remember correctly.
The problem is I'm getting around 50W with 5700XT in idle (not using anything except some background apps like kaspersky, steam/epic). There really is no logical reason for such consumption.

Miss-match in resolutions and frequencies between monitors could well be the problem, but I use 4x1920*1200p 60Hz identical IPS monitors, so resolution-frequency miss-match should be ruled out at least in my case.
I still believe this could probably be fixed on driver level, but it might be some architectural under optimization since Polaris, RNDA1 and RDNA2 all suffer from this overconsumption problem but not VEGA arch based GPUs 56/64/RVII and also NVidia's GPUs seem not to suffer from it since releasing Kepler arch.
 
Last edited:
Joined
Sep 28, 2012
Messages
963 (0.23/day)
System Name Poor Man's PC
Processor AMD Ryzen 5 7500F
Motherboard MSI B650M Mortar WiFi
Cooling ID Cooling SE 206 XT
Memory 32GB GSkill Flare X5 DDR5 6000Mhz
Video Card(s) Sapphire Pulse RX 6800 XT
Storage XPG Gammix S70 Blade 2TB + 8 TB WD Ultrastar DC HC320
Display(s) Mi Gaming Curved 3440x1440 144Hz
Case Cougar MG120-G
Audio Device(s) MPow Air Wireless + Mi Soundbar
Power Supply Enermax Revolution DF 650W Gold
Mouse Logitech MX Anywhere 3
Keyboard Logitech Pro X + Kailh box heavy pale blue switch + Durock stabilizers
VR HMD Meta Quest 2
Benchmark Scores Who need bench when everything already fast?
I'm talking about something completely different:




This is how it looked in old Radeon Drivers:

https://www.reddit.com/r/Amd/comments/etfm7a (that's for video)

Or this:



And now we have just this: https://www.amd.com/en/support/kb/faq/dh3-021

And I firmly believe that I responded accordingly, Display Data Channel Wikipedia.
Suppose you want manual setup, do you know your monitor supports DDC / CI or you just blindly trust nVidia? Just like GSync Ultimate, graphics card support it, but the monitor is only Compatible, are you sure that works just because control panel says so? As I said, manual settings are useless, as most of them do not conform to ICC profiles, which came from monitor driver, constructed through manual calibration with the X-Rite / Datacolor SpyderX, or just factory pre-calibrated monitor. The idea of manual override is good, but there's no guarantee its value won't change in games or apps with built-in ICC toggles (ie. Adobe, DaVinci).
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
15,901 (4.58/day)
Location
Kepler-186f
I'm talking about something completely different:




This is how it looked in old Radeon Drivers:

https://www.reddit.com/r/Amd/comments/etfm7a (that's for video)

Or this:



And now we have just this: https://www.amd.com/en/support/kb/faq/dh3-021


fyi I have used those Nvidia gamma settings a lot on my gaming laptop (cause it was only way to adjust colors on laptop screen) and it would only enforce the colors in half the games....) nvidia is just as broken as AMD here. its just easier to use a ICC profile and color profile enforcer. nvidia doesn't work either on this. at least not on laptops. fullscreen witcher 3 reset the nvidia color gamut everytime... annoyed crap out of me... so eh your point is moot.
 
Joined
Apr 21, 2010
Messages
562 (0.11/day)
System Name Home PC
Processor Ryzen 5900X
Motherboard Asus Prime X370 Pro
Cooling Thermaltake Contac Silent 12
Memory 2x8gb F4-3200C16-8GVKB - 2x16gb F4-3200C16-16GVK
Video Card(s) XFX RX480 GTR
Storage Samsung SSD Evo 120GB -WD SN580 1TB - Toshiba 2TB HDWT720 - 1TB GIGABYTE GP-GSTFS31100TNTD
Display(s) Cooler Master GA271 and AoC 931wx (19in, 1680x1050)
Case Green Magnum Evo
Power Supply Green 650UK Plus
Mouse Green GM602-RGB ( copy of Aula F810 )
Keyboard Old 12 years FOCUS FK-8100
TPU , Can you check CRU in multi-monitor ? for at least 6800 series.some claims they gain lower memory clock by using CRU
 
Top