If I enable any 2nd monitor, including the little baby 480p system panel, my Idle is 79 watts. It's always 79. If I enable the rest of my monitors each one adds about a watt. So with the whole setup running its about 81 to 82 watts.
You might get something out of the CRU presets on the Acer or AOCs.
It does include a way to quickly reset everything to default settings, so no worries on permanently messing up. You may find it a bit annoying if you have orientation settings though...
I'm gonna be honest with you chief. If AMD's recent track record is to go by, in 7 years you'll have no driver support whatsoever, with your card considered "vintage", no security driver updates, it'll be just... forgotten with all the bugs that were reported over the years, just like the R9 Fury X and the Radeon VII - which mind you, is not even 5 years old yet and already discontinued. By then, don't expect any community help either. The fanbase at large will tell you that your card is old, its architecture is well-developed and there was nothing they could do or that you had any reasonable demand to ask them of anyway, "RDNA family is what 11 years old by now?", and that you should just suck it up.
I know, but I really don't see anything other than 7900XT(X) that's worth buying after considering that my side of the world is hit by the 4090 ban, which also skews 4080 (and potentially 4080super) pricing to a bit ridiculous side, and I probably won't be satisfied with 4070Ti(Super) anyway. And, for a more personal opinion, I don't value DLSS.
Essentially anything >4070Ti became way too costly to purchase, and 4070Ti and 7900XTX ended up in the same price bracket. Also being overly pessimistic here, there's a remote possibility that 4070TiSuper will also get scalped for AI reasons just because it is NVIDIA+16GB. (No, 4060Ti16GB doesn't count.)
On a more ideal world which 4080 is ~USD$1000 and there is no 4090 ban, I will be much more inclined to choose 4080. (but still choose a then significantly cheaper 7900XTX anyway.)
How is your power draw with only this 4K monitor plugged in?
I have posted a few findings back on post #67 when I first have lots of time to fiddle.
If I remember what I meant correctly, it should be around 10W (not single monitor; the 4K150Hz + 1080p60Hz).
EDIT: 4K150 + 1080p60 is around 21W. 4K150 only is around 15W. 4K60 only is around 11W. CRU didn't cause enough difference.
You still used a budget cable. Ugreen is Amazon Basic. If I were you I would have bought a cable with 8K support. The DP on the 7900 series cards are DP 2.1. You should use one of those cables and see, With modern GPUs on the AMD side the cable spec matters more.
On my side of the world (Macau if you are curious) it's either UGREEN or unbranded/wacky-Chinese-branded stuff (be it Taobao or local store)... or going Amazon overseas with a ridiculous shipping fee. (that link does look wacky-Chinese-branded to my untrained eyes; if that's actually a good brand, then forgive my ignorance! But even then if I really have to buy another cable, I will probably choose a UGREEN DP
2.1 cable anyway.)
For information, if I remember correctly, the cable I bought have VESA certification (it is not listed in product information currently), and claims support of up to 8K60Hz / 4K240Hz.
Besides, the cable did help a little bit on power draw if I look at the averages.
Side Note: To another shock horror and relief to my brother, he sorted out his idle power draw issues without screwing with refresh rates, this time by enabling
DSR on his Philips 170Hz. Yes, really.
For the lolz and science, I tried similar things (VSR the 4K to , ehh, 5760x3240?, and the AOC to 1440p/4K/whatever) and there is virtually no change to idle power draw. (Well, it forces the AOC to 144Hz, so 90W again)
I also ran FM8 at 5760x3240 for a while and it does look super great...until I spotted a shimmer and got overly annoyed. Yes, I know FSR is not very good.