• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Does 7900XT still has high idle consumption on multiple monitors? And should I wait for the Supers even when I'm leaning towards AMD cards?

Joined
Sep 17, 2014
Messages
22,278 (6.02/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Now I got my hands on the card, and after a cleaning and a messy cable running session...
Here are the numbers that somehow a few people a waiting for. Saaaaaaadly...
View attachment 320872
The numbers are hovering at 80~100W range. Using latest driver at the time (23.11.1) and quiet BIOS, if that matters.
Without the M27U (the 4K monitor) it will average at ~22W, with lows of 7W if I wait hard enough.
Also, the screenshot function doesn't work perfectly probably because of various monitor scaling / position, but I digress.

Other things I tried that didn't help the situation:
Changing color depths of M27U from 10bpc to 8bpc
Changing resolution of M27U 4K to 1080p (I can guess from the Windows setting screen that signal output is still 4K)
Changing refresh rate of M27U (default 150Hz, tried 144Hz, 60Hz, 59.94Hz)
EDIT: Advanced Power Settings-> PCIE -> Changing from Moderate? to Maximum
Flipping the switch for Eyefinity for the second time (see below)

Things that helped the situation in the wrong way:
Flipping the switch for Eyefinity for the first time
(because of the wacky positioning, of course that's gonna be a horror show. Maybe because I touched some other monitor settings for the monitors, the second time I flipped the switch the power draw stayed high.
The physical space constraint means that I can't put 3 monitors horizontally at the same time. But moar monitors are moar monitors, hence the wacky positioning / config.)




I really didn't think this through huh? What an idiot I am. : /
Now if Windows always remember to turn off the monitors...
( I guess maybe it's the trivial things I run that took away the attention of Windows. )


I swear the SG really isn't 61mm tall after I measured and put both cards next to each other for comparison.
Also, I really hate alcohol beverages (allergy I guess, but also general hate), but I appreciate the thoughts out there...
Well shit.

You're getting the full 3D memory clock there. That's what's pulling those watts.

You can also try enabling “maximum power savings” for the pci-e under the advanced power settings for the windows power profile.

3 monitors seems to be a tipping point between relatively decent idle usage and basically doubling idle power draw; especially with higher res, refresh, and bpc.
Yep, card thinks or driver says it just needs to have the full mem clock.

That's probably the thing. For decent idle power, you need all of your monitors to be relatively low res and low refresh rate. For example, dropping my 144 Hz display to 60 Hz drops idle power from 36-40 W to about 25.
Its really just whether or not the total load you put on the card in 2D is enough to trigger a higher power state. You can intermittently see clock speeds jump up whenever you do something on desktop that's graphically 'something'. Even dragging a window. Most of the time, its too low to move to that higher power state, but if the base resolution output PLUS the refresh rate is high, your base load is already on the edge of where the card thinks it can be for low power states. Also I think its plausible that different monitors on varying refresh rates are going to impact efficiency.

You can play with this for hours in Adrenalin but all settings approach this from a similar angle; you're trying to limit power usage; , and you can't change the way the card detects the need for higher clocking, apart from perhaps bios edits.
 
Last edited:
Joined
Jun 25, 2020
Messages
140 (0.09/day)
System Name The New, Improved, Vicious, Stable, Silent Gaming Space Heater
Processor Ryzen 7 5800X3D
Motherboard MSI B450 Tomahawk Max
Cooling be quiet! DRP4 (w/ added SilentWings3), 4x Noctua A14x25G2 (3 @ front, 1 @ back)
Memory Teamgroup DDR4 3600 16GBx2 @18-22-22-22-42 -> 18-20-20-20-40
Video Card(s) PowerColor RX7900XTX HellHound
Storage ADATA SX8200Pro 1TB, Crucial P3+ 4TB (w/riser, @Gen2x4), Seagate 3+1TB HDD, Micron 5300 7.68TB SATA
Display(s) Gigabyte M27U @4K150Hz, AOC 24G2 @1080p100Hz(Max144Hz) vertical, ASUS VP228H@1080p60Hz vertical
Case Phanteks P600S
Audio Device(s) Creative Katana V2X gaming soundbar
Power Supply Seasonic Vertex GX-1200 (ATX3.0 compliant)
Mouse Razer Deathadder V3 wired
Keyboard Keychron Q6Max
One more observation:

Plugging only the primary display into the GPU (3440x1440, 144 Hz in my case), and the secondary (1024x600, 43 Hz) into the motherboard sets idle power consumption right: 18 W sitting on the windows desktop, which is less than half of what it used to be. No more need for an ultra low display switch-off time! :)

Vdeo playback power is still high at roughly 50 W, but at least GPU behaviour in gaming is now correct.
Welp, my CPU is a 5800X3D so...
but it looks like I have too many GPU accelerated low load things (well, actually, too many Firefox pages and the Chromium thingy) at the same time to cause instability in heavy gaming workloads, it does make me want to think about a cheapo AMD card. (EDIT: like an RX6400 @ USD~100, this time really on a x1 slot + a riser)
Well shit.

You're getting the full 3D memory clock there. That's what's pulling those watts.
I did get the idle power to about 49W after lots of tinkering in CRU, so it's mostly fine now.
 
Last edited:
Joined
Dec 10, 2022
Messages
484 (0.70/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
@Avro Arrow
->brands : Well, I know a few people who believe in the TUF/STRIX religion, so...
So.... what? I know a few people who think Elvis is still alive! Some people are just plain :kookoo: but are too :kookoo: to realise that they're :kookoo:. :roll:
->Cooler : I have gone out my way to buy lots of bequiet! and Noctua stuff for the sake of noise level. And my current home have a pretty low noise floor, so a really quiet cooler is kinda important. But hey, preferences.
I think Hellhound probably had a slightly better cooler (not sure!) with a few added features (BIOS switch, LED switch to change color / turn off RGB entirely) which Pulse doesn't have. Hellhound ended up only ~3% pricier than Pulse instead of ~10%, which is worth it IMO. And hey, every card is cool, but Hellhound is cooler.
It might be slightly cooler but you're really splitting hairs at this point because I've never had a card that ran hot or loud except my first HD 4870 because it had a blower cooler. I've never had any thermal or noise issues from a triple-fan card (which is all I ever buy anyway because I have a gigantic case). You wouldn't be able to tell even as much as a 10°C difference between two cards because as long as they're both within spec, they'll both just run the same at the two temperatures with the same fan speed without issue.
-> Undervolt : I don't see undervolt being useful in helping idle power draw. Single digit power draw is only seen when all the monitors are turned off by Windows). General non-gaming usage for my 7900XTX started at 80W (before all the tinkering below), so yeah.
Undervolting affects all aspects of power use because you're running the card at a lower voltage at all times. Sure, it will make a bigger difference under load but it still makes a difference at idle as well.
 
Joined
Sep 17, 2014
Messages
22,278 (6.02/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Undervolting affects all aspects of power use because you're running the card at a lower voltage at all times. Sure, it will make a bigger difference under load but it still makes a difference at idle as well.
Not sure that's true; the undervolt you set is for peak voltage only, and the VF curve does change along with it, but at the bottom, it barely changes, if anything. Idle is idle.
 
Joined
Dec 28, 2012
Messages
3,813 (0.88/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
This is why I push the reasoning, not value, of what AMD does (raw compute being one notch higher per market, higher framebuffer that adds to potential boilerplate longevity vs 4070 Ti). It's not that I don't appreciate nVIDIA's features, it's that I want people to realize they are a choice wrt perks in current titles at the cost of that longevity; they won't age well. If that's your jam (you want playable 1440p, really high-end 1080p for RIGHT NOW, buy a 4070 ti), awesome. If you want the compute (which is increasingly being used for the equivalent of nVIDIA's fixed-function hw)/buffer of 4080 for wayyy less, buy a 7900xt. If you want what 4070 Ti should have been (16GB for >1080p longevity), buy a 7800xt. They're for two different kinds of consumers, but the value should not be as lopsided as it is and truly is this way because people are generally short-sighted/uninformed/mezmorized by nVIDIA's marketing (until the next product launch and they're left in the cold, complain about it, and then buy the card nvidia wants them to upgrade to in order to maintain that level of perf). It's only with educating people on these realities that perception changes and we bring nVIDIA's pricing down to earth. It's not about being any kind of fanboy of a company, it's about wanting better prices/choices for everyone and people realizing what each purchase means; where it fits with their (short or long-term) goals. Like I said, I want to minimize the people that are pissed when their $800 4070 ti won't play a future game at cranked 1440p60, or any kind of decent upscaled 4k resolution. If they're cool with 1440p DLSS Quality (960p) or 4k performance (1080p) for $800 because of comparative DLSS quality (at each res) or RT features that allow higher quality at a lower rez (and that will age badly as newer archs launch with improved capability per tier), that's wonderful. When 7900xt is priced less and will sustain real higher resolutions (1440p/decent 4k upscaling) much longer on the whole, I am not, and don't think most people should be, but that's their choice. Across the stack, now and (conceivably) forever.
This argument of "longevity" falls apart when you realize that nvidia provides game ready drivers for 10 year old cards yet AMD cuts support off for 6 year old hardware. But we just ignore that because MuH fInE wInE. Anything with sufficient VRAM will work fine for the rest of this console generation and likely 1-2 years into the next at the settings you buy it to play on today.
 
Joined
Jun 25, 2020
Messages
140 (0.09/day)
System Name The New, Improved, Vicious, Stable, Silent Gaming Space Heater
Processor Ryzen 7 5800X3D
Motherboard MSI B450 Tomahawk Max
Cooling be quiet! DRP4 (w/ added SilentWings3), 4x Noctua A14x25G2 (3 @ front, 1 @ back)
Memory Teamgroup DDR4 3600 16GBx2 @18-22-22-22-42 -> 18-20-20-20-40
Video Card(s) PowerColor RX7900XTX HellHound
Storage ADATA SX8200Pro 1TB, Crucial P3+ 4TB (w/riser, @Gen2x4), Seagate 3+1TB HDD, Micron 5300 7.68TB SATA
Display(s) Gigabyte M27U @4K150Hz, AOC 24G2 @1080p100Hz(Max144Hz) vertical, ASUS VP228H@1080p60Hz vertical
Case Phanteks P600S
Audio Device(s) Creative Katana V2X gaming soundbar
Power Supply Seasonic Vertex GX-1200 (ATX3.0 compliant)
Mouse Razer Deathadder V3 wired
Keyboard Keychron Q6Max
So.... what? I know a few people who think Elvis is still alive! Some people are just plain :kookoo: but are too :kookoo: to realise that they're :kookoo:. :roll:

It might be slightly cooler but you're really splitting hairs at this point because I've never had a card that ran hot or loud except my first HD 4870 because it had a blower cooler. I've never had any thermal or noise issues from a triple-fan card (which is all I ever buy anyway because I have a gigantic case). You wouldn't be able to tell even as much as a 10°C difference between two cards because as long as they're both within spec, they'll both just run the same at the two temperatures with the same fan speed without issue.
"Elvis is dead" is a fact. "TUF/STRIX is good/bad" is at best subjective. I don't want to do as far to discredit them.
Besides, I remember lots of the coolers in TUF/STRIX cards are best in their classes in TPU reviews.
The TUF 7900XTX got a much better deal just before I pushed the BUY IT button that I would have considered if not for the pesky 4-slot form factor.

And yes, the 3070 SG that I used is triple-slot and triple-fan, but a very cheap one. I really have to choose between "quiet but hot enough to crash" or "cool enough but very noticable" in the fairly quiet place I live in. Not noisy enough to seriously bother me but still.
I'm probably splitting hairs right now. I know. I know. That's probably how the brain of a quiet enthusiast works, if that is a thing.
This argument of "longevity" falls apart when you realize that nvidia provides game ready drivers for 10 year old cards yet AMD cuts support off for 6 year old hardware. But we just ignore that because MuH fInE wInE. Anything with sufficient VRAM will work fine for the rest of this console generation and likely 1-2 years into the next at the settings you buy it to play on today.
Okay, looks like a fair argument. But...Hmm...
Looking back on this link the wording is "putting (Vega / Polaris) on a slower driver update track".
Then I dig a bit further (kinda randomly picked high/top-end models at the time) and learned about:
- R9 390 (launched 2015/06, discontinued 2022/06 -> 7yrs)
- GTX980Ti (launched 2015/06, still supported)
- 780Ti (launched 2013/07, last game ready driver 2021/09 -> 8yrs, security support ongoing)
Not exactly 6yrs, but I can see your point. Pricing on AMD side is not that top-end to start with, but anyway...
But hey, my 5700XT did well for me on its later days, and I would guesstimate that 7900XTX will fare better than 4080 in such 6~7yrs lifespan, so I'm personally okay with the "fine wine" argument.

The 1070 (launched 2016) I previously owned was really struggling when I replaced it on ~2021. It was very stable, but I learned the term "high refresh monitors". And it is more mid-range to start with. I know. I know...
Higher-end cards like 1080(Ti) are gonna fare better for sure. But for a 980Ti on 2021, I don't know why but I winced a bit as I typed.
........
Okay, this is the first top-end card I own. I don't know what to expect a top-end card will fare beyond 7 years other than being a captain obvious saying "reasonably well". My brain is really murky and ranty here.
 
Last edited:

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
8,105 (2.37/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling NH-U12A + T30┃AXP120-x67
Memory 64GB 6400CL32┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Case Caselabs S3┃Lazer3D HT5
"Elvis is dead" is a fact. "TUF/STRIX is good/bad" is at best subjective. I don't want to do as far to discredit them.
Besides, I remember lots of the coolers in TUF/STRIX cards are best in their classes in TPU reviews.
The TUF 7900XTX got a much better deal just before I pushed the BUY IT button that I would have considered if not for the pesky 4-slot form factor.

And yes, the 3070 SG that I used is triple-slot and triple-fan, but a very cheap one. I really have to choose between "quiet but hot enough to crash" or "cool enough but very noticable" in the fairly quiet place I live in. Not noisy enough to seriously bother me but still.
I'm probably splitting hairs right now. I know. I know. That's probably how the brain of a quiet enthusiast works, if that is a thing.

Okay, looks like a fair argument. But...Hmm...
Looking back on this link the wording is "putting (Vega / Polaris) on a slower driver update track".
Then I dig a bit further (kinda randomly picked high/top-end models at the time) and learned about:
- R9 390 (launched 2015/06, discontinued 2022/06 -> 7yrs)
- GTX980Ti (launched 2015/06, still supported)
- 780Ti (launched 2013/07, last game ready driver 2021/09 -> 8yrs, security support ongoing)
Not exactly 6yrs, but I can see your point. Pricing on AMD side is not that top-end to start with, but anyway...
But hey, my 5700XT did well for me on its later days, and I would guesstimate that 7900XTX will fare better than 4080 in such 6~7yrs lifespan, so I'm personally okay with the "fine wine" argument.

The 1070 (launched 2016) I previously owned was really struggling when I replaced it on ~2021. It was very stable, but I learned the term "high refresh monitors". And it is more mid-range to start with. I know. I know...
Higher-end cards like 1080(Ti) are gonna fare better for sure. But for a 980Ti on 2021, I don't know why but I winced a bit as I typed.
........
Okay, this is the first top-end card I own. I don't know what to expect a top-end card will fare beyond 7 years other than being a captain obvious saying "reasonably well". My brain is really murky and ranty here.

For TUF/Strix specifically though, it varies wildly for every given GPU and the only real constant is that it's expensive as hell in relation to others' versions of said GPU. Asus also has a track record of visibly neglecting Radeon in the past few years.

On the AMD side, Asus doesn't really do anything unique. Powercolor, Sapphire and XFX all use ball bearing fans. Basically all the AIBs build on the same respectable AMD reference PCB, so any advantages from Asus' custom PCB design is minimal.
 
Joined
Feb 24, 2023
Messages
2,922 (4.74/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC
Processor i5-12400F / 10600KF
Motherboard Gigabyte B760M DS3H / Z490 Vision D
Cooling Laminar RM1 / Gammaxx 400
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333
Video Card(s) RX 6700 XT / R9 380 2 GB
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD
Display(s) Compit HA2704 / MSi G2712
Case Matrexx 55 / Junkyard special
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / DQ550ST [backup]
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 10 and 11
7900XTX will fare better than 4080 in such 6~7yrs lifespan
It really looks very much like HD 7970 VS GTX 680 and RX 480 VS GTX 1060.

A little advantage on the nVidia side in the first year, parity throughout a couple next years, then the nVidia's GPU falls off in recent games a bit harder than the AMD's GPU but... 25 FPS versus 21, is that worth debating? Both GPUs are basically dead.

I very much agree with discontinuing the support of obsolete GPUs because it leaves more headroom for supporting the ones that still matter, id est current gen and 1.5 last gens. With AMD software department being so limited it's either "we got all GPUs supported but each GPU works like trash because we couldn't realistically optimise the driver" or "sorry but your RX 480 is no longer getting new drivers." RX 480 owners ain't gonna play latest AAA and expect 30+ FPS anyway.
 
Joined
Dec 10, 2022
Messages
484 (0.70/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
"Elvis is dead" is a fact. "TUF/STRIX is good/bad" is at best subjective. I don't want to do as far to discredit them.
What I meant was that just because some people say things, it doesn't make what they say true, even if they themselves fully believe it.
Besides, I remember lots of the coolers in TUF/STRIX cards are best in their classes in TPU reviews.
Perhaps but your focus is on "good" when it should be on "good enough". Sure, the Strix might have the best cooler (the TUF does NOT) but it's like I said, you're splitting hairs at that point. Consider how many cards that I've owned and I've been 100% satisfied with all of them (except the one RX 5700 XT that was defective). I slot them into my motherboard and away I go without a second thought.

I've been doing this for so long that I see through the marketing BS and the kinda-sorta BS from reviewers. I say kinda-sorta because it's their job to show differences between the products to help you select the one that's best for you and that's fair enough. The problem is that when the differences between the products are insignificant, they have to talk about differences that you would never notice in 1000 years.

Newbies have this insane level of overthinking their purchases because it's a lot of money to part with and if you're not used to it already (like I am), then you're going to worry. Researching things to death leads to data but if you don't understand the data, it's a lot to figure out.

Don't get me wrong, having the data is better than not having it but you should never let it drag you down a rabbit hole and that seems to be what you're doing.
 
Joined
Sep 17, 2014
Messages
22,278 (6.02/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Okay, this is the first top-end card I own. I don't know what to expect a top-end card will fare beyond 7 years other than being a captain obvious saying "reasonably well". My brain is really murky and ranty here.
Cards simply last as long as they perform. I have a pretty good track record of keeping cards for a long time. The 1080 lasted a full 6,5 years as primary gaming card, and its still completely useable for some undeterminable period of time. The GPU itself still looks like on day one, temps are still good, performance is still fine for many many games. I never considered whether it would actually last that long when I bought it though. The fact is, GPU chips just don't or barely ever die, much like CPUs. If anything goes its memory, or something on the pcb, and when it goes it just goes. I often see people open up their hardware more than once in its lifetime, its something I would only ever do if the card were out of warranty and underperforming clearly from a lack of cooling. Similar things apply to changing BIOS.

Matter of fact it doesn't even matter one bit if you own a mid range 250 dollar budget cooled card versus a high end 300W 3 slot monstrosity. Either one could go any time of the day, any year into the future. You'll come across ancient AGP cards even that still run, regardless of their performance bracket.

If it ain't broke, don't fix it, and if it works, just enjoy it.
 
Joined
Jan 14, 2019
Messages
12,167 (5.74/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Cards simply last as long as they perform. I have a pretty good track record of keeping cards for a long time. The 1080 lasted a full 6,5 years as primary gaming card, and its still completely useable for some undeterminable period of time. The GPU itself still looks like on day one, temps are still good, performance is still fine for many many games. I never considered whether it would actually last that long when I bought it though. The fact is, GPU chips just don't or barely ever die, much like CPUs. If anything goes its memory, or something on the pcb, and when it goes it just goes. I often see people open up their hardware more than once in its lifetime, its something I would only ever do if the card were out of warranty and underperforming clearly from a lack of cooling. Similar things apply to changing BIOS.

Matter of fact it doesn't even matter one bit if you own a mid range 250 dollar budget cooled card versus a high end 300W 3 slot monstrosity. Either one could go any time of the day, any year into the future. You'll come across ancient AGP cards even that still run, regardless of their performance bracket.

If it ain't broke, don't fix it, and if it works, just enjoy it.
I think the no.1 component that dies on a GPU is the fan, which luckily, can be replaced, although getting replacement parts could be a bit of a pain. That's why I keep a passively cooled 1050 Ti as a backup.

With that said, I've been lucky enough never to have a GPU die on me, and I've had many! :)
 
Joined
Feb 24, 2023
Messages
2,922 (4.74/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC
Processor i5-12400F / 10600KF
Motherboard Gigabyte B760M DS3H / Z490 Vision D
Cooling Laminar RM1 / Gammaxx 400
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333
Video Card(s) RX 6700 XT / R9 380 2 GB
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD
Display(s) Compit HA2704 / MSi G2712
Case Matrexx 55 / Junkyard special
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / DQ550ST [backup]
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 10 and 11
no.1 component that dies on a GPU is the fan
That is why my RX 6700 XT looks like a Frankenstein's beast rather than a GPU as of now.
 
Joined
Dec 10, 2022
Messages
484 (0.70/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
I think the no.1 component that dies on a GPU is the fan, which luckily, can be replaced, although getting replacement parts could be a bit of a pain. That's why I keep a passively cooled 1050 Ti as a backup.

With that said, I've been lucky enough never to have a GPU die on me, and I've had many! :)
Usually, you can find replacement fans on eBay for most cards. Quite often, a fan can be used on several cards because card manufacturers often out-source their cooling solutions to companies like Cooler Master. In a case like that, the fans' mounts are often exactly the same. Like, I'm pretty sure that the fans on my R9 Furies would fit my RX 7900 XT or RX 5700 XT if I needed them to. It does pay to have older flagship cards lying around that can be pillaged for parts. A Sapphire R9 Fury Nitro+ has a TDP of 275W with hot-running HBM which means that its cooler is no joke and has high-quality components.

Now, my reference RX 6800 XT might be a different story because it has an all-metal shroud and those fans look completely different. :D
That is why my RX 6700 XT looks like a Frankenstein's beast rather than a GPU as of now.
Hey, if it works, it works! Who cares what it looks like? To me, someone with a card that looks like yours is someone who knows what they're doing. Most people wouldn't dare try changing their card fans on their own. ;)
 
Joined
Jun 25, 2020
Messages
140 (0.09/day)
System Name The New, Improved, Vicious, Stable, Silent Gaming Space Heater
Processor Ryzen 7 5800X3D
Motherboard MSI B450 Tomahawk Max
Cooling be quiet! DRP4 (w/ added SilentWings3), 4x Noctua A14x25G2 (3 @ front, 1 @ back)
Memory Teamgroup DDR4 3600 16GBx2 @18-22-22-22-42 -> 18-20-20-20-40
Video Card(s) PowerColor RX7900XTX HellHound
Storage ADATA SX8200Pro 1TB, Crucial P3+ 4TB (w/riser, @Gen2x4), Seagate 3+1TB HDD, Micron 5300 7.68TB SATA
Display(s) Gigabyte M27U @4K150Hz, AOC 24G2 @1080p100Hz(Max144Hz) vertical, ASUS VP228H@1080p60Hz vertical
Case Phanteks P600S
Audio Device(s) Creative Katana V2X gaming soundbar
Power Supply Seasonic Vertex GX-1200 (ATX3.0 compliant)
Mouse Razer Deathadder V3 wired
Keyboard Keychron Q6Max
Being drown in the data rabbit hole is kind of an occupational disease...
And yes, I checked taobao and there are replacement fans if I really need it.
Now I hope nothing blows up on the card during its lifespan, and I really mean nothing ...

Update:
After I have a loooong look on the RX7000 series owner club thread, I thought to myself, sure a new cable won't hurt...
So I bought a new UGREEN DisplayPort 1.4 cable to swap the (unbranded, probably package included) cable on the AOC 24G2. The monitor can support only DP1.2 at max, previously running at 100Hz instead of it's maximum supported 144Hz.
The cable on the GB M27U is the package included cable from GB running DP2.1.
The cable on the cheapo ASUS is a new HDMI 2.0 cable. I can try DP-to-DVI if I needed, but it is not likely causing any bump on the power draw numbers anyway.

The new cable only helped a very little bit on default settings (moderate power savings, max refresh or 100Hz, run-to-run variance I guess).
But if I flip the switch for PCIe link state to high power savings (or whatever it is called), while pushing back to max refresh rate, the VRAM clock will fluctuate a lot more on the new cable, allowing 5~10% of power savings when compared to the old cable.
If I set the AOC to 120Hz idle power draw is going down to ~60W on average.
Back to 100Hz will allow VRAM go below 909MHz and yield a 30~35W idle power draw on average!

Unfortunately, the high power savings settings will destabliize gaming workloads, so I will have to flip the PCIe link state switch back to moderate power savings when I actually game.
Now, if only there is a way to flip that switch without jumping through all those setting hoops...

I have wasted too much time on this unscientific experiment, so until a new driver drastically improve things I will let things be for now. 45~50W is not a horrible number for this wacky setup and for big card standards, though I really hope there will be a new driver to further improve these numbers...or stabilize the high power saving settings.

Side note: To the shock horror of myself and my brother, his setup (3070ti, MSI 27" 1440p165Hz QD thingy, Philips 27" 1440p170Hz IPS) uses ~75W idle by the graphic card alone no matter the link state settings. He worked around that with similar methods (on NVIDIA control panel, uses 144Hz profile on Philips, push to 164Hz, bam, ~25W idle).
He really doesn't like the MSI (something in the software, and it really hates custom resolutions), and plans to swap it to a 27" 1440p240Hz OLED, probably with a 4080. Funny number ensues I guess, but that's a few months time for another story.
 
Joined
Jan 14, 2019
Messages
12,167 (5.74/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Try the new 23.12.1 driver! It claims to help with multi-monitor idle power draw, and it actually does with my system. :)
 
Joined
Jun 25, 2020
Messages
140 (0.09/day)
System Name The New, Improved, Vicious, Stable, Silent Gaming Space Heater
Processor Ryzen 7 5800X3D
Motherboard MSI B450 Tomahawk Max
Cooling be quiet! DRP4 (w/ added SilentWings3), 4x Noctua A14x25G2 (3 @ front, 1 @ back)
Memory Teamgroup DDR4 3600 16GBx2 @18-22-22-22-42 -> 18-20-20-20-40
Video Card(s) PowerColor RX7900XTX HellHound
Storage ADATA SX8200Pro 1TB, Crucial P3+ 4TB (w/riser, @Gen2x4), Seagate 3+1TB HDD, Micron 5300 7.68TB SATA
Display(s) Gigabyte M27U @4K150Hz, AOC 24G2 @1080p100Hz(Max144Hz) vertical, ASUS VP228H@1080p60Hz vertical
Case Phanteks P600S
Audio Device(s) Creative Katana V2X gaming soundbar
Power Supply Seasonic Vertex GX-1200 (ATX3.0 compliant)
Mouse Razer Deathadder V3 wired
Keyboard Keychron Q6Max
Try the new 23.12.1 driver! It claims to help with multi-monitor idle power draw, and it actually does with my system. :)
I tried (EDIT: the day it released), didn't change anything on my setup regarding to idle power draw, so I didn't bother to write anything on this thread anyway. : /
To be fair to AMD the exact words in the release notes are "dual monitor display setups".

(EDIT: I will post as soon as something got improved. And maybe a "in hindsight..." post after the Super launch.)
 
Last edited:

OptimalMayhem

New Member
Joined
Dec 18, 2023
Messages
1 (0.00/day)
I thought i'd add my own personal experience as some data to this conversation.

I've had a reference model 7900xt since launch and i've always had the high idle power consumption, and for me it does still exist now. I've always kept my drivers up to date and i've never seen a difference at all as drivers have come and gone.

I have a 4 monitor setup, but my monitors are shared between 2 systems. So I don't always have them all being run by my 7900xt. If I need them for my laptop I just disable them in Windows on my 7900xt PC and enable them in Windows on the laptop.

One of the 4 monitors is a small 60hz 480p display that I use for my system monitor. My main monitor is a 165hz 1440p Acer Nitro display. The other 2 are 1440 Hz 1440p AOC displays. The little guy uses HDMI, the rest of them use DP.

If I have only the Acer monitor enabled my Idle Power is about 10 watts, and it jumps around between that and like 20 watts if i'm just doing light tasks like web browsing.

If I enable any 2nd monitor, including the little baby 480p system panel, my Idle is 79 watts. It's always 79. If I enable the rest of my monitors each one adds about a watt. So with the whole setup running its about 81 to 82 watts.
 
Joined
Dec 25, 2020
Messages
6,509 (4.63/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Okay, this is the first top-end card I own. I don't know what to expect a top-end card will fare beyond 7 years other than being a captain obvious saying "reasonably well". My brain is really murky and ranty here.

I'm gonna be honest with you chief. If AMD's recent track record is to go by, in 7 years you'll have no driver support whatsoever, with your card considered "vintage", no security driver updates, it'll be just... forgotten with all the bugs that were reported over the years, just like the R9 Fury X and the Radeon VII - which mind you, is not even 5 years old yet and already discontinued. By then, don't expect any community help either. The fanbase at large will tell you that your card is old, its architecture is well-developed and there was nothing they could do or that you had any reasonable demand to ask them of anyway, "RDNA family is what 11 years old by now?", and that you should just suck it up.

The cable on the GB M27U is the package included cable from GB running DP2.1.

How is your power draw with only this 4K monitor plugged in? My RTX 4080 is at around 8 W with a 4K120, 10-bit HDR idling. I have EGS and Discord open and that's about it. Only one panel, of course, since this is a 55 inch OLED and it's basically intended to be a universal solution to the multiple-screen issue (I like the estate, dislike having multiple monitors).

1702928649886.png
 
Joined
Feb 24, 2023
Messages
2,922 (4.74/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC
Processor i5-12400F / 10600KF
Motherboard Gigabyte B760M DS3H / Z490 Vision D
Cooling Laminar RM1 / Gammaxx 400
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333
Video Card(s) RX 6700 XT / R9 380 2 GB
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD
Display(s) Compit HA2704 / MSi G2712
Case Matrexx 55 / Junkyard special
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / DQ550ST [backup]
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 10 and 11
I'm gonna be honest with you chief. If AMD's recent track record is to go by, in 7 years you'll have no driver support whatsoever, with your card considered "vintage", no security driver updates, it'll be just... forgotten with all the bugs that were reported over the years, just like the R9 Fury X and the Radeon VII - which mind you, is not even 5 years old yet and already discontinued. By then, don't expect any community help either. The fanbase at large will tell you that your card is old, its architecture is well-developed and there was nothing they could do or that you had any reasonable demand to ask them of anyway, "RDNA family is what 11 years old by now?", and that you should just suck it up.



How is your power draw with only this 4K monitor plugged in? My RTX 4080 is at around 8 W with a 4K120, 10-bit HDR idling. I have EGS and Discord open and that's about it. Only one panel, of course, since this is a 55 inch OLED and it's basically intended to be a universal solution to the multiple-screen issue (I like the estate, dislike having multiple monitors).

View attachment 325894
Slightly off-topic but a single 4K60 + RX 6700 XT + latest WHQL drivers:
1.png
2.png

You just don't compare AMD to nVidia in this regard. AMD are several magnitudes behind.
 
Joined
Mar 1, 2021
Messages
482 (0.36/day)
Location
Germany
System Name Homebase
Processor Ryzen 5 5600
Motherboard Gigabyte Aorus X570S UD
Cooling Scythe Mugen 5 RGB
Memory 2*16 Kingston Fury DDR4-3600 double ranked
Video Card(s) AMD Radeon RX 6800 16 GB
Storage 1*512 WD Red SN700, 1*2TB Curcial P5, 1*2TB Sandisk Plus (TLC), 1*14TB Toshiba MG
Display(s) Philips E-line 275E1S
Case Fractal Design Torrent Compact
Power Supply Corsair RM850 2019
Mouse Sharkoon Sharkforce Pro
Keyboard Fujitsu KB955
You also don't compare single monitor setups vs. multi monitor setups.

That beside, they both need improvements in regard to multi monitor usage, especially with different resolutions and frequencies.
 
Joined
Feb 24, 2023
Messages
2,922 (4.74/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC
Processor i5-12400F / 10600KF
Motherboard Gigabyte B760M DS3H / Z490 Vision D
Cooling Laminar RM1 / Gammaxx 400
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333
Video Card(s) RX 6700 XT / R9 380 2 GB
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD
Display(s) Compit HA2704 / MSi G2712
Case Matrexx 55 / Junkyard special
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / DQ550ST [backup]
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 10 and 11
You also don't compare single monitor setups vs. multi monitor setups.
Of course but dude asked about a single monitor setup.
 
Joined
Jun 2, 2017
Messages
8,969 (3.31/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Being drown in the data rabbit hole is kind of an occupational disease...
And yes, I checked taobao and there are replacement fans if I really need it.
Now I hope nothing blows up on the card during its lifespan, and I really mean nothing ...

Update:
After I have a loooong look on the RX7000 series owner club thread, I thought to myself, sure a new cable won't hurt...
So I bought a new UGREEN DisplayPort 1.4 cable to swap the (unbranded, probably package included) cable on the AOC 24G2. The monitor can support only DP1.2 at max, previously running at 100Hz instead of it's maximum supported 144Hz.
The cable on the GB M27U is the package included cable from GB running DP2.1.
The cable on the cheapo ASUS is a new HDMI 2.0 cable. I can try DP-to-DVI if I needed, but it is not likely causing any bump on the power draw numbers anyway.

The new cable only helped a very little bit on default settings (moderate power savings, max refresh or 100Hz, run-to-run variance I guess).
But if I flip the switch for PCIe link state to high power savings (or whatever it is called), while pushing back to max refresh rate, the VRAM clock will fluctuate a lot more on the new cable, allowing 5~10% of power savings when compared to the old cable.
If I set the AOC to 120Hz idle power draw is going down to ~60W on average.
Back to 100Hz will allow VRAM go below 909MHz and yield a 30~35W idle power draw on average!

Unfortunately, the high power savings settings will destabliize gaming workloads, so I will have to flip the PCIe link state switch back to moderate power savings when I actually game.
Now, if only there is a way to flip that switch without jumping through all those setting hoops...

I have wasted too much time on this unscientific experiment, so until a new driver drastically improve things I will let things be for now. 45~50W is not a horrible number for this wacky setup and for big card standards, though I really hope there will be a new driver to further improve these numbers...or stabilize the high power saving settings.

Side note: To the shock horror of myself and my brother, his setup (3070ti, MSI 27" 1440p165Hz QD thingy, Philips 27" 1440p170Hz IPS) uses ~75W idle by the graphic card alone no matter the link state settings. He worked around that with similar methods (on NVIDIA control panel, uses 144Hz profile on Philips, push to 164Hz, bam, ~25W idle).
He really doesn't like the MSI (something in the software, and it really hates custom resolutions), and plans to swap it to a 27" 1440p240Hz OLED, probably with a 4080. Funny number ensues I guess, but that's a few months time for another story.
You still used a budget cable. Ugreen is Amazon Basic. If I were you I would have bought a cable with 8K support. The DP on the 7900 series cards are DP 2.1. You should use one of those cables and see, With modern GPUs on the AMD side the cable spec matters more.

 

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
8,105 (2.37/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling NH-U12A + T30┃AXP120-x67
Memory 64GB 6400CL32┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Case Caselabs S3┃Lazer3D HT5
Try the new 23.12.1 driver! It claims to help with multi-monitor idle power draw, and it actually does with my system. :)

Unfortunately that's the issue isn't it? AMD's approach to multi monitor idle is to cast a wide net with full VRAM clock first, then individually approve specific resolution/Hz combinations for dropping clocks later in driver. Good thing is that It's safe and avoids checkerboarding (Nvidia) due to inadequate VRAM clock, bad thing is that the most you can hope for is popular 2-monitor setups.

You still used a budget cable. Ugreen is Amazon Basic. If I were you I would have bought a cable with 8K support. The DP on the 7900 series cards are DP 2.1. You should use one of those cables and see, With modern GPUs on the AMD side the cable spec matters more.


It's been pretty thoroughly debunked as a reddit-ism at this point; cable choice does not affect how VRAM and SOC behave on Navi31, which is the true factor behind multi monitor power consumption. Nor is it responsible for the MBA vapour chamber issue (which most of Reddit also believed in the first few days of that).

Also, plenty of Amazon brands claim VESA certification, but the actual VESA list does not agree.
 
Joined
Jan 14, 2019
Messages
12,167 (5.74/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Unfortunately that's the issue isn't it? AMD's approach to multi monitor idle is to cast a wide net with full VRAM clock first, then individually approve specific resolution/Hz combinations for dropping clocks later in driver. Good thing is that It's safe and avoids checkerboarding (Nvidia) due to inadequate VRAM clock, bad thing is that the most you can hope for is popular 2-monitor setups.
True. Although, at this point, I'm just happy that it got sorted out for my case, which is not typical at all with a 3440x1440, 144 Hz ultrawide, and a 1024x600, 43 Hz mini display. I'm also glad that nearly all Zen 4 CPUs have an iGPU in them, so your secondary screen can be connected there to have a smaller effect on your card's VRAM.

It's been pretty thoroughly debunked as a reddit-ism at this point; cable choice does not affect how VRAM and SOC behave on Navi31, which is the true factor behind multi monitor power consumption. Nor is it responsible for the MBA vapour chamber issue (which most of Reddit also believed in the first few days of that).

Also, plenty of Amazon brands claim VESA certification, but the actual VESA list does not agree.
Yep. Quality cables help with display issues - they don't do much with power usage.
 

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
8,105 (2.37/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling NH-U12A + T30┃AXP120-x67
Memory 64GB 6400CL32┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Case Caselabs S3┃Lazer3D HT5
True. Although, at this point, I'm just happy that it got sorted out for my case, which is not typical at all with a 3440x1440, 144 Hz ultrawide, and a 1024x600, 43 Hz mini display. I'm also glad that nearly all Zen 4 CPUs have an iGPU in them, so your secondary screen can be connected there to have a smaller effect on your card's VRAM.

I miss the days of having a free display output from 4790K............and quicksync

Surprised your GPU even considers 1024x600 a second screen, it's so small :D
 
Top