• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Quietly Fixes High Multi-monitor Power Draw of Arc GPUs

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,783 (7.41/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Without mentioning it in its driver change-log, Intel Graphics has quietly addressed the issue of unusually high power-draw for its Arc A-series GPUs in multi-monitor setups. The older 101.4091 drivers had a typical single-monitor idle power-draw of around 11 W, which would shoot up to 40 W idle in multi-monitor setups. In our own launch-day review of the Arc A770, we logged a 44 W multi-monitor power-draw. Intel now claims that the multi-monitor idle power-draw has been tamed, with the latest 101.4146 drivers the company released last week lowering this down to "8 to 9 W" for multi-monitor setups, and "7 to 8 W" for single-monitor ones.



View at TechPowerUp Main Site | Source
 
this is awesome news as intel continues to improve its discrete gpus
 
Mmm multi... of some combinations of...two monitors that is not exactly multi. Intel has many works to do before drivers are enough better for normal power consumption in real multimonitor regime when "idle".
 
This is just good stuff and its really smart imo to keep this up as it will just make the next generation more and more a viable choice with the added incentive of actually wanting to support a 3rd player in the space.
 
Good to see Intel still improving drivers !
They don't give up and it's far from perfect ... but who knows ?
I was kindly joking at them when they started with their ARC GPU, and now it's seems they're doing a serious job with it.
 
I recommend that everyone reading this should click on the videocardz link and actually read the table in that article.

In short, No, Intel did not fix high multi-monitor power draw for most multi-monitor users.
 
I recommend that everyone reading this should click on the videocardz link and actually read the table in that article.

In short, No, Intel did not fix high multi-monitor power draw for most multi-monitor users.
Emm.. Are you sure about that?

1678091453718.png


It was confirmed by people that use 2 screens... yea it is still not done for 3 and 4 screen builds - but small steps like that are better than nothing
 
Good to see Intel still improving drivers !
They don't give up and it's far from perfect ... but who knows ?
I was kindly joking at them when they started with their ARC GPU, and now it's seems they're doing a serious job with it.
after last HWU video about Intel ARC (with latest updates) these cards really went up in eyes of people

it is nice to see that with current performance (and price) A750 is literally a better pick for 1440p than RX 6650 XT in a "cost per frame" chart
 
So it's time to totally and officially reduce the prices of the entire rDNA 2 generation of video cards, once again! It's also time for the mid-range rDNA 3 graphics cards to hit the market.
 
they fix their driver issues faster then AMD, like a decade fast :D
 
they fix their driver issues faster then AMD, like a decade fast :D
Yeah well, Intel, like NVIDIA, actually have large software teams.
 
Yeah well, Intel, like NVIDIA, actually have large software teams.

I get what you are saying but can we please stop acting like AMD is some mom and pop shop?
 
I get what you are saying but can we please stop acting like AMD is some mom and pop shop?
They're not, which is why there's no excuse.
 
Mmm multi... of some combinations of...two monitors that is not exactly multi. Intel has many works to do before drivers are enough better for normal power consumption in real multimonitor regime when "idle".
Indeed not totally fixed, but still a positive reaction.
 
Was this issue mentioned in review of the product? as to me thats quite nasty in the era of the energy crisis.

But good its been fixed now, shameful not mentioned though as without public record of the problem the ignorant wont realise the impact on their bills.
 
Higher power draw is not occuring only in Intel cards, what's the general reason? Frame buffer bandwidth too high for VRAM and IMC to enter a lower power state? It's possible that Intel can't fix everything in software, or even a hardware revision.
 
Was this issue mentioned in review of the product? as to me thats quite nasty in the era of the energy crisis.

But good its been fixed now, shameful not mentioned though as without public record of the problem the ignorant wont realise the impact on their bills.
If 30w impacts you you dont have the budget for a $350 GPU in the first place KTHNX.
 
If 30w impacts you you dont have the budget for a $350 GPU in the first place KTHNX.
For reference a years worth of 30 watts pays for a ARC A750 in a year.
 
They're not, which is why there's no excuse.
Amen, I just cannot understand why they drop the ball every odd gen. Its so so stupid, they might as well start tossing bricks at their own offices instead.
 
For reference a years worth of 30 watts pays for a ARC A750 in a year.
*sigh* nope. Not even close. People are REALLY bad at math.

If you used your computer at idle for 8 hours a day, 365 days a year, that would be 87,600w VS the new driver. That is 87.6kWh. At current German electric prices of $.40 per kWh, that is $35 in electricity per year.

A new A750 is $250.

I shouldnt have to do the math every single time the use of electricity comes up. Can people please learn basic multiplication and division already? If electric bills are a major concern for you you cannot afford the hardware in question. PERIOD. The math DOES. NOT. WORK. OUT.
 
I wish everyone would sort out their idle consumption. My 5600X idle draws about 40w for some reason, and letting it drop the clocks low or idle at 4.5GHz seems to make little difference. 3600 is about 30w idle.

3080 pulls 15-20w idle on a single 4k screen. But my 6700xt pulls about 7-9w idle driving 2x 4k screens.

Radeon 7000 still has stupid high consumption on dual display doesn't it?

*sigh* nope. Not even close. People are REALLY bad at math.

If you used your computer at idle for 8 hours a day, 365 days a year, that would be 87,600w VS the new driver. That is 87.6kWh. At current German electric prices of $.40 per kWh, that is $35 in electricity per year.

A new A750 is $250.

I shouldnt have to do the math every single time the use of electricity comes up. Can people please learn basic multiplication and division already? If electric bills are a major concern for you you cannot afford the hardware in question. PERIOD. The math DOES. NOT. WORK. OUT.
They did say a years worth, not 8h a day for a year. But still nowhere close.
 
Last edited:
They did say a years worth, not 8h a day for a year. But still nowhere close.
Then it's still wrong. At 24hrs a day its still only $105 in a year. And that's idling 24 hrs a day. If you are buying a machine and letting it idle 24 hrs a day then whining about power use, there is something seriously wrong with you.
 
Hi,
If it wasn't a problem they wouldn't of fixed it :laugh:
 
which is why there's no excuse.
How do you know that ? My God, so many software engineers on here than know better than the ones the multi billion dollars companies hire.

There may very well be good reasons why some of these issues are never fixed, hardware of software changes that would be too costly or complicated no matter how large their software development teams are. There are plenty of things which have never been addressed by all of these companies, like how Nvidia still doesn't support 8bit color dithering to this very date on their GPUs. I was shocked when I switched to AMD at how less noticeable the color banding was. Why is that ? Supposedly Nvidia has many more resources yet this has never be changed, you buy a 2000$ card that has worse color reproduction than a 200$ one, what excuse is there for that ?
 
Emm.. Are you sure about that?

View attachment 286626

It was confirmed by people that use 2 screens... yea it is still not done for 3 and 4 screen builds - but small steps like that are better than nothing

This table explains why this, otherwise relevant improvement update, was "quietly" made.
 
Back
Top