• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Overclocks Arc A380 with a Driver Update

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,686 (7.42/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Intel Graphics figured out a unique way to step up performance of its entry-level Arc A380 graphics cards. While other companies prevent BIOS updates and generally discourage overclocking; Intel has given the A380 a free vendor overclock. With the latest Arc GPU Graphics drivers 101.4644 WHQL, Intel has increased the base frequency of the GPU. The driver installer includes a firmware update besides the driver. The A380 has a reference base clock of 2000 MHz, which boosts up to 2050 MHz. With the latest 101.4644 drivers (the new firmware), the base frequency has been increased to 2150 MHz. It's not much, but hey, who doesn't want a free 7.5% overclock to go with their recent 19% performance increase in DirectX 11 games, and over 40% increase in DirectX 9 ones?



View at TechPowerUp Main Site | Source
 
Not seeing a point in bumping up the frequency on such a low-end part...
 
Now we need a retest
 
Not seeing a point in bumping up the frequency on such a low-end part...
This is the segment where buyers tend to buy and forget about upgrades for a very long while. Every single percent of speed matters.

And heck, Intel is currently in the very fragile spot so they need anything to stay competitive. A couple percent free speed will definitely improve their reputation.

"Our GPU is a single digit % faster than competition" doesn't sound as convincing as "our GPU is a double digit % faster than competition" after all.
 
Does the A380 (and I suppose all ARC gpus) only boost up to that max boost clock value typically? Always thought that all modern GPUs will just continue to push along the frequency curve if power/thermals/voltage allow it. Seemed to me that base and boost clocks were more of a potential value that a card should be able to hit with reasonable ambient temperatures given the model's choice of cooler. Or does this just mean the frequency curve was essentially bumped up across the board?
 
Does the A380 (and I suppose all ARC gpus) only boost up to that max boost clock value typically? Always thought that all modern GPUs will just continue to push along the frequency curve if power/thermals/voltage allow it. Seemed to me that base and boost clocks were more of a potential value that a card should be able to hit with reasonable ambient temperatures given the model's choice of cooler. Or does this just mean the frequency curve was essentially bumped up across the board?
TPU's review of the A380 found it to clock at 2450 MHz in many games. We don't know if it will clock closer to 2600 MHz after this update.
 
Does the A380 (and I suppose all ARC gpus) only boost up to that max boost clock value typically? Always thought that all modern GPUs will just continue to push along the frequency curve if power/thermals/voltage allow it. Seemed to me that base and boost clocks were more of a potential value that a card should be able to hit with reasonable ambient temperatures given the model's choice of cooler. Or does this just mean the frequency curve was essentially bumped up across the board?
It can (and will) boost past that clock. It works very similar to how Nvidia does it.
 
Curious of power usage on the bump.

But yeah given the target market, this will be a change that will be appreciated even if consumers are ignorant of it.
 
Lipstick on a pig....

It's still just a pig, just a tad prettier, hehehe :)
 
Lipstick on a pig....

It's still just a pig, just a tad prettier, hehehe :)
I considered A380 at some point, even with the buggy drivers. However, I bought used RX570 earlier and the benchmarks show it's better. I wonder how A380 will perform now.
 
Lipstick on a pig....

It's still just a pig, just a tad prettier, hehehe :)
This kinda feels more like they put some extra spice on the pork chop
 
This kinda feels more like they put some extra spice on the pork chop
Ok then, so prettier AND taste a little yummier too, hehehe :)
 
Because in 1080p it can make a difference? Most gamers are still gaming 1080p. Would have been better if they had done a 10% or better bump in clock speed, but still, this kinda cool.

This card's problem, even at low res, is the VRAM of only 6 GB. This is only for older games, nothing new.
 
This card's problem, even at low res, is the VRAM of only 6 GB.
At 1080p? No it's not. It's no more or less a problem than it was for cards with 4GB at 1080p. No one is buying this GPU for 1440p/2160p gaming. At 1080p, 6GB is a healthy and reasonable amount of VRAM.
 
This card's problem, even at low res, is the VRAM of only 6 GB.
4 GB with a 33% wider bus would be better considering the core performance. This GPU loses to 1650 Super so it's not the VRAM to blame. Just like 12 GB in an RTX 3060. Nice to have but even scenarios when you need just above 7 GB are driving that card below the comfort of 60 FPS.

6 GB is a problem when we're talking GPUs above the lowest possible tiers. The core can provide with playable performance but is handicapped by anemic VRAM amount (e.g. 3070 Ti which is capable of outperforming RX 6800... only if the game doesn't need more than 8 GB of VRAM). A380 is not the case. It has more VRAM it can reasonably use. This GPU needs higher clocks and better drivers. Not more VRAM.
 
It really shows they flubbed the launch if these sorts of gains were possible with a few months extra work

VRAM bandwidth is as important as the amount - it's just that the every time RAM modules double in density, you need double the VRAM for the same capacity and that's hard to justify on cheaper cards.

6GB GDDR6, 96 bit on the A380
To go 128 bit they'd need 8GB of VRAM and then it's the same issue, the bandwidth is low for the RAM amount still.
To go 192 bit 6GB they'd have to use twice as many RAM chips, increasing the cost of the PCB design as well as the higher cost of the VRAM, changing its price point drastically - even the little things like extra power consumption stack up
 
At 1080p? No it's not. It's no more or less a problem than it was for cards with 4GB at 1080p. No one is buying this GPU for 1440p/2160p gaming. At 1080p, 6GB is a healthy and reasonable amount of VRAM.

I didn't mention anything about 1440p/2160p. The card is bad for any gaming as claimed and proved by the following authors.

Gaming performance for the Arc A380 ends up being pretty mediocre. It's faster than the GTX 1650 and RX 6400, usually, though there are occasions where it comes up short. Drivers remain a concern, so if you play a lot of indie games or more esoteric options, we'd stick with the tried and true AMD and Nvidia drivers and hardware. Big name games on the other hand seem to be getting a decent level of tuning and testing from Intel.

Ultimately, the Arc A380 offers too little too late for most people. Anyone with an older PC looking to upgrade the graphics might be tempted, but Arc can be finicky on older platforms. There's still hope for the higher end Arc models, however, especially if Intel can keep pricing competitive. Intel's own testing suggests the Arc A750 can beat the RTX 3060 on performance, but what will it cost? Or how about the rumored Arc A580, which Intel hasn't said much about? And will any of those other Arc GPUs work better on slightly older PCs? Those are all important questions.

Look at the stutter show in actual gameplay:

 
Seems like it was just a change to the reported clock speed, not an actual change.

From ArsTechnica - "In a recent driver update, we changed the reported graphics clock of the A380," an Intel spokesperson told Ars. "Actual performance and frequency were not affected and we are working on an update to revert the change in a future driver update."
 
My A380 is working fine.
 
I didn't mention anything about 1440p/2160p. The card is bad for any gaming as claimed and proved by the following authors.



Look at the stutter show in actual gameplay:

The cards got massive performance boosts already, look at the original news posts.

The changes to vulkan were the biggest, things like this just add more on top. These are an entry level gaming card, designed to add more monitor outputs, encoding/decoding capabilities and such... but the buffs to performance are definitely worth knowing about as day one reviews wont have those in their results.


pity the clock speed was an error, maybe they can OC higher now?
 
Back
Top