Monday, August 21st 2023

Intel Overclocks Arc A380 with a Driver Update

Intel Graphics figured out a unique way to step up performance of its entry-level Arc A380 graphics cards. While other companies prevent BIOS updates and generally discourage overclocking; Intel has given the A380 a free vendor overclock. With the latest Arc GPU Graphics drivers 101.4644 WHQL, Intel has increased the base frequency of the GPU. The driver installer includes a firmware update besides the driver. The A380 has a reference base clock of 2000 MHz, which boosts up to 2050 MHz. With the latest 101.4644 drivers (the new firmware), the base frequency has been increased to 2150 MHz. It's not much, but hey, who doesn't want a free 7.5% overclock to go with their recent 19% performance increase in DirectX 11 games, and over 40% increase in DirectX 9 ones?
Source: Neowin.net
Add your own comment

32 Comments on Intel Overclocks Arc A380 with a Driver Update

#1
Assimilator
Not seeing a point in bumping up the frequency on such a low-end part...
Posted on Reply
#2
P4-630
An extra
Assimilatorpoint
As in 3DMark..
Posted on Reply
#4
Beginner Macro Device
AssimilatorNot seeing a point in bumping up the frequency on such a low-end part...
This is the segment where buyers tend to buy and forget about upgrades for a very long while. Every single percent of speed matters.

And heck, Intel is currently in the very fragile spot so they need anything to stay competitive. A couple percent free speed will definitely improve their reputation.

"Our GPU is a single digit % faster than competition" doesn't sound as convincing as "our GPU is a double digit % faster than competition" after all.
Posted on Reply
#5
notaburner
Does the A380 (and I suppose all ARC gpus) only boost up to that max boost clock value typically? Always thought that all modern GPUs will just continue to push along the frequency curve if power/thermals/voltage allow it. Seemed to me that base and boost clocks were more of a potential value that a card should be able to hit with reasonable ambient temperatures given the model's choice of cooler. Or does this just mean the frequency curve was essentially bumped up across the board?
Posted on Reply
#6
AnotherReader
notaburnerDoes the A380 (and I suppose all ARC gpus) only boost up to that max boost clock value typically? Always thought that all modern GPUs will just continue to push along the frequency curve if power/thermals/voltage allow it. Seemed to me that base and boost clocks were more of a potential value that a card should be able to hit with reasonable ambient temperatures given the model's choice of cooler. Or does this just mean the frequency curve was essentially bumped up across the board?
TPU's review of the A380 found it to clock at 2450 MHz in many games. We don't know if it will clock closer to 2600 MHz after this update.
Posted on Reply
#7
Luke357
notaburnerDoes the A380 (and I suppose all ARC gpus) only boost up to that max boost clock value typically? Always thought that all modern GPUs will just continue to push along the frequency curve if power/thermals/voltage allow it. Seemed to me that base and boost clocks were more of a potential value that a card should be able to hit with reasonable ambient temperatures given the model's choice of cooler. Or does this just mean the frequency curve was essentially bumped up across the board?
It can (and will) boost past that clock. It works very similar to how Nvidia does it.
Posted on Reply
#8
chrcoluk
Curious of power usage on the bump.

But yeah given the target market, this will be a change that will be appreciated even if consumers are ignorant of it.
Posted on Reply
#9
bonehead123
Lipstick on a pig....

It's still just a pig, just a tad prettier, hehehe :)
Posted on Reply
#10
tommo1982
bonehead123Lipstick on a pig....

It's still just a pig, just a tad prettier, hehehe :)
I considered A380 at some point, even with the buggy drivers. However, I bought used RX570 earlier and the benchmarks show it's better. I wonder how A380 will perform now.
Posted on Reply
#11
Vayra86
bonehead123Lipstick on a pig....

It's still just a pig, just a tad prettier, hehehe :)
This kinda feels more like they put some extra spice on the pork chop
Posted on Reply
#12
bonehead123
Vayra86This kinda feels more like they put some extra spice on the pork chop
Ok then, so prettier AND taste a little yummier too, hehehe :)
Posted on Reply
#13
Quitessa
These are still awesome AV1 video encoders for cheap
Posted on Reply
#14
lexluthermiester
AssimilatorNot seeing a point in bumping up the frequency on such a low-end part...
Because in 1080p it can make a difference? Most gamers are still gaming 1080p. Would have been better if they had done a 10% or better bump in clock speed, but still, this kinda cool.
Posted on Reply
#15
ARF
lexluthermiesterBecause in 1080p it can make a difference? Most gamers are still gaming 1080p. Would have been better if they had done a 10% or better bump in clock speed, but still, this kinda cool.
This card's problem, even at low res, is the VRAM of only 6 GB. This is only for older games, nothing new.
Posted on Reply
#16
lexluthermiester
ARFThis card's problem, even at low res, is the VRAM of only 6 GB.
At 1080p? No it's not. It's no more or less a problem than it was for cards with 4GB at 1080p. No one is buying this GPU for 1440p/2160p gaming. At 1080p, 6GB is a healthy and reasonable amount of VRAM.
Posted on Reply
#17
Beginner Macro Device
ARFThis card's problem, even at low res, is the VRAM of only 6 GB.
4 GB with a 33% wider bus would be better considering the core performance. This GPU loses to 1650 Super so it's not the VRAM to blame. Just like 12 GB in an RTX 3060. Nice to have but even scenarios when you need just above 7 GB are driving that card below the comfort of 60 FPS.

6 GB is a problem when we're talking GPUs above the lowest possible tiers. The core can provide with playable performance but is handicapped by anemic VRAM amount (e.g. 3070 Ti which is capable of outperforming RX 6800... only if the game doesn't need more than 8 GB of VRAM). A380 is not the case. It has more VRAM it can reasonably use. This GPU needs higher clocks and better drivers. Not more VRAM.
Posted on Reply
#18
Mussels
Freshwater Moderator
It really shows they flubbed the launch if these sorts of gains were possible with a few months extra work

VRAM bandwidth is as important as the amount - it's just that the every time RAM modules double in density, you need double the VRAM for the same capacity and that's hard to justify on cheaper cards.

6GB GDDR6, 96 bit on the A380
To go 128 bit they'd need 8GB of VRAM and then it's the same issue, the bandwidth is low for the RAM amount still.
To go 192 bit 6GB they'd have to use twice as many RAM chips, increasing the cost of the PCB design as well as the higher cost of the VRAM, changing its price point drastically - even the little things like extra power consumption stack up
Posted on Reply
#19
ExcuseMeWtf
AssimilatorNot seeing a point in bumping up the frequency on such a low-end part...
Free performance?
Posted on Reply
#20
ARF
lexluthermiesterAt 1080p? No it's not. It's no more or less a problem than it was for cards with 4GB at 1080p. No one is buying this GPU for 1440p/2160p gaming. At 1080p, 6GB is a healthy and reasonable amount of VRAM.
I didn't mention anything about 1440p/2160p. The card is bad for any gaming as claimed and proved by the following authors.
Gaming performance for the Arc A380 ends up being pretty mediocre. It's faster than the GTX 1650 and RX 6400, usually, though there are occasions where it comes up short. Drivers remain a concern, so if you play a lot of indie games or more esoteric options, we'd stick with the tried and true AMD and Nvidia drivers and hardware. Big name games on the other hand seem to be getting a decent level of tuning and testing from Intel.

Ultimately, the Arc A380 offers too little too late for most people. Anyone with an older PC looking to upgrade the graphics might be tempted, but Arc can be finicky on older platforms. There's still hope for the higher end Arc models, however, especially if Intel can keep pricing competitive. Intel's own testing suggests the Arc A750 can beat the RTX 3060 on performance, but what will it cost? Or how about the rumored Arc A580, which Intel hasn't said much about? And will any of those other Arc GPUs work better on slightly older PCs? Those are all important questions.
www.tomshardware.com/reviews/intel-arc-a380-review

Look at the stutter show in actual gameplay:

Posted on Reply
#21
notaburner
Luke357It can (and will) boost past that clock. It works very similar to how Nvidia does it.
AnotherReaderTPU's review of the A380 found it to clock at 2450 MHz in many games. We don't know if it will clock closer to 2600 MHz after this update.
Thanks for clearing it up. Hopefully there are some gains for A380 owners.
Posted on Reply
#22
geniekid
Seems like it was just a change to the reported clock speed, not an actual change.

From ArsTechnica - "In a recent driver update, we changed the reported graphics clock of the A380," an Intel spokesperson told Ars. "Actual performance and frequency were not affected and we are working on an update to revert the change in a future driver update."
Posted on Reply
#23
Scrizz
My A380 is working fine.
Posted on Reply
#24
Mussels
Freshwater Moderator
ARFI didn't mention anything about 1440p/2160p. The card is bad for any gaming as claimed and proved by the following authors.


www.tomshardware.com/reviews/intel-arc-a380-review

Look at the stutter show in actual gameplay:

The cards got massive performance boosts already, look at the original news posts.

The changes to vulkan were the biggest, things like this just add more on top. These are an entry level gaming card, designed to add more monitor outputs, encoding/decoding capabilities and such... but the buffs to performance are definitely worth knowing about as day one reviews wont have those in their results.


pity the clock speed was an error, maybe they can OC higher now?
Posted on Reply
#25
lexluthermiester
ARFI didn't mention anything about 1440p/2160p. The card is bad for any gaming as claimed and proved by the following authors.


www.tomshardware.com/reviews/intel-arc-a380-review

Look at the stutter show in actual gameplay:

You are seriously using Hogwarts Legacy as a benchmark for a mid-tier card that is 3 years old? :rolleyes::slap::kookoo:

Hell, let's just benchmark a GTX1050ti with CyberPunk2077 while we're at it...

Context is important. You're missing it, or maybe you're deliberately shitposting, who knows at all...
Posted on Reply
Add your own comment
Apr 28th, 2024 09:04 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts