• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel's Arc A380 Performs Even Worse With an AMD CPU

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
18,908 (2.50/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
According to fresh benchmark numbers from someone on bilibili, Intel's Arc A380 cards perform even worse when paired with an AMD CPU compared to when paired with an Intel CPU. The card was tested using an AMD Ryzen 5 5600 on an ASUS TUF B550M motherboard paired with 16 GB of DDR4 3600 MHz memory. The Intel system it was tested against consisted of a Core i5-12400 on an ASUS TUF B660M motherboard with the same type of memory. Both test systems had resizable BAR support set to auto and above 4G decoding enabled. Windows 11 21H2 was also installed on both systems.

In every single game out of the 10 games tested, except League of Legends, the AMD system was behind the Intel system by anything from a mere one percent to as much as 15 percent. The worst performance disadvantage was in Forza Horizon 5 and Total War Three Kingdoms, both were 14 to 15 percent behind. The games that were tested, in order of the graph below are: League of Legends, Dota 2, Rainbow 6 Extraction, Watch Dogs Legions, Far Cry 6, Assassin's Creed Valhalla, Total War Three Kingdoms, Shadow of the Tomb Raider, CS:GO and Forza Horizon 5. For comparison, an NVIDIA GeForce GTX 1650 was also used, but only tested on the Intel based system and the Arc A380 only beat it on Total War Three Kingdoms, albeit by a seven percent margin. It appears Intel has a lot of work to do when it comes to its drivers, but at last right now, mixing Intel Arc graphics cards and AMD processors seems to be a bad idea.



View at TechPowerUp Main Site | Source
 
Set the 5600 to 4.6Ghz or 4.7Ghz and get back to us with the results...
 
Set the 5600 to 4.6Ghz or 4.7Ghz and get back to us with the results...

this is not about the 5600 vs the 12400 really, its about some weird mechanic where the 5600 just under performs when paired with the A380.
If you look at for example Gamers Nexus's review of the 5600, you will see that paired with an RTX3080 in Far Cry 6, the difference should be less then 5% favoring the 12400, here you see there is a 10+% difference between them.

and in CS:GO the 5600 should win with ease, again referring to GN's review with the RTX3080 gives the 5600 an average of 329.1 fps vs the 12400's 265.8, but here it loses by a bit...

so yeah odd stuff...
 
Last edited:
Intel sandbagging the GPU when paired with AMD? Nah, they wouldn't do that...:eek:
 
You know... not too long ago AMD came out with GPUs that are sandbagged with AMD CPUs, while Intel's counterparts handled them better.
Needs a reminder?
Yes please, I was probably on an Xbox then. :respect:
 
You know... not too long ago AMD came out with GPUs that are sandbagged with AMD CPUs, while Intel's counterparts handled them better.
Needs a reminder?

I would also like a link to that information.
 
I guess Intel really thought that the games early adopters play, are just benchmarks.
 
We really need more information on this tests. We are jumping to conclusions with as little information as bars on a chart. So much can explain that difference in performance
 
Someone on TPU needs to get hold of one of these things and do tests themselves. Then we can all laugh. :D
 
We really need more information on this tests. We are jumping to conclusions with as little information as bars on a chart. So much can explain that difference in performance
The source video is linked, you're free to watch it any time.

Someone on TPU needs to get hold of one of these things and do tests themselves. Then we can all laugh. :D
The launch price was $600, so the bossman wasn't interested for that price.
 
W1zzard will eventually be able to get one hopefully and we will get a thorough review.
 
my bad i only opened the twitter link
Well, it linked to the video too...

W1zzard will eventually be able to get one hopefully and we will get a thorough review.
These guys are usually quite good.
 
A shame... I had high hopes for Intel... we really needed a third player in the GPU market.

My only hope now is that I can get a next gen card this winter at MSRP
 
W1zzard will eventually be able to get one hopefully and we will get a thorough review.
I think it telling that Intel didn't send the maker of GpuZ one, they don't want it directly compared, fairly, it's that simple, I have not seen any of the Reviewer's get one tbh, a bad sign.
 
I think it telling that Intel didn't send the maker of GpuZ one, they don't want it directly compared, fairly, it's that simple, I have not seen any of the Reviewer's get one tbh, a bad sign.

I suppose Intel could make the excuse that the GPU is region locked right now but I also think it's more likely that they knew the GPU wouldn't get favorable reviews.
 
i don't do social media, i see that was a link now, had no idea
billibilli is social media in the PRC...

A shame... I had high hopes for Intel... we really needed a third player in the GPU market.

My only hope now is that I can get a next gen card this winter at MSRP
Well, this is still the bottom of the barrel part, so hopefully the higher-end tiers will do better.

Current cards are dropping like crazy in price now. Oddly enough, an RX 6900 XT costs the same as an RX 6800 XT here...
 
Raja's parting shot? Seriously, this guy has morphed into Peter Molyneux. Great bombast and promises that never materialise.

These cards are going to be utter crap, as I promised 2 years ago when Raja started making claims and marketing he could never keep up with.

Someone needs to spoof the CPUID to be genuine Intel etc rather than AMD and see if that changes anything. Then I can get my pitchfork :laugh:
They are probably using some CPU optimization which is not available/disabled in the compiler when it detects AMD CPU's. IF Intel is actually motivated to do such things... I personally wouldn't rule it out.
 
Intel GPU perform worse on AMD CPU
AMD GPU perform worse on AMD CPU

what is wrong with these :')
AMD GPU perform worse than nGreedia GPU.

There, I made it complete for you.
 
Back
Top