• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

B580 tanks performance with low end CPUs

Joined
Oct 19, 2024
Messages
69 (0.31/day)
Hey, seems like nobody is talking about this "little" problem? o_O Arc B580 Overhead Issue, Ryzen 5 3600, 5600, R7 5700X3D & R5 7600: CPU-Limited Testing - YouTube

apparently the otherwise good B580 has a big problem with "non ultrafast CPUs" (I mean, I wouldn't call the 7600 slow necessarily)

at first it seemed like it was a BAR problem with old cpus, turns out it's a problem with ALL cpus below a certain speed...

this put seriously into question the viability of the whole battlemage architecture...


tl:dr
b580Overheadproblem.png


waddaya think?
 
Last edited:
What happens when overclocking them? I pin my 3600 right at 4.0 all day long.
 
1736004776004.png


Few games tested and results vary (didn't watch the video, just scrolled through a bit) but good to know.
 
Then it's pretty bad value for people in the market for a ~$250 card. Though I have to add that in Europe it costs more than the cheapest 4060s and sometimes more than 7600XT 16GB, making it pointless anyway.
 
Then it's pretty bad value for people in the market for a ~$250 card. Though I have to add that in Europe it costs more than the cheapest 4060s and sometimes more than 7600XT 16GB, making it pointless anyway

yepp. Although, at 1440p should be less noticeable. Still, kinda bad that a budget gpu doesn't pair up well with budget cpus at budget resolution.

Few games tested and results vary (didn't watch the video, just scrolled through a bit) but good to know.

defo needs more tests to have a clearer picture, but it doesnt bode well.
 
yepp. Although, at 1440p should be less noticeable. Still, kinda bad that a budget gpu doesn't pair up well with budget cpus at budget resolution.



defo needs more tests to have a clearer picture, but it doesnt bode well.
Might be "Genuine Intel" coding in the firmware, like how windows has had it for eons.

It could be cpu brand biased.
 
i wonder why he didn't test a single intel cpu @.@
 
Might be "Genuine Intel" coding in the firmware, like how windows has had it for eons.

It could be cpu brand biased.
No it's not. Tests with Alchemist on Intel CPUs showed the same problem. It's overhead issue. Driver still needs a LOT of work. Some of the problems were hardware on Alchemist, but Battlemage seems to be even worse in this regard despite improvements, pointing out software as the cause.

This the result of a finance focused company that went all out to sell their CPUs even at the detriment of other product lines plus treating their GPUs like it was future of HD Audio, and hectic and inconsistent management combined with toxic culture.

The decades of ignoring GPUs is now biting them back hard now they are actually trying to get a decent GPU out.

A comment by reddit user:
Intel drivers use two threads for drawcall submissions, which is an ancient holdover from their igp. I remember having an intel laptop with a igp that couldn't benefit from the hyperthreading on an I3, no matter the resolution or gpu usage.
So you need a processor with 2 super fast cores, or you have to run up against the gpu limit and accept frametime dips.
Yea, it's going to take a lot of work to fix that. I mean the entire driver needs to be overhauled.
 
Last edited:
No it's not. Tests with Alchemist on Intel CPUs showed the same problem. It's overhead issue. Driver still needs a LOT of work. Some of the problems were hardware on Alchemist, but Battlemage seems to be even worse in this regard despite improvements, pointing out software as the cause.

This the result of a finance focused company that went all out to sell their CPUs even at the detriment of other product lines plus treating their GPUs like it was future of HD Audio, and hectic and inconsistent management combined with toxic culture.

The decades of ignoring GPUs is now biting them back hard now they are actually trying to get a decent GPU out.

A comment by reddit user:

Yea, it's going to take a lot of work to fix that. I mean the entire driver needs to be overhauled.
They fudged i740 back in the day, so it's just interesting that there were results shown on amd cpus but none on intel.
 
They fudged i740 back in the day, so it's just interesting that there were results shown on amd cpus but none on intel.
It doesn't matter. Wendell and Hardware Canucks noted slowdowns on 10th Gen Intel CPUs.

This is mostly a driver issue. Their drivers are 2 threaded, so CPUs with E cores aren't going to fix this either. The whole driver will need a rewrite to take advantage of multi-threading, which is going to be a ton of work. This explains the rumors that they were working on drivers to improve DX12 performance as well.

This is ALL due to being stuck in iGPU mentality. ReBar requirement is due to this too. And the fact the management and culture is horrid.
 
The whole driver will need a rewrite to take advantage of multi-threading, which is going to be a ton of work.

I generally ignore you trolling the intel subforums; but I do wonder where you get this stuff. Assuming its true, and lets do just that; assume. You know they do that already right? Even now we are in a merge window currently where 2 sets of drivers are being released as one package. This will be for several cycles before finally merge them into once package. Anyone can look into Intels driver release history on TPU and see it, and I would certainly know since I have to touch like all of them since forever;


Not saying its not difficult, but it certainly isnt as doom and gloom as you make it out to be, they literally do this every time a product is released. They did it for:

Iris XE & Alchemist
Alchemist & Meteor Lake
Alchemist & Lunar Lake

We are now on

Alchemist & Battlemage

So 4th cycle, each one has had multiple tandem driver releases before the eventual merge.

EDIT: Anyway, I was thinking about this and while it certainly does suck, it almost feels like they are blowing it out of proportion. I am not aware of Intel supporting rebar except very slect combos prior to 12th gen, and prior to AMD 5xxx series. Intel said from the beginning that they need REBAR on for the cards to perform as they intended; showing "bad" results on platforms that dont support it seems disingenuous; Thats like saying "GUYS the cards dont run well without rebar enabled; ALSO GUYS they dont perform well on platforms without rebar!" like; no shit?
 
Last edited:
the entire driver needs to be overhauled

Which API is this regarding? The way draw calls are handled in DirectX 11 and DirectX 12 or Vulkan are quite different, and actually differ in approach even between vendors.
 
Which API is this regarding? The way draw calls are handled in DirectX 11 and DirectX 12 or Vulkan are quite different, and actually differ in approach even between vendors.
For all of it. DX11 is worse at it, but DX12 shows this too.

htt ps://imgur.com/gallery/arc-b580-api-overhead-comparison-u3UHMyZ

The API overhead tests show in DX11 ST, MT, DX12, and Vulkan having far lower performance in the drawcall test than AMD/Nvidia GPUs. DX11 MT gains 2% on ARC while 7900XTX and 4070 Super both gets 34%.
I generally ignore you trolling the intel subforums; but I do wonder where you get this stuff. Assuming its true, and lets do just that; assume. You know they do that already right? Even now we are in a merge window currently where 2 sets of drivers are being released as one package. This will be for several cycles before finally merge them into once package. Anyone can look into Intels driver release history on TPU and see it, and I would certainly know since I have to touch like all of them since forever;
Talk about being in denial. It goes from being a value leader with 9800X3D to losing even with a modern Ryzen 7600.
EDIT: Anyway, I was thinking about this and while it certainly does suck, it almost feels like they are blowing it out of proportion. I am not aware of Intel supporting rebar except very slect combos prior to 12th gen, and prior to AMD 5xxx series. Intel said from the beginning that they need REBAR on for the cards to perform as they intended; showing "bad" results on platforms that dont support it seems disingenuous; Thats like saying "GUYS the cards dont run well without rebar enabled; ALSO GUYS they dont perform well on platforms without rebar!" like; no shit?
Nope. ReBar is a common excuse. HW Unboxed tested it with ReBar on. It underperforms significantly against 4060 even with a modern 7600 and the Ryzen 5600 is basically a requirement. 1% lows tank on ARC. On Nvidia, 5600 performs almost identical to 9800X3D.

And they tested Ryzen 2600 with ReBar on, and tested that being off further greatly hampers the performance, meaning ReBar is working.

So pointing out flaws so they can improve and people won't suffer from consequences is "trolling"? Got it. No, I don't make excuses for a company run by adults and is capable of fixing their own problems. None of these guys owe me anything.
 
Last edited:
For all of it. DX11 is worse at it, but DX12 shows this too.

htt ps://imgur.com/gallery/arc-b580-api-overhead-comparison-u3UHMyZ

The API overhead tests show in DX11 ST, MT, DX12, and Vulkan having far lower performance in the drawcall test than AMD/Nvidia GPUs. DX11 MT gains 2% on ARC while 7900XTX and 4070 Super both gets 34%.

Talk about being in denial. It goes from being a value leader with 9800X3D to losing even with a modern Ryzen 7600.

Nope. ReBar is a common excuse. HW Unboxed tested it with ReBar on. It underperforms significantly against 4060 even with a modern 7600 and the Ryzen 5600 is basically a requirement. 1% lows tank on ARC. On Nvidia, 5600 performs almost identical to 9800X3D.

And they tested Ryzen 2600 with ReBar on, and tested that being off further greatly hampers the performance, meaning ReBar is working.

So pointing out flaws so they can improve and people won't suffer from consequences is "trolling"? Got it. No, I don't make excuses for a company run by adults and is capable of fixing their own problems. None of these guys owe me anything.

This benchmark was retired by UL due to not being representative of a GPU's real world performance, as you can see it's completely lopsided towards Nvidia because they support both deferred contexts and driver command lists, with CPU scheduling scaling to pretty much n threads. Given the scale here, BMG must be doing things similar to how AMD does it: immediate contexts but processed by a hardware command scheduler instead of done in the CPU. This approach lowers CPU overhead (source of the "Radeon is better for low end CPUs" thing), but it also lowers the ceiling of theoretical draw calls, theoretical being the operative word, by the time scene complexity has risen to the point where you need that many, developers should have started optimizing their code a very long time ago. The one edge case is Creation Engine 1 games, namely Skyrim, Fallout 4 and Fallout 76. They really go wild on instancing, especially if mesh precombining is disabled (unfortunately required for many, if not most mods). These will run better on Nvidia as long as ample CPU power is available simply due to the scalability of their driver.

FYI, this is on an ancient GTX 580, just to show you how utterly inconsequential this particular aspect is:


The figures here might seem low, but I don't particularly believe that this is the problem. Regardless, BMG is new. It will get lots of optimizations and fixes over the years, and especially throughout 2025.
 
No it's not. Tests with Alchemist on Intel CPUs showed the same problem. It's overhead issue.
Are any of the games DX11? Because of Arc's use of dxvk, that's not an overhead problem they are going to fix, if so. Translation layers have overhead.

Also, this does not surprise me. Keep in mind Alchemist drivers bordered on unusable for a long, long time. They're taking baby steps. Now instead of unusable, we are seeing high driver overhead. These things take time to get right.
 
Last edited:
So the people most likely to buy one of these actually disproportionately get the least performance out of it, nice move Intel.
 
So the people most likely to buy one of these actually disproportionately get the least performance out of it, nice move Intel.

IMHO the requirement for ReBAR support is probably the worst deal about the Arc cards. The B580 would fit like a glove for an older machine on X79 or X99 platforms.
 
So the people most likely to buy one of these actually disproportionately get the least performance out of it, nice move Intel.
There was something someone said ARC did really well. Some specific type of encoding, I don't recall what it was at the moment.
 
Not to be overly alarmist about this but the one god damn thing Arc cards had going for them just went out the window.

If you are spending $250 on a GPU, you probably didn't spend more than $250 on your CPU... Maybe even 3/4/5 years ago.

The card has to work well under those circumstances, and it just doesn't, no matter the reason.

I've been a proponent of TPU doing an annual "reality check" and benching popular cards on for real systems that actual humans use. Maybe bench at the $500/$1000 system marks, it can uncover stuff like this where just because one manufacturer or card is the best with a 9800x3d system, it doesn't mean it's going to be the best card for your 8600k or whatever.
 
View attachment 378331

Few games tested and results vary (didn't watch the video, just scrolled through a bit) but good to know.
When I saw these results I felt a lot better about my choice paring a 4060 with my 5000 series CPU.
 
Seems like it needs more threads
the 24% drop vs the 25% less threads from the the 7600x to the 5700x 3D show it likes threads. As there's also a deficit of clock between the two.
Data is inconclusive without both dual CCD cpu & Intel 16 threads, 20 threads, & 32 threads cpu in the chart.
 
Not even one Intel CPU tested is quite strange.
But a useful warning sign anyway, I would avoid those cards.
 
Back
Top