• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Arc B580

I don't think it's the drivers at this point in time, there seem to be certain workloads that just tank performance, something about their architecture makes it awfully ineffective at times.


People are really dumb and don't understand how this works, if a game is sponsored by AMD or Nvidia they are going to include their features but everything else is extra work. It's not a matter of blocking anything but rather a matter of including the feature that your sponsor wants and everything else was being optional from your perspective, FSR runs on everything DLSS and XeSS doesn't, it's no surprise FSR is probably the first upscaler that gets implemented.

Not to mention how laughable it is to think that the GPU maker with a tiny market share gets to tell Microsoft and Bethesda what to do.
I really don't want to derail this thread but when CP2077 had DLSS did AMD users make a huge cry about no FSR support? To bring it back this card has launched to rave reviews from all the main sites. Of course HUB were the most Bombastic with "The GPU that will save PC Gaming". The narrative is strong for this card but we are exactly a month away from a response from the other players. I love it because if someone can break that mind share and bring prices back into line with the rest of the ecosystem then we as consumers benefit. I fear though that the DLSS (already mentioned in the review) and RT arguments are going to be strong when things like "Our Nvidia contacts have been very good to us" from Kit Guru colour the reviews of the next card from Team Green. The World is indeed upside down. Even people who have been on TPU for years are repeating false narratives put forth by talking heads. Like the people that use Day one reviews to judge products for the rest of time. It is not that there are more but just that their arguments used to be obtuse on TPU. Just like how the Steam Charts became the Bible in PC conversation and real product reviews (from purschasers) became all fan boys. I used the Spy vs Spy analogy on someone the other day and they jumped at me with how BS that is. Until you put on the News or go for a drive in any City in North America with the Gravity of what is to come, mixed in with a growing Housing crisis. Waxing on about Team Green or Red or Blue is just like Spy vs Spy because none of it matters when the user is Gaming. I guess that is what happened I went down the rabbit hole with my current PC and when I came back up Nvidia was the only thing to buy and every single Youbuter used (still) Nvidia GPUs with all of their builds.
 
Yes, absolutely. I don't need to see irrefutable proof to hold this belief (although I'd happily consume said proof if it ever surfaced, no matter what outcome it dictated), and, I don't need to justify my belief to you or any of the other AMD volunteer marketing department fans on the internet. So unless you or anyone else has said proof they didn't do it to show, my belief holds firm.

In any event I also believe the outcome of the fiasco (no matter what the whole truth even was), is a net benefit to all PC gamers.
So you'd rather hold that that belief because it fits your bias despite there not being any concrete proof to support your claim, you and the rest of the Nvidia volunteer marketing department fans would rather hold that belief despite FSR being good for PC gamers.
The most amusing part to me of the entire ordeal is that if AMD had just promptly released a media statement denying it, I'd have believed them. You can claim mindshare and marketing all you like, but understand that even if that's the case, then it also goes both ways ;)
A media statement would only cause Nvidia fans to double down and insist AMD had some sort of malicious intentions, it is a matter of mindshare and marketing, and it doesn't work the other way around because even the reviewers find some reason to bash the competition for not having support for a proprietary feature. The B580 is a very solid card despite the weird complaints of it not having DLSS or Nvidia RT.
 
Last edited:
Correct. It's mostly just to shut up people who keep nagging about slow memory speeds

Any chance you could do a separate article featuring your swanky new test rig like this article before :D
 
So you'd rather hold that that belief because it fits your bias despite there not being any concrete proof to support your claim, you and the rest of the Nvidia volunteer marketing department fans would rather hold that belief despite FSR being good for PC gamers.
There is no bias involved, I consumed all the available information and evidence and came to my own conclusion, which happens to be one that other neutral reputable tech press also did. It also has nothing to do with "FSR being good for gamers". From my perspective you might appreciate it appears that the inverse bias seems strong in denying it. Heck you were quick to say in this very thread you think Nvidia would do such a thing, but because there isn't absolute irrefutable proof AMD did it, then that's that... From what I see across the community at large (and this isn't aimed at your in particular), there are some double standards at play, and AMD just gets a pass.

A: I reckon Nvidia would pull something like this, they're shady
B: looks like AMD did, or at least was accused of, it was a whole thing
A: I need to see irrefutable proof or that's bogus!
B:....
A media statement would only cause Nvidia fans to double down and insist AMD had some sort of malicious intentions, it is a matter of mindshare and marketing, and it doesn't work the other way around because even the reviewers find some reason to bash the competition for not having support for a proprietary feature. The B580 is a very solid card despite the weird complaints of it not having DLSS or Nvidia RT.
I can't speak for these Nvidia fans you mention, but I'd have absolutely accepted it 100% at face value if they had handled it well and prompt and directly responded saying they didn't do it, and yet they absolutely colossally messed up their response to it. That's a considerable part of the information leading to my conclusion, the lack of directly denying it. The best case scenario here is that their PR team are woefully incompetent, and worst case is they did the thing and still handled it badly. Seems like one of the biggest no brainers ever that if you didn't do something you'd been accused of, you would simply and promptly just say so - that's at least a good start.
 
Last edited:
Lets try to stick to the topic. Lets talk about how the B580 even managed to play starfield instead of whatever anyone else things about AMD and starfield.
 
after watching LTT review and this thread, it's a success. But MiniPC's are exploding onto the scene now.
Imagine a oculink egpu version of this with a powerbrick for say 300 EUR. It would sell like hotcakes.
 
and no word about the excellent video encoder as a positive
Look again .. you'll find it .. and I wonder why suddenly everybody is doing so much video encode .. and why they couldn't do it before?

@W1zzard in every resolution A580 is faster than A750, can you check again?
Yeah, something's not right here.. will retest the A750 .. removed from the charts for now
Added RX 5700 XT and RX 6600 to charts
 
Last edited:
The lack of DLSS™ is the real issue of this card and one must know that THIS will crush this card. Switch 2 will crush PS5 and Xbox Series for this single reason.
By the way, I'm still waiting for DirectX support on my MBA M1.
 
Hi, why does A580 beat A750 in averages in every resolution in your charts? Besides that, the difference between them is surprisingly small.
 
Hi, why does A580 beat A750 in averages in every resolution in your charts? Besides that, the difference between them is surprisingly small.
Something was wrong with the A750 test run, it has been removed half an hour before you made your post. I'm retesting A750 right now and will update the charts later today
 
Funny how “bad drivers” is barely mentioned, especially in the comments….:laugh:

I truly hate intel, but the dlss Con is unfair and i dont care that you knowledge it

If anything that crap (dlss) is responsible for taking away the openness of PC gaming and should be a Con every time a game includes it.

I miss the days when gamers and reviewers trashed lock-in proprietary tech like this.

and I wonder why suddenly everybody is doing so much video encode .. and why they couldn't do it before?

In the same way that "apparently" everyone has a 4090 and must run everything with RT at 8K@120 Hz. :D
 
Last edited:
Multi monitor power draw... ugh.. Compared to my current card, it's looking bad. Almost double for nothing.. Same for video playback. This card is decently priced but the performance of it... is not where it should be imho. It really had to beat 4060 ti or get much closer to it. Idk if this would be enough to convince people. I know it didn't convince me at all. If my card were to die tomorrow, i ain't picking this. Decent doesn't sell anymore.
 
I must say - I'm impressed. This was not expected. B580 does surprisingly well in 4K. Efficiency is way much better than predecessor. I don't care about RT but that improved a lot, too. Card beats A770, RX 7600 and RTX 4060 in performance but is definitely not as efficient as is stated in this review. Review takes Cyberpunk ONLY to calculate efficiency.

When I take power draw during gaming vs. average fps and vs. rel. perf., I get this:
View attachment 375463
It is actually very clear when you look at relative performance vs. power draw during gaming. RTX 4060 has 95% of B580's perf., but consumes 125W, that's 32% less (B580 does 185W).

Anyway, good job Intel. B770 will surely eat more than 300W, but if performance jump is similar to B580's, it would be good. Price of B580 in EU is really bad right now.
Maybe I'm reading it wrong but:

shouldn't the last column say "Watts per Frame" instead of " Frames per Watt"

Currently it implies that, per Watt you get 6,23 Frames on a RX 7600 8gb in 2160p
 
Maybe I'm reading it wrong but:

shouldn't the last column say "Watts per Frame" instead of " Frames per Watt"

Currently it implies that, per Watt you get 6,23 Frames on a RX 7600 8gb in 2160p
Exactly! My bad. Wrong column titles sometimes occur when you mess with alcohol. Thanks for pointing out. Now it should be okay:

1734082294367.png

(Also replaced picture in former post.)

BTW, there is only one Arc B580 SKU available to purchase in my country and it's SPARKLE Intel Arc B580 TITAN OC 12GB.
Average price in e-shops (incl. VAT): RTX 4060 292€; B580 351€; RX 7600 279€. Even with stupid 1:1 conversion ratio ($:€) the B580 should cost around 300€ incl. VAT.
So, Arc 580 should cost 308€ incl. VAT to be on par with RX 4600's bang for buck ratio and anything less than 308 € is in favor of B580. Of course, this applies to my country.
 
Last edited:
BTW, there is only one Arc B580 SKU available to purchase in my country and it's SPARKLE Intel Arc B580 TITAN OC 12GB.
Average price in e-shops (incl. VAT): RTX 4060 292€; B580 351€; RX 7600 279€. Even with stupid 1:1 conversion ratio ($:€) the B580 should cost around 300€ incl. VAT.
So, Arc 580 should cost 308€ incl. VAT to be on par with RX 4600's bang for buck ratio and anything less than 308 € is in favor of B580. Of course, this applies to my country.
In my regions it's starts at 324,50 € (B580) which is a pretty good price for an msrp. The cheapest 4060 is currently 299,00€ (and was never below 269,00€)
 
@W1zzard Will GPU compute test results be added later?
 
Alright alright, I'm late to the party. B580 looks good. Not that I was expecting something amazing anyway. Intel literally could not disrupt the market even if they very much wanted too, imo

I'll say I am still pleasantly surprised, Intel did actually deliver what they said they would, to an extent. It wasn't the holy savior of mid-range GPU's like some were claiming but this is definitely a good step in the right direction. But if it'll pull regular consumers, I'm not sure.
Excited to see what the rest of the battlemage cards bring. Especially after the B570. I think what excites me more than anything is seeing how AMD will respond. Nvidia will do what Nvidia does obviously, but I wonder how AMD will respond. Who knows, maybe the RX 8000 series will have even better value. Who knows. I'm excited though.
 
Last edited:
It's a good card if it's sold for a lower price than its direct competitors, but it's an entry-level card for 1080p only, nothing more, competing with the RTX 4060/4060Ti and Radeon 7600XT.
But it's very likely that when the Radeon RDNA4 comes out (which will be launched in early 2025), Intel's new VGAs will lose their cost-benefit once again.

@W1zzard Will GPU compute test results be added later?
Which compute applications would you like to see tests with?
 
It's a good card if it's sold for a lower price than its direct competitors, but it's an entry-level card for 1080p only, nothing more, competing with the RTX 4060/4060Ti and Radeon 7600XT.
But it's very likely that when the Radeon RDNA4 comes out (which will be launched in early 2025), Intel's new VGAs will lose their cost-benefit once again.
I assume the lower/lowest end models won't launch in early 2025 but i might be wrong.
 
It's a good card if it's sold for a lower price than its direct competitors, but it's an entry-level card for 1080p only, nothing more, competing with the RTX 4060/4060Ti and Radeon 7600XT.
But it's very likely that when the Radeon RDNA4 comes out (which will be launched in early 2025), Intel's new VGAs will lose their cost-benefit once again.
Probably. I"m hoping it gives AMD a push as well. I think its a stretch to say its competing with the 4060 TI or 7600XT, when those cards blow the B580 out of the water, and it seems like Intel wasn't even trying to compete with those higher tier cards regardless. I haven't taken the chance to actually read too much of what this thread has had to offer though so I could be making myself look like a clown rn lol.
 
They screwed in power department. ASPM is disabled by default for a reason, it is buggy for certain platforms, devices like Intel NICs, nvme issiues, ASPM is tailored for laptops, where device changes are minimal, you can patch up the firmware. For desktops it is often hit or miss, not only the high performance aspect, ASPM choice is not optimal.

Basically, another try Intel... it renders the card unusable as accelerator in small servers or NAS.
 
I know most people say that RT is not relevant in this segment, but isnt it possible that more games like the new Indiana Jones will pop up in years to came? The game simply wont run without RT hardware. At this moment this B580 should be more future-proof then the competition in that reagard. Or is this a moot point?
Nope, your point is spot on. More and more game devs are going that direction.

Also on Linus' review, the B580 overall was better than is reflected on TPU.
This begs the question: What did LTT do differently?
 
I know most people say that RT is not relevant in this segment, but isnt it possible that more games like the new Indiana Jones will pop up in years to came? The game simply wont run without RT hardware. At this moment this B580 should be more future-proof then the competition in that reagard. Or is this a moot point?
Add to that the several RT games I've played on my RTX A2000, which is basically the equivalent of an RTX 3050. The popular narrative it's its irrelevant at this performance teir, but from my perspective this performance teir is more suited to certain resolutions, like 1080p, and at 1080p the RT performance can be quote usable.

Honestly the RT chops of battlemage are off to a cracking start from just the B580.
 
Maybe we will see another era of secondary GPUs in computers. Just like some 12-14 years ago, someone would buy GPU to compute PissX, whereas primary GPU was used for anything else.
 
Back
Top