• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Arc B580

Starting to see a pattern here:

1734024988556.png
 
This thread looks like a warzone, LMAO. Good on Intel. I love what they're doing lately, and they're delivering at a pace not even Nvidia seems to keep up with. Can't wait for the upper tier Arc cards, those are gonna kick butt! :nutkick:
 
Stop defending corporations trying to rip you off, it's not in your best interest. And yes 2013 is ages ago, memory modules are dirt cheap, 1GB of GDDR6 seems to be around 3$ at the moment, that's peanuts. The fact that 12/16 GB aren't bog standard even on lower end cards is inexcusable.

Developers shouldn't have to be stuck developing games for 8GB buffers forever, this is getting ridiculous.
I miss the days that AIB's could decide how much aftermarket VRAM they want to slap on, I always bought the one that came with double the VRAM and those cards lasted me significantly longer.
 
  • No support for DLSS (yes I know it's an NV exclusive, still doesn't change the fact that you can have it on one option and not on others)

Hah Wha? Complaining that a non-Nvidia card doesn't run an Nvidia exclusive tech is absolute nonsense.

Complaining about something Intel literally have no way of implementing is wild.
 
Try to bring a technical argument
You're not a developer

Computational and memory capacity often have to both scale together otherwise you simply cannot efficiently make use of either one of them. Since 2013 computing power went up by ~16X times but memory nowhere near that factor, this is problematic.

This is just classic nvidia fanboy PTSD Stockholm syndrome "uhm, actually it's good that GPUs don't have a lot VRAM". Most games run fine with 8GB VRAM because that's what developers are forced to work with you genius, ask any programmer if he would rather work with less or more memory. It's nearly impossible to find a computing problem where execution time, memory and problem size don't all scale together, you are forced to work with a serious handicap if the problem size keeps going up but you are constrained by the same memory limitation over and over, this isn't a game development thing, this is a programming thing in general. Stop pretending like you know anything about this subject.

People keep asking for more detailed and complex worlds in games which amongst many things requires more resources aka memory but then you have people like this guy who thinks that you can just magically "optimize" games ad infinitum.
 
Last edited:
Very impressive, if I was in the market for a new GPU I'd be seriously considering one.

Agree on the video encoding comments, Intel has the best AV1 and is tied with Nvidia for H.265 and H.264 (unless they've improved it even more?), with AMD behind everyone else, it's worth pointing out.
 
Starting to see a pattern here:

View attachment 375348

Well when the sub 400 market has been trash for the most part over the last 4 years a gpu that performs ok and has enough vram is a breath of fresh air....
I miss the days that AIB's could decide how much aftermarket VRAM they want to slap on, I always bought the one that came with double the VRAM and those cards lasted me significantly longer.

The fact that we've had budget 8GB gpus since 2016 and have not really budged from there since is pathetic other than that maket getting shifted up fron 200 usd to closer to 300usd. It has nothing to do with how much vram is or isn't needed but at this point it's just anti consumer more never hurts performance and gives the consumer peace of mind at a min.

Games have pretty terrible texture quality even today and I'm guessing part of it is developers needing to target 8GB gpus and it being too much work to scale down from super high quality assets. Thankfully a lot of games have mods to remedy this to some extent.

Kudos to Intel though for finally going beyond 8GB in the 220-250 mark with a launch msrp it only took 8 years....
 
Last edited:
Overall I like this card a lot and it does so many things well for the price, but the disappointing part is: where it fails, it fails hard.

average-fps-per-game-1920-1080.png


In those games where it doesn't reach 60fps:

It's competitive with 4060/7600 in: Wukong, Star Wars
It's slower than 4060/7600 in: Dragon Age, Silent Hill, Stalker, Starfield

So where it needs those fps the most, it fails to deliver vs. the competition 2/3 of the time. Hopefully the silver lining is that with driver improvements this can be fixed.

And for my GPU preferences, high idle power and high Vsync power are also bad. While this has been worse on some AMD models yet very good on others, Nvidia really has this angle of GPU power management completely solved.
 
Hah Wha? Complaining that a non-Nvidia card doesn't run an Nvidia exclusive tech is absolute nonsense.

Complaining about something Intel literally have no way of implementing is wild.
Just make something that's better than DLSS and ensure it's included in more games. If NVIDIA can do it, Intel can do it, too.
 
Something is wonky with the RT numbers. B580 is just under 3060Ti at 1080p, way over 3060Ti at 1440p, and way under 3060Ti at 4k.
 
Holy f'ing shit.

I think we underestimate what happened here. Battlemage is really good.

Better RT than AMD at top-class efficiency is nothing to sneeze at. If they can scale these numbers to 2x and 3x the net performance, they're fighting Nvidia on every front.

The fact is, Intel has just delivered a GPU that beats the 4060 and I dare say the 4060ti just for having 12GB VRAM on top of its price point. And on top of that, XeSS is also exceeding FSR in most situations right now.

Damn! AMD better wake the F up.
 
Something is wonky with the RT numbers. B580 is just under 3060Ti at 1080p, way over 3060Ti at 1440p, and way under 3060Ti at 4k.
VRAM and fail at Alan Wake and Ratchet & Clank 4K
 
Overall I like this card a lot and it does so many things well for the price, but the disappointing part is: where it fails, it fails hard.

average-fps-per-game-1920-1080.png


In those games where it doesn't reach 60fps:

It's competitive with 4060/7600 in: Wukong, Star Wars
It's slower than 4060/7600 in: Dragon Age, Silent Hill, Stalker, Starfield

So where it needs those fps the most, it fails to deliver vs. the competition 2/3 of the time. Hopefully the silver lining is that with driver improvements this can be fixed.

And for my GPU preferences, high idle power and high Vsync power are also bad. While this has been worse on some AMD models yet very good on others, Nvidia really has this angle of GPU power management completely solved.
Nothing optimization can't fix. Its clear the net performance is there. We can't fault Intel for not having every game at top performance imho. That takes time, and market share. The base performance is already really good. The reality is you're not playing either of the games you mentioned at 30 FPS. You will tweak settings to get close to double that, which is possible just fine on all of these GPUs at 1080p.
 
Good to see my 5700XT and 6900XT are still going strong. :)
 
Computational and memory capacity often have to both scale together otherwise you simply cannot efficiently make use of either one of them. Since 2013 computing power went up by ~16X times but memory nowhere near that factor, this is problematic.

This is just classic nvidia fanboy PTSD Stockholm syndrome "uhm, actually it's good that GPUs don't have a lot VRAM". Most games run fine with 8GB VRAM because that's what developers are forced to work with you genius, ask any programmer if he would rather work with less or more memory. It's nearly impossible to find a computing problem where execution time, memory and problem size don't all scale together, you are forced to work with a serious handicap if the problem size keeps going up but you are constrained by the same memory limitation over and over, this isn't a game development thing, this is a programming thing in general. Stop pretending like you know anything about this subject.

People keep asking for more detailed and complex worlds in games which amongst many things requires more resources aka memory but then you have people like this guy who thinks that you can just magically "optimize" games ad infinitum.

Thank you.
 
The RT efficiency and scaling in Doom Eternal is just so nice to see. No more 55% drop in performance. Hopefully that translates to B770 and all ID Tech games.
 
Sold out everywhere and already at over $350
 
Ok folks, time to update the mantra:

I really want AMD Intel to bring better price performance to the table and increase competition so I can get my Nvidia card for less money.
AMD forced nvidia to lower prices? When was that lmao? 15 years ago?
I like this card, why is it 320EUR though (for the cheapest model too)? With currency conversion and VAT, it shouldn't be over 280EUR

Also, what's going on here
1734028205794.png
 
Last edited:
I'm glad to see Intel bring some competition to the market at the entry level.
 
This means I'm heavily interested in the highest tier Battlemage GPU, unless it comes with less than 48 Xe cores. Hope it's also gonna have a 384-bit bus but you never know with these greedy corpos.
 
Ok folks, time to update the mantra:

I really want AMD Intel to bring better price performance to the table and increase competition so I can get my Nvidia card for less money.

That's what 90% of the market is hopful for lmao the problem is even when Nvidia doesn't take the bait and prices terribly from a P/P perspective people buy them anyway....

A trash 5060 with 8GB will likely sell 20x-30x this at 299-350 usd.
 
Just because it's cheap doesn't mean it's 'excellent value.' A shit graphics card that performs almost as badly as Nvidia's lowest end garbage card is still shitty value, because its still $250 dollars wasted on a sub par gaming experience.

If they could build a card that was equivelent to a 4070 Super and charge $450 dollars for it, that would truly be 'excellent value.'
 
@Ravenmaster
…it’s not “just because it’s cheap”. Did you miss the cost-per-frame graph? It is, by definition, good value. As in - good value considering the market as a whole. I am sure we’d love to live in a perfect world where a 4070S is a 250 dollar 4060 and, as such, the value proposition of a B580 would indeed be poor. But this is not that perfect world.
I sweat people become unhinged in any GPU related thread.
 
Back
Top