• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Arc B580

Tough sell, maybe good for people who want to waste energy, the 4060s chip is half the size and eats way less power in “Gaming”. No DLSS, yes, relevant. The only thing it kinda shined at was RT, but RT is largely irrelevant in mid range cards. All in all not a great product, barely competing with 7600/4060 with a way bigger chip and against former with a way better node on top (5 vs 6nm). It is really just the RT cores here that leave a good impression, everything else, no not really. Not even the price in US.
The 4060 only has 8 GB of VRAM, not making it particularly future-proofed.
 
Pretty dubious value at retail prices. Lets be generous and call it 10% faster at 1440p than 4060, but also 10% more expensive, consuming 4070 levels of power, and obviously lacking NVidia advantages (namely DLSS and CUDA). Surprisingly strong RT though. At MSRP it would be pretty interesting.
 
One thing TPU is missing is frame time testing. GamerNexus did some and in the 3 of the 5 games tested the intel card had significantly better frame times despit enot being that much faster then the 4060.

8GB is obsolete.
8 GB is obsolete when this happens in 30 games not 3 games. The reviewer here always maintained that 8 GB is still enough. Also testing all games in Ultra settings on a mid range card and then crying about lags is as weird as it gets: if I’m on a mid range card I won’t be pushing ultra in all games.
 
I must say - I'm impressed. This was not expected. B580 does surprisingly well in 4K. Efficiency is way much better than predecessor. I don't care about RT but that improved a lot, too. Card beats A770, RX 7600 and RTX 4060 in performance but is definitely not as efficient as is stated in this review. Review takes Cyberpunk ONLY to calculate efficiency.

When I take power draw during gaming vs. average fps and vs. rel. perf., I get this:
1734082312782.png

It is actually very clear when you look at relative performance vs. power draw during gaming. RTX 4060 has 95% of B580's perf., but consumes 125W, that's 32% less (B580 does 185W).

Anyway, good job Intel. B770 will surely eat more than 300W, but if performance jump is similar to B580's, it would be good. Price of B580 in EU is really bad right now.
 
Last edited:
This is somewhat better than I expected. Because Intel specifically compared 1440p results, I thought it would be quite worse at 1080p in comparison but it seems alright, still beats 4060. Sad to see that they still couldn't figure out low usage power consumption.
RT performance improvements are very good to see. This cards has the lowests performance drops by percentage in most titles, even slightly better than Nvidia. It is definitely not important for this particular card, but it's a good indication for the upcoming higher tier ones.
Also love to see the upgraded test system and 24h2. I was going to ask for some type comparison but I'm guessing your hands are full and will be continue to be full for some time after christmas so I'll just say 'Thank you for the review' for now. :D
 
Finally some small p/p improvement in lower end gpu segment. Low end needs better gpus desperately hopefully RX 8600 will be even better forget about nvidia!
 
$30 difference with the 4060 when the 4060 power consumption is almost twice lower, the drivers have more longevity, DLSS is more implemented in games than others.

Intel will have to do much better if they want their GPU to be successful.
 
8 GB is obsolete when this happens in 30 games not 3 games. The reviewer here always maintained that 8 GB is still enough. Also testing all games in Ultra settings on a mid range card and then crying about lags is as weird as it gets: if I’m on a mid range card I won’t be pushing ultra in all games.
3 of the 5 games tested. You missed that part.

We've seen GPUs kneecapped by 8GB memory before (hello 4060ti) where, yes, ultra settings are tested, because the 4060ti can handle it, but the 8GB buffer cannot.
“Future proofing” is a moot argument, as you can’t predict the future. 8 GB has been “dead” since years and is still enough in 2024, that’s the joke. For 1080p it will still be relevant for years to come I would bet.
It's hilarious how mad people get defending 8GB cards. The playstation and xbox have more than 8 gigs now, and have for years. 8GB has been on GPUs for a decade now. It's OK to upgrade, you dont prove anything by sticking to 8GB.

640k should be enough for anyone, right?
 
Finally some small p/p improvement in lower end gpu segment. Low end needs better gpus desperately hopefully RX 8600 will be even better forget about nvidia.
This isn’t even good in US, in Europe it’s downright bad price wise. In US it’s 50$ less, but your electric bill goes up due to inefficiency, in Europe the card is more expensive than a 4060, the electric bill makes it even worse. So yea better hope that 8600 and 5060 will be real deals, this really isn’t if you look a bit more at the details.
 
Well done Intel; What a well rounded card for the price, I am sure drivers will boost some of these numbers even more in the future. A pity I am obsessed with high FPS 1440p or I would have snagged this beauty, heck, still might for a streaming card or something.
 
$30 difference with the 4060 when the 4060 power consumption is almost twice lower,
That's not how math works. The intel chip consumes 50% more power, you dont use a multiple to describe a decrease in consumption.
the drivers have more longevity, DLSS is more implemented in games than others.
Longevity hasnt been proven yet. Let's wait until alchemist is discontinued before making those claims. You're right on LSS.
Intel will have to do much better if they want their GPU to be successful.
The B580 is selling out online, so I think intels doing pretty well here. Rome wasnt built in a day, and Xe2 is clearly a major step up from Xe1.

This isn’t even good in US, in Europe it’s downright bad price wise. In US it’s 50$ less, but your electric bill goes up due to inefficiency, in Europe the card is more expensive than a 4060, the electric bill makes it even worse. So yea better hope that 8600 and 5060 will be real deals, this really isn’t if you look a bit more at the details.
If you're worried about a few cents of electricity you shouldnt be buying GPUs, those are luxury goods.
 
3 of the 5 games tested. You missed that part.
3 of 5? Nice cherrypicking then. Of 10000 games tested it will have (alleged) "problems" in 5 AAA games, that's the reality. Your argument isn't part of it
We've seen GPUs kneecapped by 8GB memory before (hello 4060ti) where, yes, ultra settings are tested, because the 4060ti can handle it, but the 8GB buffer cannot.
Yes, in some cherry picked benchmarks. According to TPUs game tests 8 GB is never a issue. In 1 game (one) it is, Ratchet and Clank and that's it. And sorry if I don't trust clickbaiting drama queens over in youtube.
It's hilarious how mad people get defending 8GB cards. The playstation and xbox have more than 8 gigs now, and have for years. 8GB has been on GPUs for a decade now. It's OK to upgrade, you dont prove anything by sticking to 8GB.
I mean, maybe you are mad, don't project your problems on me. The PS5 and XSEX have 10 GB VRAM for the game, hardly more than 8 GB, while they are consoles for up to 4K, your "argument" is really my argument, not yours, as it makes my point. This is a 1080p card and competing with 1080p cards.
If you're worried about a few cents of electricity you shouldnt be buying GPUs, those are luxury goods.
Too bad that it is more than a few cents, and you're saying this to people who don't have much money anyway, who exactly buys mid range cards? And why are you so mad and personal? Who said this is about me? Calm down, kid.
 
A positive surprise. This is a real 3rd competitor in the mid-low range.
 
Not bad. Multimonitor power consumption is a real bummer though.
 
3 of 5? Nice cherrypicking then. Of 10000 games tested it will have problems in 5 AAA games, that's the reality. Your argument isn't part of it
that would imply that 10000 games have frame time tested, and only 3 showed a different. The reality is 5 games were tested. Your argument is ignorant.
Yes, in some cherry picked benchmarks. According to TPUs game tests 8 GB is never a issue. In 1 game (one) it is, Ratchet and Clank and that's it. And sorry if I don't trust clickbaiting drama queens over in youtube.
So anything that disagrees with you is cherry picked? There's also resident evil VII, which flat out didnt work on 8GB cards until a patch that allows it to run, like garbage. TLOU and forespoken dont run right on 8GB either. Other sites have tested games and shown reduced texture quality, reduced lighting, ece to make the games run.
I mean, maybe you are mad, don't project your problems on me. The PS5 and XSEX have 10 GB VRAM for the game, hardly more than 8 GB, while they are consoles for up to 4K, your "argument" is really my argument, not yours, as it makes my point.
LOL what even is this statement? Consoles are not doing native 4k, PS5 reserves 2.5GB of memory, leaving 13.5 GB for games, not 10. The series S is upscaling 720p. What does that imply for your 8GB GPU future?
Too bad that it is more than a few cents, and you're saying this to people who don't have much money anyway, who exactly buys mid range cards? And why are you so mad and personal? Who said this is about me? Calm down, kid.
Take a chill pill dude XD. It's literally a few cents for gaming. Yeah, even in europe. It would take 20 hours of gaming at full throttle for the intel chip to use 1kW more electricity then a 4060. A whopping $0.35 in germany. For 20 hours of flat out max power usage.

If that is a game breaker, then you should be spending $250 on a GPU. Sorry, that's basic math.
 
that would imply that 10000 games have frame time tested, and only 3 showed a different. The reality is 5 games were tested. Your argument is ignorant.
10000 was me lowballing it, 99.*% of all games ever released will have no problems with 8 GB vram today. You don't seem to get the point i'm making. Cherrypicking doesn't support your argument, especially not while the guy who makes the reviews here doesn't agree with you. But I bet the drama queens in youtube will agree, they need the clickz.
So anything that disagrees with you is cherry picked? There's also resident evil VII, which flat out didnt work on 8GB cards until a patch that allows it to run, like garbage. TLOU and forespoken dont run right on 8GB either. Other sites have tested games and shown reduced texture quality, reduced lighting, ece to make the games run.
While 99% of even AAA games run fine, i would say that's a "Resident Evil" problem then, not a 8 GB Vram card problem then, you seem to lack rooting in reality. Exceptions don't make the rule. The rule is more important than the few exceptions you're handpicking. Aside, I'm not even accepting your handpickings, no source, TPU author doesn't agree with you, 100% conducted in Ultra settings, which are nonsense on Medium range cards. There's so much to counter your argument, it's quite easy.
LOL what even is this statement? Consoles are not doing native 4k. The series S is upscaling 720p. What does that imply for your 8GB GPU future?
And? You can also use these cards and not do native 1080p, instead use DLSS Quality. Maybe think a bit more before pressing reply next time, you're so easily countered. And even then, 4K with just 10 GB vram Upscaled, still makes 10 GB vram hardly a lot, so no, you don't have a point at all. 1080p without upscaling eats less vram than 4K with upscaling. More thinking, less posting.
Take a chill pill dude XD. It's literally a few cents for gaming. Yeah, even in europe. It would take 20 hours of gaming at full throttle for the intel chip to use 1kW more electricity then a 4060. A whopping $0.35 in germany. For 20 hours of flat out max power usage.
And how many hours does a gamer play? Not 20, more like 20.000. Your point = no sense in it. Take a chill pill dude and post slower, think a bit before pressing the blue button.

Plus, Power consumption isn't just about $, it's also more heat and noise. Good luck with your moot argument.
 
Thanks for the great review. Intel has made great strides in power consumption and architectural efficiency. The A770, despite having more of everything, is slower. Ray tracing performance is great too though it doesn't really matter at this performance level. Still it's great to see someone else matching or beating Nvidia in ray traced games. The weak points, such as high multimonitor and video playback power consumption, are mitigated by the excellent price. At $250, this makes the 7600, 7600 XT, 4060 and 4060 Ti overpriced.
 
It’s a good step up from Alchemist. It isn’t a particularly exciting or disruptive card, sure, but it at least shows that Intel ARE improving and can, if the GPU division isn’t shuttered by the board, become a respectable competitor in time. People saying “oh, it isn’t a definitive superior to a 4060” kind of forget that one is a card from a literal juggernaut of the industry who has been a leader for more than 20 years now and the other is literally a second generation tech from a company who didn’t make conventional discrete GPUs before at all.
 
Matches last gen performance as we are about to hit 2025, and new generations from both AMD and Nvidia right around the corner. Yay, I guess...
 
10000 was me lowballing it, 99.*% of all games ever released will have no problems with 8 GB vram today. You don't seem to get the point i'm making. Cherrypicking doesn't support your argument, especially not while the guy who makes the reviews here doesn't agree with you. But I bet the drama queens in youtube will agree, they need the clickz.
Oh I see your argument here. Because Blues clues adventures from 1999 runs on a potato, that means modern GPUs dont need more then 8GB.

that's some insane red herring you got going on. Old games =! modern ones.
While 99% of even AAA games run fine, i would say that's a "Resident Evil" problem then, not a 8 GB Vram card problem then, you seem to lack rooting in reality. Exceptions don't make the rule. The rule is more important than the few exceptions you're handpicking. Aside, I'm not even accepting your handpickings, no source, TPU author doesn't agree with you, 100% conducted in Ultra settings, which are nonsense on Medium range cards. There's so much to counter your argument, it's quite easy.
Yes when you throw out everything you disagree with its easy to claim you're right. Sadly you are mistaken.
And? You can also use these cards and not do native 1080p, instead use DLSS Quality. Maybe think a bit more before pressing reply next time, you're so easily countered. And even then, 4K with just 10 GB vram Upscaled, still makes 10 GB vram hardly a lot, so no, you don't have a point at all. 1080p without upscaling eats less vram than 4K with upscaling. More thinking, less posting.
LOL you totally miss the point. The current "8GB" console cant hit 1080p native, and has to go lower. If you have to resort to running DLSS at 1080p to make the games run right, you have a problem. And DLSS doesnt reduce memory usage this is fact.
And how many hours does a gamer play? Not 20, more like 20.000. Your point = no sense in it. Take a chill pill dude and post slower, think a bit before pressing the blue button.

Plus, Power consumption isn't just about $, it's also more heat and noise. Good luck with your moot argument.
You do realize that if you gamed 16 hours a day, every day, for an entire year, that is 5840 hours. Somehow I doubt most gamers are playing 16 hours a day every day for 3.42 years straignt.

Seriously, are you OK? That is pretty basic math to be screwing up.
 
damn game devs suck now if they cant work with 8 gigs of ram.
 
Remember how Wayne Gretzky said he doesn't skate to where the puck is, but to where it will be? Well, Intel seems to be skating to where the puck was.
 
damn game devs suck now if they cant work with 8 gigs of ram.
"640k should be enough for anyone"

Technology moves forward. You didnt see people pearl clutching when their 2GB GPUs or their 512mb GPUs were obsolete. But 8GB? Man, people REALLY love the number 8 for some reason.
Matches last gen performance as we are about to hit 2025, and new generations from both AMD and Nvidia right around the corner. Yay, I guess...
Well, we've seen that Nvidia is likely going to keep the 4060/ti around for awhile. They'r emore interested in the 5090/80/70 cards and AI accelerators. AMD? Well,t he 7600 was a total dud of a card and I dont see rDNA4 being the magic that AMD needs. rDNA3, core for core, was a total failure, and AMD doest have the margins to lower prices further. So I doubt they would have a good answer to the 580.
 
"640k should be enough for anyone"

Technology moves forward. You didnt see people pearl clutching when their 2GB GPUs or their 512mb GPUs were obsolete. But 8GB? Man, people REALLY love the number 8 for some reason.

Well, we've seen that Nvidia is likely going to keep the 4060/ti around for awhile. They'r emore interested in the 5090/80/70 cards and AI accelerators. AMD? Well,t he 7600 was a total dud of a card and I dont see rDNA4 being the magic that AMD needs. rDNA3, core for core, was a total failure, and AMD doest have the margins to lower prices further. So I doubt they would have a good answer to the 580.
mainly because we have games from 5+ years ago that look about the same that were running on 6 gigs or less. it feels more like incompetence than any real leap forward.
 
Back
Top