• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Arc B580

@cal5582
It’s less incompetence (though plenty of that to go around) and more the fact that it’s been well known for a while that each new step in increasing graphics fidelity is less noticeable than the previous one while requiring progressively more computational resources. And monetary expense too. I don’t think that many game developers are actually all too enthused about this race, but the masses demand their ever “prettier” slop, however minor the improvement. I’ve been saying for years that a graphical fidelity freeze would probably do wonders for the industry, but that’s an unpopular opinion, so here we are with what we have.
 
Yea sure, CP2077 and all the other games where these cards have 0 issues released in the last 10 years are also "old games that run on a potato". :laugh: Reading comprehension on your part = 0.
The argument was for general relevance, you refer to a single game and then personally attack the person.
Try arguing the next time you're pressed, other guy might even take you seriously then. Or let's translate your comment:"bwwaahahhh i got no arguments so i'm deflecting with gibberish."
You were challenged on changing the base argument aka moving the goal post and then again attack the person.
Reading comprehension part 2, too bad I only said that to counter your moot "they don't use native, they use upscaling" argument, you really don't get it, do you? I never said 8 GB vram cards always need Upscaling to work, that's the nonsense you're reading into my very different words.
Started with a personal attack and then lampshade the semi relevancy of your own argument. Continued with a personal attack and a new argument created from the source of I didn't not say this.
Go tell that to thousands of people playing competitive games who own accounts that clocked 10000+ hours. Your oh so wrong. And what generally is your "argument"? "Let's waste energy, it's fine, it doesn't matter!" :laugh: Pathetic. Simply accept that you got no argument at all and move on.
This is not addressing the argument as the example given was a precise period of time and you argue the entire play history of all gamers. Then challenge the information sighting your personal importance placed on energy consumption followed by another personal attack.
Pasted a dictionary definition that is a personal attack.
 
@cal5582
It’s less incompetence (though plenty of that to go around) and more the fact that it’s been well known for a while that each new step in increasing graphics fidelity is less noticeable than the previous one while requiring progressively more computational resources. And monetary expense too. I don’t think that many game developers are actually all too enthused about this race, but the masses demand their ever “prettier” slop, however minor the improvement. I’ve been saying for years that a graphical fidelity freeze would probably do wonders for the industry, but that’s an unpopular opinion, so here we are with what we have.
honestly id like to see them explore other ways of improving things. what happened to 3d audio and destructible environments?
 
Low quality post by AcE
The argument was for general relevance, you refer to a single game and then personally attack the person.
Nonsense, then read my arguments carefully and a bit slower again (if I say "reading comprehension" again he says "personal attack" again :laugh: .)
You were challenged on changing the base argument aka moving the goal post and then again attack the person.
Too bad that he was the guy to move goal posts and I called him out for it, what are you even doing here? This is a technical forum and you're just here to attack me. Weird.
Started with a personal attack and then lampshade the semi relevancy of your own argument. Continued with a personal attack and a new argument created from the source of I didn't not say this.
Nonsense, the fact is he got no arguments, and I refuted his claims multiple times while he refrained from puting any work in to refute my counter-arguments and I called him out for it. Are you his 2nd account, or why so defensive of this guy? Again, this is a technical forum, not a psychology discussion or whatever your personal vendetta against me is. You got no point.
This is not addressing the argument as the example given was a precise period of time and you argue the entire play history of all gamers. Then challenge the information sighting your personal importance placed on energy consumption followed by another personal attack.
Too bad it is, go read my other post where I also address his claim, that makes 0 sense whatsoever. His claim is "oh gamers only play 20 hours, so energy waste doesn't matter!" and I refuted this by saying, they play way more and the energy cost is therefore way higher than he thinks it is, aside from his claims being nonsense on the basis that it has no merit to waste energy just "because you can", which is his entire point. He played down energy waste by saying "it doesn't matter, the costs are low!".
Pasted a dictionary definition that is a personal attack.
Too bad it isn't, describing a pattern of behavior is exactly that, a description - whereas what he said, really was a attack. And a professional psychological term is never a attack, btw. Maybe stop playing hobby psychologist here, you're clearly not good at it.

Nice try (not really). But again, this is a technical forum and not a psychology forum for your personal problems with me (that make no sense whatsoever).
 
Last edited:
Edit: Missing in pros is are the video encode decode capabilities which currently are a lot better than competition.
Everyone over looks this - Intel Deep link and Hyper Encode are no joke.
 
its time

SSqWhWb.jpeg


:)
 
Thanks for the great review. Intel has made great strides in power consumption and architectural efficiency. The A770, despite having more of everything, is slower. Ray tracing performance is great too though it doesn't really matter at this performance level. Still it's great to see someone else matching or beating Nvidia in ray traced games. The weak points, such as high multimonitor and video playback power consumption, are mitigated by the excellent price. At $250, this makes the 7600, 7600 XT, 4060 and 4060 Ti overpriced.
Pretty much my thoughts, did not expect the B580 to make Intel's previous Gen obsolete (and at launch! Without having to wait months for driver tweaks/optimization).
Congrats Intel! Hope you can keep up with demand.
 
@W1zzard Is the HDMI 2.1 support built into the card via an adapter with onboard support from HDMI Forum or is it driver based? Their last card was not driver based. I’m curious for Linux VRR support using MESA.
 
Hopefully AMD sees this and drops RDNA4 prices even further..

1734021387253.png
 
@W1zzard Is the HDMI 2.1 support built into the card via an adapter with onboard support from HDMI Forum or is it driver based? Their last card was not driver based. I’m curious for Linux VRR support using MESA.
It seems built in the GPU, there is no additional HDMI chip like on the A-Series cards
 
This would be more exciting if it had came out 6-12 months ago. As it stands the sub 300 usd market is so terrible this still ends up being a solid option finally...

I'm still digging the Intel references design.
 
This is a great showing from Intel! What impressed me the most was how many times this card beat out the 2080ti. Then there's the Raytracing performance, which for this price point is excellent! I'm now really excited for Intel's upper class cards. Also something to think about is the B570. I'm guessing it'll come in at 5%-7% lower than the performance of this card.
 
Last edited:
Wow, just wow. This is an absolutely awesome GPU for $250, matching up pretty well against competitors that cost 20-30% more. Plus I'm sure the drivers will continue to see larger than normal performance improvements.

Really interested to see what the B750 and B770 bring to the table now.
 
It's still not very impressive in the grand scheme of things, it's a relatively expensive chip to manufacture for it's performance tier, I suspect the margins are dismal.

damn game devs suck now if they cant work with 8 gigs of ram.
We've had 8GB cards on the market since 2013, over a decade ago. GPU manufacturers suck big time for under provisioning their cards with VRAM, though it's really the green goblin that's holding the industry back.
 
Last edited:
Damn it's refreshing to see a mid-end card with pricing like in the good ol' days. With some driver improvements, the 2nd gen Intel cards could be a serious threat to AMD/NV.
 
@W1zzard

Good tests

Can possible add inidiana jones and the great circle because this game needs raytracing mandatory

:)
 
Can possible add inidiana jones and the great circle because this game needs raytracing mandatory
Yeah, this will be added in next retest
 
Nice I guess, I wonder what the bigger chip is going to be like. If there is a bigger chip
 
.... Intel has advantage over AMD when it comes to video encode and decode capabilities.
Please provide empirical evidence to support that claim
 
We've had 8GB cards on the market since 2013, more than a decade ago. GPU manufacturers suck big time for under provisioning their cards with VRAM, namely the green goblin mostly.
"2013" sounds like a long time, but your argument isn't programming/developing related. 8 GB is still a lot for a vram buffer, and what was highend back then is now midrange, it just fits. The matter of fact is, just badly programmed games today struggle with 8 GB vram in 1080p (or even 1440p+), whereas well optimised games like CP2077 simply don't, most AAA games as a matter of fact don't. But it's popular to blame "8 GB" and "the green company" for this - it doesn't make much sense. It's still funny, Nvidia reacted to the people saying 4060 Ti has not enough vram, brought the 16 GB version of it, and it was a complete joke, 0% better, aside from a few very rare edge cases, and simply not worth the extra 100 bucks. Is more Vram nice to have? Yes. Do you need it? Most probably, no. You need it on a highend card for 1440p - 4K and RT on. Same on 7600 XT btw, the extra 8 GB only pay off when RT is on (on a card that is not good at RT... more than moot).

Question is, what size will the bigger chip of "Battlemage" have? Will it be 50% bigger? 100% is highly unlikely, would be a humongous chip that has low margins, or negative margins even, because too slow to capitalise on the size. With 50% bigger it can compete with 7800 XT and the likes.
 
Yeah, this will be added in next retest

thanks

maybe can add test using dxvk and vkd3d because with arc a series give more performance in various titles




:)
 
"2013" sounds like a long time, but your argument isn't programming/developing related. 8 GB is still a lot for a vram buffer, and what was highend back then is now midrange, it just fits. The matter of fact is, just badly programmed games today struggle with 8 GB vram in 1080p (or even 1440p+), whereas well optimised games like CP2077 simply don't, most AAA games as a matter of fact don't. But it's popular to blame "8 GB" and "the green company" for this - it doesn't make much sense.
Stop defending corporations trying to rip you off, it's not in your best interest. And yes 2013 is ages ago, memory modules are dirt cheap, 1GB of GDDR6 seems to be around 3$ at the moment, that's peanuts. The fact that 12/16 GB aren't bog standard even on lower end cards is inexcusable.

Developers shouldn't have to be stuck developing games for 8GB buffers forever, this is getting ridiculous.
 
Well... What to say... Competition is great. Especially, when the third player tries to get better. Efficiency is sadly, still lacking.
But I guess this is the legacy of Radeon, that RTG folks took to Intel. One of the stuff which transfered among with their past experience in AMD.

On the other hand... another good thing, that Intel didn’t push the 12HPWR connector, that basically has been developed and certified by Intel themselves. Seems this is just another evidence, that it was artificial proprietary thing of nVidia, and their own call after all! Especially it was moot, to push it for low, and medium-end cards. If 8pin ATX is enough for "more power hungry" Intel catds, anything below 4070, would have easilly get along with this "proven" connector as well.
 
Stop defending corporations trying to rip you off, it's not in your best interest.
Try to bring a technical argument, and not playing strawmans.
Developers shouldn't have to be stuck developing games for 8GB buffers forever, this is getting ridiculous.
You're not a developer, hence the non-technical non-argument seen here. You're just like this:"OMG we had 8 GB vram since 10 years! It must be outdated and old!". Tech doesn't care about feelings. 8 GB is 8 GB and it's still 99% enough.

Here's a fresh example from a brand new open world game review from TPU with the newest graphics, Unreal Engine 5:

"VRAM
Our testing shows that Stalker 2 is very well-behaved in terms of VRAM usage. Even at 4K you're barely hitting 10 GB; lowest settings runs at around 6 GB, so virtually all cards can handle the game without problems."
 
Back
Top