• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Arc B580 Battlemage Unboxing & Preview

According to TPU review, 24% over A750 would mean 8% faster than 4060 (with October 2022 drivers for the A750). So their claim of beating 4060 by 10% seems reasonable and in theory B580 could match or marginally surpass 7600 xt. We will know after a week.

I repeat, in theory.
 
If those claims hold and drivers are decent then looks like a decent product for the asking price.
I hope the drivers are good. Arc A-series card launched to terrible driver issues, even though the IGP variants and associated control panels were working well before that date. Intel took expectations of "similar to the Xe graphics we're familiar with" and broke everything, in multiple ways, all at once.

So yeah, just because the current drivers are passable and game issues are largely resolved, doesn't guarantee that Intel won't f*** this up again.

More compelling competition is great for us, so here's hoping both Battlemage and also AMD's 8000 series absolutely decimate the stagnation we've seen in terms of performance/$ since RTX 20-series launched at silly prices with modest performance gains. Bad driver problems and dishonest marketing promises are all it takes to completely undermine people's enthusiasm for Intel GPUs.
 
If this card is marginally faster than the 4060 for 250$ this product is DOA. The savings compared to AMD is too low to make me willing to deal with driver inconsistencies or recommend it.
Chop off another 30$ and now we're talking.

Frankly this sounds a lot like Intel quickly rushing this release to be able to sell some products in fear of not being able to sell any for profit once the performance mills of AMD/Nvidia get going.
Having lived through both Intel & AMD gpu driver inconsistencies, I disagree.

This will make a nice backup for my a770, until the b770 arrives.
 
10% faster then 4060 at 1440p? interesting rez to show when 4060 is 1080p card not 1440. throw on top 4060 is already over a year old soon to be replaced.
 
10% faster then 4060 at 1440p? interesting rez to show when 4060 is 1080p card not 1440. throw on top 4060 is already over a year old soon to be replaced.
I have the rx6600 combined with an i9 11900KF with 32gb of ram running at 3200mhz and play at 1440p. The 6600 is most definitely not a card for 1440p, and is worse that the 4060. It's about 21% slower that the 4060 on TPU's relative perfomance charts when looking at the 4060's page.

1733263785668.png


I usually play at mid to mostly high settings and achieve a satisfactory 70+ average fps on most games. Which you can see in my picture of the games currently on my pc.

1733263492428.png


Do I play extremely intensive games, no not really. But whenever I play an intensive game, it's not like cards like the 6600 or 4060 will be terrible at 1440p. I understand why people say X card is for 1440p or X card is a 1080p card. But I think most people would be fine with performance like mine or just the 4060's performance at 1440p. It's mostly people that are tech savvy that care more about the specific specs and whatnot.

I also understand people not wanting to compare the B580 to the 4060. While the specs on paper and value proposition are really good, I think it's fair to doubt Intel's drivers. They do a good job supporting their alchemist gpu's and they have come a long way from when they launched. But at the end of the day, this is only Intel's second (released) shot at the discrete gpu market and their tech isn't really mature compared to AMD and nVidea.

I won't be upgrading my gpu anytime soon, but I do genuinely wish Intels gpu team the best and succes. They deserve it and have done their damned hardest to get the alchemist gpu's to the state they are in right now.

Other than that, thanks for for reading my rambling and just know I'm not trying to offend anyone.
 
Not a fan of Intel stuff these days, but the design of their reference cards I absolutely love.

Minimalistic beauty. Nothing screams "gamer" about it. This is a GPU I would mount vertically just for the clean aesthetics.

Asus offers a nice compromise with their ProArt line, but these Intel GPUs make even that look too busy.
 
Only for ARC cards, the rest of "Xe1" lost driver support in 2022. Even pre ARC dedicated GPUs (Iris Xe MAX) never got the new D3D11 driver, or any fix for Vulkan, and run a seriously outdated D3D12 driver too. Intel has to really show they can consistently make a driver for the entire stack with Xe2, and not half-cook it just for the press.
There's also the issue with Alchemist drivers refusing to install at all and having to try 50 different things to get them working.

Happy to see a real memory amount for 250 USD, no more 8GB jokes, now let's hope Intel didn't fire half their driver division before release (like AMD did).

Are you back on this again? You troll every single Intel driver post bro, thats wild.

meanwhile the intel arc driver download page since the history of forever including as of like a week ago:


1733267088592.png
 
1440p tested and compared by using an upscaler (which is already bad as it's no longer 1440p) that uses different algorithms depending on the platform that runs it?
GdAgHyfagAEKYCj.jpg


What has this world come to.

On a more serious note: It looks good, and I like that they are looking at 1440p with a 250USD card, that is good and that is the goal that we should be striving for, not 1080p, it's time to move on from it, though preferably at a native resolution, none of this nonsense that upscaling is. That's a bandaid, not a target. It's also nice to see 10GB on the lower model, as that's what the Xbox series X has, so it should be fine for the foreseeable future (provided we don't keep getting VRAM catastrophes).
 
Intel has invested billions into their dGPU attempts, losing money the whole way. If they weren’t serious, there would be no BM at all. They’d have cut this loose a long time ago if they were just in it for the short term. Even with all the cost cutting going on, the Arc program has so far survived.
Billions mismanaged. I would love to see a third party involved, they ain't it. Not at this time.

No FP64 mentioned on the Architecture page, so pretty sure not.



Popping in to say simply, "5090?" is actively detrimental to constructive conversation.
I didn't know we got emotional over lackluster products. I'll do better next time - I promise.
 
This is a strong statement. Even 4090 is not ideal for 1440p anymore.
I'd rather word this in the lines of "making it great for casual gaming at 1440p."
Try options sometimes, even i4070 is good for 1440p

Big ego gamers cant lower settings?
 
Try options sometimes, even i4070 is good for 1440p

Big ego gamers cant lower settings?
If I'm trying to experience everything the Dev made, I don't want to turn down settings. I want to experience it in all its glory. However, if I couldn't afford it, I'd play at 1080p.
 
Try options sometimes, even i4070 is good for 1440p

Big ego gamers cant lower settings?
Or uncaring/poor/overworked developers can't optimize games. I'm leaning towards that last one.
 
If you put caveats like that on it (older games or DLSS) then even my old 1060 was a 4k beast. As tested with UT2004
I didn't say decades old games. ;) My argument was maybe about 10-15 year old games maximum, like Tomb Raider 2013 or so. And with DLSS, well, no RT of course DLSS performance , FG probably on as well ... and it works, if 60 FPS is your target, if it's higher it's still too slow for 4K. So it has a lot of caveats, but possible. Also of course "smart settings" and not "ultra ultra ultra" nonsense.
 
The pricing is so stupid

I don’t understand the point of the 570 if the 580 is only $30 more
 
For a 4090 owner, ideal is 120+ FPS. There's a list of games you will certainly struggle running this smoothly.

A 4090 is CPU limited at 1440p. Want to know how I know?
 
The pricing is so stupid

I don’t understand the point of the 570 if the 580 is only $30 more
My guess is these prices are not what we’ll see in retail. The OEMs will put zany RGB stuff all over the 580 and overclock it, probably tacking another $30 on there. Or the 570 will actually go on sale pretty fast and sell for under MSRP.
 
My guess is these prices are not what we’ll see in retail. The OEMs will put zany RGB stuff all over the 580 and overclock it, probably tacking another $30 on there. Or the 570 will actually go on sale pretty fast and sell for under MSRP.

It's not terrible from first party sellers. One just needs to look.

1733285854398.png
 
Who fooled you that 4060 is 1440p card? It’s a 1080p card just like A770, just like rx7600. If you hate yourself and want to punish yourself with ~30fps its your problem

Most games that I’m aware of have a settings menu you can tweak things to get increased frame rates. You should check it out.

If I'm trying to experience everything the Dev made, I don't want to turn down settings. I want to experience it in all its glory. However, if I couldn't afford it, I'd play at 1080p.

With a 6900xt? You must play the latest games at 25fps If you don’t turn down settings.
 
Last edited:
Are you back on this again? You troll every single Intel driver post bro, thats wild.

meanwhile the intel arc driver download page since the history of forever including as of like a week ago:


View attachment 374363
https://downloadmirror.intel.com/839169/ReleaseNotes_101.6299.pdf point me where are the Iris Xe and other gen 12.1 iGPU fixes there. I welcome you to check older changelogs too. Or suddenly losing game support with Forza Horizon 5 was an illusion?
Nothing worse than the blind that refuses to see.

Intel cut support to focus only on the desktop cards, screwing every owner of integrated and DG1 cards before Xe2 got released. Somehow this is news to you? Such a great moderator we have here.
 
Most games that I’m aware of have a settings menu you can tweak things to get increased frame rates. You should check it out.



With a 6900xt? You must play the latest games at 25fps If you don’t turn down settings.
I was running 3 by 1440p monitors. Just bought the 4K OLED monitor about 10 days ago. 5090 is rumored to push a respectable rate in 4K. Bought the MSI Carbon X870E. Waiting on the 9950X3D. Bought some G.Skill 8000MT/s RAM. So yeah...my current setup is going to my son's computer. Bought him the 1440P AOC OLED. For now, limited to Overwatch 2 and WoW until the upgrade is complete.
 
Must have resizable BAR? Guess I can’t consider one of these without a full upgrade. I guess I can’t consider just hope it helps drag prices down across the board.
At some point, you have to let old technology go. Rebar support with no user intervention started with intel 10th gen & AMD Zen+. If you are willing to spend a little time, then the RebarUEFI project will add Rebar support all the way back to 4th gen Intel.

I think I hurt some feelings...so so so sorry. However, I have zero faith in Intel GPUs and their drivers. Do ya'll really think Intel is seriously invested in this space? I don't think so. Looks to me like an attempt at a quick cash grab that backfired.
I've been pretty pleased with my a770. I always find it interesting that the folks that don't actually own one of these cards can make the most sweeping statements.

"running fine" is very objective, yet considering it has lower performance than 3060 in the new games tested by TPU and many times close to 3050 performance is not "running fine" for me.
I replaced my 3060 with my a770, and saw a performance increase - the a770 is more of a 1440p card than a 1080 card.

Unlike you, I actually own 2 intel cards, my a770 is my daily driver & my a310 is in my AV1 encoding box.
 
More stagnation in the GPU market, same performance as previous gens (4060/7600/xt) for the same current price give or take ($250-$300) at this rate I will have to wait another 3+ years to buy a £400 GPU that has a significant performance increase to my RX 6800 which I bought at that already inflated price over 12 months ago, how people call this progress is beyond me, it's been this way for the last 2-3 gens notwithstanding the new gens that are about to drop, PC gaming is a joke ATM specifically GPU performance/cost, but yes, covid and "inflation" etc, used to be called getting shafted and now people willfully bend over to take their punishment while thanking the greedy corporations, at least we can render at 720p and add fake frames :rolleyes:
 
At BF i bought a RX 6650 XT for a secondary rig. So glad i returned it a few days later, not only because the Sapphire Pulse cooling was crap, but because at the same price i will get a new Intel card and play around with it and on it.
what so special needed to cool 6650?:rolleyes:
 
Back
Top