• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel to Announce Arc Battlemage on December 3rd, With Availability and Reviews Expected on December 12th

GFreeman

News Editor
Staff member
Joined
Mar 6, 2023
Messages
1,901 (2.40/day)
According to the latest information from Videocardz, the Intel Arc Battlemage announcement and launch could be closer than expected. The official announcement for two first SKUs, the Arc B580 and the Arc B570, is apparently scheduled for December 3rd, with availability and first reviews coming on December 12th.

Intel is expected to announce and launch two mid-range SKUs, the Arc B580 and the Arc B570, and has yet to give any information on the rest of the lineup, including higher-end as well as entry-level SKUs. The Arc B580 SKU has been leaked recently and is said to feature 20 Xe2-cores with a GPU clock of 2.8 GHz. The board comes with 12 GB of 19 Gbps GDDR6 memory on a 192-bit memory interface and needs two 8-pin power connectors. The price is said to be set at around $250, according to rumors and listings spotted earlier. The second SKU, the Arc B570, is rumored to pack 18 Xe2-cores, while the rest of the information is still unknown.



As detailed by Videocardz, the availability is expected on December 12th, and Intel plans to split the review embargo to its own Limited Edition versions for the same date and custom versions from its AIB partners for December 13th.
Intel Arc Battlemage Embargo Dates
  • December 3rd: Announcement
  • December 12th: Launch, Reviews (Intel Limited Edition)
  • December 13th: Board Partner Card Reviews

View at TechPowerUp Main Site | Source
 
Well, hopefully this time the drivers will be on point Day 0. Intel already has an uphill battle to fight and are essentially a generation behind, no need to shoot themselves further in the foot.
 
Hope the performance is great enough, and the drivers too :toast:
 
Well, hopefully this time the drivers will be on point Day 0. Intel already has an uphill battle to fight and are essentially a generation behind, no need to shoot themselves further in the foot.
Alchemist has hardware flaws that made driver work really hard, because it needed game-by-game optimization. Battlemage will reduce such cases a lot because of SIMD16 and instructions like Execute Indirect support, among many other optimizations. It won't be perfect, but Day 0 will be lot better. Lower performance in lower settings and resolutions will be better on Battlemage too, so you don't always need performance killing 1440p and 4K resolutions to perform good relative to competition. This is one reason why it does so well on Time Spy, but falls apart in games.

And if they fix rebar, performance and compatibility will improve further, because there are cases where it only works partially, hurting performance even on latest systems that support it.

The only problem with Battlemage is B580 is the highest end until ~Q2 of next year. This is likely why they have been so quiet about it - nothing to brag about.
 
This needs to offer the performance of a 4070 and would be a game changer if it does! The GPU market needs this so bad.
 
This needs to offer the performance of a 4070 and would be a game changer if it does! The GPU market needs this so bad.
These are the mid-end cards, the 700-series will arrive later. But otherwise I truly agree.
 
This is probably why nVidia wanted to start launching their cards in December 2024 but pushed it back, they realized it's only Intel cards launching, I actually hope Intel bring out such a beast that it will actually hurt nGreedia for their arrogance, we have seen such things before. :)

Fun times, looking forward to seeing what they can do. :)
 
This needs to offer the performance of a 4070 and would be a game changer if it does! The GPU market needs this so bad.
A=Alchemist; B=Battlemage
Launched:
A770 - around 4060 performance
A750
A580
A350
A310

To be launched:
B770 - possibly 4070 performance or better
B750
B580 - Dec 12
B570 - Dec 12
B350
B310
 
A=Alchemist; B=Battlemage
Launched:
A770 - around 4060 performance
A750
A580
A350
A310

To be launched:
B770 - possibly 4070 performance or better
B750
B580 - Dec 12
B570 - Dec 12
B350
B310
2 days ago there were review of A770 and it for sure was not "around", but slower than 4060. It was around 3060
1732883455863.png


B770 will be lucky if it reach 3070 performance
 
We need good cards at or around $250 again. So please be good.
 
Last edited:
Intel should focus on driver quality for existing products. On the hardware quality and production quality especially for SSDs, processors, wlan, ethernet.

Just dumping out hardware so it's sold with poor quality is not a long term strategy.
 
I hope they could deliver power save modes so I could use the smallest B310 as pure accelerator card for AI or video encode.
 
2 days ago there were review of A770 and it for sure was not "around", but slower than 4060. It was around 3060
View attachment 373779

B770 will be lucky if it reach 3070 performance
I replaced my 3060 with the a770 & I saw a pretty hefty improvement. I am running 1440p high in my games.

Currently, the a770 sits between the 3060ti & the 3070, according to the folks on YT that benchmark the a770 & the a750 every time there is a driver update.

Really looking forward to the Battlemage cards.
 
Some people see a difference between a 3060, 3060ti and 3070. I see it as low end - low budget - old -entry gaming card for 1080p . I think the 3070 was that vram card which had issues in 1440p quite soon. A very nice 720p card for counterstrike. Enough vram.

a few frames better or worse, depending on the windows game, windows version, processor, ram, windows patch level and driver version. basically in the same bracket - low entry card - do not buy because of low graphics memory or graphic chip. If you are into retrogaming for games around 2010-2017 go for those cards. Especially with screen resolutions from 2010.

Yes Raytracing may change some aspects for some consumers

 
Last edited:
that 2x8pin seems wild for a low-end/midranger. I'm really curious its performance (and efficiency). hope it'll be comparable to similar tier AMD cards at least
 
I’m hopeful there will be some good here. Right now I’m on a 5600XT, which does a commendable job most of the time, but the 6GB is starting to show some age. A good mid-ranger at or around $250 would be a nice upgrade for me at this point. If nothing else, I hope it creates some price pressure on the 6700xt/7700XT and 6650XT/7600XT. Heck, I’m not above going with one of these BM cards, provided the drivers are sufficient and it’s not a power hog. I’m very unlikely to drop over $250 on a GPU.
 
From the same review, but at 1440p

at 4K the A770 is less than 10% slower than the 3060Ti and 6700XT.
That's cause Alchemist requires high utilization to perform fully. Battlemage will mitigate those issues significantly, so it'll perform better relative to Alchemist on 1080p resolutions and lower settings.
 
From the same review, but at 1440p
View attachment 373820
at 4K the A770 is less than 10% slower than the 3060Ti and 6700XT.
It's still 34% slower then the 3070, a GPU built on an inferior node and has a smaller die. The 3070 is 392mm2, the A770 406mm2.

No amount of "but in this situation with these settings while balancing a plate on your head" covers intel here. It's sink or swim time. Intel either needs to make some rather extreme improvements to Battlemage and catch up, or they will perpetually sell so few they'll waste money on every generation that comes out until it gets axed. TSMC space ain't cheap and intel cant make a profit on discounted third rate chips.

They have some good tech with XeSS and their RT implementation, they just have to show that $18 billion R+D budget isnt for show and they can, in fact, fix their designs.
 
From the same review, but at 1440p
View attachment 373820
at 4K the A770 is less than 10% slower than the 3060Ti and 6700XT.
Sadly there is no 16k, it would be great slideshow. Next time when you show 1440p pick card at least fast as 3060ti, oh wait, its 2025... Pick card as fast as 3070 at least
 
Last edited:
We need these cards to do good to have some sense of a respectable mid and low end product in user hands.

I already got burned by a gen12 GPU from Intel and their "we will never fix" driver mentality, that has to change with this release.
 
Sadly there is no 16k, it would be great slideshow. Next time when you show 1440p pick card at least fast as 3060ti, oh wait, its 2025... Pick card as fast as 3070 at least
Why? Current midrange should certainly be able to run 1440p. Its not the world's problem Nvidia still wants to make 1080p cards for midrange by handicapping their memory or bandwidth hard. The whole point was, the 4060 and 3070 are both incredibly poor cards, and Intel kinda already has a better offering there. Especially when also weighing the current pricing. Rather than strangely trying to defend some of the worst cards Nvidia released lately, you should be thinking about how Intel is already competitive (ish) there and what that really says about Nvidia's offering.

In case you missed it, the TPU bench suite is full of older games and new ones trickle in slowly, quite akin to the natural order of things. People dont play exclusively RT on ultra 1440p, but they can easily play at 1440p regardless and are unlikely to chase the latest triple A.
 
Last edited:
Why? Current midrange should certainly be able to run 1440p. Its not the world's problem Nvidia still wants to make 1080p cards for midrange by handicapping their memory or bandwidth hard. The whole point was, the 4060 and 3070 are both incredibly poor cards, and Intel kinda already has a better offering there. Especially when also weighing the current pricing. Rather than strangely trying to defend some of the worst cards Nvidia released lately, you should be thinking about how Intel is already competitive (ish) there and what that really says about Nvidia's offering.

In case you missed it, the TPU bench suite is full of older games and new ones trickle in slowly, quite akin to the natural order of things. People dont play exclusively RT on ultra 1440p, but they can easily play at 1440p regardless and are unlikely to chase the latest triple A.
Sure, when you get Intel GPU you don't chase new games, because you end with unplayable framerates. You can always play at low settings 4k, because the card is soooo gooooood at this resolution. About 4060 and 3070, you probably didn't realize that the 3070 is 4 years old and 4060 is entry level card, that cost $300 and soon will be replaced. Imagine to compete with Nvidia at price/perf, how terrible is your product? AMD has better offering in every price level, but nobody notices this. What has Intel? Ultra trash, slower, power hungry with worse price/perf, but but but third playerrrrrrrr
 
Sure, when you get Intel GPU you don't chase new games, because you end with unplayable framerates. You can always play at low settings 4k, because the card is soooo gooooood at this resolution. About 4060 and 3070, you probably didn't realize that the 3070 is 4 years old and 4060 is entry level card, that cost $300 and soon will be replaced. Imagine to compete with Nvidia at price/perf, how terrible is your product? AMD has better offering in every price level, but nobody notices this. What has Intel? Ultra trash, slower, power hungry with worse price/perf, but but but third playerrrrrrrr
Well the point of that bench comparison right there and the fact the Intel card runs away from the 4060 at 1440p tells you that the card IS indeed better suited and poised to not give you a 1080p midrange card anymore. Its not optimized around that level, while the 4060 apparently is... it falls apart above 1080p.

At the same time this is the segment where AMD produces heavily neutered cards too. Intel, just by way of inefficiency and lower refinement on their products, throws more hardware at the problem at the same price point, and it shows especially if you up the res. So yeah, if they bring us a 250 bucks 12GB card that's something to write home about, when the competition has an offering at 299,- with 8GB and abysmal bandwidth, which also shows in benchmarks. Its simply more GPU for your dollar, something Nvidia and AMD have been extremely crafty at cutting away for us. AMD has a better offering in every price level you say, but I'm not so sure that's true. AMD's entry/midrange has been messy too.

And then theres the age aspect... surely you too have noticed how generations are not only spaced further apart, but also barely improve the net performance per tier or price point these days? Its a perfect environment for Intel to play catch up.

It's still 34% slower then the 3070, a GPU built on an inferior node and has a smaller die. The 3070 is 392mm2, the A770 406mm2.

No amount of "but in this situation with these settings while balancing a plate on your head" covers intel here. It's sink or swim time. Intel either needs to make some rather extreme improvements to Battlemage and catch up, or they will perpetually sell so few they'll waste money on every generation that comes out until it gets axed. TSMC space ain't cheap and intel cant make a profit on discounted third rate chips.

They have some good tech with XeSS and their RT implementation, they just have to show that $18 billion R+D budget isnt for show and they can, in fact, fix their designs.
Its not just about what Intel does though, a big aspect here is also what Nvidia does, and a much smaller aspect what AMD does.
 
Last edited:
Back
Top