• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Arc B570 "Battlemage" GPU Details Surface: 18 Xe2 Cores, 10 GB VRAM

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
3,066 (1.08/day)
Intel's upcoming Arc "Battlemage" graphics card lineup has been exposed through a recent ASRock specification sheet leak, showcasing the company's latest products for the discrete GPU market. The leak details two models, the B580 and B570, with the B570 making a first appearance in the rumor section. The B580, positioned as the flagship model we already covered, features 20 Xe2-Cores and comes equipped with 12 GB of GDDR6 memory on a 192-bit interface, capable of reaching bandwidth speeds of up to 456 GB/s. Its slightly lower-spec sibling, the B570, sports 18 Xe2-Cores and 10 GB of GDDR6 memory on a 160-bit interface, delivering 380 GB/s bandwidth. In ASRock's case, both cards will be factory-overclocked, with the B580 reaching speeds of 2.8 GHz and the B570 hitting 2.6 GHz.

The new graphics cards are designed to operate on a PCIe 4.0 x8 interface. Both models will support modern display standards, including DisplayPort 2.1 and HDMI 2.1. Intel has scheduled the official unveiling of the Battlemage series for December 3, with cards expected to hit shelves on December 12. While the B570's pricing remains under wraps, the B580's rumored $249 price tag suggests Intel is making a serious play for the mid-range market segment. This aggressive pricing strategy, combined with the card's promising specifications, indicates Intel's determination to establish itself as a legitimate mid-range competitor in the discrete GPU segment, which NVIDIA and AMD have long dominated.



View at TechPowerUp Main Site | Source
 
In my opinion the offer looks great, especially media engine which should be top notch to this price tag, of course let’s wait for the independent reviewers for more meticulous details, with that said I am happy to see more competition!
 
now B series:D:D:D:D:D:roll::roll::roll::roll:
 
so we should see reviews in 3 days? or is it the b580
 
160 bit memory interface, there's a configuration we haven't seen in a while.
 
Anything less than 12GB of VRAM is a mistake. PS5 Pro is now 13.5GB, so in the future 12GB will not be enough either.

160 bit memory interface, there's a configuration we haven't seen in a while.
Nor should we ever have seen it again... Hopefully
 
so we should see reviews in 3 days? or is it the b580
No, I believe that will be later, since the "launch" is technically not until the following week.
 
I was way more interested in their Pro-series, but they were mostly paper launch. Intel Arc Pro A40/50/60 only in pre-builds or so. Rare cards in reality.
 
160 bit memory interface, there's a configuration we haven't seen in a while.
In a short one. RX 6700 non-XT and its rebranded RX 6750 GRE 10 GB variation are precisely 160 bit. Would've loved the same to happen to RX 7600 and RTX 4060 series GPUs because 128 bit is just not serious at this level of performance. Especially with such slow VRAM.
 
Ah, the deal's already getting worse, one day ago it was rumored 12GB at 250.

We lost 2 GB in just two days. I hope Intel releases tomorrow, or it'll be an 8GB card after all.
 
Ah, the deal's already getting worse, one day ago it was rumored 12GB at 250.
B580 was rumoured to have 12 GB; however, 10 GB is rumoured to be in B570. Different model.
 
In a short one. RX 6700 non-XT and its rebranded RX 6750 GRE 10 GB variation are precisely 160 bit. Would've loved the same to happen to RX 7600 and RTX 4060 series GPUs because 128 bit is just not serious at this level of performance. Especially with such slow VRAM.
Good comparison, seeing that the A770 trades blows with the 7600 (which in turn trades blows with the 6700), it really makes me wonder where this B570/580 will "fall" in terms of performance (and performance/price).
 
Videocardz also has a leaked B580 spec sheet.

1733138192814.png

What confuses me is the number of Intel XMX engines. There are 320 on the B580 but 144 on the B570. There should be the same number per Xe2 core. That would mean 40 Xe2 cores for the B580 if there are 8 XMX engines per core (144/18=8).

Either the 20 Xe2 core rumor on the B580 is wrong or there is a typo in the data sheet? Also the two data sheets are different. One states the number of cores and the other doesn’t.

I guess we will know the full specs tomorrow.
 
Videocardz also has a leaked B580 spec sheet.

View attachment 374120
What confuses me is the number of Intel XMX engines. There are 320 on the B580 but 144 on the B570. There should be the same number per Xe2 core. That would mean 40 Xe2 cores for the B580 if there are 8 XMX engines per core (144/18=8).

Either the 20 Xe2 core rumor on the B580 is wrong or there is a typo in the data sheet? Also the two data sheets are different. One states the number of cores and the other doesn’t.

I guess we will know the full specs tomorrow.
Only one more day to wait :)
 
10GiB VRAM will make it in the long run with the perfect driver quality and perfect intel quality a 720p card for retro gaming. (a part is sarcasm about quality)
Or a 1080p card with low details and low quality

I do not want to buy any device anymore with any HDMI ports. HDMI has to go. the sooner it dies the better
 
Last edited:
160 bit memory interface, there's a configuration we haven't seen in a while.
RX 6700 isn't that old. Otherwise it's a pretty rare bus width.

 
Why do all GPUs seem to have 3 DP and 1 HDMI port? There is no advantage in terms of flexibility.
 
Why do all GPUs seem to have 3 DP and 1 HDMI port? There is no advantage in terms of flexibility.
Licensing costs? I think you pay per HDMI connector but I could be wrong.

The extra FP connections are for multiple monitors that either don’t support daisy chaining or the resolution/refresh rate is too high on each monitor to support extended desktop when daisy chained.

Basically, the four connectors give you guaranteed extended desktop view on four high DP, high refresh rate monitors across most brands and SKUs without specialized hardware.
 
Last edited:
Licensing costs? I think you pay per HDMI connector but I could be wrong.

The extra FP connections are for multiple monitors that either don’t support daisy chaining or the resolution/refresh rate is too high on each monitor to support extended desktop when daisy chained.

Basically, the four connectors give you guaranteed extended desktop view on four high DP, high refresh rate monitors across most brands and SKUs without specialized hardware.

Intel does like Playstation 5 using PCON IC. RTL2173, it moves the licensing to that specific product maker. They have on board DP to HDMI adapter.

Not sure about the new one.
 
Intel does like Playstation 5 using PCON IC. RTL2173, it moves the licensing to that specific product maker. They have on board DP to HDMI adapter.

Not sure about the new one.
Right. Because the maker pays out of their pocket, they do not pass the cost to Intel which passes it on to you ;)
 
Right. Because the maker pays out of their pocket, they do not pass the cost to Intel which passes it on to you ;)

I hate hollywood with royalties, DRM binary crap and HDMI, where H stands for that place. And we pay for it all, no matter you consume their crap movies or not. That is a full blown mafia cartel.
 
I hate hollywood with royalties, DRM binary crap and HDMI, where H stands for that place. And we pay for it all, no matter you consume their crap movies or not. That is a full blown mafia cartel.
Not disagreeing one bit.

And while a few of us can still spot all that on PCs, on mobiles or tablets these are included without a second thought. And I'm not even sure why. While I personally don't consider movies worth pirating, I know several people that are basically ambulant IMDBs from all the content that is available out there, regardless of DRM.
 
So the B570 has one 8-pin while the B580 has two 8-pin. Makes me suspect that the B580 might have a bit of overkill on the power supply. Perhaps because one 8 pin isn't enough, and the 6-pin has basically disappeared from the market. That or the B570 is going to really be a significant drop in performance.
 
So the B570 has one 8-pin while the B580 has two 8-pin. Makes me suspect that the B580 might have a bit of overkill on the power supply. Perhaps because one 8 pin isn't enough, and the 6-pin has basically disappeared from the market. That or the B570 is going to really be a significant drop in performance.
One never knows. RX 6700 XT (or RX 6800 non-XT for that matter) could've make done with a single 8-pin connector with a couple "ifs" even without dropping in TGP or whatnot. Basically, offiical threshold is 225 W (despite it being physically possible to go for a single 8-pin to safely fuel a 350 W GPU (heavy duty PSU + AWG16 or thicker cords)). It might be that B570 is just a smidge below that and B580 is a nickle beyond. Since they're 18 and 20 CU SKUs with 160 and 192 bit bus respectively, there's nothing to suggest more than 15% difference in TGP provided same clocks.
 
Back
Top