• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Rumored to Launch Arc Battlemage GPU With 24GB Memory in 2025

Nomad76

News Editor
Staff member
Joined
May 21, 2024
Messages
1,361 (3.66/day)
Intel could be working on a new Arc graphics card according to Quantum Bits quoted by VideoCardz. It's based on the Battlemage architecture and has 24 GB of memory, twice as much as current models. This new card seems to be oriented more towards professionals, not gamers. Intel's Battlemage lineup currently has the Arc B580 model with 12 GB GDDR6 memory and a 192-bit bus. There's also the upcoming B570 with 10 GB and a 160-bit bus. The new 24 GB model will use the same BGM-G21 GPU as the B580, while the increased VRAM version could use higher capacity memory modules or a dual-sided module setup. No further technical details are available at this moment.

Intel looks to be aiming this 24 GB version at professional tasks such as artificial intelligence jobs like Large Language Models (LLMs) and generative AI. The card would be useful in data centers, edge computing, schools, and research, and this makes sense for Intel as they don't have a high-memory GPU for professional productivity markets yet. The company wants to launch this Arc Battlemage with bigger memory in 2025, we guess it might be announced in late spring or ahead of next year's Computex if there's no rush. Intel in the meantime will keep making their current gaming cards too as the latest Arc series was very well received, a big win for Intel after all the struggles. This rumor hints that Intel's expanding its GPU plan rather than letting it fade away, that was a gray scenario before the launch of Battlemage. Now it seems they want to compete in the professional and AI acceleration markets as well.



In 2025, Intel plans to launch a larger memory version of the Battlemage series graphics cards, with the capacity increased to 24 GB.
In the future, the existing version will continue to serve consumer markets such as games, while the 24G larger video memory version will target the "productivity market."
The target users of the "productivity market" include data centers, edge computer rooms, education and scientific research, and individual developers.
— Quantum Bits

View at TechPowerUp Main Site | Source
 
Hm, for DC I would be surprised if it was "ARC". If it carries the ARC name it would be the second gen ARC Pro's. Though they would undoubtably use the BGM cores in FLEX and MAX DC parts eventually as they did Alchemist.
 
Has the potential to eat Nvidia's lunch in regard to professional and semi-professional workloads, particularly AI if priced right. VRAM is the primary bottleneck for AI right now on consumer cards.
 
And they could ask a much higher price (like Nvidia and AMD too), for a 'professional' GPU.
 
Has the potential to eat Nvidia's lunch in regard to professional and semi-professional workloads, particularly AI if priced right. VRAM is the primary bottleneck for AI right now on consumer cards.
AMD's as well. It could be the budget card for LLM, assuming it's priced right.
 
a powerful GPU for a fair price that has "only" 16GB of VRAM would be cooler
It exists and it's called an aftermarket RX 6800 XT/6900 XT.

I'm fairly certain you can get much more from this 192-bit 12-gigabyte VRAM buffer than Intel had got so far. GDDR7 (~30 GT/s) + doubling the CU count for starters. Reasonable to make the GPU itself 200 MHz slower so it doesn't burn your house down.
 
It exists and it's called an aftermarket RX 6800 XT/6900 XT.
Believe it or not, there are problems with relying on second hand used hardware from unknown sellers with unknown lives. Shocking, I know, but many prefer to buy hardware that HASNT been driven to within an inch of its life.
I'm fairly certain you can get much more from this 192-bit 12-gigabyte VRAM buffer than Intel had got so far. GDDR7 (~30 GT/s) + doubling the CU count for starters. Reasonable to make the GPU itself 200 MHz slower so it doesn't burn your house down.
Or you could just learn to plug your power plugs in correctly so they dont melt.
 
Makes sense, easy to do, and they can increase the price to the point it becomes profitable, and therefore they would have an incentive to increase production

I hope they do it and also do their best to support AI applications
 
I have seen gaming cards refitted with more VRAM which does help with some titles. I have used 16GB cards and monitoring VRAM usage is eye opening.

The ARC Xe graphics in my laptop just use the DDR4 I have stuffed into my machine. I would think the ARX Xe as a discrete GPU with 4GB or 8GB VRAM would have done much to get more performance,
 
The new 24 GB model will use the same BGM-G21 GPU as the B580
I'm not sure the extra VRAM can be put to good use (for gamers).
 
I never got the "intel is dropping dGPU" rumors. Never made sense. Well, unless you're Lisa Su that is... :rolleyes: (scnr)
 
I have seen gaming cards refitted with more VRAM which does help with some titles. I have used 16GB cards and monitoring VRAM usage is eye opening.

The ARC Xe graphics in my laptop just use the DDR4 I have stuffed into my machine. I would think the ARX Xe as a discrete GPU with 4GB or 8GB VRAM would have done much to get more performance,
as a 20GB 7900XT owner I always laugh at claims that 8-12GB is "enough"

for years I've been seeing 16-18GB in use when playing at 4k native plus AA and RT
 
I never got the "intel is dropping dGPU" rumors. Never made sense. Well, unless you're Lisa Su that is... :rolleyes: (scnr)
Well given how Intel has dropped or cancelled products, or sold off entire divisions such as storage, I don't have a lot confidence in Intel continuing the effort especially with the co-ceos in charge who used to either be on a board or management.
Although hopefully Intel continues because the market really needs more competition and maybe Intel can take some of the market away that Nvidia has on datacenter and ai.
Or you could just learn to plug your power plugs in correctly so they dont melt
Or you could just not accept a flawed power connector, it got updated to be less crappy for a good reason. And funny enough so far Intel isn't using the new connector.
 
Well given how Intel has dropped or cancelled products, or sold off entire divisions such as storage, I don't have a lot confidence in Intel continuing the effort especially with the co-ceos in charge who used to either be on a board or management.
Although hopefully Intel continues because the market really needs more competition and maybe Intel can take some of the market away that Nvidia has on datacenter and ai.
Don't think of it as what you want. It's a company, they will do what makes sense from a company POV. Intel did not do Battlemage to give consumers cheap GPUs. They're doing it because mid/long-term they expect a good business in it. And that's why it does not make sense to drop the dGPU R&D, not because they want consumers to enjoy cheap HW.
 
Well given how Intel has dropped or cancelled products, or sold off entire divisions such as storage, I don't have a lot confidence in Intel continuing the effort especially with the co-ceos in charge who used to either be on a board or management.
Although hopefully Intel continues because the market really needs more competition and maybe Intel can take some of the market away that Nvidia has on datacenter and ai.

Or you could just not accept a flawed power connector, it got updated to be less crappy for a good reason. And funny enough so far Intel isn't using the new connector.

I assume Intel will keep GPUs around because they are another AI product in their portfolio that could allow them to crack that market. Even if they don't earn gobs of money from the gaming side in the short term, the long term upsides are very appealing from a business perspective. They are probably the last thing you cut in order for the company to recover.

Intel cannot afford to miss another huge market like they did with mobile (tablets, phones, ect).
 
for the most part GPU prices were affected by bitcoin etc leaving gamers out in the cold
Yup, manufacturers forgot who their primary audience was when idiots bought skids full of them.

And game devs got lazy, so most suffer consolitis.
 
More interested in the higher end battle mage cards, but if they manage to get good support going for cad/cam software that extra ram will be helpful for large projects
 
More interested in the higher end battle mage cards, but if they manage to get good support going for cad/cam software that extra ram will be helpful for large projects

Man yes. I hope they release a B7xx series. As for the ecosystem I am hopeful it will catch up. Intel and AMD invested HEAVILY in opensource, these things trickle down, not to mention they are pretty proactive with the likes of Topaz and Adobe. They have been reaching out to help get support implimented.

I think with CAD specifically, well you know how it goes with industrial software. The one unfortunate aspect is by the time they come out with a version that supports it, well getting the companies that rely on it to switch might be a lot of trouble. Design firms maybe, but steel mills or industrial ops as a whole works at a snails pace regardless of OEM backing unfortunately.

The B series does run cool though! So I imagine they would make great (get me a picture on the screen) cards like the A310/80 for office.
 
I don’t see the point of 24GB on B5. B7 sure, not on B5 though.
 
Intel and AMD invested HEAVILY in opensource

In regards of open source:

The firmware for my intel wifi chip, amd graphic card is closed source. Why else do I need this package "sys-kernel/linux-firmware-20241210-r1" ?
Just saying. No Firmware - no working network connection for my intel wlan. No firmware - just basic Vesa (VGA?) compatibility mode for my Radeon 7800XT.
same for the firmware for the processors / mainboards / bad compiler support

I can not set a fan curve in gnu gentoo linux for my AMD Radeon 7800XT graphic card. last time i tried in summer 2024. No visual tool from AMD - nothing.
Ryzen 5800X - sold / current 7600X have minimum gcc optimisation support. I would expect more for heavily open source support from amd. In my point of view - amd has a little bit of open source support
I want to see coreboot on amd mainboards

--

Back to topic: When the price is right and the software works some will buy those cards.
 
Last edited:
In regards of open source:

The firmware for my intel wifi chip, amd graphic card is closed source. Why else do I need this package "sys-kernel/linux-firmware-20241210-r1" ?
Just saying. No Firmware - no working network connection for my intel wlan. No firmware - just basic Vesa (VGA?) compatibility mode for my Radeon 7800XT.
same for the firmware for the processors / mainboards

I'm sorry to hear that, my AX210 is affected as well, but this thread is about ARC B series cards. The driver teams are not the same, and the contributions to GPUs is easily tracked in the kernel mailing list and sites like phoronix, as im sure you know. While I appreciate the extra detail in most things, lets try to keep it on topic.
 
as a 20GB 7900XT owner I always laugh at claims that 8-12GB is "enough"

for years I've been seeing 16-18GB in use when playing at 4k native plus AA and RT
Allocation is not utilisation.
 
Back
Top