• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Arc "Battlemage" Graphics Card with 12GB of 19 Gbps GDDR6 Memory Surfaces

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,677 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
A prototype discrete GPU based on the Intel Arc "Battlemage" graphics architecture was spotted in a public boot log by Intel GFX Continuous Integration group. The group is probably testing a prototype discrete GPU with a Linux driver. The OS loads its driver at boot, which puts out a few messages in the boot log, including explicit mention of "Battlemage" as BMG. It also mentions its memory size to be 12 GB, a memory speed of 19 Gbps, and a memory bus width of 192-bit.

It is hence likely that this is a mid-tier GPU from the series, with the top tier one probably featuring a 256-bit memory interface. This aligns with Intel's strategy of targeting the bulk of the gaming graphics market, instead of gunning for the enthusiast class. The new "Battlemage" architecture is expected to make Intel contemporary against rival architectures in the segment, such as NVIDIA "Ada" and AMD RDNA 3, although it remains to be seen if it can square off against the next-generation NVIDIA "Blackwell" and AMD RDNA 4.



View at TechPowerUp Main Site | Source
 
I wish Intel would be able to compete in the high-end segment as to make that segment cheaper but this is better than nothing.
 
12GB is a bit anemic for people who spend money on buying a dedicated graphics card.
i hope they have something above 24
 
12GB is a bit anemic for people who spend money on buying a dedicated graphics card.
i hope they have something above 24
"It is hence likely that this is a mid-tier GPU from the series..."
 
My GPUs are mainly for TFLOPs, as they mostly do BOINC calcs...why no mention about speed of new Intel Xe? :confused:
 
12GB is a bit anemic for people who spend money on buying a dedicated graphics card.
i hope they have something above 24

No point in packing in wasted VRAM on a GPU that won't be fast enough to allow settings that could make use of it anyway. The whole idea on a GPU in this class is to keep costs down so that it is affordable to the masses. This might even be comparable to the lower midrange entries of Nvidia and AMD next generations in GPU performance. Just a guess though.
 
No point in packing in wasted VRAM on a GPU that won't be fast enough to allow settings that could make use of it anyway. The whole idea on a GPU in this class is to keep costs down so that it is affordable to the masses. This might even be comparable to the lower midrange entries of Nvidia and AMD next generations in GPU performance. Just a guess though.
What would be the point of graphics cards that can't even use the 12GB, just get a used card or run the igpu
 
I wish Intel would be able to compete in the high-end segment as to make that segment cheaper but this is better than nothing.
We do need more competition in the GPU market. I just wish it would be from someone other than Intel. Given the latest track record, any moves on behalf of Intel will be a wasted effort.
 
We do need more competition in the GPU market. I just wish it would be from someone other than Intel. Given the latest track record, any moves on behalf of Intel will be a wasted effort.
they had issues with 1 generation of chips (2 in name) which should be solved now with the patch and they ironed out an advanced graphics line in 1 generation in a way that only a behemoth like intel could have done.

it's 2024 not 1990 where drawing an 8bit sprite a millisecond faster than the competition gave you an edge on the market.
there is no one else that can enter the market in a significant manner.

Intel has done a load of things that are bad.
The stuff they are being crapped upon now still is BS considering they offered a fix and restitution for those affected.
 
they had issues with 1 generation of chips (2 in name) which should be solved now with the patch and they ironed out an advanced graphics line in 1 generation in a way that only a behemoth like intel could have done.

it's 2024 not 1990 where drawing an 8bit sprite a millisecond faster than the competition gave you an edge on the market.
there is no one else that can enter the market in a significant manner.

Intel has done a load of things that are bad.
The stuff they are being crapped upon now still is BS considering they offered a fix and restitution for those affected.
Yes its 2024. AMD is worth over 2.5x the value of Intel and has broader name recognition. Intel CPUs are unstable and failing. They are laying off thousands of workers. Intel product postponements and cancelations are the new norm and innovation showcases are being cancelled. The company is hemorrhaging money left and right, profit margins are falling and dividends are cancelled. Intel no longer has one competitor (AMD) like in the past but dozens (ARM, TSMC, Apple, AMD, Qualcomm, Nvidia, etc).

Your description of Intel has no basis in reality.

Edit: Apple entered the market with a super fast GPU solution. There is no reason why Samsung, Qualcomm, Broadcom, Google, Tesla, etc couldn’t do the same if they wanted to. Even ARM and Imagination Technologies could have a go. Most here see only Nvidia, AMD and Intel as the only players in chip tech. Its a shame really because so much cool stuff including graphics are coming from others.
 
Last edited:
Yes its 2024. AMD is worth over 2.5x the value of Intel and has broader name recognition. Intel CPUs are unstable and failing. They are laying off thousands of workers. Intel product postponements and cancelations are the new norm and innovation showcases are being cancelled.

Your description of Intel has no basis in reality.
market cap and actual 'value' have little to do with one and other.
AMD is an extremely overvalued bubble (like many of the companies in that list and we're in for a correction 2008 style)
don't look at those numbers they are only interesting if you are an investor or the company it self that wants to loan some money for a project.

Intel is a massive ~120.000 personnel company doing a boat load of projects, they can fire 23000 people (size of amd) and still be a massive company with engineers to spare.
and it's still doing ok compared to the state AMD was in after a flop.

But that wasn't what i was trying to say, Just amd and nvidia isn't serious competition which should be obvious by now, an ideal situation would be +5 competitors.
But there is no way we are ever going to see 5 companies competing for that market.
Having intel as a third player is a good thing, it would be cool if say samsung or huawei were to buy Imagination Technology and would enter the discreet graphic market but i can't see that happening.

and sure it would be nice if AMD were to have some success with RDNA 4, but since it's amd they are going to price it over what nvidia is asking just like they have always demanded more than what intel did if they happened to have something that performed better.
 
market cap and actual 'value' have little to do with one and other.
AMD is an extremely overvalued bubble (like many of the companies in that list and we're in for a correction 2008 style)
don't look at those numbers they are only interesting if you are an investor or the company it self that wants to loan some money for a project.

Intel is a massive ~120.000 personnel company doing a boat load of projects, they can fire 23000 people (size of amd) and still be a massive company with engineers to spare.
and it's still doing ok compared to the state AMD was in after a flop.

But that wasn't what i was trying to say, Just amd and nvidia isn't serious competition which should be obvious by now, an ideal situation would be +5 competitors.
But there is no way we are ever going to see 5 companies competing for that market.
Having intel as a third player is a good thing, it would be cool if say samsung or huawei were to buy Imagination Technology and would enter the discreet graphic market but i can't see that happening.

and sure it would be nice if AMD were to have some success with RDNA 4, but since it's amd they are going to price it over what nvidia is asking just like they have always demanded more than what intel did if they happened to have something that performed better.
Business is business. It is very possible that Intel cancels its entire GPU efforts as they cut cost which means no Battlemage. You cannot dismiss current events because they are not convenient. Real people are behind these real business decisions. As they lose cash quarter over quarter, Intel will be faced with some hard decisions.
 
Yes its 2024. AMD is worth over 2.5x the value of Intel and has broader name recognition. Intel CPUs are unstable and failing. They are laying off thousands of workers. Intel product postponements and cancelations are the new norm and innovation showcases are being cancelled. The company is hemorrhaging money left and right, profit margins are falling and dividends are cancelled. Intel no longer has one competitor (AMD) like in the past but dozens (ARM, TSMC, Apple, AMD, Qualcomm, Nvidia, etc).

Your description of Intel has no basis in reality.

Edit: Apple entered the market with a super fast GPU solution. There is no reason why Samsung, Qualcomm, Broadcom, Google, Tesla, etc couldn’t do the same if they wanted to. Even ARM and Imagination Technologies could have a go. Most here see only Nvidia, AMD and Intel as the only players in chip tech. Its a shame really because so much cool stuff including graphics are coming from others.
None of the guys that you mentioned would go at it in the way that you want though. I would be very surprised to not see them push an SoC solution for a high-performance laptop over a dGPU. The dGPU market has been culled, any newcomers would have a hard time gaining the trust of gamers. IIRC, you were among the people who were very suspicious of Qualcomm entering the mainstream laptop market with a good product, so it's a bit funny to see you flip your mindset for a dGPU coming from them :D.
If windows on ARM don't take off, QC is just going to dip out. The dGPU market isn't a viable fallback.

Realistically, we shouldn't expect someone else to try and get into that market unless they have a strong presence in the mainstream for at least a decade. Even when Intel was at its peak, they've been very cautious about entering that market.
Apple GPUs are fast for an SoC, but they still have a steep hill to climb before they hang with Nvidia and AMDs high-end. And that's with their virtually infinite R&D money and being TSMC's favorite customer.
 
AMD is an extremely overvalued bubble
If AMD is "extremely overvalued" I wonder what opinion you have on Nvidia with over 10 times the market valuation.
 
No point in packing in wasted VRAM on a GPU that won't be fast enough to allow settings that could make use of it anyway. The whole idea on a GPU in this class is to keep costs down so that it is affordable to the masses. This might even be comparable to the lower midrange entries of Nvidia and AMD next generations in GPU performance. Just a guess though.

Well that argument of no need for more ram come from an another era where the main limiting factor for performance was the fillrate and texture mapping rate.

These days, managing more texture isn't really an issue anymore. The impact of using more texture is minimal and it's still one of the greatest visual upgrade you can get on any games. And the fact that memory size have stayed relatively flat for the last few generation is one of the reason why games today don't look that much more different than few years ago when the first 8 GB+ GPU were launched.

The top end GPU might have more but they are probably still too rare for spending a lot of budget on it.

Anyway, a low end GPU with more rams will be able to use higher texture settings and get better visual. Also, they might be less subject to stutter because most low end GPU have slower PCI-E link
 
Cool! Looking forward to second gen. What there teams did with alchemist performance was crazy. Those guys deserve all the pizza parties.
 
None of the guys that you mentioned would go at it in the way that you want though. I would be very surprised to not see them push an SoC solution for a high-performance laptop over a dGPU. The dGPU market has been culled, any newcomers would have a hard time gaining the trust of gamers. IIRC, you were among the people who were very suspicious of Qualcomm entering the mainstream laptop market with a good product, so it's a bit funny to see you flip your mindset for a dGPU coming from them :D.
If windows on ARM don't take off, QC is just going to dip out. The dGPU market isn't a viable fallback.

Realistically, we shouldn't expect someone else to try and get into that market unless they have a strong presence in the mainstream for at least a decade. Even when Intel was at its peak, they've been very cautious about entering that market.
Apple GPUs are fast for an SoC, but they still have a steep hill to climb before they hang with Nvidia and AMDs high-end. And that's with their virtually infinite R&D money and being TSMC's favorite customer.
I do agree that companies like Qualcomm are less likely to enter the dGPU market and sure, they have a huge uphill battle when it comes to the SoC laptops space. My argument is that Intel is now just as unlikely to succeed in this market as companies like Qualcomm given the current state of their business. I am not holding my breath that Battlemage and its successors will be successful so I'm optimistic that someone else will bring more competition to the dGPU space.
 
I'm interested. Why do you pollute the topic with discussion of scandals and stock market information? It is for a specific future product. Which in my opinion would be interesting if, depending on its class (presumably middle class), it has a similar price to the previous generation, and especially if it starts with good enough drivers and an increase in performance since the beginning of its market presence.
 
they had issues with 1 generation of chips (2 in name) which should be solved now with the patch and they ironed out an advanced graphics line in 1 generation in a way that only a behemoth like intel could have done.

it's 2024 not 1990 where drawing an 8bit sprite a millisecond faster than the competition gave you an edge on the market.
there is no one else that can enter the market in a significant manner.

Intel has done a load of things that are bad.
The stuff they are being crapped upon now still is BS considering they offered a fix and restitution for those affected.
I personally do not hate Intel as I am using an Intel but they did Some Shady stuff in the past like paying companies to not sell AMD. Not saying AMD or Nvidia is clean as they also did shady stuff but Intel will do anything to stay ahead. Competition is good and I love it as it gives you a choice but do it in a fair manner. I have seen how the top companies insult each other etc. Gawd your grownups. Act like it. They bicker like children to try and out-sell each other...

One thing What is his name Jetson or something? said. AMD has the best marketing strategy as it doesn't matter how bad some releases are, they still sell.
 
No point in packing in wasted VRAM on a GPU that won't be fast enough to allow settings that could make use of it anyway. The whole idea on a GPU in this class is to keep costs down so that it is affordable to the masses. This might even be comparable to the lower midrange entries of Nvidia and AMD next generations in GPU performance. Just a guess though.

It's amazing how many people believe in the myth that you need a certain level of performance to utilize additional VRAM. It's been debunked time and time again.

It might be true in some extremely rare example but HWUB proved that even low end GPUs benefit from additional VRAM. The simple fact of the matter is, even with a small memory bus having more data accessible in the higher bandwidth lower latency VRAM instead of the main system memory or on disk is going to be superior. You'd literally have to choke the VRAM off to the point of it being slower than accessing main system memory.
 
If AMD is "extremely overvalued" I wonder what opinion you have on Nvidia with over 10 times the market valuation.
nvidia doesn't have any serious competition at the moment, that's the thing they have going for them
Doesn't change the fact that we are in an insane bubble, companies are being traded at rates they couldn't possibly earn in decades. That doesn't last.

Edit: Apple entered the market with a super fast GPU solution. There is no reason why Samsung, Qualcomm, Broadcom, Google, Tesla, etc couldn’t do the same if they wanted to. Even ARM and Imagination Technologies could have a go. Most here see only Nvidia, AMD and Intel as the only players in chip tech. Its a shame really because so much cool stuff including graphics are coming from others.

Apple has Jack ****, no one is playing games on their apples now they have a "super fast GPU" because apple doesn't bother writing drivers for games especially older games.
Which is what intel did and does, google isn't going to do that either nor is qualcomm or Imagination Technologies (some Chinese company is failing hard at it)...
Unless somehow a super mega new thing everyone wants and needs like a deepdive VR like that anime 'Sword art Online' as to no one caring about their old games anymore.
Forget about a new entrant to the GPU market.
 
Intel said they're cutting their "unprofitable" businesses, and Arc certainly qualifies. The chances of this actually shipping as a consumer dGPU are very low.
 
It's amazing how many people believe in the myth that you need a certain level of performance to utilize additional VRAM. It's been debunked time and time again.

It might be true in some extremely rare example but HWUB proved that even low end GPUs benefit from additional VRAM. The simple fact of the matter is, even with a small memory bus having more data accessible in the higher bandwidth lower latency VRAM instead of the main system memory or on disk is going to be superior. You'd literally have to choke the VRAM off to the point of it being slower than accessing main system memory.

Maxwell era Tesla M40 had 24GB VRAM. It's slower than newer GPU's with similar VRAM, but that doesn't mean it can't leverage the VRAM it has available. There are limitations however on how well it can do so for particular usage cases, but that's a very different topic.
 
nvidia doesn't have any serious competition at the moment, that's the thing they have going for them
Doesn't change the fact that we are in an insane bubble, companies are being traded at rates they couldn't possibly earn in decades. That doesn't last.
I am just wondering if you're using superlatives like "extremely overvalued" for a company like AMD what's left for the rest of them ? Ultra giga hyper overvalued ?
 
Back
Top