• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Reportedly Abandoned Higher-end Arc Xe2 "Battlemage" dGPU Project Last Year

T0@st

News Editor
Joined
Mar 7, 2023
Messages
3,231 (3.97/day)
Location
South East, UK
System Name The TPU Typewriter
Processor AMD Ryzen 5 5600 (non-X)
Motherboard GIGABYTE B550M DS3H Micro ATX
Cooling DeepCool AS500
Memory Kingston Fury Renegade RGB 32 GB (2 x 16 GB) DDR4-3600 CL16
Video Card(s) PowerColor Radeon RX 7800 XT 16 GB Hellhound OC
Storage Samsung 980 Pro 1 TB M.2-2280 PCIe 4.0 X4 NVME SSD
Display(s) Lenovo Legion Y27q-20 27" QHD IPS monitor
Case GameMax Spark M-ATX (re-badged Jonsbo D30)
Audio Device(s) FiiO K7 Desktop DAC/Amp + Philips Fidelio X3 headphones, or ARTTI T10 Planar IEMs
Power Supply ADATA XPG CORE Reactor 650 W 80+ Gold ATX
Mouse Roccat Kone Pro Air
Keyboard Cooler Master MasterKeys Pro L
Software Windows 10 64-bit Home Edition
Intel GPU enthusiasts have been waiting patiently for news regarding higher-end models; ever since the launches of wallet-friendly Arc Xe2 "Battlemage" B580 and B570 graphics cards. As the cliché goes; recent silence has been deafening—we last heard about a speculative expanded lineup of B-series SKUs around late January. At the time, three mysterious "Battlemage" PCI identifiers turned up online; courtesy of Tomasz Gawroński's detective work. Opinions were split about the exact nature of these leaked "BMG" IDs; one camp envisioned Team Blue having professional variants of their existing B580 in the pipeline—presumably with generously specced pools of 24 GB VRAM onboard. A more optimistic group posited that Intel's Arc Xe2 desktop gaming family would welcome more potent "B750, B770," and (maybe) "B780" SKUs.

Yesterday, Tomasz Gawroński (aka GawroskiT) interacted with another notable source of inside track information: Jaykihn (jaykihn0). Plenty of Team Blue-related "scoops" have emerged via Jaykihn's social media channel; mostly predictions regarding upcoming desktop, mobile and enterprise CPUs. Their latest leak indicates Intel's alleged abandoning of a high-end/larger "BMG-G31" GPU die in 2024; within the third quarter of that year. Insiders have long insisted that the Arc Xe2 "Battlemage" project navigated choppy waters during development; hence the appearance of endless theories about the whole caboodle being called off. Jaykihn clarified that he believes that a "retail" launch of "BMG-G31" dGPUs will no longer take place. Many watchdogs will assume that a gap will be filled by forthcoming Arc Xe3 "Celestial" discrete GPUs. Jaykihn stated that they have no fresh insights into how that project is going.



View at TechPowerUp Main Site | Source
 
Called it.
 
Us high end buyers just cant catch a break....
 
They might just be skipping the high-end on Battlemage in favor of focusing on Celestial, but it really feels like they're exiting the discrete GPU market altogether - a short-sighted decision if that's the case. Intel GPUs have tremendous growth potential, but they'll never recognize that potential without undertaking the same journey AMD and NVidia did.
 
This has been the rumor ever since B580 came out. The high-end design missed targets and would have to be redone. Hardly worth the trouble unless Intel knew they could sell it for a good margin in the year's time it would take to get a redesign to the store shelves, which was highly unlikely. Even then, why go to that trouble when the successor would probably arrive shortly after? B580 is good for what it is and how it's priced, but even after all this time it's not an easily found product for purchase, which suggests even that chip wasn't yielding well or it's not worth paying for the fab space.
 
Two out of three players are not pursuing xx90 monstrosities anymore. Maybe that's a good sign.

I mean, when a video card can't max out current titles and still costs more than a console, somebody's got to read the writing on the wall, eventually.
 
Anyone who has been paying attention to Intel's MO over the last decade or more knew they weren't in it for the long haul. Intel just isn't built like that, they're a purely MBA driven quarter to quarter company where anything that doesn't add value to the stock is quickly spun off and axed, which has resulted in Intel being good at one thing and one thing only: CPUs. And now they're not even good at CPUs anymore.

I think we're going to get Celestial, and then MAYYYYYBEEEE Druid out of Intel, and in very Battlemage-esq "see we launched a few, go away and leave us alone", before their dGPU stack gets the kibosh.
 
Two out of three players are not pursuing xx90 monstrosities anymore. Maybe that's a good sign.
So a third of the market is pursuing super high margin mosnter cards? That seems like a pretty lucrative market and nVidia has it all to themselves.....
I mean, when a video card can't max out current titles and still costs more than a console, somebody's got to read the writing on the wall, eventually.
Imagine if this was the attitude when Crysis released.
 
  • Like
Reactions: SRS
So a third of the market is pursuing super high margin mosnter cards? That seems like a pretty lucrative market and nVidia has it all to themselves.....
Lucrative is debatable (because of volume). The side effect being any Nvidia card that doesn't cost $500 pretty much sucks.
Imagine if this was the attitude when Crysis released.
Back then, it was one game. Nowadays, there's plenty you will have to sacrifice detail if not paying $700 for your video card.
 
Imagine if this was the attitude when Crysis released.
Back then, it was one game. Nowadays, there's plenty you will have to sacrifice detail if not paying $700 for your video card.

Also, folks generally accepted that Crysis would run like poo when maxed out. It was unabashedly a tech demo as much as anything. It also (IIRC, correct me if I'm wrong) ran OK whilst still looking nice on just about anything with the right settings. Now every big release is a Crysis-level card-punishing spectacle and we seem to get angry when those games won't run at 4K120.
 
I really hope they stick to discrete GPUs with Xe³ or Xe³p. Would be sad to see them give up again, since BM isn't that bad and the drivers improved greatly since Alchemist.
 
Also, folks generally accepted that Crysis would run like poo when maxed out. It was unabashedly a tech demo as much as anything. It also (IIRC, correct me if I'm wrong) ran OK whilst still looking nice on just about anything with the right settings. Now every big release is a Crysis-level card-punishing spectacle and we seem to get angry when those games won't run at 4K120.
I don't mind that much they don't run at 4k@120, but they don't run at 4k@60 either...
 
Also, folks generally accepted that Crysis would run like poo when maxed out. It was unabashedly a tech demo as much as anything. It also (IIRC, correct me if I'm wrong) ran OK whilst still looking nice on just about anything with the right settings. Now every big release is a Crysis-level card-punishing spectacle and we seem to get angry when those games won't run at 4K120.

- IMO the biggest thing was that Crysis was an objectively better looking game by a fair margin than anything else released at the time. Yeah it ran like crap, but it had the right to since nothing else came close in terms of visual fidelity.

Nowadays its really hard to even tell many ~2025 games apart from ~2015 games in terms of visuals (if you squint the lighting is a bit better and the textures are of higher quality) but the games run like dog crap on a hot day. The graphics aren't getting better, the performance is getting worse, and the cards are getting stupid expensive.

A game from 2016 that runs ~200FPS on my 6800XT:
1743185024930.png


And here is Silent Hill 2 which runs at like 50FPS on the same system. WTF.

1743185284759.png
 
Increasingly powerful GPUs and game engines that are "good enough" out of the box has led to a loss in optimization efforts. Back then, game makers had to optimize their games so that they could run extremely well on GPUs; now they just rely on increasing brute force power and fake frames to cover the widening gap in performance quality. It sucks, and I wish more studios would start optimizing games again for the mid-range level more than just the high-end. They used to tune/optimize based on consoles (which used to represent the mid-range in general), but even modern games are struggling on consoles and are barely better on PC even with better hardware.
 
ermm is this news? I thought we already knew Battlemage was not getting any new higher end models....

Two out of three players are not pursuing xx90 monstrosities anymore. Maybe that's a good sign.

I mean, when a video card can't max out current titles and still costs more than a console, somebody's got to read the writing on the wall, eventually.

Oh for the love of......look lets please not confuse optimisation with graphics settings.

if you need a gpu more expensive then a console to get equivalent graphics then sure, now we have a problem.
But settings are COMPLETELY arbitrary, what? are you happier with GTA Enhanced edition? a game that adds ray tracing which looks good but is so low end just about any RT capable card can max it?

You rather have that then the devs flicking a switch allowing for much higher settings that current gpu's just cannot do in real time? but future ones probably could?

or do you need devs to go the Avatar route, have settings in there but hidden and you need to "hack" the files to expose them.

like what is this silly fomo/ego thing going on that you think you need to max a game? its not about maxing, its about the visuals given for the performance required....
 
Also, folks generally accepted that Crysis would run like poo when maxed out. It was unabashedly a tech demo as much as anything. It also (IIRC, correct me if I'm wrong) ran OK whilst still looking nice on just about anything with the right settings. Now every big release is a Crysis-level card-punishing spectacle and we seem to get angry when those games won't run at 4K120.
I think folks get angry because what they get for all these buzzwords and way overpriced cards and software is a pretty poor experience overall.

I can't say newest games grab me much, rare exceptions only. Even graphically its just not really something that keeps me playing. We've already had so many engines doing so much more for us, mostly in terms of interactivity. Where is the innovation? All I see is complexity and features simplified, streamlined, or executed very poorly. There is far too much attention going to things gamers don't really care about.

I mean look at your average action game. Static as fuck. Combat is whacking the same button all the time, interaction is always 'press E here' and barely ever gets past that, and the marker on your map shows you where to go. But it looks better. Yeah, so what?

its not about maxing, its about the visuals given for the performance required....
Of course, but there is also an expectation if you have indeed bought a high end GPU.

- IMO the biggest thing was that Crysis was an objectively better looking game by a fair margin than anything else released at the time. Yeah it ran like crap, but it had the right to since nothing else came close in terms of visual fidelity.

Nowadays its really hard to even tell many ~2025 games apart from ~2015 games in terms of visuals (if you squint the lighting is a bit better and the textures are of higher quality) but the games run like dog crap on a hot day. The graphics aren't getting better, the performance is getting worse, and the cards are getting stupid expensive.

A game from 2016 that runs ~200FPS on my 6800XT:
View attachment 392161

And here is Silent Hill 2 which runs at like 50FPS on the same system. WTF.

View attachment 392162
And yet, the latter is arguably the better game, imho.
Its just a shame that you have to put up with that taxing baseline, right. I mean sure, if you look hard, there is more detail. But that's not what ever made this game bone chilling to play. SH2 is not more immersive than any Silent Hill before it that ran on systems that could probably not even render that puddle there right now.

What we got is a more reflective, sharpened puddle at the cost of 4 generations of GPU and about doubled prices. Yay for progress.

I also think we need to reflect on what exactly Crysis added. The biggest thing was volumetric elements, but also vibrant nature to walk through in first person; gone were the leaves that would just turn into a pixel mess as you closed in, assets and geometry gained a massive upgrade. That, and the interactivity in the world combined for something magical. Playing Crysis was like getting a new feel in your gaming, truly a whole new level of fidelity beyond what we had on every metric.
 
Last edited:
This was expected. They first need to fix their driver overhead first to release anything faster as it would be even worse on it. They just wouldn't really get any more performance. I just hope they don't abandon the whole discrete gpu thing.
 
I knew it lol. Nobody starts talking about how excited they are about their next architecture when their current one hasn't even hit its stride.

This is just Intel, being Intel. Bottling it once again.
 
- IMO the biggest thing was that Crysis was an objectively better looking game by a fair margin than anything else released at the time. Yeah it ran like crap, but it had the right to since nothing else came close in terms of visual fidelity.

Nowadays its really hard to even tell many ~2025 games apart from ~2015 games in terms of visuals (if you squint the lighting is a bit better and the textures are of higher quality) but the games run like dog crap on a hot day. The graphics aren't getting better, the performance is getting worse, and the cards are getting stupid expensive.
There was that, too. It was more of tech demo, like FarCry and Quake series before it.
 
- IMO the biggest thing was that Crysis was an objectively better looking game by a fair margin than anything else released at the time. Yeah it ran like crap, but it had the right to since nothing else came close in terms of visual fidelity.

Nowadays its really hard to even tell many ~2025 games apart from ~2015 games in terms of visuals (if you squint the lighting is a bit better and the textures are of higher quality) but the games run like dog crap on a hot day. The graphics aren't getting better, the performance is getting worse, and the cards are getting stupid expensive.

A game from 2016 that runs ~200FPS on my 6800XT:
View attachment 392161

And here is Silent Hill 2 which runs at like 50FPS on the same system. WTF.

View attachment 392162
Imagine if you throw on Ray Tracing!
 
IF they're going with Xe3, then this is a very good move for Intel. Let's cross our collective fingers for that.
 
That's not good. If they keep cancelling there is a realistic chance they will again abandon dGPU as such.
 
That's not good. If they keep cancelling there is a realistic chance they will again abandon dGPU as such.
Good grief I hope not. The Arc cards have been a breath of fresh air in a market that desperately needed same.

@ Intel
Come guys, put the rabbit, let us enrich you!
 
So a third of the market is pursuing super high margin mosnter cards? That seems like a pretty lucrative market and nVidia has it all to themselves.....

Imagine if this was the attitude when Crysis released.
The situation with the 4090 was absolutely disgusting (and continues to be — check out those Ebay prices). The situation with the 5090 is absolutely disgusting. I was disgusted, too, when I saw media sites that are supposed to know better treating the 5090's MSRP without even the slightest grain of salt — playing right by Nvidia's playbook regardless of how harmful that is, headlines like:

headline.png


We're supposed to ascribe incompetence rather than malice to headlines like that but I am more of a cynic.

That capitalism's best "competition" involves a "$2000" product selling out within 24 hours at $3700, with no end in sight — with its weaker predecessor selling for the same price on Ebay, says a lot about how well that "competition" is set up to work.

Shenanigans have characterized high-end gaming GPUs since Turing. AMD has been out to lunch since Fury. Duopolies are terrible for consumes and monopolies are worse. That AMD isn't even bothering to feign competitiveness with the 4090, 5080, and 5090 should be a red flag. Even though AMD may not be able to compete with CUDA right now, it could sell high-end cards to the gamers who can't get any of the high-end Nvidia cards because of CUDA demand. Anyone who paid any attention knew the 4090 sold very well, including there being large periods of time where there was no stock due to them selling out. All of the desperation involved in people trying to get the FE edition through BestBuy is something even someone who isn't paid to write about these things would know if they even fairly casually follow tech.

The key to understanding the 4090 and 5090 most fully is not to focus merely on gaming. Those cards are in hot demand for AI. Amateurs want to be able to run AI models like FLUX1dev, which, as I understand it, requires around 23 GB of VRAM for full quality. Those who want to do that have the following choices:

1) Wait forever for a sane market (even though we're in GPU-maker monopolist conditions coupled with everyone wanting TSMC high-end node capacity).
2) Pay insane prices for a used 3090 that may not work reliably.
3) Pay insane prices for a 4090 (hoping that it's actually a 4090) that may not work reliably or, if new, will cost as much as an inflated 5090.
4) Pay insane prices for a 5090 if one manages to ever find one in stock.

Then, there are likely businesses that want them.

I expected the 5090 to be a more intense version of the same game Nvidia played with the 4090. So far, it has been as expected. The missing ROPs thing was even icing on the tombstone.

What Intel needs to do is produce an affordable GPU that has 32 GB of VRAM, one that will perform well enough with AI models. It needs to get popular models like FLUX1dev working well on its GPUs. Doing that will be profitable, particularly since AMD is out to lunch as it has been for many years. Intel might even use chiplets and Samsung 8 nm. Amateur AI folk and gamers would likely be willing to hold their noses at the power consumption, simply because of the price difference versus Nvidia — especially if Intel were to go even higher than 32 GB for a card.
 
Last edited:
Back
Top