• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Reportedly Abandoned Higher-end Arc Xe2 "Battlemage" dGPU Project Last Year

The situation with the 4090 was absolutely disgusting (and continues to be — check out those Ebay prices). The situation with the 5090 is absolutely disgusting. I was disgusted, too, when I saw media sites that are supposed to know better treating the 5090's MSRP without even the slightest grain of salt — playing right by Nvidia's playbook regardless of how harmful that is, headlines like:

View attachment 392216

We're supposed to ascribe incompetence rather than malice to headlines like that but I am more of a cynic.

That capitalism's best "competition" involves a "$2000" product selling out within 24 hours at $3700, with no end in sight — with its weaker predecessor selling for the same price on Ebay, says a lot about how well that "competition" is set up to work.

Shenanigans have characterized high-end gaming GPUs since Turing. AMD has been out to lunch since Fury. Duopolies are terrible for consumes and monopolies are worse. That AMD isn't even bothering to feign competitiveness with the 4090, 5080, and 5090 should be a red flag. Even though AMD may not be able to compete with CUDA right now, it could sell high-end cards to the gamers who can't get any of the high-end Nvidia cards because of CUDA demand. Anyone who paid any attention knew the 4090 sold very well, including there being large periods of time where there was no stock due to them selling out. All of the desperation involved in people trying to get FE editions through BestBuy is something even someone who isn't paid to write about these things would know if they even fairly casually follow tech.

The key to understanding the 4090 and 5090 most fully is not to focus merely on gaming. Those cards are in hot demand for AI. Amateurs want to be able to run AI models like FLUX1dev, which, as I understand it, requires around 23 GB of VRAM for full quality. Those who want to do that have the following choices:

1) Wait forever for a sane market (even though we're in GPU-maker monopolist conditions coupled with everyone wanting TSMC high-end node capacity).
2) Pay insane prices for a used 3090 that may not work reliably.
3) Pay insane prices for a 4090 (hoping that it's actually a 4090) that may not work reliably or, if new, will cost as much as an inflated 5090.
4) Pay insane prices for a 5090 if one manages to ever find one in stock.

Then, there are likely businesses that want them.

I expected the 5090 to be a more intense version of the same game Nvidia played with the 4090. So far, it has been as expected. The missing ROPs thing was even icing on the tombstone.

What Intel needs to do is produce an affordable GPU that has 32 GB of VRAM, one that will perform well enough with AI models. It needs to get popular models like FLUX1dev working well on its GPUs. Doing that will be profitable, particularly since AMD is out to lunch as it has been for many years. Intel might even use chiplets and Samsung 8 nm. Amateur AI folk and gamers would likely be willing to hold their noses at the power consumption, simply because of the price difference versus Nvidia — especially if Intel were to go even higher than 32 GB for a card.
That was a hell of a rant. How does it apply to the Intel's Xe3 being discussed here?
 
Two out of three players are not pursuing xx90 monstrosities anymore. Maybe that's a good sign.

I mean, when a video card can't max out current titles and still costs more than a console, somebody's got to read the writing on the wall, eventually.
That's...such a goofy statement to make. If anyone buys a GPU thinking it'll be better bang for buck than a console they aren't very educated/informed. Also, I guarantee any GPU at $400+ will wipe the floor visually with any console, so i again dont get your comparison. EDIT: i stand corrected. A Xbox series X is only $500 and is roughly a RTX 3060/Ti or RX 6700 XT. Damn!!! Well shittzle, if things were even close to MSRP my statement would make sense.
Also, folks generally accepted that Crysis would run like poo when maxed out. It was unabashedly a tech demo as much as anything. It also (IIRC, correct me if I'm wrong) ran OK whilst still looking nice on just about anything with the right settings. Now every big release is a Crysis-level card-punishing spectacle and we seem to get angry when those games won't run at 4K120.
Not to mention the standards are vastly different. Back then 45+FPS was considered damn good and comfortably playabale, with 60FPS+ considered perfect. There was also something about either the engine or coding, but 23-30fps in Crysis felt like 40-60FPS compared to other games.
 
Last edited:
Until 2nm tmsc gaafet or intel 1.8A if ever, nothing will change.
So if you have a pc, sell it.
If you want to buy a pc, don't.
And for 3 years you are free of this Bullshit,
in 2 yeas premium gaafet cpus and gpus will come out
in 3 years they may be affordable.
 
IF they're going with Xe3, then this is a very good move for Intel. Let's cross our collective fingers for that.
The same way we crossed our collective fingers for an expanded Battlemage lineup? Don't hold your breath.
 
If by interested you mean can't afford, I agree.
Yeah, ya think? That's why they're not interested. Contrary to what Jensen says/claims/thinks, the vast majority of gamers are spending less than $2200 on a PC. Very few are spending anywhere near $10,000. Anyone who agrees with his idiotic statement is a moron(or clueless fanboy) completely out of touch with reality.

The same way we crossed our collective fingers for an expanded Battlemage lineup?
Let's be fair, they actually talked about them last year.
Don't hold your breath.
I never do. But Intel wants to get their game on, pun intended. If this is their plan, it's a good plan. I just hope they've engineered the ARC-C cards to perform well without the need for rebar.
 
I never do. But Intel wants to get their game on, pun intended. If this is their plan, it's a good plan. I just hope they've engineered the ARC-C cards to perform well without the need for rebar.

-Why? By the time ARC-C is out non-rebar systems will be ancient even by entry level PC standards.

No way they're going backwards on Rebar support as time moves forward and non-rebar systems age out of the ecosystem.
 
-Why? By the time ARC-C is out non-rebar systems will be ancient even by entry level PC standards.
Because not everyone uses SecureBoot and Rebar requires Secureboot to be enabled. This is true for both Windows AND Linux.
No way they're going backwards on Rebar support as time moves forward and non-rebar systems age out of the ecosystem.
That is perfectly irrelevant.
 
-Why? By the time ARC-C is out non-rebar systems will be ancient even by entry level PC standards.

No way they're going backwards on Rebar support as time moves forward and non-rebar systems age out of the ecosystem.
That is perfectly irrelevant.

It's absolutely relevant. Why would they handicap their architecture to serve a segment of the market that will continue to shrink to near-zero?
 
I'm so glad this guy is embarrassing some other company...



As for windows 11 - WHERE THE F*CK ARE THE VERTICAL TASKBARS, YOU GODDAMN IDIOTS AT MICROSOFT???

Peculiarly, Edge supports that and with tab grouping is one of the most convenient browser to use, only missing MRU tab switching(fuck you, google) and Opera like dark page mode. (no, plugins don't cut it).
 
Last edited:
Good. Kinda figured anyway, Celestial has already started getting patched into the kernel weeks ago. That with the confirmation the HW engineers are on druid and the driver teams moved to XE3 by Tom, it all comes together. Really this is a better outcome and it had to come sooner or later; the other two contenders are an entire generation ahead. It only made sense for them to cut the cycle early.

IMO and thats just opinion, they may even need to do it again, at best Xe3 might come EOY but thats still a solid 6mo trailing. It might be totally ok, but that might mean they are sounding the drum in the lab to speed up Druid to re-sync. Thats total conjecture though.
 
Last edited:
Lucrative is debatable (because of volume). The side effect being any Nvidia card that doesn't cost $500 pretty much sucks.

Back then, it was one game. Nowadays, there's plenty you will have to sacrifice detail if not paying $700 for your video card.
Because there is option, we can lower settings if EGO can handel it..
if no, then we go and buy 5090
If yes, lower settings

Its good there is more details and settings in options menu for us who use High end GPUs

Or do we want Ultra settings is real Medium and almost all can then use max settings, so no hurt feelings because no need to lower settings..
 
Last edited:
Because there is option, we can lower settings if EGO can handel it..
if no, then we go and buy 5090
If yes, lower settings

Its good there is more details and settings in options menu for us who use High end GPUs

Or do we want Ultra settings is real Medium and almost all can then use max settings, so no hurt feelings because no need to lower settings..
That much I can agree with. However, look at previous post showing how, despite a hefty fps plunge, there is virtually no IQ improvement in recent games, compared with games 5 yo or more.

Also, ego or not, if I'm out a grand, I don't expect to have to lower settings.
 
Let's face it. The Game industry as we know it, is slowly going to a dead end. nGreedia really took care of that when they start releasing video cards for rich people only. Yes, pricing a middle-end card for more than 1000$ (real street price) is a callous, shameless move . 50% of the World countries have salaries less than 1000$ (gross), do you think people and children out there do not play or do not want to play PC games? That's more than 2 billion people nGreedia which you shamelessly cut off from any quality PC gaming.
 
Let's face it. The Game industry as we know it, is slowly going to a dead end. nGreedia really took care of that when they start releasing video cards for rich people only. Yes, pricing a middle-end card for more than 1000$ (real street price) is a callous, shameless move . 50% of the World countries have salaries less than 1000$ (gross), do you think people and children out there do not play or do not want to play PC games? That's more than 2 billion people nGreedia which you shamelessly cut off from any quality PC gaming.
It's not Nvidia's fault. Nvidia just noticed GPU shaders and compute have a lot in common. They build CUDA to capitalize on that, but then people used CUDA (and thus consumer video cards) to build an entire new industry (or two, if you count mining). Nvidia is simply a publicly traded company that has to go where the money is.

Sucks for PC gamers (says the guy clinging on to his Pascal card), but that's the state of the market right now. Unless AI suddenly collapses or new fab spring out of the ground, there's no reason to hope anything will change.
 
it's real sad irony that Intel Arc B590 would do wonders for Intel share at market to rise
all it needed to have was 224 or 256-bit 16GB VRAM ...
it still allows the GPU as whole stay below 225W demand (75+150)
the lack of 16GB model is IMHO one of reasons the popularity isn't higher
(spoke to several users with 8 and 12GB variant and they quite happy)

it's also shame that Intel didn't even try 16-20 cores of Xe2 design instead of 24-32 sized monstrosity
 
Last edited:
Because not everyone uses SecureBoot and Rebar requires Secureboot to be enabled. This is true for both Windows AND Linux.
Do you have a source for that? I am thinking you confused rebar's UEFI requirement with secureboot. I have rebar on right now and do not have secureboot.
Not having secureboot is one thing, but not having UEFI is quite extraordinary. There's no reason not to have that on, and that's been in consumer devices since I don't know, 15 years ago? I don't expect retrofitting 15 year old computers being a market that intel's going to cater to.
 
Do you have a source for that? I am thinking you confused rebar's UEFI requirement with secureboot. I have rebar on right now and do not have secureboot.
Not having secureboot is one thing, but not having UEFI is quite extraordinary. There's no reason not to have that on, and that's been in consumer devices since I don't know, 15 years ago? I don't expect retrofitting 15 year old computers being a market that intel's going to cater to.
ReBAR is a PCIe feature, I doubt it could tell what SecureBoot status is if it wanted to, much less require it be enabled.
 
That much I can agree with. However, look at previous post showing how, despite a hefty fps plunge, there is virtually no IQ improvement in recent games, compared with games 5 yo or more.

Also, ego or not, if I'm out a grand, I don't expect to have to lower settings.

That's an excellent illustration of the change in expectations. When G80 launched in 2006 at $700 (~$1100 inflation-adjusted), it managed 60fps or better in one out of five games in the (admittedly small compared to now) TPU test battery at 2560x1600. Two were at or around 30. Today, $700 (9070 street price) gets you a bit under that performance in the five most-demanding games at 4096x2160, and $1100 (5070 ti street) provides more-or-less equal. They do this at twice the resolution and considerably higher visual quality.

This is not to imply that I'm fine with the state of the market; I'm absolutely not. Midrange and lower has been underserved for, what, five years now? I'd love to be able to buy something that doesn't kinda suck for under USD300. It's more meant to illustrate the push and pull of capability vs. demands. We talked about Crysis before; it and others like it allow us to push sliders beyond what current hardware can reasonably manage. I don't think that's a Bad Thing in and of itself, but it's on us to recognize that we are not entitled to a specific performance threshold with all boxes checked and sliders all the way to the right, regardless of how much we paid for our graphics card. We see it regularly: hardware catches up with software, then subsequent software responds by asking more of the hardware.
 
  • Like
Reactions: bug
Do you have a source for that? I am thinking you confused rebar's UEFI requirement with secureboot.
Other than it not working when I've tried it? No. But there is this;

ReBAR is a PCIe feature, I doubt it could tell what SecureBoot status is if it wanted to, much less require it be enabled.
You would be wrong, or so says microsoft;
https://answers.microsoft.com/en-us...size-bar/9eb1f448-830f-4645-81e9-ba8ac2b635d8
I tested this. Enabling Secureboot allows Rebar to work. Disabling it disables both.
So yeah, deal breaker for me.

Thus my hope that future ARC cards perform well without Rebar.
 
Back
Top