• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Continues to Develop GPUs Beyond Arc Battlemage

Anyone predicting the end of DIY, sounds like what he said/she said, back in the Windows 8x era. :(

Me thinking positively, I can't see DIY literally being dead, in fact, there was backlash over Surface RT. Any such negativity, reminds me of the early-Surface era. But, we got very close to losing DIY already, before there was a lot of push back against Surface RT and Windows RT.

On top of that, we aren't being forced to get a laptop, despite Microsoft's advertisement showing off laptops on Windows 10 users' screens.

I still have high hopes for Arc! Especially if GeForces are going to be too expensive for what they should be. I rather have an Arc A770 than a GeForce RTX 4080, anyways! Based on how expensive they are!

I feel like Nvidia is pushing it, even with the price of 'RTX 4070. :(
I don't think the DIY market will literally die (sales are still increasing even!), but rather shrink in the bigger picture and become more niche as time goes.

Desktops alone are already a minority of shipments, and have a way minor growth rate when compared to laptops (source).
Given how DIY is an even smaller fraction of the overall desktop market, I think it's fair to say that an off-the-shelf mini-PC that's powerful enough for most users will likely achieve a higher market share compared to DIY setups. Mac minis have already shown that this is doable for many people, maybe an equivalent with Strix Halo can prove the same for the gaming crowd.
Another thing is how a gaming setup has only became more and more expensive as time goes, making it harder for people to keep with the hobby.

Of course that's just my opinion and I can be totally wrong. If I were able to predict the future with certainty I'd have won the lottery already :p

Really? I am talking about a possible future, you translate it as if I am talking about what is happening today and you don't see the difference?
OK.....
No, because I don't see how a 5090 would be different for ARM or x86.
I am talking about full attack in the consumer market, not a few boards for developers.
How would any of that be an "attack"? If anything, having more options in the market would be a good thing.
I am talking about the top discrete GPUs, for example, RTX 5090 performing better on the ARM platform, getting released 3 months earlier and being 10% cheaper than the same model for the x86 platform. Players will start building Nvidia powered ARM based systems for gaming, instead of x86 ones with Intel or AMD CPUs.
That's the point I'm trying to understand. I don't see any way how a "5090 ARM-Exclusive edition" would exist, unless you're talking about fully integrated devices, which would fall into my mini-PC point. If you can elaborate how such thing would exist in a discrete manner, I guess it'd be better for both of us.
Strix Halo will be a nice example of a device that has no proper equivalent desktop-wise, but it's a SoC, just like consoles.
I can't see mini PCs taking over, but I could see laptops, consoles and pre build systems, designed by Nvidia, with minimal upgradability, becoming the preferred options for the majority of gamers. Those pre build systems could come not just as mini PCs, but also as typical desktop form factors, offering the chance for multiple storage solutions and probably a couple PCIe slots for upgrades targeting also semi professionals.
Fair enough. Technically such pre-built system you mentioned wouldn't be much different than a mini-PC, but the somewhat increased size for some expansion (like Sata, extra M2, and 1 or 2 PCIe slots) could still make it way smaller than your regular DIY setup. Maybe more similar to those ITX ones (without any of the upgradeability, apart from some peripherals).

I can also totally see Nvidia downscaling their DGX stations for a more "affordable" workstation with proper PCIe slots and whatnot, but that would be hella expensive still and outside of the range of your regular desktop buyer.

Gaming is what teenagers do, compute is what many professionals want. If x86 loses the advantage of being the de facto option for hi end gaming and hi performance software, it will start losing the battle against ARM in the PC market.
Compute has no hard requirements for x86. As I said, Nvidia on ARM is already a thing in the enterprise. Your only point seems to be gaming, for which the market, albeit significant, is a really small portion of the overall PC market share.
What is the biggest failure of the Qualcomm based Windows on ARM laptops? They are terrible in gaming. One of the reasons they are not flying of the selves.
I totally disagree with this point, and Apple products are a great counter argument to that.
Qualcomm products failed because software integration is still bad, performance was insignificant against the competition, and pricing made no sense compared to any other option.
People don't buy ultralight laptops for gaming anyway, at least I never even attempted to open a single game in my LG Gram.
 
I don't think the DIY market will literally die (sales are still increasing even!), but rather shrink in the bigger picture and become more niche as time goes.

Desktops alone are already a minority of shipments, and have a way minor growth rate when compared to laptops (source).
Given how DIY is an even smaller fraction of the overall desktop market, I think it's fair to say that an off-the-shelf mini-PC that's powerful enough for most users will likely achieve a higher market share compared to DIY setups. Mac minis have already shown that this is doable for many people, maybe an equivalent with Strix Halo can prove the same for the gaming crowd.
As long as laptops aren't always made to assume that we are all computer-illiterate, like that somebody who literally ends up with their bank account hacked, every single year for using a dictionary password in the 2020s!

Laptops that throttle abnormally for no specified reason=Scamtops.

Even back in the "CPU malaise era", (2012-2016 mostly) such thing would be insane! IIRC, I ranted in 2015 about that, after I have been hearing words claiming death of the PC.
 
Last edited:
So what do we define as 'success' in the case of Battlemage I wonder. Some % of market share? Some number of sales? General quality of product being in order, including drivers and legacy gaming on it? Being able to hit the performance of the competition? All of the above?

I'm interested in what people think :) I don't know myself, honestly. I wonder if Intel knows.
For a first go in the GPU market, it's been a good effort and Intel isn't losing money on the project. If Battlemage can raise the performance bar and have competitive pricing, it'll do well.
 
For the purpose of this discussion it also matters that PCIe works over a cable. External GPUs may become more common in the long term, and they may take the shape of a mini PC, meant to be stacked together with one.
More boxes on the desk. Thanks, but no thanks, despite the known number of advantages, there are also some drawbacks. The price is also further increased, because of the box, second power supply and premium communications.
 
I think there is a large range from cancelling a project to continuing the project with clear vision and goals, or just continuing, but spending minimal amounts. Intel could be splurging large amounts of cash on a project, or leave it on a lifeline, update it just a bit and call it a new generation (kind of something that AMD is doing right now), and wait for better times to compete on discrete GPU market.
 
No, because I don't see how a 5090 would be different for ARM or x86.
Look at Intel's ARC line and how REBAR changes performance. Now imagine on a future Nvidia platform having the option to enable something that offers 10-20% extra performance and that tech is a proprietary tech available only on Nvidia's platform. Nvidia is doing it all the time, it's the way they do business the last 20+ years.
1732882511693.png

How would any of that be an "attack"? If anything, having more options in the market would be a good thing.
Really? You don't understand the way I use the word "attack" here?
That's the point I'm trying to understand. I don't see any way how a "5090 ARM-Exclusive edition" would exist, unless you're talking about fully integrated devices, which would fall into my mini-PC point. If you can elaborate how such thing would exist in a discrete manner, I guess it'd be better for both of us.
Strix Halo will be a nice example of a device that has no proper equivalent desktop-wise, but it's a SoC, just like consoles.
A gave you an example from Intel. Nvidia is a company that "invents" proprietary techs all the time in a way to offer it's products an advantage and at the same time create disadvantages for the competing products, but you still can't imagine that this could happen. ASUS last year presented a way to offer power to a graphics card, through an extra connector that would connect to a special motherboard, avoiding having to use pcie power cables. Future(10+ years away) hi end graphics cards from Nvidia could have a connector as an expansion of the PCIe slot, like what ASUS is doing, that accelerates somehow the graphics card. Or Nvidia could be limiting it's cards to PCIe 8X on x86 platforms and use the extra 8 lanes for something proprietary that promises extra performance on their ARM platform. So, the same card model is compatible with both platforms, but offers higher performance on the Nvidia platform only. Again, think REBAR.
I can also totally see Nvidia downscaling their DGX stations for a more "affordable" workstation with proper PCIe slots and whatnot, but that would be hella expensive still and outside of the range of your regular desktop buyer.
MACs are outside of the range of the regular buyer, but enjoy a healthy market share. Imagine now the best graphics/compute options being available, or performing better on an Nvidia platform. I can see a significant market share moving to the Nvidia option.
Compute has no hard requirements for x86. As I said, Nvidia on ARM is already a thing in the enterprise. Your only point seems to be gaming, for which the market, albeit significant, is a really small portion of the overall PC market share.
So, compute could turn to Nvidia easier. As for gaming, it's not a small part. While gamers are not a majority, gaming is still a major function of any PC, even an office PC and someone choosing a PC or laptop could also have gaming as an important parameter. AMD's APUs and the efforts Intel is doing to improve it's iGPUs prove that gaming is an important parameter even for ultra portables, even for laptops that are mostly targeting office applications and internet browsing.
I totally disagree with this point, and Apple products are a great counter argument to that.
Qualcomm products failed because software integration is still bad, performance was insignificant against the competition, and pricing made no sense compared to any other option.
People don't buy ultralight laptops for gaming anyway, at least I never even attempted to open a single game in my LG Gram.
Apple products are targeting a very specific target group, offering a different ecosystem and services. Not to mention a shiny logo that screams "premium" and "rich". Qualcomm is trying to compete with PCs in everything PCs are doing and while it offers better battery life, it fails mostly in gaming. As I said above, AMD's APUs and the latest Intel CPUs with the new iGPUs prove that even ultralight laptops must be able to game today. You and me we are in a different category, because we have gaming PCs to use. Why game on LG gram when in your system specs you mention not one but two 3090s? Others buy laptops and they, or their kids expect to also game on such laptops. So a discrete GPU, or a strong enough iGPU is a necessity.
 
It would be pretty ironic (for lack of a more appropriate word) if Intel ends up surviving because of its GPUs, while the management resets their whole CPU chain.

Who woulda thunk? :/
 
Most consumers dont want to take that risk, especially if $250 is a big deal for them
Missed this piece. I assume you that in some cases money isn't the issue. The reality is that these people don't value a GPU over their specific price point. For years even my friends get confused as to why I won't hesitate to spend $600 on a CPU, but will hesitate to spend more than $500 on a GPU - the answer is simple. I use my CPU for my "job" and a GPU is a "toy". This can also be seen in other elements of my life where I don't buy games consoles or fancy handhelds or RGB, or Star Trek figurines (or other collectibles, insert your hobby here) because I personally don't value those things as highly as others do. Some people may not even need more than that 1080p card (or in my case, I don't need a card that can do more than 4k60, but I absolutely do care if my code takes 30 minutes to compile vs 10).
 
Look at Intel's ARC line and how REBAR changes performance.
But rebar is an standard feature for PCIe stuff. Heck, it was AMD who came up with it and miners/HPC folks have been using it way before it became a thing on games.
So far you only used examples about standard features.
Now imagine on a future Nvidia platform having the option to enable something that offers 10-20% extra performance and that tech is a proprietary tech available only on Nvidia's platform. Nvidia is doing it all the time, it's the way they do business the last 20+ years.
Only thing that I could imagine would be them have some native NVLink to increase the throughput between CPU and GPU transfers, akin to what they had in some POWER workstations, but that would be irrelevant for games or the regular user.
Really? You don't understand the way I use the word "attack" here?
No, not really, otherwise I wouldn't be asking.
A gave you an example from Intel.
And that was a bad example and not something proprietary.
Future(10+ years away) hi end graphics cards from Nvidia could have a connector as an expansion of the PCIe slot, like what ASUS is doing, that accelerates somehow the graphics card. Or Nvidia could be limiting it's cards to PCIe 8X on x86 platforms and use the extra 8 lanes for something proprietary that promises extra performance on their ARM platform. So, the same card model is compatible with both platforms, but offers higher performance on the Nvidia platform only. Again, think REBAR.
Ok, now I can see your point (stop using rebar please lol).
Still, that's not specific to ARM by any means, they could do so in any platform of theirs, the fact that it's ARM is just a minor detail. They could come up with a partnership with AMD and only offer it in Ryzen 6969 series on the x999 chipset, it would be the exact same thing.

Again, I don't see how this has any relation to the whole x86 vs ARM issue you brought at first.
Imagine now the best graphics/compute options being available, or performing better on an Nvidia platform. I can see a significant market share moving to the Nvidia option.
I mean, won't consumers move to what's best for a given price range anyway? I believe that would be natural if such product came to be.
So, compute could turn to Nvidia easier.
GPGPU compute is either Nvidia (with a near monopoly) or AMD already, just see the whole drama about A100/H100/B200s going around, while AMD picks up scraps for the customers Nvidia is not able to sell to due to not enough supply.
As for gaming, it's not a small part. While gamers are not a majority, gaming is still a major function of any PC, even an office PC and someone choosing a PC or laptop could also have gaming as an important parameter. AMD's APUs and the efforts Intel is doing to improve it's iGPUs prove that gaming is an important parameter even for ultra portables, even for laptops that are mostly targeting office applications and internet browsing.
In my opinion you are overestimating the value of gaming. It's not a majority, and is not a major function. I concede that it's not insignificant by any means either, but I don't think it's relevant to continue on this topic since it's more of our own personal perceptions of how important it is vs how not.
Shall we agree to disagree on this one and continue to discuss on the other points?

Apple products are targeting a very specific target group, offering a different ecosystem and services. Not to mention a shiny logo that screams "premium" and "rich". Qualcomm is trying to compete with PCs in everything PCs are doing and while it offers better battery life, it fails mostly in gaming. As I said above, AMD's APUs and the latest Intel CPUs with the new iGPUs prove that even ultralight laptops must be able to game today. You and me we are in a different category, because we have gaming PCs to use. Why game on LG gram when in your system specs you mention not one but two 3090s? Others buy laptops and they, or their kids expect to also game on such laptops. So a discrete GPU, or a strong enough iGPU is a necessity.
Apple products (at least the entry level ones like Airs and base MBPs) compete directly with premium x86 laptops, think Dell XPS, Lenovo X1 Carbon, HP Spectre, etc etc. All of those sure have the shiny logo, but the idea of this class is a lightweight laptop with really long battery life and that you can get work done with anywhere, period. No dGPUs (apart from the bigger models, but then you forgo both portability and battery life). $1k+ price tag, 0.9~1.6kg, 7h+ battery life, that's the group we are talking about.
Qcom tried to compete in that same space, but failed since they had a higher price tag with less performance. GPU is a minor detail, since those other laptops are not meant for gaming either. You can run your lightweight indie game, but you can also do the same on Macbooks and even Snapdragon devices anyway, so I don't think this is any relevant.

Qcom did not try to bring a desktop-class product with expansion, nor a gamer-like GPU.

FWIW, I don't play games, my 2x3090 are for compute (who the hell would be using 2 to game when SLI is dead? :laugh:). To be 100% honest, last time I played something on steam was back in May, and I did play some mario kart when at a friends house a couple months ago, but I don't think a console from a 3rd party is any relevant to our discussion.
Most of my friends rarely ever play either. Don't forget this forum is a bit of a bubble focused mostly on gamers, but if you go to a random bar the majority of people in there likely won't have gaming as regular habit of theirs.
 
So what do we define as 'success' in the case of Battlemage I wonder.
Anything that impacts the market in a positive way for consumers
 
Anything that impacts the market in a positive way for consumers

I mean this is nice for consumers but ultimately at the end of the day Intel needs to start seeing financial success in order for them to continue to make GPUs.

Would be nice to see Intel and AMD team up for open-source or vendor agnostic solutions that crack Nvidia's software lock-in for various markets.
 
Would be nice to see Intel and AMD team up for open-source or vendor agnostic solutions that crack Nvidia's software lock-in for various markets.
Do you mean for compute?
AMD has ROCm, Intel has oneAPI, both are opensource and make use of some underlying stuff that's also opensource and often has contributions from both (some even from nvidia, like some vulkan extensions and SYCL stuff).

CUDA has a really big traction mostly due to inertia, it has been a thing for over a decade already and really focused on developer experience and accessibility for everyone, whereas Intel and AMD are trying to catch up now.
 
But rebar is an standard feature for PCIe stuff. Heck, it was AMD who came up with it and miners/HPC folks have been using it way before it became a thing on games.
So far you only used examples about standard features.
Look, I am not an Nvidia engineer. If you can't understand the meaning of what the other person tries to tell you, then I will just stop here. My time is precious. REBAR was just an example where performance on ARC GPUs was affected. It was just a simple example. What where you expecting from me? Invent a new tech present it to you and tell you "I am Nvidia and I am locking it only on my platform?". Well guess what. Nvidia is doing it all the time. Just use your imagination, try to understand what the other person tries to tell you and stop that "don't use REBAR it is a standard". If it was an Nvidia tech, it would have been proprietary, not a standard. And that's all.

PS I did read the rest of your post.
 
Look, I am not an Nvidia engineer. If you can't understand the meaning of what the other person tries to tell you, then I will just stop here. My time is precious. REBAR was just an example where performance on ARC GPUs was affected. It was just a simple example. What where you expecting from me? Invent a new tech present it to you and tell you "I am Nvidia and I am locking it only on my platform?". Well guess what. Nvidia is doing it all the time. Just use your imagination, try to understand what the other person tries to tell you and stop that "don't use REBAR it is a standard". If it was an Nvidia tech, it would have been proprietary, not a standard. And that's all.

PS I did read the rest of your post.
Point is that your idea just doesn't seem realistic at all from a hardware standpoint, so I was trying to understand if there were something that I was not picking up. If Nvidia were to go as far as to make something different like you said with the PCIe slots, at this point they'd be better just making a product not compatible at all with ATX standards (so, a mini-PC thing with everything soldered together).
Going half-way between fully compliant to a standard and going full-blown integrated design doesn't make sense at all to me tbh, so I was expecting you to try to provide an example where this would make some sense. Dreading about an unlikely future just because "big corporation bad" sounds baseless, specially when this doesn't seen any relevant to the ARM x x86 idea that brought the entire discussion to begin with.

Anyhow, you are not obliged to get your point across, you are free to just not reply and leave it at that :)
 
Back
Top