Thursday, August 11th 2022

Intel GPU Business in a $3.5 Billion Hole, Jon Peddie Recommends Sell or Kill

Jon Peddie Research (JPR) provides some of the most authoritative and informative market-research into the PC graphics hardware industry. The firm just published a scathing editorial on the future of Intel AXG (Accelerated Computing Systems and Graphics), the business tasked with development of competitive discrete GPU and HPC compute accelerators for Intel. Founded to much fanfare in 2016 and led by Raja Koduri since 2016; AXG has been in the news for the development of the Xe graphics and compute architecture, particularly with the Xe-HP "Ponte Vecchio" HPC accelerator; and the Arc brand of consumer discrete graphics solutions. JPR reports that Intel has invested several billions of Dollars into AXG, to little avail, with none of its product lines bringing in notable revenues for the company. Xe-LP based iGPUs do not count as they're integrated with client processors, and their revenues are clubbed with CCG (Client Computing Group).

Intel started reporting revenues from the AXG business since Q1-2021, around which time it started selling its first discrete GPUs as the Intel DG1 Xe MAX, based on the same Xe-LP architecture powering its iGPUs. The company's Xe-HPG architecture, designed for high-performance gaming, was marketed as its first definitive answer to NVIDIA GeForce and AMD Radeon. Since Q1-2021, Intel has lost $2.1 billion to AXG, with not much to show for. The JPR article suggests that Intel missed the bus both with its time-to-market and scale.
A sizable launch of Arc "Alchemist" in 2021 or early-2022, in the thick of the GPU supply crisis, would have enabled Intel to cash in on sales to whoever is in the market for a graphics card. With the supply crisis over in the wake of the crypto-currency mining demand for dGPUs, Intel finds Arc "Alchemist" competing with GeForce and Radeon products purely on gaming performance, where its fastest "Alchemist" product matches their mid-range products. Both NVIDIA and AMD are ready to ship their next-generation, which is bound to widen the performance gap with Intel even further. Besides graphics, NVIDIA and AMD are ready with even their next-generation scalar compute products, with NVIDIA's Hopper, and AMD's CDNA3, increasing the performance gap with "Ponte Vecchio."

With the recent axing of the Optane Memory business, which rode on the promise of the pioneering 3D XPoint memory technology that Intel invented, it's open-season on non-performing Intel businesses, especially with CEO Pat Gelsinger seeing favorable outcomes in Washington DC for legislation that makes business favorable for Intel and increases government subsidies for the company. JPR recommends that in light of the losses faced by AXG, Intel should consider selling the entire division off and exiting this market.

The JPR editorial can be read from the source link below.
Source: Jon Peddie Research
Add your own comment

112 Comments on Intel GPU Business in a $3.5 Billion Hole, Jon Peddie Recommends Sell or Kill

#26
Vayra86
Solaris17Thanks I respect this response. I disagree and I think all the companies mentioned either today or historically bet on something that wasn't meant to be or otherwise failed, and even with the crypto bubble bursting I think there is a room for a third player. AMD traded at under $3 in the last decade the entire company. I don't think Intel will die off, and if they are under stress all the better. Hopefully they bring something to the table technology wise or performance wise that pressures the others. Those are my thoughts though, not facts and they could very well fail. I hope they don't, just as id hope any other company trying to compete didn't.
I didn't say and I don't think they will die either, but they are definitely going to be forced to go forward making choices and budget cuts. The recent annual report wasn't rosy and I don't see any signs of it turning around.

You are right, all tech companies try things all the time, but this is a big one, and its something that relies on a long breath. If the result now is like this, its a decent question to ask if they should continue. I mean, think from a shareholder's perspective, if they lost 3.5B already, and we all think and guesstimate they'll need a gen or two to gain parity (and even that is questionable, especially in high end where they have to also compete), what is the ROI here? Years? Decades?
Posted on Reply
#27
lexluthermiester
Intel is starting fresh in the dGPU sector. Like NVidia and ATI/AMD, it will take time, research, refinement and advancements, but if they stay the course they will be an important player, one we should all encourage.
Posted on Reply
#28
TheoneandonlyMrK
MikeMurphyBig datacenter is all moving to ARM64. It has nothing to do with AMD, but rather less expensive and more secure ARM64 custom silicon. Intel x86 CPU business has a lot of headwinds at present which is why Intel is now aggressively moving to develop other sources of revenue, which honestly should have started a long time ago - but it didn't.
I'm going to have to disagree there.

They make far more different stuff than people realise plus they bought many companies including an FPGA one, I think they'll be fine personally.

I also don't agree with j peddling research either, stick with it but do better Intel.
Posted on Reply
#29
Vayra86
r9From the interviews I've seen it looked to me like they made peace on how this first generation of ARC gonna go down which I think it's the right way to go about it.
Only reason to cancel this would have been to if they actually believed that they will just blow these GPU companies out of the water, companies that been the cutting edge for decades and that's just delusional.
So they are bad in DX11 which means old games that gonna run well anyways so it doesn't really matter and their DX12 has solid performance.
They also claim that they'll price it based on DX11 performance so we might be able to get same performance for less if we accept that there will be bugs to deal with.
Another thing that I think it's positive is that the number of transistors of ARC 770 is around 6800 non-XT and I think that's where they'll land with performance in DX12 and if that's true they are actually on par and it's not like they are using twice the silicon to match it's performance as that way they'll never be a success.
And lastly the driver thing well that just takes time it's plain physics there is no way around it. So they can only better with time.
DX11 games can be heavier than DX12 on GPU. I wouldn't dismiss them anytime soon. AMD did. It cost them dearly and the age of DX12 is only upon us for a few years. Also, we have competitors with full compatibility and tremendous performance.

Are they going sit that out? It's easily 2025 before you'll see more than half of what's getting released in DX12 only. If not later.

DX9? Sure. DX11? Suicide.
Posted on Reply
#30
Arkz
Bomby569They have driver issues, serious driver issues, and to be honest this was to be expected, you had to know nothing of the market to assume anything else. I called it here way before all problems.

But from raw performance the cards don't seem that bad, for a first try the silicon shows an immense progress. AMD had decades of this and couldn't do better against Nvidia then the RX480 at one point. This is money lost if they fell they can't make any progress, it's money invested if they fell they can improve.

But there is clearly a leadership problem making things worst, this launch was absolutely ridiculous at all levels.
Eh? AMD and NV had been trading blows at the high end for years before NV got ahead for the last few generations. Certainly not decades, and AMD only bought ATI in 2006.
Posted on Reply
#31
GreiverBlade
Vayra86

This engineer has no business at executive level jobs.

Overpromise and underdeliver is not where you want to be at this level. Its where you absolutely should not be. That is the reason for 'Poor Vega' and now/soon Poor Arc.
hilarious that my first thinking when A380 got benched, was :
"Raja at AMD uncompetitive product and somewhat "bad" product"
"Raja goes to Intel, AMD GPU Div. got better and better"
"Raja at Intel overhyped and oversold promises and then fiasco launch of a somewhat "bad" product"
Posted on Reply
#32
MikeMurphy
Vayra86DX11 games can be heavier than DX12 on GPU. I wouldn't dismiss them anytime soon. AMD did. It cost them dearly and the age of DX12 is only upon us for a few years. Also, we have competitors with full compatibility and tremendous performance.

Are they going sit that out? It's easily 2025 before you'll see more than half of what's getting released in DX12 only. If not later.

DX9? Sure. DX11? Suicide.
I think titles just need a DX12 option to do well on Arc. All new major releases support DX12.

Intel indicated they need time to improve DX11 driver performance as it requires a lot of optimization work for each title.
Posted on Reply
#33
Bomby569
ArkzEh? AMD and NV had been trading blows at the high end for years before NV got ahead for the last few generations. Certainly not decades, and AMD only bought ATI in 2006.
What is the relevance in separating AMD from ATI when they bought the company, the IP's, the engineers, the know how, the buildings, the cleaning personal. So AMD started the bussiness from zero in 2006 is that what you are saying?
Posted on Reply
#34
ThrashZone
Hi,
Option three do what intel has already stated they are going to do
Raise cpu prices.
Posted on Reply
#35
Punkenjoy
MikeMurphyBig datacenter is all moving to ARM64. It has nothing to do with AMD, but rather less expensive and more secure ARM64 custom silicon. Intel x86 CPU business has a lot of headwinds at present which is why Intel is now aggressively moving to develop other sources of revenue, which honestly should have started a long time ago - but it didn't.
Many Datacenter are going to have an ARM offer, but in the end, it's more the move from VM/IaaS to PaaS that will make a big shift in the CPU landscape. People running something in PaaS do not care on what it Run, it's the cloud provider that have to figure that out.

This mean less reason to commit to a specific architecture. Could be RISCV, could be ARM, could be X64, a lot of that market will be free and this is where the market will go. ARM will have a better chance but it do not mean it's going to win defacto. Some big cloud provider develop their own CPU, but that do not mean they will be viable in the long term.

But back to the topic, i think Intel need to continue to suffer to get good a GPU or else, it will just become a relic of the past. They need to be good at that because GPU are just going to become even more important in the future. That or a merge with Nvidia.
Posted on Reply
#36
trparky
GreiverBladehilarious that my first thinking when A380 got benched, was :
"Raja at AMD uncompetitive product and somewhat "bad" product"
"Raja goes to Intel, AMD GPU Div. got better and better"
"Raja at Intel overhyped and oversold promises and then fiasco launch of a somewhat "bad" product"
This.

There's a reason why Raja was let go by AMD, now Intel is finding out why.
Posted on Reply
#37
defaultluser
Solaris17Im sorry but what? It almost sounds like they have a vested interest. Intel, S3 or whoever entering the market would be a massive win in both terms of the company in question and consumers. It is confusing to me that they are concerned about Intels $2.xB loss when if they are that good at finance knows that is peanuts to develop a product. Let alone Intels over all capitol this is a drop in the pond.

Maybe I’m just that financially tone deaf but fucking lol.
There's always a cutoff for something Intel has failed at. Be it Contra-revenue to get wins in the new handheld windows 8.1 tablet market (completely dead now that Intel has killed it, and even AMD abandoned Jaguar), or or just spending way too much on other tech they failed horribly at (remember Intel 5g? group? yeah, that's gone too!)

The drivers continue to have bad stability in a numbber of smaller-market games, so why would you bother buying this thing , when the 6600 already matches the 3060 for discount price?

When you already have two well-established competitors, its surprisingly hard to make the value proposition for more than a handful of folks., so its likely to take 5-10 more years before Intel turns a profit on this mess!

Anyone Remember s3 Chrome? Yeah, having competitive entry-level tech did not make up for the highly-variable performance! If you get very little money then you ahave lirttle to spend on drivers.

www.extremetech.com/computing/76978-s3-chrome-s27-review

Intel has the money to make-up the ground, but are they wiling to fork out that amount of change? The problem folks have here is you already know how bad the igp driver are today, so why would you suddenly thing that is "good enough? for you to whip-out discrete prices?
Posted on Reply
#38
TheinsanegamerN
"Resource intensive project to build an entire new GPU arch that just launched its first product is still in the red"

Wow. JPR analysts are PAID to write this garbage?
Posted on Reply
#39
FeelinFroggy
Maybe they ought to wait until they release their GPUs before they decide to kill it. I would think some sales of the new GPU would start easing that 3.5 billion dollar investment.
Posted on Reply
#40
zlobby
MikeMurphyBig datacenter is all moving to ARM64. It has nothing to do with AMD, but rather less expensive and more secure ARM64 custom silicon. Intel x86 CPU business has a lot of headwinds at present which is why Intel is now aggressively moving to develop other sources of revenue, which honestly should have started a long time ago - but it didn't.
I really lost it at 'more secure' :D
Posted on Reply
#41
maxfly
Wow, click baity, easy target, lazy writing. Yep, there's heaps of players out there with the capitol to buy a division saddled with 3.5b in debt that has shown no profitability to date and will likely require several million to billions more invested before showing a modicum of profit. Yeah, that company exists outside of Intel or ngreedia, who has zero interest.
So who exactly is this customer?
Either Intel does this or they don't, end of story.

I could have saved them an entire day of editing and a week of typing.
Posted on Reply
#42
defaultluser
FeelinFroggyMaybe they ought to wait until they release their GPUs before they decide to kill it. I would think some sales of the new GPU would start easing that 3.5 billion dollar investment.
not if they're so late its left competing with the 4060 - until it has drivers stable enough to match AMD, its still going to have to be value-priced to move any stock (but if they have to wait till next year for volume, its going to end-up in clearance dumpsters almost immediately (think i740


0!
Posted on Reply
#43
Testsubject01
Solaris17Im sorry but what? It almost sounds like they have a vested interest. Intel, S3 or whoever entering the market would be a massive win in both terms of the company in question and consumers. It is confusing to me that they are concerned about Intels $2.xB loss when if they are that good at finance knows that is peanuts to develop a product. Let alone Intels over all capitol this is a drop in the pond.

Maybe I’m just that financially tone deaf but fucking lol.
Seems they are hyperfocused on short term profits.:kookoo: I'm no finance pro, but breaking into the High-end GPU market, especially with current market conditions, would always have been a 99% venture capital burn.
Sure, there is the possibility to win the lottery with your initial launch, but on a more realistic note, it takes a few generations to get it right.
Posted on Reply
#44
trparky
Testsubject01Seems they are hyperfocused on short term profits.
The guy's a Wall Street hack. What do you want? Sane thinking? Good luck with that.
Posted on Reply
#45
outpt
The new CEO has not been there long enough. He seems to be a good executive. If AMD can pull their asses out of the fire surely Intel can do better. I just don’t see them tossing the towel in. 3.5 billion in losses is nothing to a corporation the size of Intel. There’s to much instant gratification for the quick buck bunch.
Say that three times fast
Posted on Reply
#46
medi01
The "OMG, new product that isn't even sold to customers is losing money" is a weird thing to say, isn't it?
PunkenjoyPeople running something in PaaS do not care on what it Run, it's the cloud provider that have to figure that out.
You can run GO (shockingly good stuff) programs as PaaS, and they are compiled into native code.
Admittedly, it's not typical and most of the other offerings run in some sort of interpreter/JIT.
Posted on Reply
#47
GreiverBlade
defaultluseruntil it has drivers stable enough to match AMD,
oh? that's a tall order, because right now for me (disclaimer, situational... i might just be lucky) AMD drivers (Polaris series) are the most stable driver i had in the last 10yrs (previously R9 290 then GTX 980 and lastly GTX 1070, i let you guess the most horrible drivers who made me roll back 3 release prior to latest :laugh: ) ofc both side had sh!t driver (not on Intel level tho :rolleyes: ) and i don't know why i have 0 artifacts or bad performances or whatever issues "lots" of users repport... using the latest AMD driver :laugh:
medi01The "OMG, new product that isn't even sold to customers is losing money" is a weird thing to say, isn't it?
acually if he AIB do not buy them (in the news www.techpowerup.com/297490/intel-arc-board-partners-are-reportedly-stopping-production-encountering-quality-issues )hen the product is not, indeed, selling, no? :laugh: (well in China do the A380 sells well? i wonder...)
outptIf AMD can pull their asses out of the fire surely Intel can do better
better than making an unthinkable comeback and raising from the dead (Ryzen from the dead?) i doubt ... since they are not in that state, what AMD (well Lisa... ) achieved is epic level! (Epyc Level?)


Intel pretty please, stop PR showing ("disclaimer: will not do that in real life scenario, keep in mind the test we do are in a totally selective and controlled environment.") number of your GPU being on level of XXXX concurrent model (and later being completely bogged down by drivers issues) offer Raja a good spot (in the janitor room maybe? Coffeer courrier? well anything but GPU related) and "same player try again" (for the 3rd time since Larabee, or 4th since i740, but i liked my real3d starfighter i740 :laugh: )
Posted on Reply
#48
defaultluser
GreiverBladeoh? that's a tall order, because right now for me (disclaimer, situational... i might just be lucky) AMD drivers (Polaris series) are the most stable driver i had in the last 10yrs (previously R9 290 then GTX 980 and lastly GTX 1070, i let you guess the most horible drivers who made me roll back 3 release prior to latest :laugh: ) ofc both side had sh!t driver (not on Intel level tho :rolleyes: ) and i don't know why i have 0 artifacts or bad performances or whatever issues "lots" of users repport... using the latest AMD driver :laugh:


acually if he AIB do not buy them (in the news www.techpowerup.com/297490/intel-arc-board-partners-are-reportedly-stopping-production-encountering-quality-issues )hen the product is not, indeed, selling, no? :laugh: (well in China do the A380 sells well? i wonder...)


better than making an unthinkable comeback and raising from the dead (Ryzen from the dead?) i doubt ... since they are not in that state, what AMD (well Lisa... ) achieved is epic level! (Epyc Level?)


Intel pretty please, stop PR showing ("disclaimer: will not do that in real life scenario, keep in mind the test we do are in a totally selective and controlled environment.") number of your GPU being on level of XXXX concurrent model (and later being completely bogged down by drivers issues) offer Raja a good spot (in the janitor room maybe? Coffeer courrier? well anything but GPU related) and "same player try again" (for the 3rd time since Larabee, or 4th since i740, but i liked my real3d starfighter i740 :laugh: )
I'm only saying AMD is the target to reach before you're "stable", even though they have a long history of taking 6-12 months before they get stable. with each new Arch (Polaris the card you're using shat-the -bed for its first 12 months, rdna 1, same time period. RDNA2 was the first new release with surprisingly stable drivers in the entire history of GCN - if they can keep that up for multiple releases, then I'll put thenm ion the same stability level as Nvidia,)

Intel is still trying to get Xe stable in mainstream games 2 years later - quite a long period
Posted on Reply
#49
Garrus
It basically took AMD 7 years between the Radeon 7000 series and RDNA2 (not coincidentally their PS4 and PS5 architecture) to fix their graphics division. Intel needs to be committed to 7 years of products also.

It took more than 7 years from Bulldozer to Ryzen. Intel needs to be in it for the long haul. I don't think the fact their first product is bad is that surprising.
Posted on Reply
#50
eidairaman1
The Exiled Airman
MikeMurphyWhen a company the size of Intel decides to enter into the dGPU market, it does so. Market timing and all the rest doesn't undermine the business case for entering into the dGPU market. Microsoft didn't abandon gaming when the Xbox360 RROD debacle came about, but rather fixed it at great cost.

Big picture is high performance parallel computing is a big deal and Intel will enter the market. Product pricing can be adjusted to remain competitive.

First gen can be rocky, but Intel has the design chop to solve these problems, even the mediocre DX11 driver issues.

It's coming.
Wishful thinking, eg i740, and igps are examples of trouble.
GarrusIt basically took AMD 7 years between the Radeon 7000 series and RDNA2 (not coincidentally their PS4 and PS5 architecture) to fix their graphics division. Intel needs to be committed to 7 years of products also.

It took more than 7 years from Bulldozer to Ryzen. Intel needs to be in it for the long haul. I don't think the fact their first product is bad is that surprising.
Had no issues with the R9 series GPUs, it wasnt till HBM that was a problem.
Posted on Reply
Add your own comment
May 7th, 2024 12:45 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts