• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel GPU Business in a $3.5 Billion Hole, Jon Peddie Recommends Sell or Kill

It basically took AMD 7 years between the Radeon 7000 series and RDNA2 (not coincidentally their PS4 and PS5 architecture) to fix their graphics division. Intel needs to be committed to 7 years of products also.

It took more than 7 years from Bulldozer to Ryzen. Intel needs to be in it for the long haul. I don't think the fact their first product is bad is that surprising.
 
When a company the size of Intel decides to enter into the dGPU market, it does so. Market timing and all the rest doesn't undermine the business case for entering into the dGPU market. Microsoft didn't abandon gaming when the Xbox360 RROD debacle came about, but rather fixed it at great cost.

Big picture is high performance parallel computing is a big deal and Intel will enter the market. Product pricing can be adjusted to remain competitive.

First gen can be rocky, but Intel has the design chop to solve these problems, even the mediocre DX11 driver issues.

It's coming.
Wishful thinking, eg i740, and igps are examples of trouble.

It basically took AMD 7 years between the Radeon 7000 series and RDNA2 (not coincidentally their PS4 and PS5 architecture) to fix their graphics division. Intel needs to be committed to 7 years of products also.

It took more than 7 years from Bulldozer to Ryzen. Intel needs to be in it for the long haul. I don't think the fact their first product is bad is that surprising.
Had no issues with the R9 series GPUs, it wasnt till HBM that was a problem.
 
It took more than 7 years from Bulldozer to Ryzen. Intel needs to be in it for the long haul. I don't think the fact their first product is bad is that surprising.
Yeah well, I don't think Intel is used to losing. They're used to having a great product straight out of the gate. Internal company morale must have taken a major hit with this.
 
Not suprising as a lot of stuff that Intel touches turns to shit if it's not a traditional x86 design.
And if Battlemage turns out to be the same disaster as Alchemist, I'd say they will kill off their gaming GPU endeavours and refocus purely on HPC accelerators...
 
What is the relevance in separating AMD from ATI when they bought the company, the IP's, the engineers, the know how, the buildings, the cleaning personal. So AMD started the bussiness from zero in 2006 is that what you are saying?
No, and well done on ignoring the rest.
 
I agree, better sell the IPs to someone else (maybe AMD, to help them in streaming and encoding). Intel has been doing GPU for ages, no excuse for their failure with Arc. Maybe they could've focused on the low end, launching something better than a 1030 for cheap and go up from there, targeting the 3070 was very ambitious.
With ATI/AMD and nVidia we had a lot of competition graphics cards launched nearly every 6 months, it's only the last few generations that AMD struggled a bit, but since RDNA AMD is back (ignoring nVidia fanboys especially among reviewers) we will have competition again.
 
no excuse for their failure with Arc.
I blame Raja for it. He was a failure at AMD, and now he's a failure at Intel. It's obvious he doesn't belong in an executive position. He couldn't organize a kegger in a brewery or an orgy in a whorehouse let alone a graphics card division.
 
Last edited:
You cannot look at all that surrounded the ARC launch and say there is even a glimpse of leadership, and i'm not talking about performance issues.
Maybe though it's more a case of Raja screwing up royally and the Intel upper management giving him too much freedom and not enough oversight.I think he will be in deep shit after this all comes to a head. The rest of the Intel departments are apparently fuming at what's going on in the gpu group. Arc desktop may be cancelled to focus on Battelmage but I doubt Raja would be leading that..

Now this is this a failing of Gelsinger or did he put too much trust in Raja and has learned his lesson.

I'm no Intel fan, but we need them to be competitive and a 3rd gpu player is most needed. I hope they don't bail on discrete gpus, but Arc has been an unmitigated disaster for them.
 
A 3rd player launching a dGPU onto global markets for the first time in many yrs among the "post" pandemic world & its subsequent supply chain issues... what could go wrong?? :rolleyes:
Combine that with rising geopolitical tensions in East Asia where most of the manufacturing is done ....
I'm sure Intel had a crystal ball yrs ago & could EASILY see this mess! :laugh:
 
Last edited:
What is the relevance in separating AMD from ATI when they bought the company, the IP's, the engineers, the know how, the buildings, the cleaning personal. So AMD started the bussiness from zero in 2006 is that what you are saying?
So Intel start their GPU bussiness today, is that what you are saying? You forget decades of IGPUs from Intel, and the most important - Xe is already 3 years old, they had 3 years to fix their drivers and learn how to do it properly, but they didn't. Do you seriously believe that IGPU is something different than GPU in terms of drivers and hardware?
 
AMD took a giant risk in buying ATI & everyone claimed AMD would go bankrput. Here we are 16 year later AMD has high-end gpu's & everyone is forgetting that AMD had to settle for making mid-range graphics cards for a very long time. In the long run Intel better be ok with more than a 3.5 billion dollar drop cause for AMD it was 5.4 Billion dollars to lose + the other Losses.
 
Damn, I just want to believe Intel could finally enter dgpu market. Yes, it's not going to be a smooth sailing ride especially when your competitor has been in it like, idk 20+ years?

I was expecting their 1st product to have flaws here and there, and I could reason with that. But killing this before it even started? Damn.

Just my thoughts, hey who am I anyway? I'm just a random guy over the internet after all.
 
Who in their right mind would listen to "Jon Peddie Research"? I remember they published at the height of cyptocraze when all the cards, even the cheapest ones were selling for at least twice their MSRP, that "25% of cards were sold to crypto miners".

They're just throwing out some guesses that all the rest of us can in forums.

Yes, Intel would perhaps be better of by selling or axing their discrete GPU branch, like they did Optane. At least in the short run. But I don't think they will. ARC, as late and underpowered and undeveloped it is, at least shows that they can enter the discrete GPU business, competively maybe with next generation. And even if they fudge that one too and it takes too long to compete with 2022 Nvidia and AMD generations, I think they are in for the long run, to learn how to do GPU computing, not just gaming.
 
Intel needs to take a risk here. Intel already gave up on Optane and cell-phone chips. Intel also killed Xeon Phi and that entire ecosystem for this path. Seems a bit late to get cold feet over this. Intel must create a high performance computer, just to remain competitive at the supercomputer level. GPUs / SIMD-compute is the future, and there's no other way to reach exascale unless they go SIMD/GPGPU.

The gaming-portion is basically to help fund the research into the high-end stuff. Much like how NVidia uses lower-end 3060 GPUs to fund the research of $10,000 A100 server GPUs, Intel needs to get the video-game market hooked onto some product that supports the stack. That's the only way this will be economically feasible.

Now maybe the PC market is too difficult to break into. Then Intel should instead try with the console market first, or something else along those lines. Or maybe Intel makes a compute-only chip and finds a new market of buyers (but unlikely, video gamers have shown to be wanting good chips and funding this kind of SIMD-compute research). The only other niche is maybe the AI / Deep Learning fellows. But casting a wide net and getting as many customers as possible is just the most obvious strategy here.
 
Also, "$3.5 billion hole" is the estimate of the whole project's cost since 2016, in 6 years. Intel's yearly revenue is $80 billion!

They are in a free fall right now, last quarter was 22% down compared to last year, but so is the rest of the market. We are in a recession, whatever the definitions say.

I'm sure many shareholders would welcome axing such an expensive project for momentary gain, no matter the potential. And that's what might actually happen.
 
View attachment 257708
View attachment 257709
This engineer has no business at executive level jobs.

Overpromise and underdeliver is not where you want to be at this level. Its where you absolutely should not be. That is the reason for 'Poor Vega' and now/soon Poor Arc.

This guy was also responsible for HDR gaming that we have today. Anyway that's not the point, this was his first attempt at Intel and secondly at AMD they were budget strapped so he has always been taking job against all odds.
 
A bad suggestion, imo. Even if it's unprofitable, selling something is still better than nothing.
 
The need for a SOC to remain competitive is a requirement, AMD is eating their lunch, their lack of a GPU is all that’s missing
 
if they can keep that up for multiple releases, then I'll put thenm ion the same stability level as Nvidia
i understand your point of view, but for me it's the opposite, Nvidia have been unstable for me and ATI/AMD i never had any single issue since 1999 with up to date drivers :oops:
ATI/AMD : from a Mach64 LT, a few Rage, a X1650 Pro passing by a X1950 Pro a HD3650 3670 4870 a 5XXX i don't remember :laugh: 7870 R9 270 290 and then RX 6700 XT (and pobably a few more i forgot)
Nvidia: from Tnt1 , Geforce 256, Geforce 2 GTS, :oops: MX400 :oops: 460 480, 560 580 (even a 580 SLI of Matrix Platinum) 670, 770, 980 and lastly 1070.

i spent enough time with them since then :laugh: nonetheless i's down to personal point of view and luck, in my case, i guess :ohwell:

edit: as i said, i really loved the Real3d Starfighter i740, even tho it was sh!t performance wise in 2D and utterly weak in 3D (i had a 8mb version ) (although drivers were fine with that one :oops: )
and i will not talk about the S3, Permedia, PowerVR and 3Dfx i owned.
 
acually if he AIB do not buy them (in the news https://www.techpowerup.com/297490/...opping-production-encountering-quality-issues )hen the product is not, indeed, selling, no? :laugh: (well in China do the A380 sells well? i wonder...)
My point is it is not about single product, it's about Intel carving out discrete GPU market. It is a journy, not a step.

As for that link, it is looks more like cypto hangover than anything else.

Now, how bad is being 13% behind RDNA1? Not great, to put it softly, but not an unrecoverable catastrophe either. Especially given Mr Koduri's involvement...

 
View attachment 257708
View attachment 257709
This engineer has no business at executive level jobs.

Overpromise and underdeliver is not where you want to be at this level. Its where you absolutely should not be. That is the reason for 'Poor Vega' and now/soon Poor Arc.
Agreed with your point

I think we need to look into the big picture - long term. With introductory ARC the performance is not that bad tbh as every new product ( with a new arch.) needs time to flourish and become stable. I think intel can create an impact in the dGPU market in the coming years.

Slowly it will get there!
 
It basically took AMD 7 years between the Radeon 7000 series and RDNA2 (not coincidentally their PS4 and PS5 architecture) to fix their graphics division. Intel needs to be committed to 7 years of products also.

It took more than 7 years from Bulldozer to Ryzen. Intel needs to be in it for the long haul. I don't think the fact their first product is bad is that surprising.
Context matters.

AMD was a financially starving company with shrinking market share.

At some point (and it certainly wasn't 7 years) they bet on Zen architecture and then it delivered.

Once GPU division got proper funding IN A MERE YEAR AMD was able to hit with RDNA1, which was a major leap forward, essentially closing perf/mm2 gap and perf/watt gaps vs team green. One more year, and RDNA2 disrupts NV lineup, forcing Huang to drop a tier (that is why we got 10GB 3080 and 12 GB on lower end cards). The only reason we haven't seen that havoc in NV financials is the crypto bubble.

So, bottom line is, it doesn't take 7 years for a competent GPU manufacturer to roll out serious competition.

Intel, of course, is rather lacking respective experience, but... perhaps 13% worse than RDNA1 for the first product is not that bad.
 
My point is it is not about single product, it's about Intel carving out discrete GPU market. It is a journy, not a step.

As for that link, it is looks more like cypto hangover than anything else.

Now, how bad is being 13% behind RDNA1? Not great, to put it softly, but not an unrecoverable catastrophe either. Especially given Mr Koduri's involvement...

oh, do not misread me either, i want Intel to succeed but because it's Intel, i find these recent news, laughable to say the least ... it's not their first graphic card rodeo, but they are making beginner mistakes :cry:

Once GPU division got proper funding IN A MERE YEAR AMD was able to hit with RDNA1, which was a major leap forward, essentially closing perf/mm2 gap and perf/watt gaps vs team green. One more year, and RDNA2 disrupts NV lineup, forcing Huang to drop a tier (that is why we got 10GB 3080 and 12 GB on lower end cards). The only reason we haven't seen that havoc in NV financials is the crypto bubble.

So, bottom line is, it doesn't take 7 years for a competent GPU manufacturer to roll out serious competition.

Intel, of course, is rather lacking respective experience, but... perhaps 13% worse than RDNA1 for the first product is not that bad.

AMD GPU division got better after Raja left ... that's also a context ... (and also why i did not go Vega, although for after he left i did not take a GPU since i was stuck with my 1070 due to cryptocraze :laugh: ) yeah Arc is not good, certainly not up to what Raja or anyone at Intel said (remember the "Intel is promising to deliver loads of GPU to gamers"? ) and they are just behind the current gen (or previous gen in some cases) while AMD and Nvidia are literally readying next gens with a sizable jump in perf from the current gen (well, i am happy with my RX 6700 XT her perfs will not go away with RTX 40XX and RX 7X00)

in short what did happen for both enterprise? Raja happened :laugh: (kinda sad tho ... )
 
Back
Top