• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel's AI PC Chips Underperform, "Raptor Lake" Demand Sparks Shortages

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
3,114 (1.09/day)
Intel's latest AI-focused laptop processors, "Lunar Lake" and "Meteor Lake," have encountered slower-than-anticipated uptake, leading device manufacturers to increase orders for the previous-generation "Raptor Lake" chips. As a result, Intel 7 manufacturing lines, originally intended to scale up production of its newest AI-ready CPUs and transition to newer nodes, are now running at full capacity on "Raptor Lake" output, limiting the availability of both the new and legacy models. In its first-quarter 2025 financial report, Intel recorded revenue of $12.7 billion, essentially flat year-over-year, and a net loss of $821 million. The results fell short of the industry's expectations, and the company's stock declined by more than 5% in after-hours trading.

Management attributed the shortfall to cautious buying patterns among OEMs, who seek to manage inventory in light of ongoing US-China tariff discussions, and to consumer hesitancy to pay higher prices for AI-enabled features that are still emerging in mainstream applications. CEO Lip-Bu Tan outlined plans to reduce operating expenses by $500 million and lower capital expenditures by approximately $2 billion to address these challenges in 2025. He also confirmed that workforce reductions are planned for the second quarter, though specific figures were not disclosed. Looking ahead, Intel intends to focus on strengthening its data-center business, where demand for Xeon processors remains robust, and to prepare for the late-2025 introduction of its Panther Lake platform. The company will also continue efforts to encourage software development that leverages on-device AI, aiming to support wider adoption of its AI-capable hardware.



View at TechPowerUp Main Site | Source
 
Of course. Overpriced and pointless chips. We can use a browser for AI.
 
That´s good and all.

But i doesen´t take advantage of all those loose LGA 1700 wallets, that would buy a 10 or 12 p-core-only chip in a heartbeat. Even without hyperthreading.

Did AMD not make millions in selling new AM4 chips for old hardware?
 
Last edited:
"and to consumer hesitancy to pay higher prices for AI-enabled features that are still emerging in mainstream applications."

That's an understatement. Despite all the marketing there are basically no locally running AI applications worth mentioning.
 
That´s good an all.

But i doesen´t take advantage of all those loose LGA 1700 wallets, that would buy a 10 or 12 p-core-only chip in a heartbeat. Even without hyperthreading.

Did AMD not make millions in selling new AM4 chips for old hardware?
Same boat here, still waiting for that suppose Bartlett lake, it's either that or I finally jump ship to AM5, my 12th gen CPU is getting old fast.
 
pcgh also wrote about workforce reduction. Intel most likely lives from the government money from several different countries by now.
 
Wait, you're telling me that people would rather buy CPUs that are focused on tasks they actually do than AI marketing garbage they can't even use? I'm shocked /s
 
If they had AI-related hardware worth using, users or at the very least enthusiasts would flock to them, or at least I know I would if they offered something with at least 128 GB of RAM with 500 GB/s bandwidth and 200 TOPS or more in FP16 at affordable prices.
 
Of course. Overpriced and pointless chips. We can use a browser for AI.
Imo, It's not that it's pointeless, but mismarketed. The simple fact that you mentioned the cloud is proof that the marketing failed. Those NPU are kind of weak in the grand scheme of things, and not meant to run stuff that would require cloud power. Their strenght lies in low latency (for stuff like real time video/audio processing) and energy efficiency. But it was never about raw power. It's complimentary to the cloud, not a replacement.

The browser can't show me all the pictures stored locally that contains a specific item or words just by searching a query. The browser will ask me for an additional subscription for denoising a picture. The browser isn't going to conceal how the place that I'm in looks like by replacing the background when I'm doing a video call. (The last few health specialists that I've consulted in video call used that feature. Very useful when things get hectic, and your place looks like a mess) And yes, you can run that on the GPU, but then it's going to drain your battery life.

Their issue is thinking that it's worth a premium, and the CPU with those functionality also happens to be slower than RPL. Local A.I should just be a given. That's how phones operate, and that why local A.I on phones is much more developed vs the desktop. Samsung, Google, and Apple, never tried to sell an "A.I edition" of their phones for a premium. It's just something that you get when you buy any of their recent phones. When Apple (yes, them of all people) started to sell their M laptops with the NPU, they were cheaper than the NPUless laptops that they replaced. Meanwhile the PC industry tried to do the opposite.
 
That´s good and all.

But i doesen´t take advantage of all those loose LGA 1700 wallets, that would buy a 10 or 12 p-core-only chip in a heartbeat. Even without hyperthreading.

Did AMD not make millions in selling new AM4 chips for old hardware?
LGA 1700 is way more upgradable than 1851 and thats kinda funny.
 
NPUs have been called Dark Silicon for a reason. Microsoft was even quick to implement it onto the task manager, yet that graph never even moves, because of the lack of apps that even use it.

At the end of the day, it’s all a big marketing hogwash, which bet on the ignorance of consumers.
 
Same boat here, still waiting for that suppose Bartlett lake, it's either that or I finally jump ship to AM5, my 12th gen CPU is getting old fast.

its actually not getting old fast...at all, 13th and 14th gen are essentially the same chips with a bit of an overclock and 200 series is just crap and often inferior in gaming sooo yeah.
 
"and to consumer hesitancy to pay higher prices for AI-enabled features that are still emerging in mainstream applications."

That's an understatement. Despite all the marketing there are basically no locally running AI applications worth mentioning.
There are plenty of people dabbling in local AI for image generation or custom chatbots, but those people need way more power than these NPU provide, and instead are going with discrete cards (or large iGPUs like on higher end Macs or the upcoming Strix Halo) to run their stuff. I think the NPU going current CPUs are in a really awkward middle kind of place. There are basically two kinds of AI tasks a person would want to do; small tricks like photo filters or webcam background blurring that only needs a small amount of processing power, and running local image/text generation that can use every bit of processing power you can toss at it. The NPUs Intel (and AMD ) are putting into their current CPUs are overkill for the first set of tasks, but simply not good enough for the second. The second set also runs just fine on GPU hardware, which of course also does 3D graphics. We would be better off with much smaller NPUs on the processor and the saved die space used for bigger iGPU, since the iGPU can do both games & the heavier AI stuff.
 
Meteor lake a d arrow lake are failed products
they failed because intel was trying to do to much at the same time

ditch the npu tile and hybrid cores on the compute tile and get that configuration running at at levels comparable to the previous generation.
and when that’s done try something new like some accelerator for who knows what
 
ditch the npu tile

They can't, microsoft is all in into the AI bullshit train, the npu needs to be there so the computer can be a copilot computer as meaningless as that is, so any cpu going forward will need to have this wasted die space even if there are better solutions (i.e. Strix Halo with it's massive igpu still has the silly npu).
 
Back
Top