• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel to Move Select Chipset Fabrication Back to 22nm in Wake of 14 nm Silicon Constraints

To be honest, I don't know why they moved to 14nm in the first place. All pre-Z370 chipsets were 22nm or older (yes, including X299 and C422) and it's not like they're low on features or anything.
Since they have to, it's that simple. They were forced to do so, by law. That's the result of the California Energy Commission's 2019 regulations. So they literally have to move those to 14nm – since their chipsets ain't energy efficient enough when being fabbed on 22nm.

Thing is, and speaking about happily shooting your on foot;
To my knowledge, that very law which demands computer peripheral components being energy-efficiency, Intel even happily was lobbying for and was actively involved in shaping such regulations. That why they proudly announced it joyfully in the first place. So their own actions are now coming back to bite their ass.

Did I mention that I love boomerangs?? Their ways are so friggin‘ predictable!
bgrin15x18.gif


We're writing the Year two thousand and eightteen as we witness when Intel shots their own face …
 
The Intel soap just keeps on giving.

Hilarious
 
Its H310 who gives crap?
 
Since they have to, it's that simple. They were forced to do so, by law. That's the result of the California Energy Commission's 2019 regulations. So they literally have to move those to 14nm – since their chipsets ain't energy efficient enough when being fabbed on 22nm.
So, if their 22nm chipsets aren't efficient enough to be sold in US, then tell me please how Z370 or X299 does just fine on 22nm? Or why didn't this energy commission, or EPA, or Greenpeace, or antarctic penguins swept up all of those uber-inefficient Ryzen motherboards w/ 55nm chipsets? Maybe they'll go after 65+nm SuperIO ICs next time, cause they are so-o-o inefficient, regardless of sub-1W package... You are talking absolute nonsense.

The only logical explanation is that Intel have decided to try out 14nm lithography for chipsets on the lowest-of-the-low H310, so if something goes wrong, then replacing a $50-60 motherboard will not create as much fuss or outrage as a high number of faulty/defective high-end Z370 boards, or B360-based enterprise PCs. In most cases it may even go unnoticed, cause "cheap stuff breaks".
 
I’m guessing they moved chipsets to 14nm on regular schedule, with the assumption that 10nm would be ready by now.
Possibly. Originally, they planned to be doing most CPUs on 10nm at this point. And they would have unused 14nm potential for everything less complex.
Well... they didn't, so we end up with this strange shift in production. It doesn't look good, but it's fairly neutral.
On the other hand, the reasons why this is happening are very positive (apart from 10nm delay, obviously). Intel is selling a lot of stuff, they are getting new contracts.
Reaction on gaming sites/forums is just hilarious.
This is the longest Intel node delay that I can recall, and I’ve been following more intently since the Pentium III era.
It's pretty obvious by now that everyone has issues with 7/10nm. One of the most advanced semiconductor makers has just surrendered.
Everyone's working on a solution to make this idea profitable. Most likely everyone will announced readiness at more or less the same time.
Intel's delay will be the longest simply because they were the most advanced in the early development period, i.e. were first.

And I'm known to be a Intel fanboy, so I'll also say that Intel will be the first to launch an x86 CPU. Of course apart from the existing mobile i3 available since May. :p
 
This isn't necessarily A Bad Thing I personally feel that chip manufacturers move a little too quickly on to the next die shrink instead of refining their parts on the current process, this applies to both Intel/AMD at this rate
 
To add to @londiste's reply, Intel makes a lot of other chips, they have (and correct me if I'm wrong):
  • CPUs
  • Chipsets
  • Wired network controllers
  • Wireless network controllers
  • SSDs
  • Thunderbolt controllers
  • Probably even more
I'm sure at least one of those types of chips is currently overstocked, and built on 22nm, so they have the tooling setup, and demand low enough to switch manufacturing over for an amount of time to cope.
SSD don't feature in this debate, they're not even on 14nm, neither do current gen CPUs ~ which they cannot possibly afford to move back to 22nm. Everything else is probably fair game.
I’m guessing they moved chipsets to 14nm on regular schedule, with the assumption that 10nm would be ready by now. It’s possible that Ryzen forced their hand into adding more cores, but without a node shrink, which hampers volume. This is the longest Intel node delay that I can recall, and I’ve been following more intently since the Pentium III era. These product lines are so long in development, there’s no way Intel could have planned for this much trouble. Also factor in that some lines have to be retooled to 10nm by now, but they aren’t producing anything profitable yet. It’s a double-whammy at this point. Bigger chips, less manufacturering capacity.
Tbh I don't remember any of the latest gen chipsets being mentioned as 14nm, is there any documentation for this? Yup I'd be surprised if any of the chipsets are 14nm, even Z390 :rolleyes:
Screenshot (95).png
 
I wouldn't worry to much about intel, look at MS with the windows 8 fiasco, they simply have way too much cash to burn, they'll all be alright
 
Possibly. Originally, they planned to be doing most CPUs on 10nm at this point. And they would have unused 14nm potential for everything less complex.
Well... they didn't, so we end up with this strange shift in production. It doesn't look good, but it's fairly neutral.
On the other hand, the reasons why this is happening are very positive (apart from 10nm delay, obviously). Intel is selling a lot of stuff, they are getting new contracts.
Reaction on gaming sites/forums is just hilarious.

It's pretty obvious by now that everyone has issues with 7/10nm. One of the most advanced semiconductor makers has just surrendered.
Everyone's working on a solution to make this idea profitable. Most likely everyone will announced readiness at more or less the same time.
Intel's delay will be the longest simply because they were the most advanced in the early development period, i.e. were first.

And I'm known to be a Intel fanboy, so I'll also say that Intel will be the first to launch an x86 CPU. Of course apart from the existing mobile i3 available since May. :p
There is at least one 7nm product out there, in volume. Apple’s A12 appears to have no trouble performing if the 4K to 1080p conversion result I saw was any guide.

Intel was really able to save face and increase demand by making its most significant CPU performance improvements that I can recall in a long time. i3 is now a quad core, i5 is 6 core, i7 is 6C/12T. The whole stack got 2 more cores. Before that, the refreshes were pretty anemic, making upgrades largely unnecessary.
 
Tbh I don't remember any of the latest gen chipsets being mentioned as 14nm, is there any documentation for this? Yup I'd be surprised if any of the chipsets are 14nm, even Z390 :rolleyes:
All 300-series chipsets apart from Z370 are 14nm (i.e. all released in 2018).
 
So, if their 22nm chipsets aren't efficient enough to be sold in US, then tell me please how Z370 or X299 does just fine on 22nm? Or why didn't this energy commission, or EPA, or Greenpeace, or antarctic penguins swept up all of those uber-inefficient Ryzen motherboards w/ 55nm chipsets? Maybe they'll go after 65+nm SuperIO ICs next time, cause they are so-o-o inefficient, regardless of sub-1W package... You are talking absolute nonsense.

The only logical explanation is that Intel have decided to try out 14nm lithography for chipsets on the lowest-of-the-low H310, so if something goes wrong, then replacing a $50-60 motherboard will not create as much fuss or outrage as a high number of faulty/defective high-end Z370 boards, or B360-based enterprise PCs. In most cases it may even go unnoticed, cause "cheap stuff breaks".

Doesn't matter what you say, we're talking about intel here, and intel is bad, nothing will change that.



/s
 
To be honest, I don't know why they moved to 14nm in the first place. All pre-Z370 chipsets were 22nm or older (yes, including X299 and C422) and it's not like they're low on features or anything.

Profit. It is cheaper to manufacture H310 on 14nm than 22nm, you can fit more dies per wafer.

Since they have to, it's that simple. They were forced to do so, by law. That's the result of the California Energy Commission's 2019 regulations. So they literally have to move those to 14nm – since their chipsets ain't energy efficient enough when being fabbed on 22nm.

Those standards have nothing to do with moving H310 from 22nm to 14nm. The difference in power consumption, especially when idle/sleep/off is next to nothing, and that is literally all that standard cares about.

The 14nm H310 was only 6w, the 22nm might be 10w. Not enough to make any real difference, and certainly not enough to throw off power regulations.
 
Profit. It is cheaper to manufacture H310 on 14nm than 22nm, you can fit more dies per wafer.
But switching costs.
It's obvious that they planned to move CPUs to 10nm and chipsets to 14nm. But the fact that they did the latter even when the former was delayed is somehow weird.
On the other hand, it's still only H310, so not the best selling nor most important chipset. They haven't done the same with the bestselling H/Q370 yet (both desktop and mobile), so IMO the transition to 10nm is still going relatively well (compared to what we read online ;-)).
 
But switching costs.
It's obvious that they planned to move CPUs to 10nm and chipsets to 14nm. But the fact that they did the latter even when the former was delayed is somehow weird.
On the other hand, it's still only H310, so not the best selling nor most important chipset. They haven't done the same with the bestselling H/Q370 yet (both desktop and mobile), so IMO the transition to 10nm is still going relatively well (compared to what we read online ;-)).

Well, being the smallest chipset, it is the easiest to transition to 14nm from 22nm.
Plus, I bet they actually sell more H310 chipsets than any others. Those cheap pre-built computers that fly off the shelves, because they are cheap, usually always use the lowest end chipset.
 
Huge LOL on the article and most of the comments. Intel is so bad, moving back to old 22mm fabs!
[…]
Remember to always mention Intel's CPU security problems! :-D
They actually are pretty serious tho.
ucool15x18.gif

What happens in real life:
1) Intel is moving chosen plants to 10nm,
2) They have a huge order portfolio, so they had to shuffle production sites a bit,
3) 22nm fabs will make H310C - the cheapest and simplest chipset in the lineup.
Your books cold be full and orders being written all over.
Worth exactly no·thing if your tools are broken!
hehee15x18.gif


The last couple of months really feel somewhat awkward. It's like on every new head-line 'bout Intel you're thinking »Guys, stop it, he's already dead!« … Then you have to read the next, stating that „things get worse for Intel …“ and you just asking yourself »Well, how worse it can even get anymore?!«

It feels really, really strange these days …
Did karma actually bought an Itanium CPU back then? A Larrabee-card by any chance? Or some Optane SSDs? What a bitch!
 
Last edited:
But switching costs.
It's obvious that they planned to move CPUs to 10nm and chipsets to 14nm. But the fact that they did the latter even when the former was delayed is somehow weird.
On the other hand, it's still only H310, so not the best selling nor most important chipset. They haven't done the same with the bestselling H/Q370 yet (both desktop and mobile), so IMO the transition to 10nm is still going relatively well (compared to what we read online ;-)).

Intel is a giant company with big roadmaps. The plan to move a chipset to a smaller node was probably put into motion years ago, with design decisions made, and silicon taped out and tested. The chipset group is also very likely on its own time table and siloed from the CPU group. I’m sure a reasonable amount of node transition trouble is anticipated, but not 14nm+++ kind of trouble!
 
Agree. Time to close all factories and call it a day. Intel can't even make profit anymore. They're done.
That is easily one of the most misguided, factless and plain silly comments I've read in a while. Intel has had a few challenges certainly, but only a clueless fool would think it's time for them to pack it in and close up shop. Rubbish thinking at it's best.
 
That is easily one of the most misguided, factless and plain silly comments I've read in a while. Intel has had a few challenges certainly, but only a clueless fool would think it's time for them to pack it in and close up shop. Rubbish thinking at it's best.

Wow to see something like that from him confuses the heck out of me
 
Right... thought I still can't see how it's feasible to move to an older node, I doubt it means the end of the world for Intel. While dealing with security issues (like everyone else) and scurrying about to produce a product that competes with AMD's offerings, they're still topping the charts.

So... how does one move to an older node, then? Did they just pack up the 22nm machine and put it away? I would imagine 14nm, 22nm etc "tools" are merely part of a larger machine/process that produces these things, so it wouldn't make much sense to swap out 14nm for 22nm, unless I'm grossly misunderstanding something, which is very well possible... I'm no fab expert, nor have I ever even been near one.
 
So... how does one move to an older node, then? Did they just pack up the 22nm machine and put it away? I would imagine 14nm, 22nm etc "tools" are merely part of a larger machine/process that produces these things, so it wouldn't make much sense to swap out 14nm for 22nm, unless I'm grossly misunderstanding something, which is very well possible... I'm no fab expert, nor have I ever even been near one.
I have seen those systems(calling it a system because it's a process that involves a whole series of machines) and the process is actually surprisingly modular, though it's still an involved task to move from one process scaling to another.
 
Back
Top