Monday, November 26th 2018

14nm 6th Time Over: Intel Readies 10-core "Comet Lake" Die to Preempt "Zen 2" AM4

If Intel's now-defunct "tick-tock" product development cadence held its ground, the 14 nm silicon fabrication node should have seen just two micro-architectures, "Broadwell" and "Skylake," with "Broadwell" being an incrementally improved optical shrink of 22 nm "Haswell," and "Skylake" being a newer micro-architecture built on a then more matured 14 nm node. Intel's silicon fabrication node advancement went off the rails in 2015-16, and 14 nm would go on to be the base for three more "generations," including the 7th generation "Kaby Lake," the 8th generation "Coffee Lake," and 9th generation "Coffee Lake Refresh." The latter two saw Intel increase core-counts after AMD broke its slumber. It turns out that Intel won't let the 8-core "Coffee Lake Refresh" die pull the weight of Intel's competitiveness and prestige through 2019, and is planning yet another stopgap, codenamed "Comet Lake."

Intel's next silicon fabrication node, 10 nm, takes off only toward the end of 2019, and AMD is expected to launch its 7 nm "Zen 2" architecture much sooner than that (debuts in December 2018). Intel probably fears AMD could launch client-segment "Zen 2" processors before Intel's first 10 nm client-segment products, to cash in on its competitive edge. Intel is looking to blunt that with "Comet Lake." Designed for the LGA115x mainstream-desktop platform, "Comet Lake" is a 10-core processor die built on 14 nm, and could be the foundation of the 10th generation Core processor family. It's unlikely that the underlying core design is changed from "Skylake" (circa 2016). It could retain the same cache hierarchy, with 256 KB per core L2 cache, and 20 MB shared L3 cache. All is not rosy in the AMD camp. The first AMD 7 nm processors will target the enterprise segment and not client, and CEO Lisa Su in her quarterly financial results calls has been evasive about when the first 7 nm client-segment products could come out. There was some chatter in September of a "Zen+" based 10-core socket AM4 product leading up to them.
Source: HotHardware
Add your own comment

123 Comments on 14nm 6th Time Over: Intel Readies 10-core "Comet Lake" Die to Preempt "Zen 2" AM4

#26
Basard
Manu_PT said:
Im on this hardware stuff for almos 2 decades and I never seen high end pc components being so premium.
I'll admit that GPUs have never cost so much. But CPUs regularly cost over $500 when they are released. FFS the FIRST AMD "FX" chips (back in the Athlon 64 days) cost over $900.
Posted on Reply
#27
lanlagger
base clock will be 3.2ghz to get those 95W TDP ?
Posted on Reply
#29
Vayra86
m4dn355 said:
How come intel's lake never dry up?!


Their water chiller's leaking into it.

(Man, its so easy these days :D)
Posted on Reply
#30
bug
lanlagger said:
base clock will be 3.2ghz to get those 95W TDP ?
Lower base clock in exchange for more cores is nothing new. However, all cores turbo is where it's at.
That said, 10 cores within 95W would be quite a feat, I expect these to need a little more than that.
Posted on Reply
#31
Vayra86
bug said:
Lower base clock in exchange for more cores is nothing new. However, all cores turbo is where it's at.
That said, 10 cores within 95W would be quite a feat, I expect these to need a little more than that.
Even their 6 cores royally boost beyond the stated TDPs, so yea, 'just a little' :P
Posted on Reply
#32
Joss
Intel Readies 10-core "Comet Lake" Die to Preempt "Zen 2" AM4
Intel, as things are, can not pre empt AMD; they're playing defensive.
Posted on Reply
#33
bajs11
considering the 8core 16threads i9 9900K costs over 600 bucks
I dont want to know how much they will charge for that refresh-refresh-refresh 14nm 10core cpu
Posted on Reply
#34
bug
Vayra86 said:
Even their 6 cores royally boost beyond the stated TDPs, so yea, 'just a little' :p
TDP is specified at base clocks, let's not have this discussion again. We're already off topic.
Posted on Reply
#35
windwhirl
Comet Lake? Really? Well, I hope it doesn't crash right on top of them...

Basard said:
I'll admit that GPUs have never cost so much. But CPUs regularly cost over $500 when they are released. FFS the FIRST AMD "FX" chips (back in the Athlon 64 days) cost over $900.
Official launch prices for the Core i7-x7xx-K models had mostly been around $350, at least... Although before AMD woke up, if you wanted 6 cores or more, you had to be prepared to give up quite a bit more, usually 999 dollars for the CPU...

bug said:
The problem I see here is these don't work on current motherboards. And if you buy one of these, you know Ice Lake is (eventually) coming and will require yet another motherboard. The CPUs themselves are fine. Everything around them is not, however.
How long has it been since Ice Lake appeared in roadmaps and stuff? It feels like it should have been released like years ago...
Posted on Reply
#36
bug
windwhirl said:
How long has it been since Ice Lake appeared in roadmaps and stuff? It feels like it should have been released like years ago...
2-3 years. I lost count.
Posted on Reply
#37
Basard
windwhirl said:
Although before AMD woke up, if you wanted 6 cores or more, you had to be prepared to give up quite a bit more, usually 999 dollars for the CPU...
Right, and AMD did the same when they had superior CPUs in the past. Even Intel had a model between $350 and $999 during i7's glory days.
Posted on Reply
#38
bug
Basard said:
Right, and AMD did the same when they had superior CPUs in the past. Even Intel had a model between $350 and $999 during i7's glory days.
I'm still not sure why this gets brought up so often.
You lead by a sizeable margin, you get to charge a premium. That's how business works. End of story.
Posted on Reply
#39
kings
windwhirl said:
Although before AMD woke up, if you wanted 6 cores or more, you had to be prepared to give up quite a bit more, usually 999 dollars for the CPU...
Its not entirely true, my six-core i7 5820K cost me 350€ (+/- $350) in 2014.
Posted on Reply
#40
Manu_PT
M2B said:
Oh look, oh look, this guy can't even look back to 144Hz but can enjoy games at Sub-30FPS on consoles. Bad liar.
Say what? <3



Vayra86 said:
News flash, that 500 dollar CPU was never a requirement to play games. Not even for your magical 240hz - that is just a surrealistic FPS target consisting of 40% marketing and 60% placebo that you've fallen for. Evidence in the fact 'you can't look back to 144hz'... its sad and funny all at the same time. You have no idea. Just because a number's higher doesn't mean it means anything. Its the same as with 4K monitors and how half the gamurs play on them at a downscaled 1080p because 'woopsie, games do really get tougher on my GPU over time'. There is always going to be something to blow insane amounts of money on, with questionable benefits. Wake up... You can still easily game on a reasonable budget.
You remind of those guys that once said "human eye can´t see more than 24 fps". And also those that said 120hz was useless as no one ever needs more than 60hz and so on... I still have my 144hz LG and I do a lot of tests between it and my Dell AW2518 and the different is HUGE, to the point that I can now see the mouse pointer lagging on windows desktop. You can live on denial if you want. 240hz is objectively superior to 144hz and everone can notice it. And if you use 240hz for a long time you will not accept 144hz. Also you don´t need 240fps to take advantage of 240hz. Research on Blur Busters forum and website and read the technical analysis about the subject made by Chief.
Posted on Reply
#41
Vayra86
Manu_PT said:
Say what? :love:





You remind of those guys that once said "human eye can´t see more than 24 fps". And also those that said 120hz was useless as no one ever needs more than 60hz and so on... I still have my 144hz LG and I do a lot of tests between it and my Dell AW2518 and the different is HUGE, to the point that I can now see the mouse pointer lagging on windows desktop. You can live on denial if you want. 240hz is objectively superior to 144hz and everone can notice it. And if you use 240hz for a long time you will not accept 144hz. Also you don´t need 240fps to take advantage of 240hz. Research on Blur Busters forum and website and read the technical analysis about the subject made by Chief.
Cool story, but your point was that gaming is unaffordable these days and then you bring in surreal demands in terms of FPS targets that less than 1% of the audience cares about. The vast majority of games cannot even keep half that framerate on the fastest CPU. You're comparing ultra high end demands to mid-range price points and then say 'its not fair everything costs so much' and/or 'Why can't Intel make a faster CPU'.

So yeah, its quite hilarious to see you didn't get that memo right here. So great, you can now notice your mouse pointer lagging on the desktop at 144hz. I don't. Who's experience is really better now? :)Diminishing returns is a thing, look it up. 24 fps or 60 or 144 or 240, its quite a stretch. There will always be a next best thing, that doesn't mean its something you'd need.

You know I'm on a 120hz panel too. I'm sure that a higher refresh rate will still be smoother, but the investment versus the payoff simply isn't worth it, and consistency suffers. Its better to have a slightly lower but fixed FPS/refresh than 'as high as possible' while only hitting it rarely. Why? So I can use my strobing backlight, which helps motion clarity far more than even 480hz would. I don't spend my days counting pixels or pushing my nose into my panel so I can spot the smoothness of my 240hz mouse pointer. I play games :)
Posted on Reply
#42
GlacierNine
bug said:
TDP is specified at base clocks, let's not have this discussion again. We're already off topic.
I love the way you agree completely with his statement but are so determined to defend Intel that you managed to make it combative and stupid.

Intel's shit runs hot these days. Way hotter than their marketing implies or even outright states it actually should. Quite why you're so married to the idea of minimising that is beyond me, but here we are again - another piece of news that makes Intel look terrible, and you're on deck with damage control.

Basard said:
Right, and AMD did the same when they had superior CPUs in the past. Even Intel had a model between $350 and $999 during i7's glory days.
Your point ignores that "back in the day" Intel did not segment their products into "Mainstream" and "HEDT" segments.

The first chips Intel formally denoted as separate from "Mainstream" platform were the Sandy Bridge E chips, the 3960X in specific, on LGA2011 with an RRP of $1059. The 2700K was $339 - Before that, the high end chips existed on the same platform as the lower end chips - the 980X was also $1059 but it sat in exactly the same motherboards as the i7 920, which was $305 and was a highly recommended budget/overclockers chip for what we now refer to as "mainstream" users.

Since the inception of i7 branding, RRPs for mainstream CPUs have gone from $305 to $499, (63.61% increase over 7 years), and "HEDT" processors have gone from $1059 to $1999 (88.76% increase over 7 years).

In the same 7 year period, the dollar has only inflated from $100 to $112.42, meaning that in all segments, Intel's price increases have outstripped inflation.

On top of that, the end of Moores law, the stagnation of corecounts and clocks, means that while prices have been rising more quickly, performance has been rising more slowly.

In other words, for the last 7 years, Intel has been fucking us all and even Intel fanboys should be glad that AMD are finally stepping up to put pressure on them to be more competitive in all areas.
Posted on Reply
#43
Manu_PT
Vayra86 said:
Cool story, but your point was that gaming is unaffordable these days and then you bring in surreal demands in terms of FPS targets that less than 1% of the audience cares about. The vast majority of games cannot even keep half that framerate on the fastest CPU. You're comparing ultra high end demands to mid-range price points and then say 'its not fair everything costs so much' and/or 'Why can't Intel make a faster CPU'.

So yeah, its quite hilarious to see you didn't get that memo right here. So great, you can now notice your mouse pointer lagging on the desktop at 144hz. I don't. Who's experience is really better now? :)Diminishing returns is a thing, look it up. 24 fps or 60 or 144 or 240, its quite a stretch. There will always be a next best thing, that doesn't mean its something you'd need.

You know I'm on a 120hz panel too. I'm sure that a higher refresh rate will still be smoother, but the investment versus the payoff simply isn't worth it, and consistency suffers. Its better to have a slightly lower but fixed FPS/refresh than 'as high as possible' while only hitting it rarely. Why? So I can use my strobing backlight, which helps motion clarity far more than even 480hz would. I don't spend my days counting pixels or pushing my nose into my panel so I can spot the smoothness of my 240hz mouse pointer. I play games :)
Strobing is an awful technology that not only introduces flickering, wich is bad for some eyes (including mine) but also adds double the input lag. I have strobing on my 240hz panel and I never use it. You don´t get my point. Is not about having 240fps! Having a 240hz panel and a framerate that fluctuates from 120 to 200 is already smoother than any 120hz/144hz panel even with Gsync/Freesync, because the monitor can actually display every frame without needing to lock framerate. The Motion Clarity is amazing, response time near 0,5 (closest as it can get to Oled or CRTs), and the input lag is half of 120hz! Even if you run 60fps at 240hz it will have half input lag compared to 120hz (just an example, I always try to run high framerates).

Having 120fps-200fps interval on games like Overwatch, Quake Champions, Dirty Bomb, Rainbow 6, Destiny 2 or Warframe, at medium/high settings 1080p, doesn´t require an unrealistic amount of juice! Requires a very good cpu for gaming yes! Something like the 8600k at 4,8ghz will do! Something Ryzen can´t deliver, otherwise I would have one! Simple as that. You need to understand everyone has different needs. I love my 240hz setup and I don´t want a CPU that will keep my minimum fps at 85 or 90, as you can see, for example, on today Steve BF V multiplayer analysis with Ryzen vs Intel highlighted.

Now you can argue that Intel CPUs are so expensive right now that it is madness to buy them instead of a Ryzen and I agree! 100%! But 1 year ago that wasn´t the case.

One thing I can warranty you, 240hz is not placebo and it really improves your experience on every game that is not a 2d platformer or maybe RTS! The persistence is valid even if you can´t reach close to 240fps, you don´t need to. When you move your mouse realatively fast even on a 3rd person RPG, having 240hz 1080p will provide better clarity and image quality while on movement even compared to 4k 60hz! Because there will be no distortion/smearing/ghosting. You should try one day and then you agree with me.

And remember, 240hz monitors go as low as 260€ nowadays on European Amazon. Is not a premium price.
Posted on Reply
#44
TheinsanegamerN
Upgrayedd said:
Now just make a K version with no iGPU.. *keeps dreaming*
As always, what you want already exists, its called HDET. Now, why you would want to go spend more money to lack an iGPU, while the core i5s have proven the iGPU does NOT impede overclocks is beyond me, but hey, the platform exists for people like you.

Stop whining and go buy one.
Posted on Reply
#45
eidairaman1
The Exiled Airman
Object55 said:
So if 9900k wasn't hot enough may I recommend "9590" for the 10 core model. Seems suitable for chip that's made hot squeezing every last bit of performance out of the current generation.

You can even reuse AMD's marketing with few minor changes.


Considering my 8350 is 5.0 across all 8 cores...
Posted on Reply
#46
phanbuey
TheinsanegamerN said:
As always, what you want already exists, its called HDET. Now, why you would want to go spend more money to lack an iGPU, while the core i5s have proven the iGPU does NOT impede overclocks is beyond me, but hey, the platform exists for people like you.

Stop whining and go buy one.
They're not releasing this on HEDT. HEDT doesn't have the ringbus that games love.
Posted on Reply
#47
M2B
Manu_PT said:
Say what? :love:
I didn't mean you don't have the hardware, I meant someone who can't even look back to 144Hz WOULD not be able to enjoy games at 30FPS on consoles. but seems like you're in-love with consoles & console games based on your previous posts.
Posted on Reply
#48
R-T-B
M2B said:
I didn't mean you don't have the hardware, I meant someone who can't even look back to 144Hz WOULD not be able to enjoy games at 30FPS on consoles. but seems like you're in-love with consoles & console games based on your previous posts.
OT but I'll bite... maybe he plays different games on consoles that don't require the same refresh to enjoy?

What he's saying isn't impossible.
Posted on Reply
#49
oxidized
Manu_PT said:
Say what? :love:

You remind of those guys that once said "human eye can´t see more than 24 fps". And also those that said 120hz was useless as no one ever needs more than 60hz and so on... I still have my 144hz LG and I do a lot of tests between it and my Dell AW2518 and the different is HUGE, to the point that I can now see the mouse pointer lagging on windows desktop. You can live on denial if you want. 240hz is objectively superior to 144hz and everone can notice it. And if you use 240hz for a long time you will not accept 144hz. Also you don´t need 240fps to take advantage of 240hz. Research on Blur Busters forum and website and read the technical analysis about the subject made by Chief.
You make 0 sense, and besides you got vayra's post totally wrong.
Posted on Reply
#50
bug
TheinsanegamerN said:
As always, what you want already exists, its called HDET. Now, why you would want to go spend more money to lack an iGPU, while the core i5s have proven the iGPU does NOT impede overclocks is beyond me, but hey, the platform exists for people like you.

Stop whining and go buy one.
The thing I have against IGPs is the die size they take. If it was a tiny thing that offers a fallback in case your real GPU craps out, I'd be fine with it. As it is, IGPs are monsters that take 33-50% of the die space, depending on configuration. And that translates into $$$ (just look at Turing).
Posted on Reply
Add your own comment