Monday, November 26th 2018

14nm 6th Time Over: Intel Readies 10-core "Comet Lake" Die to Preempt "Zen 2" AM4

If Intel's now-defunct "tick-tock" product development cadence held its ground, the 14 nm silicon fabrication node should have seen just two micro-architectures, "Broadwell" and "Skylake," with "Broadwell" being an incrementally improved optical shrink of 22 nm "Haswell," and "Skylake" being a newer micro-architecture built on a then more matured 14 nm node. Intel's silicon fabrication node advancement went off the rails in 2015-16, and 14 nm would go on to be the base for three more "generations," including the 7th generation "Kaby Lake," the 8th generation "Coffee Lake," and 9th generation "Coffee Lake Refresh." The latter two saw Intel increase core-counts after AMD broke its slumber. It turns out that Intel won't let the 8-core "Coffee Lake Refresh" die pull the weight of Intel's competitiveness and prestige through 2019, and is planning yet another stopgap, codenamed "Comet Lake."

Intel's next silicon fabrication node, 10 nm, takes off only toward the end of 2019, and AMD is expected to launch its 7 nm "Zen 2" architecture much sooner than that (debuts in December 2018). Intel probably fears AMD could launch client-segment "Zen 2" processors before Intel's first 10 nm client-segment products, to cash in on its competitive edge. Intel is looking to blunt that with "Comet Lake." Designed for the LGA115x mainstream-desktop platform, "Comet Lake" is a 10-core processor die built on 14 nm, and could be the foundation of the 10th generation Core processor family. It's unlikely that the underlying core design is changed from "Skylake" (circa 2016). It could retain the same cache hierarchy, with 256 KB per core L2 cache, and 20 MB shared L3 cache. All is not rosy in the AMD camp. The first AMD 7 nm processors will target the enterprise segment and not client, and CEO Lisa Su in her quarterly financial results calls has been evasive about when the first 7 nm client-segment products could come out. There was some chatter in September of a "Zen+" based 10-core socket AM4 product leading up to them.
Source: HotHardware
Add your own comment

123 Comments on 14nm 6th Time Over: Intel Readies 10-core "Comet Lake" Die to Preempt "Zen 2" AM4

#51
Vayra86
Manu_PT said:
Strobing is an awful technology that not only introduces flickering, wich is bad for some eyes (including mine) but also adds double the input lag. I have strobing on my 240hz panel and I never use it. You don´t get my point. Is not about having 240fps! Having a 240hz panel and a framerate that fluctuates from 120 to 200 is already smoother than any 120hz/144hz panel even with Gsync/Freesync, because the monitor can actually display every frame without needing to lock framerate. The Motion Clarity is amazing, response time near 0,5 (closest as it can get to Oled or CRTs), and the input lag is half of 120hz! Even if you run 60fps at 240hz it will have half input lag compared to 120hz (just an example, I always try to run high framerates).

Having 120fps-200fps interval on games like Overwatch, Quake Champions, Dirty Bomb, Rainbow 6, Destiny 2 or Warframe, at medium/high settings 1080p, doesn´t require an unrealistic amount of juice! Requires a very good cpu for gaming yes! Something like the 8600k at 4,8ghz will do! Something Ryzen can´t deliver, otherwise I would have one! Simple as that. You need to understand everyone has different needs. I love my 240hz setup and I don´t want a CPU that will keep my minimum fps at 85 or 90, as you can see, for example, on today Steve BF V multiplayer analysis with Ryzen vs Intel highlighted.

Now you can argue that Intel CPUs are so expensive right now that it is madness to buy them instead of a Ryzen and I agree! 100%! But 1 year ago that wasn´t the case.

One thing I can warranty you, 240hz is not placebo and it really improves your experience on every game that is not a 2d platformer or maybe RTS! The persistence is valid even if you can´t reach close to 240fps, you don´t need to. When you move your mouse realatively fast even on a 3rd person RPG, having 240hz 1080p will provide better clarity and image quality while on movement even compared to 4k 60hz! Because there will be no distortion/smearing/ghosting. You should try one day and then you agree with me.

And remember, 240hz monitors go as low as 260€ nowadays on European Amazon. Is not a premium price.
You still didn't get it. Re read your first post and try again... Its fine that you want to be in the 240hz niche. No problem at all and if you feel it works wonders for you, all the better.

So you can score a cheap 260 EUR TN somewhere. Great... it has no relation to your price argument in terms of CPU prices and how all of a sudden gaming is so expensive. You're now saying its not? Do you even logic?

I see this trend alot. Kids whine about price and then proceed to invest and force themselves in the most expensive niches that they can find: be it high refresh, 4K, or silly demands in terms of peripherals... Even you are saying you'd back down to medium settings on a lightweight game to get decent FPS to suit your monitor choice. Its .... well. To each his own, but my mind is blown.

bug said:
The thing I have against IGPs is the die size they take. If it was a tiny thing that offers a fallback in case your real GPU craps out, I'd be fine with it. As it is, IGPs are monsters that take 33-50% of the die space, depending on configuration. And that translates into $$$ (just look at Turing).
Hear, hear
Posted on Reply
#52
Chloe Price
Manu_PT said:
Im on this hardware stuff for almos 2 decades and I never seen high end pc components being so premium. I have money to buy those products but mentally I would never accept paying that kind of cash. Makes no sense.
Almost 2 decades and you don't remember the prices of the fastest Pentium 4s..?
Posted on Reply
#53
mstenholm
kings said:
Its not entirely true, my six-core i7 5820K cost me 350€ (+/- $350) in 2014.
W3670 (i7-970 hex) was $999 when I bought mine.
Posted on Reply
#54
M2B
R-T-B said:
OT but I'll bite... maybe he plays different games on consoles that don't require the same refresh to enjoy?

What he's saying isn't impossible.
It might be the case (Somewhat)
But there's a MASSIVE difference between 240Hz gaming and 30Hz gaming, no matter what.
Posted on Reply
#55
GlacierNine
Chloe Price said:
Almost 2 decades and you don't remember the prices of the fastest Pentium 4s..?
As I mentioned earlier, when Pentium 4 was the thing, the fastest pentium 4s were essentially the HEDT CPUs of their day, and they were as expensive as they were, because Intel didn't have a separate HEDT platform with different chips on it.

If you compare the prices of what most gamers were actually buying back then, to the prices of the top end, and then you compare the modern equivalents to those, you'll see that "Mainstream" has jumped in price by ~63% in the last 7 years and "HEDT" has jumped by ~88%, while the $ itself has only risen by ~12%
Posted on Reply
#56
bajs11
Chloe Price said:
Almost 2 decades and you don't remember the prices of the fastest Pentium 4s..?
going back even further the Pentium and Pentium pros of the mid 90s.
https://www.cnet.com/news/intel-releases-new-prices-on-pentiums/
But the thing is back then CPUs were the main component of a pc, well it still is but nowadays GPU has taken over quite a lot of workload that cpu used to perform
and the prices on computers were still very high in the 90s.
I remember my parents paid approximately 3000 usd for a Dell pc with pentium 120mhz and 8MegaByte ram back in 1996
the graphic processing unit was integrated onto the motherboard and had 2 MegaByte of memory....
Posted on Reply
#57
GlacierNine
bajs11 said:
going back even further the Pentium and Pentium pros of the mid 90s.
https://www.cnet.com/news/intel-releases-new-prices-on-pentiums/
But the thing is back then CPUs were the main component of a pc, well it still is but nowadays GPU has taken over quite a lot of workload that cpu used to perform
and the prices on computers were still very high in the 90s.
I remember my parents paid approximately 3000 usd for a Dell pc with pentium 120mhz and 8MegaByte ram back in 1996
the graphic processing unit was integrated onto the motherboard and had 2 MegaByte of memory....
This is also a valid point. There was very little market in the Pentium Pro, Pentium 2, Pentium 3 days, for using a GPU outside of gaming. Async Compute didn't exist. Little, if any, software leveraged the GPU. Desktops weren't 3d accelerated. 3DFX was a big company.

The further back you go in time, the more different not only pricing becomes, but the PURPOSE of the hardware also changes, in many cases entire market segments simply don't exist when you go back in time, so the usefulness of a comparison becomes very limited.
Posted on Reply
#58
Manu_PT
M2B said:
I didn't mean you don't have the hardware, I meant someone who can't even look back to 144Hz WOULD not be able to enjoy games at 30FPS on consoles. but seems like you're in-love with consoles & console games based on your previous posts.
I don´t think you need 240hz or high framerates when you are playing from the touch with a controller on your hand, a slow paced game and with Game Motion Plus enabled on the TV + HDR. PC doesn´t support Game Motion PLus tho, you´re stuck with refresh rates or VRR technologies. If you ask me to play a shooter on console? No way, I won´t of course. But a game like God of War? Oh yes I will, 0 problem. I don´t want to play that kind of game sitting on a chair on the desk anyway.

R-T-B said:
OT but I'll bite... maybe he plays different games on consoles that don't require the same refresh to enjoy?

What he's saying isn't impossible.
Plus my only chance to play SIngle PLayer masterpieces like Horizon ZD, God of War, Bloodborne, Uncharted or Spider-Man is on the PS4... not PC. I wouldn´t play Black Ops 4 or Battlefield on PS4 for sure.

Chloe Price said:
Almost 2 decades and you don't remember the prices of the fastest Pentium 4s..?
You mean those that were beaten by AMD at 1/3 of the price? Yeah I do remember. Back in the day if you knew how to overclock (wich wasn´t as easy as nowadays) you could get same CPUs as extreme versions, 3 or 4 times cheaper, and get almost same performance in the end. You can´t get a 150€ CPU that comes close to a 600€ 9900k nowadays.

My former HD5870, fastest GPU in the world by that time, cost me 350€. Nowadays the fastest GPU costs 1200€ in my country. Enough said.
Posted on Reply
#59
Vayra86
Manu_PT said:
I don´t think you need 240hz or high framerates when you are playing from the touch with a controller on your hand, a slow paced game and with Game Motion Plus enabled on the TV + HDR. PC doesn´t support Game Motion PLus tho, you´re stuck with refresh rates or VRR technologies. If you ask me to play a shooter on console? No way, I won´t of course. But a game like God of War? Oh yes I will, 0 problem. I don´t want to play that kind of game sitting on a chair on the desk anyway.



Plus my only chance to play SIngle PLayer masterpieces like Horizon ZD, God of War, Bloodborne, Uncharted or Spider-Man is on the PS4... not PC. I wouldn´t play Black Ops 4 or Battlefield on PS4 for sure.



You mean those that were beaten by AMD at 1/3 of the price? Yeah I do remember. Back in the day if you knew how to overclock (wich wasn´t as easy as nowadays) you could get same CPUs as extreme versions, 3 or 4 times cheaper, and get almost same performance in the end. You can´t get a 150€ CPU that comes close to a 600€ 9900k nowadays.

My former HD5870, fastest GPU in the world by that time, cost me 350€. Nowadays the fastest GPU costs 1200€ in my country. Enough said.
Now thát I can only agree on - there is a great tool for each purpose and content is what matters most. But I think its good to realize that 'normal' gaming really isn't expensive. It still isn't and in fact it used to be fár more expensive when the hardware was less capable across the board, which is what most of these 'it used to be better' price comparisons seem to omit. The only way you will make gaming expensive is by thinking you need to have the top 10% performance bracket or it can't be good. That simply isn't true and the higher your demands, the more costly they will be.
Posted on Reply
#60
bajs11
GlacierNine said:
This is also a valid point. There was very little market in the Pentium Pro, Pentium 2, Pentium 3 days, for using a GPU outside of gaming. Async Compute didn't exist. Little, if any, software leveraged the GPU. Desktops weren't 3d accelerated. 3DFX was a big company.

The further back you go in time, the more different not only pricing becomes, but the PURPOSE of the hardware also changes, in many cases entire market segments simply don't exist when you go back in time, so the usefulness of a comparison becomes very limited.
well I believe most games back in the 90s were running in sub 30fps
at least thats how it felt when I was playing Duke Nukem 3D and Quake
The strange thing is there were more players in the cpu market with Cyrix being one of the big names I am sure there were also a few others around other than Intel and yet the cost of a computer were just outrageously high at least compared to the prices today.

BUT that doesn't mean Nvidia and Intel are not greedy for charging 1300 bucks for a consumer gpu and 600+ for a refresh-refresh-refresh cpu
Posted on Reply
#61
Basard
kings said:
Its not entirely true, my six-core i7 5820K cost me 350€ (+/- $350) in 2014.
Well, that's not a bad deal... For everybody else it was like $400 and the 5930k was even more.
GlacierNine said:
I love the way you agree completely with his statement but are so determined to defend Intel that you managed to make it combative and stupid.

Intel's shit runs hot these days. Way hotter than their marketing implies or even outright states it actually should. Quite why you're so married to the idea of minimising that is beyond me, but here we are again - another piece of news that makes Intel look terrible, and you're on deck with damage control.

Your point ignores that "back in the day" Intel did not segment their products into "Mainstream" and "HEDT" segments.

The first chips Intel formally denoted as separate from "Mainstream" platform were the Sandy Bridge E chips, the 3960X in specific, on LGA2011 with an RRP of $1059. The 2700K was $339 - Before that, the high end chips existed on the same platform as the lower end chips - the 980X was also $1059 but it sat in exactly the same motherboards as the i7 920, which was $305 and was a highly recommended budget/overclockers chip for what we now refer to as "mainstream" users.

Since the inception of i7 branding, RRPs for mainstream CPUs have gone from $305 to $499, (63.61% increase over 7 years), and "HEDT" processors have gone from $1059 to $1999 (88.76% increase over 7 years).

In the same 7 year period, the dollar has only inflated from $100 to $112.42, meaning that in all segments, Intel's price increases have outstripped inflation.

On top of that, the end of Moores law, the stagnation of corecounts and clocks, means that while prices have been rising more quickly, performance has been rising more slowly.

In other words, for the last 7 years, Intel has been fucking us all and even Intel fanboys should be glad that AMD are finally stepping up to put pressure on them to be more competitive in all areas.
I agree with all you wrote. The seperation of sockets is pretty much needed nowadays though--for people that actually use their computers for work... the extra threads and lanes are a huge help.
Cant wait to see what the next step is though, now that Moore's law is coming.
Posted on Reply
#62
cucker tarlson
no one mentioned it's rumored to be two pieces of silicon on a double ring bus.
Posted on Reply
#63
oxidized
Manu_PT said:

Plus my only chance to play SIngle PLayer masterpieces like Horizon ZD, God of War, Bloodborne, Uncharted or Spider-Man is on the PS4... not PC. I wouldn´t play Black Ops 4 or Battlefield on PS4 for sure.
MASTERPIECES. LMAO now i understand who we're dealing with.
Posted on Reply
#64
E-curbi
Can't wait to overclock a new Intel mainstream-enthusiast 10-core on a ROG Stryx board. lol :roll:

No wait, how about a TUF board 10-cores 20-threads. :roll:

The ROG forum should be fun to watch.

"Waddaya mean it's only a 2-phase VRM?" :D
Posted on Reply
#65
Slizzo
Object55 said:
Just make it fit z390 and I won't even be mad that I got 9700k for a time being and skipped this whole 9900k bubble.
Hopefully most Z390 board's VRM will be up to the task of supporting 2 extra cores.
Posted on Reply
#66
TheLostSwede
I'm just going to throw this one in here, even though it's technically unrelated, as I think some of you are going to get quite excited about it.
Intel is going to launch some new SKU's without IGP soon-ish. Same socket as now at least...
I'm afraid that's all I can share right now, but I'm sure more details will leak soon.
Posted on Reply
#67
phanbuey
Slizzo said:
Hopefully most Z390 board's VRM will be up to the task of supporting 2 extra cores.
They are really pushing that at this point.

The last 10 core they had (7900x) was popping mobos at launch. The 9900K OC'd has that ability - this 10 core is going to be a house burner.
Posted on Reply
#68
bug
TheLostSwede said:
I'm just going to throw this one in here, even though it's technically unrelated, as I think some of you are going to get quite excited about it.
Intel is going to launch some new SKU's without IGP soon-ish. Same socket as now at least...
I'm afraid that's all I can share right now, but I'm sure more details will leak soon.
I wouldn't be too surprised. As I was posting eariler, those IGPs seriously eat into the die space. When you need to put more cores onto one die, somebody's bound to put two and two together sooner or later.
Posted on Reply
#69
M2B
Gigabyte's higher end Z390 motherboards can easily handle 10 core processors.
The VRM won't be an issue if you choose the right board.
Posted on Reply
#70
Captain_Tom
cucker tarlson said:
no one mentioned it's rumored to be two pieces of silicon on a double ring bus.
That's the only thing they can do to compete with the upcoming 7nm 16-cores from AMD. My guess is they will use 6-core dies with the least efficient core on each die disabled. This should allow them to make a 100w 10-core, and hopefully sell it for $399.

For those worrying about Dual Ring-bus performance in games, I would guess they can write drivers that tell a game to utilize only 1 of the 2 ringbusses. A 5.2GHz 5-core will game just as well as a 5GHz 8-core, and then they can attempt to continue to compete in multithreading with midrange Zen 2.

This socket would also pave the way for the inevitable 14nm++++ 12, 14, and 16-core dual-dies Intel will be forced to launch in 2020....
Posted on Reply
#71
E-curbi
TheLostSwede said:
I'm just going to throw this one in here, even though it's technically unrelated, as I think some of you are going to get quite excited about it.
Intel is going to launch some new SKU's without IGP soon-ish. Same socket as now at least...
I'm afraid that's all I can share right now, but I'm sure more details will leak soon.
Yep, ya got me excited, I'm as giddy as a schoolboy. :p Same Z370/Z390 compatibility? Go on, don't hold back, tell us more. :D
Posted on Reply
#72
GlacierNine
M2B said:
Gigabyte's higher end Z390 motherboards can easily handle 10 core processors.
The VRM won't be an issue if you choose the right board.
That's not really the issue though is it. Sure, if you buy an overkill board, it'll be able to handle overkill processors.

The problem is that Z390 is already a high end chipset and it should be expected that any Z390 board should run the rull range of Intel's lineup without breaking a sweat.

If we were talking about B360 boards, then sure, budget board, super high end CPU, that's a mismatch, plain and simple. But Z390 is a high end chipset. The boards are expensive. One of the main draws to buying that chipset is overclocking.

So why do products even exist with that chipset, that have no hope of maintaining even a mild overclock without making us have to side-eye the VRMs on each one individually?

The simple answer is, that's happening because Intel are wringing the scrawny fucking neck of 14nm too long, too hard, and hoping the consumer doesn't notice how close to the ragged edge these chips are actually running.
Posted on Reply
#73
TheLostSwede
E-curbi said:
Yep, ya got me excited, I'm as giddy as a schoolboy. :p Same Z370/Z390 compatibility? Go on, don't hold back, tell us more. :D
Yes, same motherboards. Even same model names from what I understand, just another letter added to the suffix.
I'm afraid I can't share much more, as this information hasn't leaked yet so...
I guess it would be on current Intel roadmaps, as it's apparently not that far off.
Posted on Reply
#74
Manu_PT
oxidized said:
MASTERPIECES. LMAO now i understand who we're dealing with.
Well, that´s always subjective I agree, but to me they are masterpieces. God of War might be the best Single Player game I seen this generation. Personal opinion. There´s a reason for those games I mentioned being GOTY contenders year after year... and I forgot to mention RDR2 aswell. If you have any PC exclusive game similar to those feel free to recomend me and I will give them a try for sure. Go ahead.

And that "who we´re dealing with" is just ridiculous. People in 2018 still have problems to accept gamers that play on both PC and consoles? Damn, you got stuck in the past bud. On the previous page I had to take a picture to show my 240hz monitor and explained how I love to game on it. That doesn´t stop me from enjoying consoles aswell. Is there any contract we sign wich says that if we play PC we can´t touch consoles? Please...... move along.

Maybe if PC kept delivering amazing exclusives pushing boundaries I wouldn´t have to buy consoles. Think about it. But when was the last one? Crysis 1 in 2008? And Nintendo Switch is another target for me, didn´t buy yet because is still too expensive for what if offers. But it has amazing games aswell and I will for sure play it in the future. Open your mind.
Posted on Reply
#75
oxidized
Manu_PT said:
Well, that´s always subjective I agree, but to me they are masterpieces. God of War might be the best Single Player game I seen this generation. Personal opinion. There´s a reason for those games I mentioned being GOTY contenders year after year... and I forgot to mention RDR2 aswell. If you have any PC exclusive game similar to those feel free to recomend me and I will give them a try for sure. Go ahead.
That's simply called low standards, there hasn't been a game worth of the "Masterpiece" tag for years, only a few that came close to that. Anyway the fact that they're contenders of GOTY means nothing, i'm not sure you realize the low quality we've reached with videogames, well you probably don't, after all you're calling mediocre games for casual gamers, masterpieces, so yeah.

Manu_PT said:

And that "who we´re dealing with" is just ridiculous. People in 2018 still have problems to accept gamers that play on both PC and consoles? Damn, you got stuck in the past bud. On the previous page I had to take a picture to show my 240hz monitor and explained how I love to game on it. That doesn´t stop me from enjoying consoles aswell. Is there any contract we sign wich says that if we play PC we can´t touch consoles? Please...... move along.

Maybe if PC kept delivering amazing exclusives pushing boundaries I wouldn´t have to buy consoles. Think about it. But when was the last one? Crysis 1 in 2008? And Nintendo Switch is another target for me, didn´t buy yet because is still too expensive for what if offers. But it has amazing games aswell and I will for sure play it in the future. Open your mind.
And you sound like one too! Good job!
Posted on Reply
Add your own comment