Monday, May 25th 2020

Intel Core i9-10900K der8auer De-Lidding Reveals Accurate Die-Size Measurements

Professional overclocker and extreme cooling products developer der8auer de-lidded a Core i9-10900K 10-core processor to study the processor's behavior with various kinds of custom cooling setups. It was discovered that the 10-core "Comet Lake" die measures 206.1 mm² in die-area. It is 9.2 mm wide like its predecessors, "Coffee Lake" 8-core, 6-core, and 4-core, but is 22.4 mm long, with the outer edges of its packaging material barely within a couple of millimeters of the adhesion point of the integrated heatspreader (IHS). Given what we know about how much each pair of cores adds to these dies, we predict that Intel cannot elongate this die to 12 cores, without having to remove the iGPU. der8auer discovered that using liquid metal TIMs and running the processor de-lidded shaves up to 7 °C off temperatures. Find more technical commentary in the der8auer video presentation.
Source: via Andreas Schilling (Twitter)
Add your own comment

55 Comments on Intel Core i9-10900K der8auer De-Lidding Reveals Accurate Die-Size Measurements

#1
Upgrayedd
Really wish Intel kinda did the AMD approach with their lineup.
Leave the iGPU for low power and mobile chips while the desktop processors are designed without an iGPU taking up precious die space.
Posted on Reply
#2
INSTG8R
Vanguard Beta Tester
UpgrayeddReally wish Intel kinda did the AMD approach with their lineup.
Leave the iGPU for low power and mobile chips while the desktop processors are designed without an iGPU taking up precious die space.
Yep I’ve never once used the Intel iGPU just like I’ve never taken the stock cooler out of the box either...
Posted on Reply
#3
ppn
gpu measure 5,4mm
1by2 cores measure 2.9mm,
so yes 4 cores wasted by the gpu.

you could make a 16 core by adding 6 cores to one side, removing the gpu, and center. but imagine cooling that 5.0Ghz 500 watt CPU.
Posted on Reply
#4
Chrispy_
If intel stopped putting graphics in everything they sold, they'd have more room for cores.
The 8700K is about 40% IGP by area.
Posted on Reply
#5
E-curbi
Reveals 9th gen was a thermal efficiency joke Intel played on us.

May have to wait until 2022 and 7nm Meteor Lake for the amazing thermal efficiency of the 8th gen to return once again. Oh well. :ohwell:

One 13900K 13th gen 8-core please.
Posted on Reply
#6
Tatty_One
Senior Moder@tor
Well there is the option of a Intel Core i9-10900KF without the iGPU already so perhaps 2 more cores could be doable.
Posted on Reply
#7
Chrispy_
Tatty_OneWell there is the option of a Intel Core i9-10900KF without the iGPU already so perhaps 2 more cores could be doable.
The KF versions aren't different silicon - the die still has all that wasted space taken up by a broken IGP - they're just cheaper because the GPUs are defective.
Posted on Reply
#8
Upgrayedd
E-curbiReveals 9th gen was a thermal efficiency joke Intel played on us.

May have to wait until 2022 and 7nm Meteor Lake for the amazing thermal efficiency of the 8th gen to return once again. Oh well. :ohwell:

One 13900K 13th gen 8-core please.
I really hope their 13900K has more than 8 cores considering the 10900K has 10cores
Posted on Reply
#9
TheinsanegamerN
UpgrayeddReally wish Intel kinda did the AMD approach with their lineup.
Leave the iGPU for low power and mobile chips while the desktop processors are designed without an iGPU taking up precious die space.
They do, they are called HDET, they are CPU dies made without an iGPU for those who want more CPU power instead. you are free to buy them.

A certain number of enthusiasts have been whining about that IGP for a decade now. It's been proven more then once that the lack of said iGPU wont magically allow higher clock speeds, lower power usage, or lower temperatures. It changes nothing. From intels' standpoint, that 10 core die used on the desktop is the same die that is used across their entire desktop portfolio, from 15 watt (if a 15 watt 10 core is sold) up to their 125 watt K series. If the 10 core is used on laptops, its the same die, from the same factory. The 8 cores are the same die with 2 cores lasered off, the 6 core the same die with 4 cores lasered, ece. None of these different chips have their own dies, they are all the same die, with cheaper parts binned from faulty larger parts, a practice as old as the computer industry itself.

Creating a special chunk of silicon without this iGPU attached for a niche market (enthusiasts) of a niche market (DIY) of a small market (desktop procesors) in intels processor portfolio would be a very expensive proposal, as these chips would only be useful in satiating those small number of loud enthusiasts (he vast majority of the enthusiast market of the 9900k/10900k are fine with the harmless iGPU); anywhere else they are used would simply be eating into the market for existing chips and gain intel nothing but the headache of managing another very similar production line. Given that said enthusiasts would expect lower prices or more cores for the same price, intel stands to gain nothing monetarily.

AMD manages this because their desktop processors are made with the same dies used to make their server processors. It is economically advantageous for them to do this. It would be economically poor for Intel to do the same with their monolithic design. AMD has their own seperate SKUs for these purposes, while intel can take care of both markets with one set of SKUs.
Posted on Reply
#10
Upgrayedd
TheinsanegamerNThey do, they are called HDET, they are CPU dies made without an iGPU for those who want more CPU power instead. you are free to buy them and stop whining.

If you dont want to pay the high price for a niche product (iGPU less CPU) for a niche market (DIY), then buy the mainstream 10 core CPU with the iGPU, or pay $5 less for the KF version and stop whining. Or buy AMD, whose CPUs are using CPU dies that dont cut the mustard for server chips. You have options. Just because a tiny % of the DIY market wants HDET processors relabeled as mainstream processors for less money doesnt mean intel will dedicate a production line to them.
Lol whatever man. I know why Intel does it and I don't blame them.
Intel runs very well against AMDs chiplet approach while still maintaining an iGPU on the die and they probably could've maintained the lead on the competition if it was completely devoid of it. But with the fab delays it's probably out of the question for business efficiency currently.

And it's HEDT
Posted on Reply
#11
TheinsanegamerN
UpgrayeddLol whatever man. I know why Intel does it and I don't blame them.
Intel runs very well against AMDs chiplet approach while still maintaining an iGPU on the die and they probably could've maintained the lead on the competition if it was completely devoid of it. But with the fab delays it's probably out of the question for business efficiency currently.

And it's HEDT
So if by your own admission intel can maintain their lead with and without the iGPU, why would they create a separate SKU without it? What would the benefit be, both to intel and to enthusiasts?

This is why I labeled this as "whining" initially. There is no proposition for why intel should make seperate SKUs, just that they should do it.
Posted on Reply
#12
hat
Enthusiast
It makes sense to have SKUs without integrated graphics, even if they fully intend to include it on every chip. The manufacturing process is not perfect, and if they can sell chips without integrated graphics because that part of the die is bad, they will. It's common industry-wide.
Posted on Reply
#13
Upgrayedd
TheinsanegamerNSo if by your own admission intel can maintain their lead with and without the iGPU, why would they create a separate SKU without it? What would the benefit be, both to intel and to enthusiasts?

This is why I labeled this as "whining" initially. There is no proposition for why intel should make seperate SKUs, just that they should do it.
To widen the lead gap and increase market share? More market more profit? Take market from the competition? Justify pricing? But I doubt with all the fab problems they have the ability to do it right now.

TPU's very own i9-10900K review mentions next to nothing about the iGPU other than it is there and it is a pro and not a con. Labeled as "The World's Fastest Gaming Processor" yet its GPU is near pointless for gaming lol. Its great for business efficiency though which is their reason for their existing.
Posted on Reply
#14
Midland Dog
the moron was running 1.5v system agent and 1.408 vccio and then the flog complains about temperatures lol
Posted on Reply
#15
R-T-B
Midland Dogthe moron was running 1.5v system agent and 1.408 vccio and then the flog complains about temperatures lol
I mean he has points of reference to compare to. He's an extreme OCer by trade.
Posted on Reply
#16
lZKoce
Nice short article. I have iGPU on my i5, but never used it. There is this option in the MB BIOS: iGPU Multi-Monitor, but it is disabled. I suppose what it does, but haven't tried it. I guess I could tie up my second monitor, so that it doesn't takes resources from the main GPU, but how this will affect the CPU load I don't know.
Posted on Reply
#17
INSTG8R
Vanguard Beta Tester
lZKoceNice short article. I have iGPU on my i5, but never used it. There is this option in the MB BIOS: iGPU Multi-Monitor, but it is disabled. I suppose what it does, but haven't tried it. I guess I could tie up my second monitor, so that it doesn't takes resources from the main GPU, but how this will affect the CPU load I don't know.
YeAh that’s is only real “party trick” practical usage it also had something called Quick Sync that could be useful to specific users, apparently it’s pretty good ar what it does.
en.wikipedia.org/wiki/Intel_Quick_Sync_Video
Posted on Reply
#18
londiste
Why the iGPU hate?
iGPU opens up more market niches for Intel. Work computers that need no GPU besides internet stuff and showing picture on monitors are the obvious example.
I love being able to live without a GPU in my computer for a while. Or being able to update BIOS without one (unlike trying to get my AM4 APU machine up and running).

From the other side, there is no power penalty if it is not in use.
Die area might be significant but pretty sure it does not directly affect cost or yield (especially now with F-models).

If you think about the flack Intel gets about power consumption they don't really care for fitting more cores onto this platform and for a good reason.
Posted on Reply
#19
INSTG8R
Vanguard Beta Tester
londisteWhy the iGPU hate?
iGPU opens up more market niches for Intel. Work computers that need no GPU besides internet stuff and showing picture on monitors are the obvious example.
I love being able to live without a GPU in my computer for a while. Or being able to update BIOS without one (unlike trying to get my AM4 APU machine up and running).

From the other side, there is no power penalty if it is not in use.
Die area might be significant but pretty sure it does not directly affect cost or yield (especially now with F-models).

If you think about the flack Intel gets about power consumption they don't really care for fitting more cores onto this platform and for a good reason.
Meh it’s like buying car with another engine in the back, you”ll just never use It ¯\_(ツ)_/¯ maybe in an emergency (bad BIOS flash)I mean they have theI’d usages as you’ve given examples but Grandma isn‘t surfing the web with a K processor either. Granted I’ve read this is is an improved version Iris Plus so it‘s not “terrible“ and powerful enough for some actual light gaming.
Posted on Reply
#20
londiste
INSTG8Rbut Grandma isn‘t surfing the web with a K processor either.
Exactly this is a large part of why iGPU is a thing. Even though Intel has several different dies, they do not have that many of them. For the Grandma, office or OEM computers to have iGPU the dies have it and when it is already there, why not enable it for K-models as well? Not to mention lower end of the range that has even more use for iGPU because these are used in mobile applications.
Posted on Reply
#21
SetsunaFZero
Chrispy_If intel stopped putting graphics in everything they sold, they'd have more room for cores.
The 8700K is about 40% IGP by area.
Removing the iGPU would greatly reduce production cost, ~40% wasted wafer space.
Posted on Reply
#22
londiste
SetsunaFZeroRemoving the iGPU would greatly reduce production cost, ~40% wasted wafer space.
Are you sure that is Intel's problem? The 10900K die with 10 cores and iGPU is 206.1 mm². This is small enough and cheap enough die to produce, especially at 14nm. For comparison AMD's 12/14nm Zeppelin dies are a little larger at 212.97mm².
Posted on Reply
#23
qcmadness
londisteAre you sure that is Intel's problem? The 10900K die with 10 cores and iGPU is 206.1 mm². This is small enough and cheap enough die to produce, especially at 14nm. For comparison AMD's 12/14nm Zeppelin dies are a little larger at 212.97mm².
Have you considered Zeppelin has 4 IFOP and 2 IFIS for server use?
Also out of 32 available PCI-e links, only 24 (x16 for graphics, x4 for I/O hub and x4 for SSD) is exposed to Socket AM4?

en.wikichip.org/wiki/File:amd_zen_octa-core_die_shot_(annotated).png
Posted on Reply
#24
ARF
TheinsanegamerNThey do, they are called HDET, they are CPU dies made without an iGPU for those who want more CPU power instead. you are free to buy them.

A certain number of enthusiasts have been whining about that IGP for a decade now. It's been proven more then once that the lack of said iGPU wont magically allow higher clock speeds, lower power usage, or lower temperatures. It changes nothing. From intels' standpoint, that 10 core die used on the desktop is the same die that is used across their entire desktop portfolio, from 15 watt (if a 15 watt 10 core is sold) up to their 125 watt K series. If the 10 core is used on laptops, its the same die, from the same factory. The 8 cores are the same die with 2 cores lasered off, the 6 core the same die with 4 cores lasered, ece. None of these different chips have their own dies, they are all the same die, with cheaper parts binned from faulty larger parts, a practice as old as the computer industry itself.

Creating a special chunk of silicon without this iGPU attached for a niche market (enthusiasts) of a niche market (DIY) of a small market (desktop procesors) in intels processor portfolio would be a very expensive proposal, as these chips would only be useful in satiating those small number of loud enthusiasts (he vast majority of the enthusiast market of the 9900k/10900k are fine with the harmless iGPU); anywhere else they are used would simply be eating into the market for existing chips and gain intel nothing but the headache of managing another very similar production line. Given that said enthusiasts would expect lower prices or more cores for the same price, intel stands to gain nothing monetarily.

AMD manages this because their desktop processors are made with the same dies used to make their server processors. It is economically advantageous for them to do this. It would be economically poor for Intel to do the same with their monolithic design. AMD has their own seperate SKUs for these purposes, while intel can take care of both markets with one set of SKUs.
This is not correct for one main reason - the systems (OEM - notebook included) where these CPUs go are with discrete graphics cards, either Radeon or GeForce.
So, yes, the proper action of the corp is to strip the iGPU altogether because no one, literally no one needs it or wants it, and it consumes very large parts of the dies which could otherwise be used for more proper CPU cores.
Posted on Reply
#25
ThrashZone
Hi,
No DP or hdmi on asus formula guess that's what the thunderbolt card is for but I really have not looked into it and of course additional costs for graphic's

I can see why onboard graphic's is a good option on some boards seeing gpu prices are steep and building a computer often people do it in steps and not all at once and can if the board has DP or hdmi there.
Posted on Reply
Add your own comment