Sunday, April 3rd 2016

Rumored Intel Kaby Lake-G Series: Modular, Multi-Die, HBM 2, AMD Graphics IP?

Rumors have been making the rounds about an as-of-yet unannounced product from Intel: a Kaby Lake-G series which would mark Intel's return foray to a multi-chip module in a singular package. The company has already played with such a design before with its Clarkdale family of processors - which married a 32 nm CPU as well as a 45 nm GPU and memory controller in a single package. Kaby Lake-G will reportedly make away with its simple, low-data rate implementation and communication between two parts, instead carrying itself on the shoulders of Intel's EMIB (Embedded Multi-die Interconnect Bridge), which the company claims is a "more elegant interconnect for a more civilized age."

Instead of using a large silicon interposer typically found in other 2.5D approaches (like AMD did whilst marrying its Fiji dies with HBM memory), EMIB uses a very small bridge die, with multiple routing layers, which provide a good measure of price/data paths for the interconnected, heterogeneous architecture. This saves on the costly TSV (Through-Silicon Vias) that dot the interposer approach.
For now, rumors peg these Kaby Lake-G as special BGA processors based on Kaby Lake, with an additional discrete GPU on the package. The TDP of these processors (at 65 W and 100 W) is well above the Kaby Lake-H's known 45 Watts. Which begs the question: what exactly is under the hood? This, including Intel's modular approach to chip design for which it developed its EMIB technology, could probably account for the AMD graphic's chip TDP - a discrete-level GPU which would be integrated on-die, EMIB's routing layers handling the data exchange between GPU and processor. This is where HBM 2 memory integration would also come in, naturally - a way to keep a considerable amount of high-speed memory inside the package, accessible by the silicon slices that would need to. Nothing in the leaked information seems to point towards this HBM 2 integration, however.
Also helping these "AMD Radeon IP integration" story (besides TDP) is that the two chips that will be part of the Kaby Lake-G series will feature a package size of 58.5 x 31mm - bigger than a desktop Kaby Lake-S (37.5 x 37.5 mm) and the Kaby Lake-H series chips (42 x 28mm). The extra space would accommodate increased footprint of the GPU package - though for now, leaked information points only, again, to Intel's own GT2 graphics solution, though Benchlife seems to put much stock on the AMD side of the equation.
The heterogeneous, modular approach to CPU development here would really benefit Intel thusly: it would allow it to integrate such external graphics solutions that could be produced in other factories entirely and then fitted onto the package; would allow Intel to save die space on their 10 nm dies for actual cores, increasing yields from their 10 nm process; and would allow Intel to recycle old processes with new logic inside the CPU package, permitting the company to better distribute production load across different processes, better utilizing (and extracting value from) their not-so-state-of-the-art processes.

If Intel advances with this modular approach, we stand to see some really interesting designs, with multiple manufacturing processes working in tandem inside a single package, giving Intel more flexibility in developing and implementing its fabrication processes. What do you think about this take on CPU development? Sources: BenchLife, Computerbase.de
Add your own comment

32 Comments on Rumored Intel Kaby Lake-G Series: Modular, Multi-Die, HBM 2, AMD Graphics IP?

#1
ensabrenoir
.....So was there some backroom trade where AMD got a decent CPU with "hyper threading" and Intel got decent graphics cores?
Posted on Reply
#2
r9
ensabrenoir said:
.....So was there some backroom trade where AMD got a decent CPU with "hyper threading" and Intel got decent graphics cores?
I think Intel is still married to Nvidia GPUs.
I don't think Intel gonna want to feed AMD with Ryzen attacking Intel on all fronts by the end of the year.
$264 Millions annually for the GPU licencing is what Intel is paying Nvidia.
Putting those extra money into AMD pocket would definitely hurt Intel in the long run.
Posted on Reply
#3
iO
Quite a stretch to conclude a AMD GPU just from the mentioning of a 100W TDP part...
Could also be some 6 core thing for mobile workstations. Or a cheaper way for more L4 cache.
Posted on Reply
#4
bug
Iris Pro w/ HBM maybe?
Posted on Reply
#5
ensabrenoir
r9 said:
I think Intel is still married to Nvidia GPUs.
I don't think Intel gonna want to feed AMD with Ryzen attacking Intel on all fronts by the end of the year.
$264 Millions annually for the GPU licencing is what Intel is paying Nvidia.
Putting those extra money into AMD pocket would definitely hurt Intel in the long run.
WARING: TINFOIL HATS REQUIRED!!!!!!!!!!!
...true but we both know no one could afford for Amd to go out of business for obvious reason with out a suitable replacement. So intel does this jogging in place thing for a couple of years, throws them a life line or two and we have todays current situation. Now intel drops a ridiculous new line they've been sitting on for a stupid price because there is now competition and some excitement in the cpu world. Badda Bing.... Badda Boom...
Posted on Reply
#6
simlariver
r9 said:
I think Intel is still married to Nvidia GPUs.
I don't think Intel gonna want to feed AMD with Ryzen attacking Intel on all fronts by the end of the year.
$264 Millions annually for the GPU licencing is what Intel is paying Nvidia.
Putting those extra money into AMD pocket would definitely hurt Intel in the long run.
That deal with nvidia expired last year and they signed with AMD now.
Posted on Reply
#7
Imsochobo
r9 said:
I think Intel is still married to Nvidia GPUs.
I don't think Intel gonna want to feed AMD with Ryzen attacking Intel on all fronts by the end of the year.
$264 Millions annually for the GPU licencing is what Intel is paying Nvidia.
Putting those extra money into AMD pocket would definitely hurt Intel in the long run.
Intel hates Nvidia.
Nvidia hates Intel.

How hard is it for people to learn it?....
Posted on Reply
#8
Imsochobo
simlariver said:
That deal with nvidia expired last year and they signed with AMD now.
"deal", more as a lawsuit that resulted in forced license of IP which Intel didn't want.
Posted on Reply
#9
bug
Imsochobo said:
Intel hates Nvidia.
Nvidia hates Intel.

How hard is it for people to learn it?....
Companies in an oligopoly do not hate each other.

How hard is it for people to learn it?....
Posted on Reply
#10
theGryphon
Imsochobo said:
Intel hates Nvidia.
Nvidia hates Intel.

How hard is it for people to learn it?....
Yeah, hard to believe but Intel & AMD are closer buddies than Nvidia & either one of the other two.

That should say something about Nvidia, lol...
Posted on Reply
#11
Vayra86
Basically, this is Intel telling us that because they can't keep up their 'tick-tock-toe' schedule, they'll now ghetto-mod random parts together to produce something that does better every year.

And they even get to hide their super expensive but still shitty IGP's while doing it. :D
Posted on Reply
#12
RejZoR
theGryphon said:
Yeah, hard to believe but Intel & AMD are closer buddies than Nvidia & either one of the other two.

That should say something about Nvidia, lol...
AMD is direct competition to Intel (CPU to CPU). NVIDIA isn't (CPU to GPU). Talking discrete type, not the integrated crap. I'd say it's easier to keep good relations with non-competitor than with direct competitor...
Posted on Reply
#13
theGryphon
Vayra86 said:
Basically, this is Intel telling us that because they can't keep up their 'tick-tock-toe' schedule, they'll now ghetto-mod random parts together to produce something that does better every year.
Well, it's not a terrible way to hide non-innovation.
Along with maximizing revenues from older nodes, the design lends itself to use a variety of graphics chips (possibly from both Intel and AMD) to arrive at a line of products.

I'm not sure how well this would play for AMD though... Just when they will be trying to make a comeback to notebook market, feeding their biggest rival with their own graphics IP, which has been their most valuable asset in that market? Maybe they will only share the IP for their inferior parts? As in for example Polaris, but not Vega yet?
Posted on Reply
#14
Vayra86
theGryphon said:
Well, it's not a terrible way to hide non-innovation.
Along with maximizing revenues from older nodes, the design lends itself to use a variety of graphics chips (possibly from both Intel and AMD) to arrive at a line of products.

I'm not sure how well this would play for AMD though... Just when they will be trying to make a comeback to notebook market, feeding their biggest rival with their own graphics IP, which has been their most valuable asset in that market? Maybe they will only share the IP for their inferior parts? As in for example Polaris, but not Vega yet?
I think that seeing AMD graphics in Intel CPUs is going to be a major stretch at best, but still, licensed technology is a really good cash cow, you just sell information once, get cash annually. One way or another, they make money off it and also: when you can't beat them, join them right? In addition, AMD can still make something new that trumps everything before it, which puts them back in control, while it becomes highly unlikely that the competitor will put R&D into making something new. Pretty good strategy.
Posted on Reply
#15
bug
RejZoR said:
AMD is direct competition to Intel (CPU to CPU). NVIDIA isn't (CPU to GPU). Talking discrete type, not the integrated crap. I'd say it's easier to keep good relations with non-competitor than with direct competitor...
Nvidia isn't a direct competitor on the desktop. But they are in the automotive segment, which is rising fast. If anything Intel is more concerned about Nvidia than it is about AMD.
But again, in this market no one wants the other to go belly up, lest they end up a monopoly and have to deal will all sorts of additional regulations.
Posted on Reply
#16
TheGuruStud
bug said:
Nvidia is

Nvidia isn't a direct competitor on the desktop. But they are in the automotive segment, which is rising fast. If anything Intel is more concerned about Nvidia than it is about AMD.
But again, in this market no one wants the other to go belly up, lest they end up a monopoly and have to deal will all sorts of additional regulations.
Regulations? This is Amurrica! That's bypassed by simple donations to campaigns and overseas bank accounts (see every single large corp getting away with murder...sometimes literally).
Posted on Reply
#17
RejZoR
Intel can't even compete in automotive industry. You need a GPU based solution to address that (huge amonts of data to process in parallel) and they have none. Even the Larrabee they had went belly up. And even though Intel has tons of resources, they just don't have the know how to compete with AMD or NVIDIA in this segment. Otherwise they'd already be in this business.
Posted on Reply
#18
bug
TheGuruStud said:
Regulations? This is Amurrica! That's bypassed by simple donations to campaigns and overseas bank accounts (see every single large corp getting away with murder...sometimes literally).
It's still less messy to have a competitor around. Even a minor one will do.
RejZoR said:
Intel can't even compete in automotive industry. You need a GPU based solution to address that (huge amonts of data to process in parallel) and they have none. Even the Larrabee they had went belly up. And even though Intel has tons of resources, they just don't have the know how to compete with AMD or NVIDIA in this segment. Otherwise they'd already be in this business.
I think you're confusing automotive with number crunching. I'm talking about car infotainment systems.
Posted on Reply
#19
iO
RejZoR said:
Intel can't even compete in automotive industry. You need a GPU based solution to address that (huge amonts of data to process in parallel) and they have none. Even the Larrabee they had went belly up. And even though Intel has tons of resources, they just don't have the know how to compete with AMD or NVIDIA in this segment. Otherwise they'd already be in this business.
Intel got more than enough processing power when they bought Altera and their FPGAs.
Just a matter of time till they announce some self driving platform thing.
Posted on Reply
#20
thesmokingman
r9 said:
I think Intel is still married to Nvidia GPUs.
I don't think Intel gonna want to feed AMD with Ryzen attacking Intel on all fronts by the end of the year.
$264 Millions annually for the GPU licencing is what Intel is paying Nvidia.
Putting those extra money into AMD pocket would definitely hurt Intel in the long run.
Intel's bigger threat is Nvidia albeit in different segments. Also, Nvidia GPU IP has gotten them hardly anywhere in the GPU space. And the last thing they want to do is keep paying their biggest and better rival in AI, auto, and other professional fields. Intel actually needs AMD, and vice versa. They have an ironic symbiotic relationship, especially considering the worlds code base is based on AMD's iteration of X64. They need each other unlike Intel needing Nvidia.
Posted on Reply
#22
Blueberries
Embedded HBM and an integrated GPU is the way it should be, in a perfect world the CPU wouldn't have any bandwidth or latency constraints. For the typical PC user this is going to be the ideal model.

As with anything hardware related, I'll hold off until I can see independent real-world studies, but I have high hopes for this product.
Posted on Reply
#23
Camm
I put absolutely no stock in this being an AMD gpu as I can't see AMD giving up its one advantage in the APU space (at this current time).

All the AMD deal was for licensing IP that Nvidia forced it to license from itself. Any increase in tdp or die space is likely coming from a larger Iris GPU.
Posted on Reply
#24
qubit
Overclocked quantum bit
I can just see this technology making for killer consoles and that's a good thing.
Posted on Reply
#25
renz496
simlariver said:
That deal with nvidia expired last year and they signed with AMD now.
any official proof that intel already signed the deal with AMD?
Posted on Reply
Add your own comment