• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Rumored Intel Kaby Lake-G Series: Modular, Multi-Die, HBM 2, AMD Graphics IP?

Raevenlord

News Editor
Joined
Aug 12, 2016
Messages
3,755 (1.15/day)
Location
Portugal
System Name The Ryzening
Processor AMD Ryzen 9 5900X
Motherboard MSI X570 MAG TOMAHAWK
Cooling Lian Li Galahad 360mm AIO
Memory 32 GB G.Skill Trident Z F4-3733 (4x 8 GB)
Video Card(s) Gigabyte RTX 3070 Ti
Storage Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB
Display(s) Acer Nitro VG270UP (1440p 144 Hz IPS)
Case Lian Li O11DX Dynamic White
Audio Device(s) iFi Audio Zen DAC
Power Supply Seasonic Focus+ 750 W
Mouse Cooler Master Masterkeys Lite L
Keyboard Cooler Master Masterkeys Lite L
Software Windows 10 x64
Rumors have been making the rounds about an as-of-yet unannounced product from Intel: a Kaby Lake-G series which would mark Intel's return foray to a multi-chip module in a singular package. The company has already played with such a design before with its Clarkdale family of processors - which married a 32 nm CPU as well as a 45 nm GPU and memory controller in a single package. Kaby Lake-G will reportedly make away with its simple, low-data rate implementation and communication between two parts, instead carrying itself on the shoulders of Intel's EMIB (Embedded Multi-die Interconnect Bridge), which the company claims is a "more elegant interconnect for a more civilized age."

Instead of using a large silicon interposer typically found in other 2.5D approaches (like AMD did whilst marrying its Fiji dies with HBM memory), EMIB uses a very small bridge die, with multiple routing layers, which provide a good measure of price/data paths for the interconnected, heterogeneous architecture. This saves on the costly TSV (Through-Silicon Vias) that dot the interposer approach.





For now, rumors peg these Kaby Lake-G as special BGA processors based on Kaby Lake, with an additional discrete GPU on the package. The TDP of these processors (at 65 W and 100 W) is well above the Kaby Lake-H's known 45 Watts. Which begs the question: what exactly is under the hood? This, including Intel's modular approach to chip design for which it developed its EMIB technology, could probably account for the AMD graphic's chip TDP - a discrete-level GPU which would be integrated on-die, EMIB's routing layers handling the data exchange between GPU and processor. This is where HBM 2 memory integration would also come in, naturally - a way to keep a considerable amount of high-speed memory inside the package, accessible by the silicon slices that would need to. Nothing in the leaked information seems to point towards this HBM 2 integration, however.



Also helping these "AMD Radeon IP integration" story (besides TDP) is that the two chips that will be part of the Kaby Lake-G series will feature a package size of 58.5 x 31mm - bigger than a desktop Kaby Lake-S (37.5 x 37.5 mm) and the Kaby Lake-H series chips (42 x 28mm). The extra space would accommodate increased footprint of the GPU package - though for now, leaked information points only, again, to Intel's own GT2 graphics solution, though Benchlife seems to put much stock on the AMD side of the equation.



The heterogeneous, modular approach to CPU development here would really benefit Intel thusly: it would allow it to integrate such external graphics solutions that could be produced in other factories entirely and then fitted onto the package; would allow Intel to save die space on their 10 nm dies for actual cores, increasing yields from their 10 nm process; and would allow Intel to recycle old processes with new logic inside the CPU package, permitting the company to better distribute production load across different processes, better utilizing (and extracting value from) their not-so-state-of-the-art processes.

If Intel advances with this modular approach, we stand to see some really interesting designs, with multiple manufacturing processes working in tandem inside a single package, giving Intel more flexibility in developing and implementing its fabrication processes. What do you think about this take on CPU development?

View at TechPowerUp Main Site
 
.....So was there some backroom trade where AMD got a decent CPU with "hyper threading" and Intel got decent graphics cores?
 
.....So was there some backroom trade where AMD got a decent CPU with "hyper threading" and Intel got decent graphics cores?

I think Intel is still married to Nvidia GPUs.
I don't think Intel gonna want to feed AMD with Ryzen attacking Intel on all fronts by the end of the year.
$264 Millions annually for the GPU licencing is what Intel is paying Nvidia.
Putting those extra money into AMD pocket would definitely hurt Intel in the long run.
 
Quite a stretch to conclude a AMD GPU just from the mentioning of a 100W TDP part...
Could also be some 6 core thing for mobile workstations. Or a cheaper way for more L4 cache.
 
Iris Pro w/ HBM maybe?
 
I think Intel is still married to Nvidia GPUs.
I don't think Intel gonna want to feed AMD with Ryzen attacking Intel on all fronts by the end of the year.
$264 Millions annually for the GPU licencing is what Intel is paying Nvidia.
Putting those extra money into AMD pocket would definitely hurt Intel in the long run.

WARING: TINFOIL HATS REQUIRED!!!!!!!!!!!
...true but we both know no one could afford for Amd to go out of business for obvious reason with out a suitable replacement. So intel does this jogging in place thing for a couple of years, throws them a life line or two and we have todays current situation. Now intel drops a ridiculous new line they've been sitting on for a stupid price because there is now competition and some excitement in the cpu world. Badda Bing.... Badda Boom...
 
I think Intel is still married to Nvidia GPUs.
I don't think Intel gonna want to feed AMD with Ryzen attacking Intel on all fronts by the end of the year.
$264 Millions annually for the GPU licencing is what Intel is paying Nvidia.
Putting those extra money into AMD pocket would definitely hurt Intel in the long run.

That deal with nvidia expired last year and they signed with AMD now.
 
I think Intel is still married to Nvidia GPUs.
I don't think Intel gonna want to feed AMD with Ryzen attacking Intel on all fronts by the end of the year.
$264 Millions annually for the GPU licencing is what Intel is paying Nvidia.
Putting those extra money into AMD pocket would definitely hurt Intel in the long run.

Intel hates Nvidia.
Nvidia hates Intel.

How hard is it for people to learn it?....
 
That deal with nvidia expired last year and they signed with AMD now.

"deal", more as a lawsuit that resulted in forced license of IP which Intel didn't want.
 
Intel hates Nvidia.
Nvidia hates Intel.

How hard is it for people to learn it?....
Companies in an oligopoly do not hate each other.

How hard is it for people to learn it?....
 
Intel hates Nvidia.
Nvidia hates Intel.

How hard is it for people to learn it?....

Yeah, hard to believe but Intel & AMD are closer buddies than Nvidia & either one of the other two.

That should say something about Nvidia, lol...
 
Basically, this is Intel telling us that because they can't keep up their 'tick-tock-toe' schedule, they'll now ghetto-mod random parts together to produce something that does better every year.

And they even get to hide their super expensive but still shitty IGP's while doing it. :D
 
Last edited:
Yeah, hard to believe but Intel & AMD are closer buddies than Nvidia & either one of the other two.

That should say something about Nvidia, lol...

AMD is direct competition to Intel (CPU to CPU). NVIDIA isn't (CPU to GPU). Talking discrete type, not the integrated crap. I'd say it's easier to keep good relations with non-competitor than with direct competitor...
 
Basically, this is Intel telling us that because they can't keep up their 'tick-tock-toe' schedule, they'll now ghetto-mod random parts together to produce something that does better every year.

Well, it's not a terrible way to hide non-innovation.
Along with maximizing revenues from older nodes, the design lends itself to use a variety of graphics chips (possibly from both Intel and AMD) to arrive at a line of products.

I'm not sure how well this would play for AMD though... Just when they will be trying to make a comeback to notebook market, feeding their biggest rival with their own graphics IP, which has been their most valuable asset in that market? Maybe they will only share the IP for their inferior parts? As in for example Polaris, but not Vega yet?
 
Well, it's not a terrible way to hide non-innovation.
Along with maximizing revenues from older nodes, the design lends itself to use a variety of graphics chips (possibly from both Intel and AMD) to arrive at a line of products.

I'm not sure how well this would play for AMD though... Just when they will be trying to make a comeback to notebook market, feeding their biggest rival with their own graphics IP, which has been their most valuable asset in that market? Maybe they will only share the IP for their inferior parts? As in for example Polaris, but not Vega yet?

I think that seeing AMD graphics in Intel CPUs is going to be a major stretch at best, but still, licensed technology is a really good cash cow, you just sell information once, get cash annually. One way or another, they make money off it and also: when you can't beat them, join them right? In addition, AMD can still make something new that trumps everything before it, which puts them back in control, while it becomes highly unlikely that the competitor will put R&D into making something new. Pretty good strategy.
 
AMD is direct competition to Intel (CPU to CPU). NVIDIA isn't (CPU to GPU). Talking discrete type, not the integrated crap. I'd say it's easier to keep good relations with non-competitor than with direct competitor...
Nvidia isn't a direct competitor on the desktop. But they are in the automotive segment, which is rising fast. If anything Intel is more concerned about Nvidia than it is about AMD.
But again, in this market no one wants the other to go belly up, lest they end up a monopoly and have to deal will all sorts of additional regulations.
 
Last edited:
Nvidia is

Nvidia isn't a direct competitor on the desktop. But they are in the automotive segment, which is rising fast. If anything Intel is more concerned about Nvidia than it is about AMD.
But again, in this market no one wants the other to go belly up, lest they end up a monopoly and have to deal will all sorts of additional regulations.

Regulations? This is Amurrica! That's bypassed by simple donations to campaigns and overseas bank accounts (see every single large corp getting away with murder...sometimes literally).
 
Intel can't even compete in automotive industry. You need a GPU based solution to address that (huge amonts of data to process in parallel) and they have none. Even the Larrabee they had went belly up. And even though Intel has tons of resources, they just don't have the know how to compete with AMD or NVIDIA in this segment. Otherwise they'd already be in this business.
 
Regulations? This is Amurrica! That's bypassed by simple donations to campaigns and overseas bank accounts (see every single large corp getting away with murder...sometimes literally).
It's still less messy to have a competitor around. Even a minor one will do.
Intel can't even compete in automotive industry. You need a GPU based solution to address that (huge amonts of data to process in parallel) and they have none. Even the Larrabee they had went belly up. And even though Intel has tons of resources, they just don't have the know how to compete with AMD or NVIDIA in this segment. Otherwise they'd already be in this business.
I think you're confusing automotive with number crunching. I'm talking about car infotainment systems.
 
Intel can't even compete in automotive industry. You need a GPU based solution to address that (huge amonts of data to process in parallel) and they have none. Even the Larrabee they had went belly up. And even though Intel has tons of resources, they just don't have the know how to compete with AMD or NVIDIA in this segment. Otherwise they'd already be in this business.

Intel got more than enough processing power when they bought Altera and their FPGAs.
Just a matter of time till they announce some self driving platform thing.
 
I think Intel is still married to Nvidia GPUs.
I don't think Intel gonna want to feed AMD with Ryzen attacking Intel on all fronts by the end of the year.
$264 Millions annually for the GPU licencing is what Intel is paying Nvidia.
Putting those extra money into AMD pocket would definitely hurt Intel in the long run.

Intel's bigger threat is Nvidia albeit in different segments. Also, Nvidia GPU IP has gotten them hardly anywhere in the GPU space. And the last thing they want to do is keep paying their biggest and better rival in AI, auto, and other professional fields. Intel actually needs AMD, and vice versa. They have an ironic symbiotic relationship, especially considering the worlds code base is based on AMD's iteration of X64. They need each other unlike Intel needing Nvidia.
 
Embedded HBM and an integrated GPU is the way it should be, in a perfect world the CPU wouldn't have any bandwidth or latency constraints. For the typical PC user this is going to be the ideal model.

As with anything hardware related, I'll hold off until I can see independent real-world studies, but I have high hopes for this product.
 
I put absolutely no stock in this being an AMD gpu as I can't see AMD giving up its one advantage in the APU space (at this current time).

All the AMD deal was for licensing IP that Nvidia forced it to license from itself. Any increase in tdp or die space is likely coming from a larger Iris GPU.
 
Last edited:
I can just see this technology making for killer consoles and that's a good thing.
 
Back
Top