Tuesday, October 8th 2019

The End of a Collaboration: Intel Announces Discontinuation of Kaby Lake-G with AMD Radeon Vega Graphics

The marriage of Intel and AMD IPs in the form of the Kaby Lake-G processors was met with both surprised grunts from the company and a sense of bewilderment at what could come next. Well, we now know what came next: Intel hiring several high-level AMD employees on the graphics space and putting together its own motley crew of discrete GPU developers, who should be putting out Intel's next-gen high-performance graphics accelerators sometime next year.

The Kaby Lake-G processors, however, showed promise, pairing both Intel's (at the time) IPC dominance and AMD's graphics IP performance and expertise on a single package by placing the two components in the same substrate and connecting them via a PCIe link. A new and succinct Intel notice on the Kaby Lake-G page sets a last order time (January 31, 2020, as the last date for orders, and July 31, 2020, as the date of last shipments), and explains that product market shifts have moved demand from Kaby Lake-G products "to other Intel products". Uptake was always slow on this particular collaboration - most of it, we'd guess, because of the chips' strange footprint arrangement for embedding in systems, which required custom solutions that had to be designed from scratch. And with Intel investing into their own high-performance graphics, it seems clear that there is just no need to flaunt their previous collaborations with other companies in this field. Farewell, Intel-AMD Kaby Lake-G. We barely knew you.
Source: Tom's Hardware
Add your own comment

14 Comments on The End of a Collaboration: Intel Announces Discontinuation of Kaby Lake-G with AMD Radeon Vega Graphics

#1
yakk
Intel certainly got what they wanted out of that association.
Posted on Reply
#2
Cheeseball
If the i7-8809G debuted at $800.00, I would've gotten one. But for $1,450? Not worth it.
Posted on Reply
#3
Vayra86
This is just the 5775C all over again. Intel was getting afraid they might actually make a truly nice product since years, and they quickly axed it.

o_O
Posted on Reply
#4
Franzen4Real
Vayra86, post: 4129554, member: 152404"
This is just the 5775C all over again. Intel was getting afraid they might actually make a truly nice product since years, and they quickly axed it.

o_O
I always thought it was a bit hard to figure out who Intel was going after with the 5775C other than someone wanting an NUC style build. With i7's typically going into workstations or gaming towers, they are mostly paired with GPU's anyways. From the little that I had read on gaming workloads, the huge cache didn't add much performance if any when running with discreet cards over something like Devils Canyon. I'm only guessing that there are more professional software cases where it was able to exploit that L4 with good results. As nice of a cpu as it was, it seems like it didn't have any particular market to target, which perhaps lead to Iris Pro's quick demise.
Posted on Reply
#5
dj-electric
As someone who uses a Hades Canyon NUC HVK machine - this thing is absolutely incredible. But this sort of collab ending makes sense.
This is the bridge that caused all of those RTG people move into Intel. This is the stamp that proves that these kind of technologies can make something amazing.
This, is pretty much that trigger that started Intel Graphics division.
Posted on Reply
#6
Vayra86
dj-electric, post: 4129621, member: 87186"
As someone who uses a Hades Canyon NUC HVK machine - this thing is absolutely incredible. But this sort of collab ending makes sense.
This is the bridge that caused all of those RTG people move into Intel. This is the stamp that proves that these kind of technologies can make something amazing.
This, is pretty much that trigger that started Intel Graphics division.
Heh, that is an interesting insight and you're probably right too
Posted on Reply
#7
AnarchoPrimitiv
Being reminded of this just makes me think about how badly I want AMD to come out with a powerhouse APU with at least 8 zen 2 cpu cores, at least 4gb of embedded HMB2(E) and enough CUs to game at 1080p ultra @60fps (though 1440p would be amazing). I wouldn't care if it required an TR4 socket and 250watts, because the ability to completely wash my hands of the dGPU market would be just sheer joy.

If AMD could somehow, make a 4-6 CPU core APU with enough GPU horsepower to do 1080p ultra at 60fps (again, embedded 4gb of HBM2e would be essential) and keep the tdp down to approximately 140watts or less (i7-9750h - 45w + RTX 2060 - 90w = 135watts) it'd be a game changer.
Posted on Reply
#8
Vayra86
AnarchoPrimitiv, post: 4129648, member: 168101"
If AMD could somehow, make a 4-6 CPU core APU with enough GPU horsepower to do 1080p ultra at 60fps (again, embedded 4gb of HBM2e would be essential) and keep the tdp down to approximately 140watts or less (i7-9750h - 45w + RTX 2060 - 90w = 135watts) it'd be a game changer.
Question remains... game changer for who? The price would be the real indicator here, because lots of people won't mind dGPU at all if the combo can be had cheaper. That and they also have the ability to upgrade it separately, which, for gaming, should not be understated especially with the CPU performance curve being nearly flat.

Coming back to my 5775C remark... people replied- where's the market? They're probably right. I mean the A10 wasn't killing anything either. AMD would need some sort of campaign of 'Ultrabook' like proportions to get people accustomed to a new norm, form factor and function combo.
Posted on Reply
#9
dj-electric
Combining a CPU with a powerful enough GPU+VMEM is not easy (Enough being jumping from movies and indie games to full on 1080P gaming). The Hades Canyon NUC is truly one of a kind in terms of raw power per MCM size. This HBM2 package does wonders.

To "Why not a stronger 5775C?" Intel answers with "eDRAM is too expensive" and ill add that they can't make it over 128MB at the moment without being gigantic to a point where you better just use HBM2.

For NUC-like solutions, HBM2e has to be the ultimate solution for power to size ratio, but its painfully expensive to use. That's one of the reasons this NUC is so expensive.

You do get the performance of something like a GTX 1650, which is very nice. The question is - what is your target audience? Will you be aiming towards budget oriented gamers? No, That's why you have CPUs like 9100F \ Ryzen 3 of kinds and GPUs like Radeon RX 5500.

That narrows it to a very niche type of audience that want to have power in a very small form factor, smaller than even what MITX cases can offer. I'm not sure AMD \ Intel are truly after this market, and pretty sure Hades Canyon NUC so far has not been making billions for those both.

The natural curve of APUs will have to keep bending to available DDR speeds of standard platforms, as slow as they may be (and they are, cripplingly slow).
Posted on Reply
#11
eidairaman1
The Exiled Airman
Vayra86, post: 4129554, member: 152404"
This is just the 5775C all over again. Intel was getting afraid they might actually make a truly nice product since years, and they quickly axed it.

o_O
Because it is AMD Igpu, and AMD is a tough competitor now.
Posted on Reply
#12
BorgOvermind
Proper naming: next-gen high-performance graphics emulators.
Posted on Reply
#13
kapone32
eidairaman1, post: 4130465, member: 40556"
Because it is AMD Igpu, and AMD is a tough competitor now.
I agree with that. Especailly as we purportedly get closer to Intel's launch of their own GPU(s).
Posted on Reply
#14
king of swag187
eidairaman1, post: 4130465, member: 40556"
Because it is AMD Igpu, and AMD is a tough competitor now.
iGPU is still HD/UHD 630, Its just that the dGPU has moved onto the same packaged as the CPU.
Posted on Reply
Add your own comment