• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Core i7-8809G "Kaby Lake + Vega" MCM Specs Leaked Again, Indicate Dual IGP

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,849 (7.39/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Intel revealed specifications of its upcoming "Kaby Lake + AMD Vega" multi-chip module, the Core i7-8809G, on its website. A number of these specs were already sniffed out by Futuremark SystemInfo, but the website sheds light on a key feature - dual integrated graphics. The specs sheet confirms that the chip combines a 4-core/8-thread "Kaby Lake" CPU die with an AMD Radeon RX Vega M GH graphics die. The CPU is clocked at 3.10 GHz, and SystemInfo (from the older story) confirmed that its Turbo Boost frequency is up to 3.90 GHz. The L3 cache amount is maxed out a 8 MB. The reference memory clock is set at dual-channel DDR4-2400. What's more, the CPU component features an unlocked base-clock multiplier.

Things get interesting with the way Intel describes its integrated graphics solution. It mentions both the star-attraction, the AMD Radeon RX Vega M GH, and the Intel HD Graphics 630 located on the "Kaby Lake" CPU die. This indicates that Intel could deploy a mixed multi-GPU solution that's transparent to software, balancing graphics loads between the HD 630 and RX Vega M GH, depending on the load and thermal conditions. Speaking of which, Intel has rated the TDP of the MCM at 100W, with a rider stating "target package TDP," since there's no scientifically-correct way of measuring TDP on a multi-chip module. Intel could build performance-segment NUCs with this chip, in addition to selling them to mini-PC manufacturers.



Specifications of the RX Vega M GH continue to elude us. All we know is that it has its own 4 GB HBM2 memory stack over a 1024-bit wide memory interface, ticking at 800 MHz (204.8 GB/s memory bandwidth), and a GPU engine clock of 1.19 GHz. Even if this chip offers performance in the neighborhood of the discrete Radeon RX 570 4 GB, it should make for a killer entry-level gaming solution. Motherboards based on it could quickly capture the gaming iCafe, entry-gaming PC, and performance AIO markets.

View at TechPowerUp Main Site
 
Intel's and AMD's APU + Discrete GPU never worked that well so I find it hard to believe that Intel has developed the drivers to allow their iGPU to run in tandem with Vega iGPU. I'd say it's more likely they will turn off one of the two solutions when the PC is in low power mode.
 
It if would be anywhere near the 150W+ RX 570, count me as "impressed af". I would have to gamble somewhere RX 570 -30-35%
 
Intel's and AMD's APU + Discrete GPU never worked that well so I find it hard to believe that Intel has developed the drivers to allow their iGPU to run in tandem with Vega iGPU. I'd say it's more likely they will turn off one of the two solutions when the PC is in low power mode.
Define "run in tandem"?
These GPUs will not work together on rendering the video output - that would most likely be a mess.
The system will switch between them depending on the rendering load (like it would on a laptop).
And it's no surprise, since AMD GPUs are way too power hungry in idle. This would not be acceptable in a NUC / ultrabook.

The more interesting scenario is when this MCM uses both GPUs at the same time: Intel IGP for video and Vega for software acceleration - I'm rather excited. It's the prince that was promised :-)
 
DX12 does allow running of different GPU's together. Although I could never understand how can that work when GPU's are so vastly different...
 
RX570 territory? Yeah, right.
Most of the leaks/speculations so far put it in the range of 500 shaders. Which puts it around RX540/550. RX550 has a TDP of 50W so that matches as well.
 
Im kinda disappointed AMD let this happen honestly. I was really looking forward in building a itx AMD Ryzen APU system this year but Im not sure now if Intel brings out anything in that form factor.
 
Most of the leaks/speculations so far put it in the range of 500 shaders.

I haven't seen anything that suggests that.

There have been some photos around of this thing and judging just from the size of the die , it should have at least 1536 shaders.
 
Define "run in tandem"?
These GPUs will not work together on rendering the video output - that would most likely be a mess.
The system will switch between them depending on the rendering load (like it would on a laptop).
And it's no surprise, since AMD GPUs are way too power hungry in idle. This would not be acceptable in a NUC / ultrabook.

The more interesting scenario is when this MCM uses both GPUs at the same time: Intel IGP for video and Vega for software acceleration - I'm rather excited. It's the prince that was promised :)
Continually chatting rubbish, amd vega uses little power in idle as does polaris please stop missinforming, it's video playback power use is higher than intels but these are unlikely to be used in tandem as one power draw is less than two and high power uses like gaming wont receive dev support to get two gpus working in dx12
 
Continually chatting rubbish, amd vega uses little power in idle as does polaris please stop missinforming, it's video playback power use is higher than intels but these are unlikely to be used in tandem as one power draw is less than two and high power uses like gaming wont receive dev support to get two gpus working in dx12
According to TPU test results Vega used 14W in idle. You don't agree?
https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_56/29.html
 
Color me skeptical. I'm looking forward to reviews IF Intel actually markets this. I always use discreet GPUs so I have no interest anyway, but I know many others that do and I'm always open to new ways of doing things.
 
Hopefully $100-$200 with decent performance.
 
So how much? Half? 7W? Still way too much.

Of course. How is this even relevant?
[/QUOTE]
Did it not adequately address your concerns regarding vegas idle power draw , equivalent equipment and all ,same ish idle power??

In general they'll have to be on the ball to hit performance metrics within their allowed power budget that i think we agree on.
 
Did it not adequately address your concerns regarding vegas idle power draw , equivalent equipment and all ,same ish idle power??
I don't have any concerns regarding its power draw in idle. I just said it's too high for the task at hand - that's why a dual GPU config makes sense.
Man... you just went berserk about it! I didn't say Vega draws more than Pascal. I didn't mention NVIDIA at all. You're the one who started comparing to "equivalent equipment". :-D

Few days ago someone told you that you're coping very badly with any AMD criticism and it makes discussing difficult, but these few posts were just on another level! :roll:
 
I don't have any concerns regarding its power draw in idle. I just said it's too high for the task at hand - that's why a dual GPU config makes sense.
Man... you just went berserk about it! I didn't say Vega draws more than Pascal. I didn't mention NVIDIA at all. You're the one who started comparing to "equivalent equipment". :-D

Few days ago someone told you that you're coping very badly with any AMD criticism and it makes discussing difficult, but these few posts were just on another level! :roll:
Not at all im used to being labelled , however im actually still unbiased and reasonable , your getting too much Sentiment in the form of a few sentences.

I just see the trends you see in others just different , always spouting neg stuff about particular parties ,all a bit dull to me.


You see i don't waste too much time bad mouthing what i don't like but im happy to discuss what i perceive as good in technology and for it's future.
And I'm not keen on Google's great flock if im honest.
 
Anyone remember Radeon Xpress 200? This was an ATI motherboard chipset with integrated graphics for Intel processors (Pentium D mostly). Shortly after release, AMD bought ATI and stopped all further development, including drivers. It was the worst chipset I ever had to deal with.

Let bygones be bygones?
 
Back
Top