• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel iGPU+dGPU Multi-Adapter Tech Shows Promise Thanks to its Realistic Goals

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
38,863 (8.42/day)
Location
Hyderabad, India
Processor AMD Ryzen 7 2700X
Motherboard ASUS ROG Strix B450-E Gaming
Cooling AMD Wraith Prism
Memory 2x 16GB Corsair Vengeance LPX DDR4-3000
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) Creative Sound Blaster Recon3D PCIe
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Microsoft Sidewinder X4
Software Windows 10 Pro
Intel is revisiting the concept of asymmetric multi-GPU introduced with DirectX 12. The company posted an elaborate technical slide-deck it originally planned to present to game developers at the now-cancelled GDC 2020. The technology shows promise because the company isn't insulting developers' intelligence by proposing that the iGPU lying dormant be made to shoulder the game's entire rendering pipeline for a single-digit percentage performance boost. Rather, it has come up with innovating augments to the rendering path such that only certain lightweight compute aspects of the game's rendering be passed on to the iGPU's execution units, so it has a more meaningful contribution to overall performance. To that effect, Intel is on the path of coming up with SDK that can be integrated with existing game engines.

Microsoft DirectX 12 introduced the holy grail of multi-GPU technology, under its Explicit Multi-Adapter specification. This allows game engines to send rendering traffic to any combinations or makes of GPUs that support the API, to achieve a performance uplift over single GPU. This was met with lukewarm reception from AMD and NVIDIA, and far too few DirectX 12 games actually support it. Intel proposes a specialization of explicit multi-adapter approach, in which the iGPU's execution units are made to process various low-bandwidth elements both during the rendering and post-processing stages, such as Occlusion Culling, AI, game physics, etc. Intel's method leverages cross-adapter shared resources sitting in system memory (main memory), and D3D12 asynchronous compute, which creates separate processing queues for rendering and compute.



Intel developed easy code for game engine developers to integrate the new tech, with code for creating cross-adapter resources, shared heaps, and resources. The presentation also includes examples of how to how to leverage async compute and get the lightweight rendering- and compute paths to work with as little latency as possible. Intel also developed code for cross-adapter synchronization, called Intel Command Queue Throttle. This piece of code ensures performance and low frame-times when when the load is inconsistent between the iGPU and dGPU.



All current Intel Graphics drivers include support for the extension, and Intel has started giving out headers for the extension through its developer support. Intel notes that its method can be used for various kinds of async compute tasks such as shadows, AI, mesh deformation, and physics. Load on the system's PCIe and memory bandwidth is minimized because the iGPU isn't made to handle heavyweight resources such as texture filtering.



Intel iGPUs are approaching the 1 TFLOPs compute power barrier, with Gen11 and the upcoming Xe-based iGPU debuting with "Tiger Lake." That's a lot of compute power not to take advantage of. Intel's tech can prove particularly useful with notebooks that have entry- thru mid-range discrete GPUs, as all Intel mobile processors pack iGPUs and implement dynamic switching between iGPU and dGPU.



The complete Intel presentation follows.


View at TechPowerUp Main Site
 
Joined
Nov 15, 2016
Messages
360 (0.28/day)
System Name Sillicon Nightmares
Processor Intel i7 9700KF 5ghz (no avx offset), 4.7ghz ring 1.374 vcore
Motherboard Asus Z390 Strix F
Cooling DEEPCOOL Gamer Storm CAPTAIN 360
Memory 2x8GB G.Skill Trident Z RGB 3733 16,18,18,36 2T
Video Card(s) ASUS GTX 1060 Strix 6GB OC, Core: 2202/2215, Vcore: 1.075v, Mem: 4909mhz (Sillicon Lottery Jackpot)
Storage Samsung 840 EVO 1TB SSD, WD Blue 1TB, Seagate 3TB, DoA PoS Nvme that makes me cry
Display(s) BenQ XL2430 1080p 144HZ + (2) Samsung SyncMaster 913v 1280x1024 75HZ + A Shitty TV For Movies
Case Deepcool Genome ROG Edition
Audio Device(s) Bunta Sniff Speakers From The Tip Edition
Power Supply Corsair AX860i/Cable Mod Cables
Mouse Logitech G602
Keyboard Shitty Dell Office Keyboard
Software Windows 10 x64
Benchmark Scores ~`13500 Firestrike (1st for my hardware, *De Ja Fool can s\/ck me softly)
i asked for this years ago, anisotropic filtering and post processing can be easily done on some PoS gigaflop level igp
 
Joined
Jul 5, 2013
Messages
9,455 (3.75/day)
System Name GPD-Q9
Processor Rockchip RK-3288 1.8ghz quad core
Motherboard GPD Q9_V6_150528
Cooling Passive
Memory 2GB DDR3
Video Card(s) Mali T764
Storage 16GB Samsung NAND
Display(s) IPS 1024x600
It should be noted that this is not exclusive to DX12 and can be done in OpenGL and Vulkan as well.
 
Last edited:
Joined
Sep 6, 2013
Messages
1,558 (0.63/day)
Location
Athens, Greece
System Name 3 systems: Gaming / Internet / HTPC
Processor Intel i5 4460 / Thuban 1455T(Unlocked 645) @ 3.7GHz @ 1.30V / A6 7400K
Motherboard ASRock Z97 Extreme6 / Gigabyte GA-990XA-UD3 / ASUS FM2+
Cooling Modified AM2 Asetek MicroChill / Noctua U12S / CoolerMaster TX2
Memory 16GB Kingston KHX1866C10D3 / 16GB Adata 2133MHz / 8GB Kingston 2400MHz (DDR3)
Video Card(s) XFX RX 580 8GB + GT 620 (PhysX)/ GT 710 / A6 7400K iGPU
Storage Intel NVMe 500GB, Samsung NVMe 250GB + more / Kingston 240GB + more / Samsung SSD 120GB
Display(s) Samsung LE32D550 32'' TV(2 systems connected) / LG 42''
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Sharkoon 650W / Chieftec 560W
Mouse CoolerMaster / Rapoo / Logitech
Keyboard CoolerMaster / Microsoft / Logitech
Software Windows
AMD could have pushed this years ago, but probably the advantages where not enough to cover up for lost sales of discrete GPUs. Nvidia could also push this, but they where winning in the GPU arena, so why give you the option to postpone an immediate upgrade? With Intel being the underdog in this market, it was probably the easiest bet to expect this kind of tech being utilized by them first. And in this case if they create an SDK that favors Intel CPUs and GPUs, it will be AMDs fault. Not Intel's marketing/bribe money or whatever we say (I say) from time to time(and probably in many case being correct).
 
Joined
Apr 12, 2013
Messages
3,197 (1.23/day)
Oh cool another set of slides by Intel, I wonder if this won't die the death of their last dozen or so innovations :laugh:
 
Joined
Nov 24, 2017
Messages
727 (0.79/day)
Location
Asia
Processor Intel Core i5 4590
Motherboard Gigabyte Z97x Gaming 3
Cooling Intel Stock Cooler
Memory 8GB(2x4GB) DDR3-800MHz [1600MT/s]
Video Card(s) XFX RX 560 4GB
Storage Transcend SSD370S 128GB; Toshiba DT01ACA100 1TB
Display(s) Samsung S20D300 20" 768p TN
Case Delux DLC-MV888
Audio Device(s) Realtek ALC1150
Power Supply Corsair VS450
Mouse A4Tech N-70FX
Software Windows 10 Pro
Benchmark Scores BaseMark GPU : 250 Point
AMD could have pushed this years ago, but probably the advantages where not enough to cover up for lost sales of discrete GPUs. Nvidia could also push this, but they where winning in the GPU arena, so why give you the option to postpone an immediate upgrade? With Intel being the underdog in this market, it was probably the easiest bet to expect this kind of tech being utilized by them first. And in this case if they create an SDK that favors Intel CPUs and GPUs, it will be AMDs fault. Not Intel's marketing/bribe money or whatever we say (I say) from time to time(and probably in many case being correct).
AMD could have pushed it, but the software developer would have implement it. How many application properly access AMD's encoder/decoder hardware on Windows??
Besides MS's Edge not a single browser use AMD's hardware decoder. But sadly it will repleased by inefficient Chrome's engine.
 

ARF

Joined
Jan 28, 2020
Messages
883 (7.18/day)
System Name ARF System 1 (retro build) | Portable 1 (energy efficient and portable)
Processor AMD Athlon 64 4400+ X2 | AMD Ryzen 5 2500U
Motherboard ASRock 939A790GMH 790GX SATA2 |
Cooling Arctic Freezer 13 | Dual-fan, dual heat-pipe Acer inbuilt
Memory 4 x 1GB DDR-400 | 2 x 8GB DDR4-2400
Video Card(s) Radeon ASUS EAH4670/DI/512MD3 | Radeon RX 560X 4G & Vega 8
Storage ADATA XPG SX900 128GB SATA3@SATA2 SSD | Western Digital Blue 3D NAND M.2 SSD 500GB
Display(s) | LG 24UD58-B & Panasonic TX-50CX670E
Case Cooler Master HAF 912 Plus | 15-inch notebook chassis
Audio Device(s) Superlux HD681 EVO
Mouse Genius NetScroll 100X | Genius NetScroll 100X
Keyboard | Logitech Wave
Software Windows 7U SP1| Windows 10 Pro 2004
Benchmark Scores CPU-Z 17.01.64 - ST: 387.3, MT: 2060.6 CPU-Z 15.01.64 - ST: 2044, MT: 7766
AMD could have pushed this years ago, but probably the advantages where not enough to cover up for lost sales of discrete GPUs. Nvidia could also push this, but they where winning in the GPU arena, so why give you the option to postpone an immediate upgrade? With Intel being the underdog in this market, it was probably the easiest bet to expect this kind of tech being utilized by them first. And in this case if they create an SDK that favors Intel CPUs and GPUs, it will be AMDs fault. Not Intel's marketing/bribe money or whatever we say (I say) from time to time(and probably in many case being correct).
AMD has for a decade Radeon Dual Graphics and Hybrid technologies.



The ATI Hybrid Graphics technology was announced on January 23, 2008 with Radeon HD 2400 series and Radeon HD 3400 series video cards supporting hybrid graphics functionality. Originally, ATI announced this feature would only be supported in Vista, but in August 2008 they included support in their Windows XP drivers as well.[1] The architecture has been patented by ATI.[2]
 
Joined
Jan 8, 2017
Messages
5,028 (4.06/day)
System Name Good enough
Processor AMD Ryzen R7 1700X - 4.0 Ghz / 1.350V
Motherboard ASRock B450M Pro4
Cooling Scythe Katana 4 - 3x 120mm case fans
Memory 16GB - Corsair Vengeance LPX
Video Card(s) OEM Dell GTX 1080
Storage 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) 4K Samsung TV
Case Zalman R1
Power Supply 500W
AMD could have pushed this years ago, but probably the advantages where not enough to cover up for lost sales of discrete GPUs.
It's not that, this stuff simply doesn't work well in practice.

The biggest problem by far is the fact that under a lot of circumstances the distribution of compute is counterproductive, the gap caused by the different levels of performance makes it so that stalls are inevitable. So the closer the gap the better the performance, this basically makes the combination of iGPU+dGPU the worst contender for this sort of stuff.

The same problem existed with CPU+GPU hybrid compute, nothing really works in that way, most things are either fully GPU accelerated or run on just the CPU.
 
Joined
Feb 19, 2009
Messages
1,047 (0.25/day)
Location
I live in Norway
System Name 3 sys spec seperated by "|"
Processor R9 3900x| R7 1700 @3.75 | 4800H
Motherboard Asrock X570M | AB350M Pro 4 | Asus Tuf A15
Cooling Air | Air | duh laptop
Memory 64gb G.skill SniperX @3600 CL16 | 64gb | 16gb
Video Card(s) EK-FC - RX Vega 64 | server | RTX2060M
Storage MP510 2TB, 660P 2TB, 2x860 evo 1tb | 960 500gb Intel 660P 1tb PM871 4x256gb ++| 1TB 660+ 1tb A1000
Display(s) AOC 28" 4K something + 1440p AOC 144hz something.
Case Phanteks EvolvX M-Atx
Power Supply Corsair RM850
Mouse g502 Lightspeed
Keyboard TT Meka G1
Joined
Jul 7, 2019
Messages
97 (0.30/day)
I wonder if AMD's Infinity Architecture is capable of allowing this sort of hybrid crossing. The IA is already planned to allow sharing of memory resources to reduce bottlenecks, and some potential work-splitting (making multi-GPUs work more like one giant GPU), but I wonder if it could also leverage a Ryzen APU's iGPU too, in a similar manner.

The main concern I have about the concept of using the iGPU (whether it's on AMD or Intel) in a hybrid manner is how much of this will adversely affect performance of the CPU. Intel already throttles when it gets too toasty, and that's mostly just pure CPU work. AMD APUs also have bottleneck limitations, but assuming the next-gen APU is more like the PS5's via Infinity Arch, what about heat limitations for it?

And for that matter, will gaming companies really be interested in pursuing and optimizing games to work over such a niche distributed GPU method when they haven't done so for when the tech was first available? I mean sure, a few might just because Intel sponsored a game, but it seems like an overall waste of time if they have to build in the feature for it instead of letting the system distribute it intelligently (ie: it's up to the CPU/APU + GPU to intelligently distribute the resources necessary).
 
Joined
Sep 6, 2013
Messages
1,558 (0.63/day)
Location
Athens, Greece
System Name 3 systems: Gaming / Internet / HTPC
Processor Intel i5 4460 / Thuban 1455T(Unlocked 645) @ 3.7GHz @ 1.30V / A6 7400K
Motherboard ASRock Z97 Extreme6 / Gigabyte GA-990XA-UD3 / ASUS FM2+
Cooling Modified AM2 Asetek MicroChill / Noctua U12S / CoolerMaster TX2
Memory 16GB Kingston KHX1866C10D3 / 16GB Adata 2133MHz / 8GB Kingston 2400MHz (DDR3)
Video Card(s) XFX RX 580 8GB + GT 620 (PhysX)/ GT 710 / A6 7400K iGPU
Storage Intel NVMe 500GB, Samsung NVMe 250GB + more / Kingston 240GB + more / Samsung SSD 120GB
Display(s) Samsung LE32D550 32'' TV(2 systems connected) / LG 42''
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Sharkoon 650W / Chieftec 560W
Mouse CoolerMaster / Rapoo / Logitech
Keyboard CoolerMaster / Microsoft / Logitech
Software Windows
AMD has for a decade Radeon Dual Graphics and Hybrid technologies.
Can you please post the Dual Graphics compatibility table with Ryzen APUs? How many years to update/support it?
Also, this is something different compared to Hybrid CrossFire.

It's not that, this stuff simply doesn't work well in practice.

The biggest problem by far is the fact that under a lot of circumstances the distribution of compute is counterproductive, the gap caused by the different levels of performance makes it so that stalls are inevitable. So the closer the gap the better the performance, this basically makes the combination of iGPU+dGPU the worst contender for this sort of stuff.

The same problem existed with CPU+GPU hybrid compute, nothing really works in that way, most things are either fully GPU accelerated or run on just the CPU.
Under DX12 we could see something much better compared to what Hybrid Graphics was giving us. If Intel's approach works and probably works in combination with AMD and Nvidia discrete cards, it will be proof that AMD and Nvidia just decided to ignore it.
 
Joined
Feb 23, 2018
Messages
172 (0.21/day)
Ya, multi gpu is a very good thing. I'd like to see game developers take advantage of this tech for folks who have dGPU + iGPU and for folks using two heavyweight dGPUs.
 
Joined
Feb 18, 2005
Messages
2,616 (0.47/day)
Location
United Kingdom
Essentially it's offloading tasks that would normally be run on the CPU... to the iGPU. Which... is part of the CPU.

So I guess you can either choose between your CPU boosting to max and iGPU doing nothing, or CPU and iGPU both running but at lower speeds? Not seeing the value proposition TBH.
 
Joined
Aug 22, 2010
Messages
320 (0.09/day)
Location
Germany
System Name https://goo.gl/FDgehs
Not much progress since 2015 :(
 

ARF

Joined
Jan 28, 2020
Messages
883 (7.18/day)
System Name ARF System 1 (retro build) | Portable 1 (energy efficient and portable)
Processor AMD Athlon 64 4400+ X2 | AMD Ryzen 5 2500U
Motherboard ASRock 939A790GMH 790GX SATA2 |
Cooling Arctic Freezer 13 | Dual-fan, dual heat-pipe Acer inbuilt
Memory 4 x 1GB DDR-400 | 2 x 8GB DDR4-2400
Video Card(s) Radeon ASUS EAH4670/DI/512MD3 | Radeon RX 560X 4G & Vega 8
Storage ADATA XPG SX900 128GB SATA3@SATA2 SSD | Western Digital Blue 3D NAND M.2 SSD 500GB
Display(s) | LG 24UD58-B & Panasonic TX-50CX670E
Case Cooler Master HAF 912 Plus | 15-inch notebook chassis
Audio Device(s) Superlux HD681 EVO
Mouse Genius NetScroll 100X | Genius NetScroll 100X
Keyboard | Logitech Wave
Software Windows 7U SP1| Windows 10 Pro 2004
Benchmark Scores CPU-Z 17.01.64 - ST: 387.3, MT: 2060.6 CPU-Z 15.01.64 - ST: 2044, MT: 7766
Not much progress since 2015 :(
:( Nope, around 60% performance improvement between the Radeon R9 Fury X and Radeon VII:

1585168996742.png

 
Joined
Mar 14, 2009
Messages
4,553 (1.11/day)
Location
Ohio
System Name Rainbow puke/ Orange Poop
Processor AMD Ryzen 3600/ AMD Ryzen 2600x
Motherboard ASRock X570 Gaming 4s/ ASRock B450- pro 4 atx
Cooling Cooler master Master Air/ Corsair H110i
Memory 16GB G.Skill TridentZ 3200MHZ
Video Card(s) Zotac 2080 Super AMP
Storage Corsair 512gb PCI-E 4.0/ 960 EVO 500gb/256gb Inland Premium 2280 M2
Display(s) ACER 144hz 27"
Case Thermaltake Commander C33/ Radimax Gama (LoL)
Power Supply Seasonic
Mouse Red Dragon RGB 602 Griffin
Keyboard Razer Deatstalker
Software Windows 10Pro x64
I remember this sounding fantastic 5 years ago...Still sounds fantastic.
 
Joined
Apr 8, 2010
Messages
515 (0.14/day)
Processor Intel Core i5 8400
Motherboard Gigabyte Z370N-Wifi
Cooling Silverstone AR05
Memory Micron Crucial 16GB DDR4-2400
Video Card(s) Gigabyte GTX1080 G1 Gaming 8G
Storage Micron Crucial MX300 275GB
Display(s) Dell U2415
Case Silverstone RVZ02B
Power Supply Silverstone SSR-SX550
Keyboard Ducky One Red Switch
Software Windows 10 Pro 1909
From experience, just the management of dGPU and iGPU transferring data from one another, alone, wastes more time per frame than the time saved by offloading some work to the iGPU. Not to mention the extra work the CPU has to do to keep them in sync.
 

ARF

Joined
Jan 28, 2020
Messages
883 (7.18/day)
System Name ARF System 1 (retro build) | Portable 1 (energy efficient and portable)
Processor AMD Athlon 64 4400+ X2 | AMD Ryzen 5 2500U
Motherboard ASRock 939A790GMH 790GX SATA2 |
Cooling Arctic Freezer 13 | Dual-fan, dual heat-pipe Acer inbuilt
Memory 4 x 1GB DDR-400 | 2 x 8GB DDR4-2400
Video Card(s) Radeon ASUS EAH4670/DI/512MD3 | Radeon RX 560X 4G & Vega 8
Storage ADATA XPG SX900 128GB SATA3@SATA2 SSD | Western Digital Blue 3D NAND M.2 SSD 500GB
Display(s) | LG 24UD58-B & Panasonic TX-50CX670E
Case Cooler Master HAF 912 Plus | 15-inch notebook chassis
Audio Device(s) Superlux HD681 EVO
Mouse Genius NetScroll 100X | Genius NetScroll 100X
Keyboard | Logitech Wave
Software Windows 7U SP1| Windows 10 Pro 2004
Benchmark Scores CPU-Z 17.01.64 - ST: 387.3, MT: 2060.6 CPU-Z 15.01.64 - ST: 2044, MT: 7766
From experience, just the management of dGPU and iGPU transferring data from one another, alone, wastes more time per frame than the time saved by offloading some work to the iGPU. Not to mention the extra work the CPU has to do to keep them in sync.
Nowadays CPUs don't do a lot of work. At 1080p 40% load, at 4K 15-20% load.
They must do something and be kept busy up to 80-90%, rather than sitting lazy.
 
Last edited:
Joined
Apr 8, 2010
Messages
515 (0.14/day)
Processor Intel Core i5 8400
Motherboard Gigabyte Z370N-Wifi
Cooling Silverstone AR05
Memory Micron Crucial 16GB DDR4-2400
Video Card(s) Gigabyte GTX1080 G1 Gaming 8G
Storage Micron Crucial MX300 275GB
Display(s) Dell U2415
Case Silverstone RVZ02B
Power Supply Silverstone SSR-SX550
Keyboard Ducky One Red Switch
Software Windows 10 Pro 1909
Nowadays CPUs don't do a lot of work. At 1080p 40% load, at 4K 15-20% load.
They must do something and be kept busy up to 80-90%, rather than sitting lazy.
You don't keep your CPU spinning for no performance gains, which is the case with adding iGPU along side your dGPU. Not to mention the amount of extra work for developers.
 
Joined
Sep 17, 2014
Messages
12,145 (5.83/day)
Location
Mars
Processor i7 8700k 4.7Ghz @ 1.26v
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) MSI GTX 1080 Gaming X @ 2100/5500
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Eizo Foris FG2421
Case Fractal Design Define C TG
Power Supply EVGA G2 750w
Mouse Logitech G502 Protheus Spectrum
Keyboard Sharkoon MK80 (Brown)
Software W10 x64
If Intel's approach works
That is the million dollar question and so far all evidence has historically pointed at 'no'

Besides, what are you winning here. Just buy a decent GPU. We all know that any realtime critical task needs to be done as close to the hardware as possible, yet here we are making a full round trip back to the CPU to win what, 5-10% fps?

This is already dead imo, but maybe Intel can pull out rabbits if they combine this with new Xe sauce. End result: you will need an Intel CPU (lol, in 2020-2021?!) and an Intel Xe GPU (which?) to make a dent.

Nowadays CPUs don't do a lot of work. At 1080p 40% load, at 4K 15-20% load.
They must do something and be kept busy up to 80-90%, rather than sitting lazy.
That's not how it works obviously. Its the IGP doing the work, not the CPU cycles you have sitting idle while gaming.
 
Joined
Sep 6, 2013
Messages
1,558 (0.63/day)
Location
Athens, Greece
System Name 3 systems: Gaming / Internet / HTPC
Processor Intel i5 4460 / Thuban 1455T(Unlocked 645) @ 3.7GHz @ 1.30V / A6 7400K
Motherboard ASRock Z97 Extreme6 / Gigabyte GA-990XA-UD3 / ASUS FM2+
Cooling Modified AM2 Asetek MicroChill / Noctua U12S / CoolerMaster TX2
Memory 16GB Kingston KHX1866C10D3 / 16GB Adata 2133MHz / 8GB Kingston 2400MHz (DDR3)
Video Card(s) XFX RX 580 8GB + GT 620 (PhysX)/ GT 710 / A6 7400K iGPU
Storage Intel NVMe 500GB, Samsung NVMe 250GB + more / Kingston 240GB + more / Samsung SSD 120GB
Display(s) Samsung LE32D550 32'' TV(2 systems connected) / LG 42''
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Sharkoon 650W / Chieftec 560W
Mouse CoolerMaster / Rapoo / Logitech
Keyboard CoolerMaster / Microsoft / Logitech
Software Windows
That is the million dollar question and so far all evidence has historically pointed at 'no'

Besides, what are you winning here. Just buy a decent GPU. We all know that any realtime critical task needs to be done as close to the hardware as possible, yet here we are making a full round trip back to the CPU to win what, 5-10% fps?

This is already dead imo, but maybe Intel can pull out rabbits if they combine this with new Xe sauce. End result: you will need an Intel CPU (lol, in 2020-2021?!) and an Intel Xe GPU (which?) to make a dent.



That's not how it works obviously. Its the IGP doing the work, not the CPU cycles you have sitting idle while gaming.
Well, in a scenario that is way too optimistic, if this works out, the possibilities for a much more meaningful usage than just combining an iGPU with a high end discrete GPU, are there. We just never had the chance to see what this technology could offer.
 
Joined
Nov 21, 2010
Messages
1,466 (0.42/day)
Location
Right where I want to be
System Name Miami
Processor Ryzen 3800X
Motherboard Asus Crosshair VII Formula
Cooling Ek Velocity/ 2x 280mm Radiators/ Alphacool fullcover
Memory F4-3600C16Q-32GTZNC
Video Card(s) SAPPHIRE NITRO+ Radeon RX 5700 XT
Storage 1TB WD M.2 SSD/ 2TB WD SN750/ 4TB WD Black HDD
Display(s) DELL AW3420DW / HP ZR24w
Case Lian Li O11 Dynamic XL
Audio Device(s) EVGA Nu Audio
Power Supply Seasonic Prime Gold 1000W
Mouse Corsair Scimitar/Glorious Model O-
Keyboard Corsair K95 Platinum
Software Windows 10 Pro
But sadly it will repleased by inefficient Chrome's engine.
It already has and it's terrible. Whatever is driving audio cannot keep up whatsoever leading to DPC latency errors which causes playback to snap, crackle, and pop leaving the only options is to either lower sound quality, deal with it, or switch to another browser. I'm now using Firefox which isn't much better trading one set of issues for another. Not happy/disappointed that Edge is now hot garbage.
 
Last edited:
Joined
Sep 17, 2014
Messages
12,145 (5.83/day)
Location
Mars
Processor i7 8700k 4.7Ghz @ 1.26v
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) MSI GTX 1080 Gaming X @ 2100/5500
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Eizo Foris FG2421
Case Fractal Design Define C TG
Power Supply EVGA G2 750w
Mouse Logitech G502 Protheus Spectrum
Keyboard Sharkoon MK80 (Brown)
Software W10 x64
Well, in a scenario that is way too optimistic, if this works out, the possibilities for a much more meaningful usage than just combining an iGPU with a high end discrete GPU, are there. We just never had the chance to see what this technology could offer.
Hmhm. But this is Intel and this is/was timed to coincide with GDC. This is marketing. And probably just some wild fantasy. On the link to the Intel page the thing has 0 collaborators and all we get is a few lines of text and some weird purple blurb that is supposed to be a demo. So yay they can run a furry with the IGP taking some of the load... 1999 called.
 
Joined
Sep 6, 2013
Messages
1,558 (0.63/day)
Location
Athens, Greece
System Name 3 systems: Gaming / Internet / HTPC
Processor Intel i5 4460 / Thuban 1455T(Unlocked 645) @ 3.7GHz @ 1.30V / A6 7400K
Motherboard ASRock Z97 Extreme6 / Gigabyte GA-990XA-UD3 / ASUS FM2+
Cooling Modified AM2 Asetek MicroChill / Noctua U12S / CoolerMaster TX2
Memory 16GB Kingston KHX1866C10D3 / 16GB Adata 2133MHz / 8GB Kingston 2400MHz (DDR3)
Video Card(s) XFX RX 580 8GB + GT 620 (PhysX)/ GT 710 / A6 7400K iGPU
Storage Intel NVMe 500GB, Samsung NVMe 250GB + more / Kingston 240GB + more / Samsung SSD 120GB
Display(s) Samsung LE32D550 32'' TV(2 systems connected) / LG 42''
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Sharkoon 650W / Chieftec 560W
Mouse CoolerMaster / Rapoo / Logitech
Keyboard CoolerMaster / Microsoft / Logitech
Software Windows
Hmhm. But this is Intel and this is/was timed to coincide with GDC. This is marketing. And probably just some wild fantasy. On the link to the Intel page the thing has 0 collaborators and all we get is a few lines of text and some weird purple blurb that is supposed to be a demo. So yay they can run a furry with the IGP taking some of the load... 1999 called.
I don't think it's just wild fantasy. Koduri does have experience and knowledge from Hybrid and typical CrossFire being in AMD. As shown above, in another post, DirectX 12 supports the tech from 2015, so people working with GPUs probably work with that idea for years. For reasons unknown to us, they didn't moved forward to make it available to us. But that's in my opinion decisions made from marketing and financial divisions, not technical problems. If we also consider that much of DirectX 12 is Mantle, maybe multi-adapter was part of Mantle, or it was suppose to be a feature for Mantle 2, 3 or something, if there where more versions of Mantle. Maybe a replacement for the original CrossFire back when CrossFire and SLi where still important. But then Nvidia decided to slowly kill SLi and AMD probably had no problem to follow Nvidia in that decision.
Intel now is a different story, it starts with zero market share, that will definitely jump thank's to the major OEMs getting nice deals on the GPUs, if they buy them as a nice little package with the CPUs. Intel's offerings will also have lower performance compared to AMD and Nvidia offerings. Not to mention probably shorter list of features. So, Intel is in a position where they will try every way possible to extent that feature list and probably close the gap in performance with their competitors. And multi adapter support can do both. Offer something that the competition doesn't and also improve performance when combining an Intel iGPU and a discrete Intel GPU compared to combining an Intel iGPU, that will be doing nothing in 3D gaming, with an AMD or Nvidia discrete GPU.

P.S. "Combining two Intel GPUs for the ultimate performance in games"
That's a nice marketing line on the box of a laptop, don't you think? And many consumers will never think to ask what kind of GPUs. 2 GPUs vs 1 GPU does sound better.
 
Last edited:
Top