• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel iGPU+dGPU Multi-Adapter Tech Shows Promise Thanks to its Realistic Goals

Joined
Apr 8, 2010
Messages
515 (0.14/day)
Processor Intel Core i5 8400
Motherboard Gigabyte Z370N-Wifi
Cooling Silverstone AR05
Memory Micron Crucial 16GB DDR4-2400
Video Card(s) Gigabyte GTX1080 G1 Gaming 8G
Storage Micron Crucial MX300 275GB
Display(s) Dell U2415
Case Silverstone RVZ02B
Power Supply Silverstone SSR-SX550
Keyboard Ducky One Red Switch
Software Windows 10 Pro 1909
I don't think it's just wild fantasy. Koduri does have experience and knowledge from Hybrid and typical CrossFire being in AMD. As shown above, in another post, DirectX 12 supports the tech from 2015, so people working with GPUs probably work with that idea for years. For reasons unknown to us, they didn't moved forward to make it available to us. But that's in my opinion decisions made from marketing and financial divitions, not technical problems. If we also consider that much of DirectX 12 is Mantle, maybe multi-adapter was part of Mantle, or it was suppose to be a feature for Mantle 2, 3 or something, if there where more versions of Mantle. Maybe a replacement for the original CrossFire back when CrossFire and SLi where still important. But then Nvidia decided to slowly kill SLi and AMD probably had no problem to follow Nvidia in that decision.
DirectX 12 allows developers more access to the underlying hardware aspects of GPUs, so it is possible to, for example, explicitly do task 1 on GPU1, after that's done, copy the result to GPU2 and use that result to do task 2. So yes, you could say all that was made possible since Mantle. A lot of that control wasn't there in DirectX 11 and before, SLI/CF just merged multiple GPUs into one logical GPU. The problem, to make it work, and especially to make it work well (significantly better than origical CF and SLI), it becomes a lot of work for developers. Just too much work for too little gain.
 
Joined
Sep 17, 2014
Messages
12,145 (5.83/day)
Location
Mars
Processor i7 8700k 4.7Ghz @ 1.26v
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) MSI GTX 1080 Gaming X @ 2100/5500
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Eizo Foris FG2421
Case Fractal Design Define C TG
Power Supply EVGA G2 750w
Mouse Logitech G502 Protheus Spectrum
Keyboard Sharkoon MK80 (Brown)
Software W10 x64
I don't think it's just wild fantasy. Koduri does have experience and knowledge from Hybrid and typical CrossFire being in AMD. As shown above, in another post, DirectX 12 supports the tech from 2015, so people working with GPUs probably work with that idea for years. For reasons unknown to us, they didn't moved forward to make it available to us. But that's in my opinion decisions made from marketing and financial divisions, not technical problems. If we also consider that much of DirectX 12 is Mantle, maybe multi-adapter was part of Mantle
This I think hits the nail in the head but you fail to draw the right conclusion. If its not a technical problem, its an economical one, and there is absolutely zero reason right now to make it viable all of a sudden. DX12's mGPU bit is up to devs and it still is even if Intel offloads parts of it to its IGP. It won't happen because devs will need to implement it, your perfect proof of that is the current demise of SLI and in somewhat lesser degree, Crossfire. And the few percent you can win on an IGP in absolute performance is a complete waste of time, especially because discrete GPUs get stronger much faster than IGPs. By the time you're done implementing it you're already looking at a cheap discrete GPU that can dwarf its performance for 30 bucks. Those product stacks jump up in perf with 15-30% on a yearly or bi yearly basis. IGP's are still stuck struggling with medium 1080p graphics, hopefully at something above 15 fps.

I also remember this one

Didn't fly, and its now dead and buried. The theory is great. In practice it just won't work.
 
Last edited:
Joined
Sep 6, 2013
Messages
1,558 (0.63/day)
Location
Athens, Greece
System Name 3 systems: Gaming / Internet / HTPC
Processor Intel i5 4460 / Thuban 1455T(Unlocked 645) @ 3.7GHz @ 1.30V / A6 7400K
Motherboard ASRock Z97 Extreme6 / Gigabyte GA-990XA-UD3 / ASUS FM2+
Cooling Modified AM2 Asetek MicroChill / Noctua U12S / CoolerMaster TX2
Memory 16GB Kingston KHX1866C10D3 / 16GB Adata 2133MHz / 8GB Kingston 2400MHz (DDR3)
Video Card(s) XFX RX 580 8GB + GT 620 (PhysX)/ GT 710 / A6 7400K iGPU
Storage Intel NVMe 500GB, Samsung NVMe 250GB + more / Kingston 240GB + more / Samsung SSD 120GB
Display(s) Samsung LE32D550 32'' TV(2 systems connected) / LG 42''
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Sharkoon 650W / Chieftec 560W
Mouse CoolerMaster / Rapoo / Logitech
Keyboard CoolerMaster / Microsoft / Logitech
Software Windows
This I think hits the nail in the head but you fail to draw the right conclusion. If its not a technical problem, its an economical one, and there is absolutely zero reason right now to make it viable all of a sudden. DX12's mGPU bit is up to devs and it still is even if Intel offloads parts of it to its IGP. It won't happen because devs will need to implement it, your perfect proof of that is the current demise of SLI and in somewhat lesser degree, Crossfire. And the few percent you can win on an IGP in absolute performance is a complete waste of time, especially because discrete GPUs get stronger much faster than IGPs. By the time you're done implementing it you're already looking at a cheap discrete GPU that can dwarf its performance for 30 bucks. Those product stacks jump up in perf with 15-30% on a yearly or bi yearly basis. IGP's are still stuck struggling with medium 1080p graphics, hopefully at something above 15 fps.

I also remember this one

Didn't fly, and its now dead and buried. The theory is great. In practice it just won't work.
The economical problems I have in mind is more like "If we keep (for example) SLI, we will have to support it in drivers and we will end up selling two GTX 2070s instead of one 2080 Ti". Well, Intel doesn't have this dilemma. This could end up as their competing option against SLI and Crossfire and Nvidia and AMD killing those two, doesn't mean than Intel will follow. At least not yet, not as long as it is - logically- (far) behind. So, what starts as a combination of an iGPU and a discrete GPU could evolve into combining two discrete GPUs. And the fact that someone would be able to combine two Intel GPUs from different market segments, a low end and a mid range for example, or even different generation discrete GPUs - that was the whole idea in the first place anyway - could offer Intel an advantage.
 
Joined
Sep 17, 2014
Messages
12,145 (5.83/day)
Location
Mars
Processor i7 8700k 4.7Ghz @ 1.26v
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) MSI GTX 1080 Gaming X @ 2100/5500
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Eizo Foris FG2421
Case Fractal Design Define C TG
Power Supply EVGA G2 750w
Mouse Logitech G502 Protheus Spectrum
Keyboard Sharkoon MK80 (Brown)
Software W10 x64
The economical problems I have in mind is more like "If we keep (for example) SLI, we will have to support it in drivers and we will end up selling two GTX 2070s instead of one 2080 Ti". Well, Intel doesn't have this dilemma. This could end up as their competing option against SLI and Crossfire and Nvidia and AMD killing those two, doesn't mean than Intel will follow. At least not yet, not as long as it is - logically- (far) behind. So, what starts as a combination of an iGPU and a discrete GPU could evolve into combining two discrete GPUs. And the fact that someone would be able to combine two Intel GPUs from different market segments, a low end and a mid range for example, or even different generation discrete GPUs - that was the whole idea in the first place anyway - could offer Intel an advantage.
We've seen that before too, when Raja told us we could put two RX480's in Crossfire to match Nvidia's performance at some point. Didn't quite work out. But you have point regardless, people do already have the IGP in many cases. This hinges entirely on Intel's ability to keep broad support intact, I'm not holding my breath.
 
Top