• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Unveils Discrete GPU Prototype Development

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Intel is making progress in its development of a new discrete GPU architecture, after its failed attempt with "Larrabee" that ended up as an HPC accelerator; and ancient attempts such as the i740. This comes in the wake of the company's high-profile hiring of Raja Koduri, AMD's former Radeon Technologies Group (RTG) head. The company unveiled slides pointing to the direction in which its GPU development is headed, at the IEEE International Solid-State Circuits Conference (ISSCC) in San Francisco. That direction is essentially scaling up its existing iGPU architecture, and bolstering it with mechanisms to sustain high clock speeds better.

The company's first 14 nm dGPU prototype, shown as a test-chip at the ISSCC, is a 2-chip solution. The first chip contains two key components, the GPU itself, and a system agent; and the second chip is an FPGA that interfaces with the system bus. The GPU component, as it stands now, is based on Intel's Gen 9 architecture, and features a three execution unit (EU) clusters. Don't derive numbers from this yet, as Intel is only trying to demonstrate a proof of concept. The three clusters are wired to a sophisticated power/clock management mechanism that efficiently manages power and clock-speed of each individual EU. There's also a double-clock mechanism that doubles clock speeds (of the boost state) beyond what today's Gen 9 EUs can handle on Intel iGPUs. Once a suitable level of energy efficiency is achieved, Intel will use newer generations of EUs, and scale up EU counts taking advantage of newer fab processes, to develop bigger discrete GPUs.


More slides follow.



View at TechPowerUp Main Site
 
It would be cool if we got 3rd GPU player on graphics market.
 
better buy me some extra nice thermal paste when they come out

Kappa
 
It would be cool if we got 3rd GPU player on graphics market.

There already are a lot more GPU makers than Nvidia and AMD, but none of them are for PC's.

ARM, Imagination Technologies, Qualcomm and Vivante all make GPU's, as well as technically S3/VIA...
I most likely missed some companies as well, but the problem is, none of them can keep up with Nvidia due to the amount of money they're throwing at their R&D of new graphics architectures.

Even Intel isn't going to be able to catch up any time soon. At best I'd expect something mid-level for the first 2-3 years, as it's expensive, it's resource intensive and time consuming to make GPU's.
 
There already are a lot more GPU makers than Nvidia and AMD, but none of them are for PC's.

ARM, Imagination Technologies, Qualcomm and Vivante all make GPU's, as well as technically S3/VIA...
I most likely missed some companies as well, but the problem is, none of them can keep up with Nvidia due to the amount of money they're throwing at their R&D of new graphics architectures.

Even Intel isn't going to be able to catch up any time soon. At beast I'd expect something mid-level for the first 2-3 years, as it's expensive, it's resource intensive and time consuming to make GPU's.


Yep, I myself am waiting for GTX 2080 this summer. Oh sweet mama, the power! I need the power even if it costs me my soul!!!! /howls
 
Inferior shaders on an inferior dual chip solution.

Good luck with that!
 
Yep, I myself am waiting for GTX 2080 this summer. Oh sweet mama, the power! I need the power even if it costs me my soul!!!! /howls

GTX1180*
 
If this is actually a GPU aimed at gaming and if performance is sufficient and if Linux drivers are Open Source, then I might consider it as an option :)
 
i740 wasn't impressive at all if you looked at the numbers. Yet is was highly sought after, since it gave us acceptable performance for a good price.
If Intel manages to do just that once again, customers win.
 
i740 wasn't impressive at all if you looked at the numbers. Yet is was highly sought after, since it gave us acceptable performance for a good price.
If Intel manages to do just that once again, customers win.

Different times... today an IGP is no longer something special and AMD's Ryzen APUs are really quite far ahead here, but also in the realm of 'performance where you might need it at some point'... hardly something people are eagerly waiting for.
 
Different times... today an IGP is no longer something special and AMD's Ryzen APUs are really quite far ahead here, but also in the realm of 'performance where you might need it at some point'... hardly something people are eagerly waiting for.
i740 was quite a capable mid-range GPU. Today's IGPs are not in the same league, so I'd like Intel heat up things a bit in that segment. Of course, there's always the odd chance that Intel's strategy is not entirely based on my wishes, but we'll see.
 
I hope AMD is competitive with Nvidia in the hi end GPU market before Intel starts pushing out some good GPUs. Assuming they can throw enough money on the problem and come out with good GPUs, of course. We need AMD as a highly competitive company, more than we need a good Intel GPU.

PS With Intel having bought in the past Altera, I wonder if their first GPUs will be targeting gamers, or miners.
 
...intel is a mystery to me right mow. I really think they torpedoed their cpu division intentionally for some reason. No one goes from Sherlock Holmes to inspector Clousteau that quickly.... there has to be a long game plan here somewhere
 
Intel would be a curious player if the drivers function well.

Even Intel isn't going to be able to catch up any time soon. At best I'd expect something mid-level for the first 2-3 years, as it's expensive, it's resource intensive and time consuming to make GPU's.

I bet they take the lead for certain types of mining right off of the bat.
 
...intel is a mystery to me right mow. I really think they torpedoed their cpu division intentionally for some reason. No one goes from Sherlock Holmes to inspector Clousteau that quickly.... there has to be a long game plan here somewhere

Competition! AMD is bringing out APU's that are faster then Intels. So to maximize profits and keep shareholders happy they set for creating their own gpu's as in the past. Remember that Intel has sold the most VGA chipsets back in the days... it was due to complete motherboard sollutions with intel's IGP on top of that.

In business, small brands and co-operations, general computers dont need external GPU to function. A IGP is more then enough to provide 2D work and some video acceleration. These days it's more and more tasks being shifted to the GPU.
 
Intel should start to develop it's own drivers for their IGPs before doing a dGPU, Windows OpenGL and Direct3D drivers are a joke.
 
If Intel could come to market with even a mainstream discrete gaming (GTX1060/470) offering by 2020 I'd be surprised and welcoming, but they've a bunch of work. If Raja Koduri can build a clear focus and keep engineers on task just getting to the 1050/560 level and drives to back it would be a good first showing.
 
i740 was quite a capable mid-range GPU. Today's IGPs are not in the same league, so I'd like Intel heat up things a bit in that segment. Of course, there's always the odd chance that Intel's strategy is not entirely based on my wishes, but we'll see.

True. If my memory serves me well i740 was half the speed of 3dfx voodoo2. And if you take the current top dog gtx1080ti as voodoo2, then i740 would be no less than RX 580. I would gladly take third discrete gpu manufacturer if it could offer that kind of speed vs competition today.
 
I'm sure they understand the huge challenges ahead just from the Larrabee debacle. I think I remember reading something when Larrabee was in development and engineer was quoted saying something like making a GPU was harder than they anticipated.
 
Inferior shaders on an inferior dual chip solution.

Good luck with that!
Play fair dude, this is likely Raja's very first step towards something, ie intels present best igpu with hybrid shaders and an fpga added to make up for all thats missing if they don't fit cpu cores and more importantly the rest of its supporting circuitry related to specific purposes ie power control but, the fpga mainly acts as an interface between intels proprietary inter chip interface type and pciex since it's clear the igpu was designed without pciex in mind it makes sense to test what they can expect before scaling up the design, they won't actually make many of these even for themselves, it's a stepping stone chip clear as day.

The shop bought ones four or five years out imho and certainly won't be a dual chip solution maybe MCM though.
Because that FPGA could possibly be a game changer if used to pump the GPUs core performance intelligently and or to directly affect a new Api for mainstream use case acceleration, intel already do fpga in server so they know it's got legs.
 
Last edited:
Competition! AMD is bringing out APU's that are faster then Intels. So to maximize profits and keep shareholders happy they set for creating their own gpu's as in the past. Remember that Intel has sold the most VGA chipsets back in the days... it was due to complete motherboard sollutions with intel's IGP on top of that.

In business, small brands and co-operations, general computers dont need external GPU to function. A IGP is more then enough to provide 2D work and some video acceleration. These days it's more and more tasks being shifted to the GPU.


That is iff you want to consider a chip being sold is intergated into a CPU that people have had no choice to get with the CPU. No one would willing buy the GPU if it was in a PCIE slot.
 
That is iff you want to consider a chip being sold is intergated into a CPU that people have had no choice to get with the CPU. No one would willing buy the GPU if it was in a PCIE slot.

There was a time where IGP's where simply integrated into the motherboards chipset. Even in AMD's platform (Nforce anyone?) it was very common to add a IGP along with motherboards chipset. It was a good sale value, simply buy CPU / RAM / Mobo and your good. But the performance was never really 'good', inproper for any gaming at all and esp. lack of good driver support.

They made attempts on AMD boards for example, to put a seperate memory module on the board itself which was the memory for the IGP. This was alot faster then shared ram sollution.

But as long as we stick to relatively slow DDR on PC's any IGP will lack bandwidth thus never offer serious performance compared to dedicated card.
 
Play fair dude, this is likely Raja's very first step towards something, ie intels present best igpu with hybrid shaders and an fpga added to make up for all thats missing if they don't fit cpu cores and more importantly the rest of its supporting circuitry related to specific purposes, and act as an interface between intels proprietary inter chip type and pciex, they won't actually make many of these even for themselves, it's a stepping stone chip clear as day.

The shop bought ones four or five years out imho and certainly won't be a dual chip solution maybe MCM though.
Because that FPGA could possibly be a game changer if used to pump the GPUs core performance intelligently and or to directly affect a new Api for mainstream use case acceleration, intel already do fpga in server so they know it's got legs.

This isn't Raja's first step. He just joined the company last month, no way he did these designs in a month. Raja will have a hand in their next gen architecture and current / future drivers.
 
Back
Top