Sunday, February 19th 2017

Intel Unveils Discrete GPU Prototype Development

Intel is making progress in its development of a new discrete GPU architecture, after its failed attempt with "Larrabee" that ended up as an HPC accelerator; and ancient attempts such as the i740. This comes in the wake of the company's high-profile hiring of Raja Koduri, AMD's former Radeon Technologies Group (RTG) head. The company unveiled slides pointing to the direction in which its GPU development is headed, at the IEEE International Solid-State Circuits Conference (ISSCC) in San Francisco. That direction is essentially scaling up its existing iGPU architecture, and bolstering it with mechanisms to sustain high clock speeds better.

The company's first 14 nm dGPU prototype, shown as a test-chip at the ISSCC, is a 2-chip solution. The first chip contains two key components, the GPU itself, and a system agent; and the second chip is an FPGA that interfaces with the system bus. The GPU component, as it stands now, is based on Intel's Gen 9 architecture, and features a three execution unit (EU) clusters. Don't derive numbers from this yet, as Intel is only trying to demonstrate a proof of concept. The three clusters are wired to a sophisticated power/clock management mechanism that efficiently manages power and clock-speed of each individual EU. There's also a double-clock mechanism that doubles clock speeds (of the boost state) beyond what today's Gen 9 EUs can handle on Intel iGPUs. Once a suitable level of energy efficiency is achieved, Intel will use newer generations of EUs, and scale up EU counts taking advantage of newer fab processes, to develop bigger discrete GPUs.
More slides follow.

Source: PC Watch
Add your own comment

65 Comments on Intel Unveils Discrete GPU Prototype Development

#1
RejZoR
It would be cool if we got 3rd GPU player on graphics market.
Posted on Reply
#2
lynx29
better buy me some extra nice thermal paste when they come out

Kappa
Posted on Reply
#3
TheLostSwede
RejZoR said:
It would be cool if we got 3rd GPU player on graphics market.
There already are a lot more GPU makers than Nvidia and AMD, but none of them are for PC's.

ARM, Imagination Technologies, Qualcomm and Vivante all make GPU's, as well as technically S3/VIA...
I most likely missed some companies as well, but the problem is, none of them can keep up with Nvidia due to the amount of money they're throwing at their R&D of new graphics architectures.

Even Intel isn't going to be able to catch up any time soon. At best I'd expect something mid-level for the first 2-3 years, as it's expensive, it's resource intensive and time consuming to make GPU's.
Posted on Reply
#4
lynx29
TheLostSwede said:
There already are a lot more GPU makers than Nvidia and AMD, but none of them are for PC's.

ARM, Imagination Technologies, Qualcomm and Vivante all make GPU's, as well as technically S3/VIA...
I most likely missed some companies as well, but the problem is, none of them can keep up with Nvidia due to the amount of money they're throwing at their R&D of new graphics architectures.

Even Intel isn't going to be able to catch up any time soon. At beast I'd expect something mid-level for the first 2-3 years, as it's expensive, it's resource intensive and time consuming to make GPU's.
Yep, I myself am waiting for GTX 2080 this summer. Oh sweet mama, the power! I need the power even if it costs me my soul!!!! /howls
Posted on Reply
#5
Vayra86
Inferior shaders on an inferior dual chip solution.

Good luck with that!
Posted on Reply
#6
ZoneDymo
lynx29 said:
Yep, I myself am waiting for GTX 2080 this summer. Oh sweet mama, the power! I need the power even if it costs me my soul!!!! /howls
GTX1180*
Posted on Reply
#7
kruk
If this is actually a GPU aimed at gaming and if performance is sufficient and if Linux drivers are Open Source, then I might consider it as an option :)
Posted on Reply
#8
bug
i740 wasn't impressive at all if you looked at the numbers. Yet is was highly sought after, since it gave us acceptable performance for a good price.
If Intel manages to do just that once again, customers win.
Posted on Reply
#9
Vayra86
bug said:
i740 wasn't impressive at all if you looked at the numbers. Yet is was highly sought after, since it gave us acceptable performance for a good price.
If Intel manages to do just that once again, customers win.
Different times... today an IGP is no longer something special and AMD's Ryzen APUs are really quite far ahead here, but also in the realm of 'performance where you might need it at some point'... hardly something people are eagerly waiting for.
Posted on Reply
#10
bug
Vayra86 said:
Different times... today an IGP is no longer something special and AMD's Ryzen APUs are really quite far ahead here, but also in the realm of 'performance where you might need it at some point'... hardly something people are eagerly waiting for.
i740 was quite a capable mid-range GPU. Today's IGPs are not in the same league, so I'd like Intel heat up things a bit in that segment. Of course, there's always the odd chance that Intel's strategy is not entirely based on my wishes, but we'll see.
Posted on Reply
#11
john_
I hope AMD is competitive with Nvidia in the hi end GPU market before Intel starts pushing out some good GPUs. Assuming they can throw enough money on the problem and come out with good GPUs, of course. We need AMD as a highly competitive company, more than we need a good Intel GPU.

PS With Intel having bought in the past Altera, I wonder if their first GPUs will be targeting gamers, or miners.
Posted on Reply
#12
ensabrenoir
...intel is a mystery to me right mow. I really think they torpedoed their cpu division intentionally for some reason. No one goes from Sherlock Holmes to inspector Clousteau that quickly.... there has to be a long game plan here somewhere
Posted on Reply
#13
cdawall
where the hell are my stars
Intel would be a curious player if the drivers function well.

TheLostSwede said:
Even Intel isn't going to be able to catch up any time soon. At best I'd expect something mid-level for the first 2-3 years, as it's expensive, it's resource intensive and time consuming to make GPU's.
I bet they take the lead for certain types of mining right off of the bat.
Posted on Reply
#14
Jism
ensabrenoir said:
...intel is a mystery to me right mow. I really think they torpedoed their cpu division intentionally for some reason. No one goes from Sherlock Holmes to inspector Clousteau that quickly.... there has to be a long game plan here somewhere
Competition! AMD is bringing out APU's that are faster then Intels. So to maximize profits and keep shareholders happy they set for creating their own gpu's as in the past. Remember that Intel has sold the most VGA chipsets back in the days... it was due to complete motherboard sollutions with intel's IGP on top of that.

In business, small brands and co-operations, general computers dont need external GPU to function. A IGP is more then enough to provide 2D work and some video acceleration. These days it's more and more tasks being shifted to the GPU.
Posted on Reply
#15
GoldenX
Intel should start to develop it's own drivers for their IGPs before doing a dGPU, Windows OpenGL and Direct3D drivers are a joke.
Posted on Reply
#16
Casecutter
If Intel could come to market with even a mainstream discrete gaming (GTX1060/470) offering by 2020 I'd be surprised and welcoming, but they've a bunch of work. If Raja Koduri can build a clear focus and keep engineers on task just getting to the 1050/560 level and drives to back it would be a good first showing.
Posted on Reply
#17
jabbadap
bug said:
i740 was quite a capable mid-range GPU. Today's IGPs are not in the same league, so I'd like Intel heat up things a bit in that segment. Of course, there's always the odd chance that Intel's strategy is not entirely based on my wishes, but we'll see.
True. If my memory serves me well i740 was half the speed of 3dfx voodoo2. And if you take the current top dog gtx1080ti as voodoo2, then i740 would be no less than RX 580. I would gladly take third discrete gpu manufacturer if it could offer that kind of speed vs competition today.
Posted on Reply
#18
v12dock
I'm sure they understand the huge challenges ahead just from the Larrabee debacle. I think I remember reading something when Larrabee was in development and engineer was quoted saying something like making a GPU was harder than they anticipated.
Posted on Reply
#19
CrAsHnBuRnXp
ZoneDymo said:
GTX1180*
Ive seen 2080 mentioned in more than one place. They might be skipping the 11-19 scheme for some reason.
Posted on Reply
#20
theoneandonlymrk
Vayra86 said:
Inferior shaders on an inferior dual chip solution.

Good luck with that!
Play fair dude, this is likely Raja's very first step towards something, ie intels present best igpu with hybrid shaders and an fpga added to make up for all thats missing if they don't fit cpu cores and more importantly the rest of its supporting circuitry related to specific purposes ie power control but, the fpga mainly acts as an interface between intels proprietary inter chip interface type and pciex since it's clear the igpu was designed without pciex in mind it makes sense to test what they can expect before scaling up the design, they won't actually make many of these even for themselves, it's a stepping stone chip clear as day.

The shop bought ones four or five years out imho and certainly won't be a dual chip solution maybe MCM though.
Because that FPGA could possibly be a game changer if used to pump the GPUs core performance intelligently and or to directly affect a new Api for mainstream use case acceleration, intel already do fpga in server so they know it's got legs.
Posted on Reply
#21
jahramika
Jism said:
Competition! AMD is bringing out APU's that are faster then Intels. So to maximize profits and keep shareholders happy they set for creating their own gpu's as in the past. Remember that Intel has sold the most VGA chipsets back in the days... it was due to complete motherboard sollutions with intel's IGP on top of that.

In business, small brands and co-operations, general computers dont need external GPU to function. A IGP is more then enough to provide 2D work and some video acceleration. These days it's more and more tasks being shifted to the GPU.
That is iff you want to consider a chip being sold is intergated into a CPU that people have had no choice to get with the CPU. No one would willing buy the GPU if it was in a PCIE slot.
Posted on Reply
#22
Jism
jahramika said:
That is iff you want to consider a chip being sold is intergated into a CPU that people have had no choice to get with the CPU. No one would willing buy the GPU if it was in a PCIE slot.
There was a time where IGP's where simply integrated into the motherboards chipset. Even in AMD's platform (Nforce anyone?) it was very common to add a IGP along with motherboards chipset. It was a good sale value, simply buy CPU / RAM / Mobo and your good. But the performance was never really 'good', inproper for any gaming at all and esp. lack of good driver support.

They made attempts on AMD boards for example, to put a seperate memory module on the board itself which was the memory for the IGP. This was alot faster then shared ram sollution.

But as long as we stick to relatively slow DDR on PC's any IGP will lack bandwidth thus never offer serious performance compared to dedicated card.
Posted on Reply
#23
Reeves81x
Vayra86 said:
Inferior shaders on an inferior dual chip solution.

Good luck with that!
its a proof of concept design.
Posted on Reply
#24
evernessince
theoneandonlymrk said:
Play fair dude, this is likely Raja's very first step towards something, ie intels present best igpu with hybrid shaders and an fpga added to make up for all thats missing if they don't fit cpu cores and more importantly the rest of its supporting circuitry related to specific purposes, and act as an interface between intels proprietary inter chip type and pciex, they won't actually make many of these even for themselves, it's a stepping stone chip clear as day.

The shop bought ones four or five years out imho and certainly won't be a dual chip solution maybe MCM though.
Because that FPGA could possibly be a game changer if used to pump the GPUs core performance intelligently and or to directly affect a new Api for mainstream use case acceleration, intel already do fpga in server so they know it's got legs.
This isn't Raja's first step. He just joined the company last month, no way he did these designs in a month. Raja will have a hand in their next gen architecture and current / future drivers.
Posted on Reply
#25
Captain_Tom
I could see Intel really carving out a decent niche in the lower-mid to lower-enthusiast set of products.

1) Not quite as strong as Nvidia/AMD's halo products
2) Not quite as good price/perf as AMD's midrange
3) But industry-leading perf/watt.

For the time, the 10-25w IvyBridge-Broadwell had absolutely incredible perf/watt. But they just couldn't scale them up efficiently past even ~50w...
Posted on Reply
Add your own comment