Friday, October 18th 2019

Intel Could Unveil First Discrete 10 nm GPUs in mid-2020

According to the sources close to DigiTimes, Intel will unveil its first discrete 10 nm graphics cards named "Xe" very soon, with the first wave of Xe GPUs expected to arrive some time in 2020. Said to launch mid year, around July or August, Intel will start selling initial Xe GPU models of the long awaited product to consumers, in hope of gaining a share in the massive market using GPU for acceleration of all kinds of tasks.

Perhaps one of the most interesting notes DigiTimes reported is that "... Intel's GPUs have already received support from the upstream supply chain and has already been integrated into Intel's CPUs to be used in the datacenter and AI fields.", meaning that AIB partners already have access to first 10 nm graphics chips that are ready for system integration. First generation of Xe graphics cards will cover almost whole GPU market, including PC, datacenter, and AI applications where NVIDIA currently holds the top spot.
Source: DigiTimes
Add your own comment

35 Comments on Intel Could Unveil First Discrete 10 nm GPUs in mid-2020

#1
notb
We keep seeing this LEDy, huge Xe renders... we already have this kind of GPUs. Intel would join a market segment already full of very good cards (and bad as well - depending what they're up to...).

Whereas AFAIK the single Xe render that made the most buzz was this - short, dark, elegant, utilitarian.
What happened?

Posted on Reply
#2
my_name_is_earl
[yawn] Disappointed so many times. Won't fall for it this time.
Posted on Reply
#3
Vayra86
notb
We keep seeing this LEDy, huge Xe renders... we already have this kind of GPUs. Intel would join a market segment already full of very good cards (and bad as well - depending what they're up to...).

Whereas AFAIK the single Xe render that made the most buzz was this - short, dark, elegant, utilitarian.
What happened?


Raja apparently dropped the only prototype and ate the blueprints with his morning cornflakes. He couldnt stand the regular 5775C breakfast option.
Posted on Reply
#4
silentbogo
Strangely reminds me of Larrabee prototypes, only now with bling )))
But as with 10nm CPUs, I'll believe it when I see it.

Vayra86
Raja apparently dropped the only prototype and ate the blueprints with his morning cornflakes. He couldnt stand the regular 5775C breakfast option.
Oh, I 'memba that...[IMG width="40px"]https://66.media.tumblr.com/avatar_04d36e0b2bd4_128.pnj[/IMG]


Posted on Reply
#5
ZoneDymo
Aha so there are those elusive 10nm Intel parts going to, crafty

Joking aside, I actually hope they will make a capable gaming gpu in the future, bring big green down a peg and in general tripple competition > dual competition.
Posted on Reply
#6
Totally
Ooooooh! Finally, I'll be able to build a AMD Cpu/ Intel GPU based system. Ooooooh
Posted on Reply
#7
Mouth of Sauron
I trust them completely. This time, for sure, it will be something special. And on the 10nm node, no less. Larabee? Iris Pro? Yes, but *this* time... And whatabout *that time long ago-*, when Intel said they're making a graphic card (GPU was the term unheard of), which will be *really good*, unlike their previous ones? With the competition consisting NOT of all-consuming duopoly of patents, experience, architecture accepted by software creators, graphic engines, years of experience in production, BUT of companies like Tseng Labs, Trident, S3 and such...

Was it all the technical know-how and leading team of engineers who produced CPU with near-monopoly? Was it reasonable that they will catch and overcome opposition *then*? Or start thinking about it when GPUs, this time with the term in existence, started to be important, dominant sometimes, component of super-computers, Intel was so interested on?

With this in mind, I just ignore any Intel-makes-GPU-which-rules rumours. Especially over "whole range" - yeah, maybe by performance, but with great likeness of price being 2x higher than similarly performing GPU.

In comical, or sad - depending on side you're looking at to, history of Intel-made GPU, one man only, named Raja, will change everything!

Right! Exclusively caused by people who made all the progress in graphical cards and later GPUs in last 30 years, ALL of which had inherent burning hatred towards both Intel and piles of treasure alike, also applied to their associates, 'students', relatively young engineers having a knowledge and experience in the practice - dare even to call it a conspiracy - so, for that reason, Intel was grossly hurt by the world, but now their hour is nigh!

Whole conspiracy is suddenly off, and a man called Raja and few other snatched engineers will lead Intel in the lands of light (represented by nothing less than ray-traced light, so I've heard), justice and fair competition! And beyond(TM)!

Because things work like that.
Posted on Reply
#8
Vya Domus
Here we go again, the millionth piece of news telling us that their GPUs will exist sometime in the distant future, great. And mind you, there is still exactly zero concrete information about anything specific.

And people blame AMD for hyping shit, they got nothing on Intel this time.
Posted on Reply
#9
Splinterdog
Knowing Intel, you'll probably have to buy your own heatsink, though.
Posted on Reply
#10
64K
I know very little about GPUs made for work purposes so I don't know if the drivers need to be regularly updated for those but for the gaming market Intel is going to have to prove that they will support their drivers to gain the trust that Nvidia and AMD have with gamers. It does no good to have the fastest gaming GPUs at a decent price and they run games like crap because of buggy drivers. No one wants that scenario.
Posted on Reply
#11
mak1skav
I really hope that they will compete with NVIDIA and AMD but I can't hide the fact they every time I see Intel trying to get into graphic cards market it reminds me the attempts from Microsoft to enter the smart-phone market.
Posted on Reply
#12
silentbogo
Oh, speaking of hype and Koduri, did anyone watch the interview with Russian Pro HiTech?
For non-native speakers, there are Eng subtitles.
Posted on Reply
#13
Anymal
Unveil soon and launch in mid 2020? Or unveil in mid 2020 as title says?
Posted on Reply
#14
kings
No one knows anything at the moment. This isn't even Intel teasing something, it's just another rumor.
Posted on Reply
#16
Franzen4Real
Vya Domus
Here we go again, the millionth piece of news telling us that their GPUs will exist sometime in the distant future, great. And mind you, there is still exactly zero concrete information about anything specific.

And people blame AMD for hyping shit, they got nothing on Intel this time.
Unfortunately this is the day and age we live in Vya, where # of clicks is FAR > quality of content. I think the hype train is pretty much brand agnostic really----just fueled by pure metrics where you get this ebb and flow between the big 3, dependent on who's clicking what, and what all of the other outlets are "reporting" on. For instance, this article merely says "according to a source close to Digitimes", it's not a straight up Intel quote/press release driving the train. To make matters worse, you and I both just added +2 to the trending metrics for Intel GPU "news", making it a sliver of a percent more likely that another one will get posted.... ;)
Posted on Reply
#17
Jism
Vya Domus
Here we go again, the millionth piece of news telling us that their GPUs will exist sometime in the distant future, great. And mind you, there is still exactly zero concrete information about anything specific.

And people blame AMD for hyping shit, they got nothing on Intel this time.
A GPU tapeout usually takes a few years into account if you take the design time with it. So it's not so strange that they are done by design and working on the tapeouts.

Intel was once dominant (still is i think) in the "VGA" market because it delivered plenty of motherboards with a IGP. There was no need to have a additional graphics card if the only purpose was web/word/excel/printing.
Posted on Reply
#18
Vya Domus
Jism
A GPU tapeout usually takes a few years into account if you take the design time with it. So it's not so strange that they are done by design and working on the tapeouts.
A tapeout is supposed to come after the final design is done, but tapeouts don't take a year. Usually no more than 1 or 2 revisions of the silicon are needed to finalize a product with the level of CAD that is used nowadays, in other words once you have the design the silicon follows soon after. I highly doubt they have anything remotely close to being final right now because if they did it should only be a matter of a couple of months to take it out onto the market.
Posted on Reply
#19
Pinktulips7
intel will kick AMD GPU along with CPU@@@AMD need to run away...………………...
Posted on Reply
#20
eidairaman1
The Exiled Airman
Pinktulips7
intel will kick AMD GPU along with CPU@@@AMD need to run away...………………...
Keep on dreaming
Posted on Reply
#21
notb
Vya Domus
Here we go again, the millionth piece of news telling us that their GPUs will exist sometime in the distant future, great. And mind you, there is still exactly zero concrete information about anything specific.

And people blame AMD for hyping shit, they got nothing on Intel this time.
To be honest, it's not that different from how AMD hyped Zen.
I don't know what you'd expect.
"Concrete information" like what? Performance? They won't give you that. No serious company would.
Posted on Reply
#22
Vayra86
Vya Domus
Here we go again, the millionth piece of news telling us that their GPUs will exist sometime in the distant future, great. And mind you, there is still exactly zero concrete information about anything specific.

And people blame AMD for hyping shit, they got nothing on Intel this time.
Poor Xe...

Can I just add that as a marketing name, Xe, is utterly shit? How do you even say it? ('Kzeh')?
Posted on Reply
#23
holyprof
As I see it so far, Intel is aiming at HPC and accelerator market (from supercomputers to accelerating desktop/laptop with AI, which inevitably will enter our lifes). The ability to play games with Xe is just an added bonus. Don't expect DirectX / OpenGL / Vulkan drivers to be as good or as up-to-date as Nvidia or AMD. Even if Intel throw millions at game developers and implement their version of "The Way it's meant to be P(l)ayed".
I hope they succeed, because more players mean customers better served and more options to choose from. Maybe I could add a Xe card as a compute card besides my GTX1080 and have the ultimate Frankenstein PC with AMD Ryzen CPU, Nvidia GTX GPU and Intel Xe as compute/AI accelerator :cool:
Posted on Reply
#24
notb
holyprof
As I see it so far, Intel is aiming at HPC and accelerator market (from supercomputers to accelerating desktop/laptop with AI, which inevitably will enter our lifes). The ability to play games with Xe is just an added bonus. Don't expect DirectX / OpenGL / Vulkan drivers to be as good or as up-to-date as Nvidia or AMD. Even if Intel throw millions at game developers and implement their version of "The Way it's meant to be P(l)ayed".
By all means: no one expects Intel's drivers and libraries to be as good as Nvidia's. It'll take them years.
AMD on the other hand...

It's almost impossible to think this could be any worse than Polaris.

Intel is new in large GPUs and they'll make rookie mistakes.
I hope they succeed, because more players mean customers better served and more options to choose from. Maybe I could add a Xe card as a compute card besides my GTX1080 and have the ultimate Frankenstein PC with AMD Ryzen CPU, Nvidia GTX GPU and Intel Xe as compute/AI accelerator :cool:
Well. I hope they'll succeed simply because today GPGPU acceleration is not a rarely used novelty anymore. And I do hope Intel Xe (or whatever results from this project) becomes a mainstream standard - scaling from tiny IGP to large HPC chips.
Posted on Reply
#25
goodeedidid
What does it take so long, next year I might be dead in car crash or something, geez, people don't live forever...
Posted on Reply
Add your own comment