Wednesday, December 12th 2018

Intel Xe Kicks the Door Open to Challenge the GeForce-Radeon Duopoly

Intel's discrete graphics card for PC enthusiasts is real. Intel won't just address the pro-graphics and accelerated-compute markets, but also consumer graphics, challenging the duopoly of NVIDIA GeForce and AMD Radeon. Scheduled for 2020, the new Intel Xᵉ is a family of discrete GPUs targeting client-segment (consumer graphics) as well as enterprise (pro-graphics and compute).

As for performance, we speculate that the first Xᵉ products could span a vast lineup of ASICs starting single-digit TFLOP/s range for the client-segment GPU, looking purely at a nondescript performance-time graph presented by Intel. This graph depicts performance double linearly over time up to Gen9, and increase to Intel's own state 1 TFLOP/s for the Gen11 iGPU core in 2019 (a full four years following Gen9). There are a spectrum of GPUs going from the entry-level client-segment all the way up to mid-range and enthusiast segment (Intel finally used the E-word).
For the enterprise-segment GPU, however, we see a sharp rise in performance over Gen11, which could well be in the double-thru-triple digit TFLOP/s range. Intel is targeting the compute data-center market dominated by NVIDIA Tesla and AMD Radeon Instinct. This graph perfectly explains why Intel wants a discrete GPU now: it wants to go after the lucrative compute data-center segment, but wants to give the dividends of its R&D to gamers, too. Source: AnandTech
Add your own comment

72 Comments on Intel Xe Kicks the Door Open to Challenge the GeForce-Radeon Duopoly

#1
StrayKAT
At first I thought it was pro-oriented from earlier comments of theirs, but recently they announced they'll be supporting Linux gaming as well... which led me to think it's for gaming in general.

More power to them! We need more competitors. I might just fall into their camp myself.

Since I'm already eyeing a NUC, it'd be great to see it in one of those eventually.
Posted on Reply
#2
Vayra86
They will have to be going into a LOT more detail than stating Teraflops for performance if they want to make an entrance.

We all know how well Tflops translate to performance between different architectures... (NOT). This tells us just about nothing. Still, nice to see the confirmation that they're pushing discrete GPU now. Never thought it'd come to this honestly, and a lot needs to happen before I trust Intel to provide a solid high end GPU...
Posted on Reply
#3
dj-electric
I just hope RTG 2.0 could have funds RTG 1.0 couldn't to have nice competition.
Posted on Reply
#4
BakerMan1971
I am mildly excited about this announcement, at least we get a three legged race.
Posted on Reply
#5
natr0n
K version lets you overclock I've heard.
Posted on Reply
#6
opteron
natr0n, post: 3959460, member: 102496"
K version lets you overclock I've heard.
It's the X version for oc, K is for CPU.
Posted on Reply
#7
ArbitraryAffection
I won't be buying their graphics cards but it should mean NVIDIA lowers their pricing (sorry but 20 series pricing is stupid) if Intel can compete.

Or.... Intel being intel, they follow NVIDIA's lead and set the pricing stupidly high, too. (they do with their CPUs so why not GPUs?) Shrug.

Either way I don't think im going to replace my Vega 64 any time soon
Posted on Reply
#8
theoneandonlymrk
Vayra86, post: 3959440, member: 152404"
They will have to be going into a LOT more detail than stating Teraflops for performance if they want to make an entrance.

We all know how well Tflops translate to performance between different architectures... (NOT). This tells us just about nothing. Still, nice to see the confirmation that they're pushing discrete GPU now. Never thought it'd come to this honestly, and a lot needs to happen before I trust Intel to provide a solid high end GPU...
True, also people seam to be ignoring the fact only the Enterprise segment gpu has more than a few tflop potentially, so, total shit for gaming but brilliant for mining, or data mining is my prediction for gen11.
Posted on Reply
#10
Fleurious
Another player in the discrete GPU market will be great.
Posted on Reply
#11
dj-electric
craigo, post: 3959485, member: 26896"
Why would anything good ever happen?
You're right. Intel shouldn't have basically get some of the best people in the GPU industry and just give up on the whole idea. That will surely get us closer to fair GPU prices
Posted on Reply
#12
craigo
dj-electric, post: 3959522, member: 87186"
You're right. Intel shouldn't have basically get some of the best people in the GPU industry and just give up on the whole idea. That will surely get us closer to fair GPU prices
I suppose, but I have no illusions that intel cares for consumer markets, larrabee turned into Knights Landing and had nothing to do with the consumer space.
this will no doubt turn into backend for your subscription cloud based gaming service so you can play the latest console ports on you nuc (there you go consumer) with your thin client os
Posted on Reply
#13
R0H1T
There's a small possibility that Intel might throw a lot of Silicon behind this ambitious move & cross subsidize their entry into the dGPU space, ala Atom. If so they could match some of the mid range Nvidia GPUs & upper/mid range from AMD. Unlike CPUs you can throw a lot of "cores" into a GPU & they'll work just fine, assuming there isn't a major bottleneck somewhere.
Posted on Reply
#14
Bones
Will they'll have crappy TIM on these too......... ? :fear:
Posted on Reply
#15
yakk
They got PowerPoint slides down perfectly.

The rest... :confused:
Posted on Reply
#16
Vayra86
craigo, post: 3959540, member: 26896"
I suppose, but I have no illusions that intel cares for consumer markets, larrabee turned into Knights Landing and had nothing to do with the consumer space.
this will no doubt turn into backend for your subscription cloud based gaming service so you can play the latest console ports on you nuc (there you go consumer) with your thin client os
Painful as it is, that will certainly be part of their business plan.
Posted on Reply
#17
R0H1T
Not unless they're buying a stake in Comcast/AT&T/Verizon or they're looking at an exclusive streaming service, only for Intel users. Why waste money on something like that when you can "upsell" GPUs probably at a much higher adoption rate as well?
Posted on Reply
#18
craigo
R0H1T, post: 3959566, member: 131092"
Not unless they're buying a stake in Comcast/AT&T/Verizon or they're looking at an exclusive streaming service, only for Intel users. Why waste money on something like that when you can "upsell" GPUs probably at a much higher adoption rate as well?
What hardware do you suppose the providers you listed use to provide their services and who do you you think they buy it from?
Posted on Reply
#19
R0H1T
craigo, post: 3959582, member: 26896"
What hardware do you suppose the providers you listed use to provide their services and who do you you think they buy it from?
You said something about NUC right, for (game) streaming? How big is the NUC market vs dGPU one, for that matter streaming services vs playing from a local disk? Besides there's other hurdles like data plan, broadband availability & so much more, that's before gaming via "Netflix" becomes affordable &/or popular.



I'm not sure what you're saying, please elaborate?
Posted on Reply
#20
EarthDog
BakerMan1971, post: 3959447, member: 183184"
I am mildly excited about this announcement, at least we get a three legged race.
Even if one leg is as long as a kickstand... :p
Posted on Reply
#21
mtcn77
As much as I hate to admit: there is literally '0' brand distinction in the marketplace when the developers design to be as brand-agnostic as possible in order to consolidate their userbase. Intel's aim may be specific towards this 'mainstream'. They did try to break pace using pixelsync and what not, but the key point is always agnosticism and sheer fps gallop of the gpu in question which they hadn't kind of established as of yet.
Posted on Reply
#23
bug
Hardware is nothing if they don't nail driver support.
And let's not kid ourselves, a three player market is no better than a two player one. We just get to be a bit more picky.
Posted on Reply
#24
neatfeatguy
I can make some graphic charts where the line keeps going up, but it doesn't actually represent any true data points!

Quick! Intel! Put me on your marketing team. I'm going to lead you down the path that rocks!
Posted on Reply
#25
Nkd
Performance graph without numbers! Why do companies keep ignoring that lol. On top the chart makes it look like midrange is barely faster than integrated and same for enthusiast vs mid range lol. Not expecting much from intel the first go around.
Posted on Reply
Add your own comment