Thursday, January 9th 2020

Intel Unveils Xe DG1-SDV Graphics Card, Demonstrates Intent to Seriously Compete in the Gaming Space

At a media event on Wednesday, Intel invited us to check out their first working modern discrete graphics card, the Xe DG1 Software Development Vehicle (developer-edition). Leading the event was our host Ari Rauch, Intel Vice President and General Manager for Graphics Technology Engineering and dGPU Business. Much like gruff developer-editions of game consoles released to developers several quarters ahead of market launch, the DG1-SDV allows software developers to discover and learn the Xe graphics architecture, and develop optimization processes for their current and future software within their organizations. We walked into the event expecting to see a big ugly PCB with a bare fan-heatsink and a contraption that sort-of looks like a graphics card; but were pleasantly surprised with what we saw: a rather professional product design.

What we didn't get at the event, through, was a juicy technical breakdown of the Xe graphics architecture, and its various components that add up to the GPU. We still left pleasantly surprised for what we were shown: it works! The DG1-SDV is able to play games at 1080p, even if they are technically lightweight titles like "Warframe," and aren't maxing out settings. The SDV is a 15.2 cm-long graphics card that relies on the PCI-Express slot for power entirely (and hence pulling less than 75 W).
We already know from Intel's Xe slides since 2019 that the Xe architecture is designed to be extremely scalable, with a single ISA scaling all the way from iGPUs to tiny discrete GPUs like the DG1-SDV, and scaling all the way up to double-digit TFLOP-scale compute processors for the HPCs. Along the way, though, Intel intends to compete in the gaming and client-graphics space by developing products at just the right scale of the Xe architecture, with just the right performance/Watt to compete with specific products from the NVIDIA-AMD duopoly. Forget the high-end for a moment. If Intel is able to match even the GTX 1650 and RX 5500 (or their future $150 successors) in performance and power, they end up tapping into a double-digit percentage of the client-graphics TAM, and that spells trouble for Santa Clara and Markham.
The Xe DG1 is backed by Intel's robust software stack, which has seen breakneck development and feature-additions in recent times, such as a modern Control Center app, support for modern technologies such as variable-rate shading, integer scaling, etc. Intel has, for over a decade, established a foothold in the client media-acceleration space with its Quick Sync video encoders, and Xe only dials that a notch. The DG1 features Intel's entire media-acceleration and display-controller feature-set. Intel is also designing Xe to be extremely configurable by OEMs, to make the GPU finely-match their product's thermal and power targets. Responding to a specific question by us, Intel didn't rule out the possibility of discrete Xe graphics cards working in tandem with Intel iGPUs (possibly starting with "Tiger Lake").
Xe DG1-SDV in action (video)
Here are a couple of brief videos we took of the DG1-SDV alive and kicking.

Here's the card itself:

And here's the money-shot of Intel's presentation, a "Warframe" gaming session. There were no performance numbers put out, but the game is being rendered at 1080p, and appears playable.

Xe DG1-SDV Physical Design
The Xe DG1-SDV (software development vehicle) is a contraption that's more evolved than "working prototype," and stops short of being a production product. It is designed to to be stable and durable enough for its target audience: ISVs, individual software developers, and systems engineers evaluating the thing for major hardware OEMs. The card has the exact same physical dimensions as the Radeon R9 Nano, and fits into any machine that has two full-height expansion slots and a PCI-Express x16 interface, no additional power cables needed. A single fan cools an aluminium fin-stack heatsink underneath. Throughout the demo, the cooler was more than audible and in need of acoustic optimization. The cooler shroud and back-plate bear a futuristic silvery design, there's also a row of LEDs near the I/O shield that put out light into the grooves of the shroud.
"Tiger Lake" gate-crashes the party
Intel is still trying to break its habit from being a CPU maker foremost. The "Tiger Lake" CPU microarchitecture is a significant inflection point of Intel's next-gen "Willow Cove" CPU core, and the first implementation of Xe as an iGPU solution. Given the volumes of CPUs with iGPUs Intel pushes, Xe is expected to hit critical mass in the client-segment with "Tiger Lake" proving the launchpad. There are some interesting tidbits in the "Tiger Lake" slide:
  • There's a "massive leap" in graphics performance, compared to current Gen11 architecture, thanks to Xe
  • Unless we're badly mistaken, Intel just put out a CPU IPC guidance for "Willow Cove" as being "double digit" (we assume in comparison to current Ice Lake / Sunny Cove)
  • A "massive" AI performance improvement from DLBoost and support for more AVX-512 instructions
Complete Intel Slide-Deck
Offical Intel Board Shots
Add your own comment

44 Comments on Intel Unveils Xe DG1-SDV Graphics Card, Demonstrates Intent to Seriously Compete in the Gaming Space

#26
Vya Domus
JismIts a compute card, being setup as a general purpose graphics card.
This may being somewhat of a spoiler but all GPU architectures nowadays are first and foremost compute accelerators with some fixed function graphics blocks bolted on, even the ones for mobile SoCs. Compute has priority over graphics, you just have to look at some block diagrams describing their architecture and you'll immediately realize this.
Posted on Reply
#27
Totally
notbThe fact that you'd like GPUs to be cheaper doesn't make them overpriced. That's just a price. On a free market. GPU makers can set it however they deem fit.

For a long time we had higher Nvidia's prices and lower AMD's. Nvidia was making money, AMD wasn't.
Now AMD has raised prices to Nvidia's level, which is just a confirmation of who was right all along.

Intel isn't exactly known for selling at break-even. Don't expect them to be the cheapest option. :)

Seriously, PC gaming is a cheap hobby anyway. Don't be such a miser...
Not only that some people were turning up their noses at AMD cards because they were 'cheap', and some went as far as to put down others who couldn't nvidia cards lost count of 'ooh, look at me flex, I have two titan x in sli and all I ever turn this pc on to do is browse the web and youbtube, oooh.'
Posted on Reply
#28
Mescalamba
So, low end start. Im actually still curious about performance per watt and such, cause if it can be scaled up..
Posted on Reply
#29
QUANTUMPHYSICS
All new GPU will be measured against the 1080Ti, and the 2080Ti.

All Intel really has to do is come out swinging with a GPU that can ray trace better than the 2080Ti art 4K 60 fps and costs less than $1000.
Posted on Reply
#30
Houd.ini
Vya DomusTo all the "bring competition to AMD/Nvidia" prophets, buckle up for some serious disappointment. I have a feeling the purpose of this demo is to sort of like bring down the expectations a bit, there have been some wild hopes and dreams put in Intel's discreet GPUs.
I think this GPU is too discreet for the demo to tell us anything.
Posted on Reply
#31
Mouth of Sauron
All yet non-existing GPUs, as NVIDIA "+50%", AMD 5600XT, "Big Navi" and... this:

Price/performance test real world test and general availability. Or it didn't happen.

Mine 0.02g on the game chosen - Intel wouldn't put a random game on show. They knew exactly what they are showing, they had at least decently optimized driver for it, and I wouldn't exclude the possibility that game itself was patched undocumentedly for this occasion. Yes, a shadow of a doubt, except it's a much more than shadow. Not a random game someone installed - this was pretty much the best performing game with best and custom driver optimization, on (granted) engineering sample, but no doubt hand-picked...

We can't say something like 'we witnessed games gain 20%+ on driver optimization' (as in attached video) in this case... Games did that, but that was something else, entirely. You would never see the first appearance of new architecture on a game that natively performs badly on GPU architecture and furthermore hasn't own drivers optimized for that game.

But, price/performance real world test and general availability, and then it happened...
Posted on Reply
#32
swirl09
QUANTUMPHYSICSAll new GPU will be measured against the 1080Ti, and the 2080Ti.

All Intel really has to do is come out swinging with a GPU that can ray trace better than the 2080Ti art 4K 60 fps and costs less than $1000.
I dont think anyone is that pushed with RT yet to make it something people need to chase, and I dont think 4K60 at a grand is a great bar either. Those GPUs will likely be 3 and a half and 2 years old respectively by the time something is launched, so I dont want to know about "competes with", I want to know about "destroys" - for a grand! I seriously doubt nvidia are going to be idle either.

Really there is nothing to see here. In fact, what was demonstrated, I would want an eye witness to confirm the dude didnt plug the HDMI into the mobo ><!

It is 2020 right? The year they enter the dGPU market? And we are impressed with a shroud? Okay...
Posted on Reply
#33
mastrdrver
Lets ask the obvious question: If Intel is struggling to keep up with demand with their CPUs since all of their fabs are running at full capacity, then where is there room to make a GPU?
Posted on Reply
#34
ShurikN
mastrdrverLets ask the obvious question: If Intel is struggling to keep up with demand with their CPUs since all of their fabs are running at full capacity, then where is there room to make a GPU?
They could take it to a different fab. Samsung for example
Posted on Reply
#35
notb
ShurikNThey could take it to a different fab. Samsung for example
Or TSMC to push out AMD. :)
But I believe the Samsung partnership has already been leaked (if not officially announced).
Divide OverflowCompletely underwhelming.
Because you were expecting what? 2080Ti competitor? Even your beloved AMD can't make that.

Some people here thought (hoped?) it will take Intel 1-2 years more to come up with a working POC. But it's already here. :)
ZoneDymoThe fact that you think "overpriced" can be anything but an opinion, and trying to fight that opinion as if its a fact is just...silly to say the least.
It can't be anything other than an opinion, because price perception is always subjective.
Something can be called "overpriced" when it's more expensive than majority of alternative products. It's about comparing, about statistics.
If you have a single product of a kind - with no competition - it just can't be overpriced.
Posted on Reply
#36
Casecutter
Oh hey we "Intel" are able to make a slick prototype that runs a game! We're still here....

How quaint!
Okay we see you.
Posted on Reply
#37
NC37
dicktracySince AMD also decided to join Nvidia in overcharging their low and midrange GPUs, this looks like a good time to break up this duopoly. AMD fans sure misses Raja Koduri’s breath taking prices.
Indeed. AMD just isn't cutting it in competition. Everything is going to RT and AMD still releases non RT parts. Then charges a premium for them. Which is ridiculous. Those GPUs were already discontinued the moment they launched. They should have cut the prices to make up for the devaluation.

The day the 104 series chips are back at the high mid price range is the day I return to nVidia. At this point AMD can't do that. If Intel can, all the best to them.
Posted on Reply
#38
Zubasa
NC37Everything is going to RT and AMD still releases non RT parts. Then charges a premium for them. Which is ridiculous. Those GPUs were already discontinued the moment they launched. They should have cut the prices to make up for the devaluation.
As if the GTX16 series that nVidia released has RT or Tensor cores in them.
Apparently nVidia believes that it is a waste of die space for current gen mainstream gpus.
Posted on Reply
#39
londiste
Vya DomusEven for an early prototype it's just too underwhelming, it's also obvious this is till no where near a real product that will be out there in the hands of people anytime soon.
All the coverage says Intel reps kept saying exactly that. It is a development vehicle - to get developers started on something actually Xe before any real product appears.

While Intel has iGPUs and drivers/software for them, what they are aiming for now is far higher. AMD and Nvidia have been doing that for decades and have a lot of built-up software support that Intel needs to somehow match. This will take considerable amount of time. Intel is taking every chance to try and start that process before an actual product is out.
Posted on Reply
#40
BakerMan1971
Its new and exciting
I love the idea that something is at least being attempted
Roll on Computex, hopefully more juicy details and demonstrations
Posted on Reply
#41
mastrdrver
ShurikNThey could take it to a different fab. Samsung for example
Except the GPU was made in Intel fabs, not some one elses. It would be very time consuming to port it to another fab.
Posted on Reply
#42
TranceHead
iOOnly 8 PCIe lanes on the PCB and the complete lack of any details still isn't promising but at least they show actual hardware now instead of just renders.
I seriously doubt it's saturating 8 lanes, why rig it up to 16 lanes when it won't use them ‍
Posted on Reply
#43
Vayra86
This is as 'new' for Intel as RDNA is new for AMD. Its just an iteration with fancy packaging. This will end up as a faster IGP.

It does not compete with dGPU at all, currently and from what we've seen. Can this scale? Sure. Will it scale as well as AMD or Nvidia's tech? Not by a long shot. The fact it can run some ancient game at low is... well.. even Intel IGPs have done that for years. Being capable of 30~60 fps at 1080p low instead of 720p medium is not an achievement in 2020.

What they've shown is a new shroud, really, and lots of plans and powerpoint slides.
Posted on Reply
#44
Zizo007
This thing is so noisy. I hope the final product is quieter.
Posted on Reply
Add your own comment
Apr 25th, 2024 21:26 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts