Thursday, January 9th 2020
Intel Unveils Xe DG1-SDV Graphics Card, Demonstrates Intent to Seriously Compete in the Gaming Space
At a media event on Wednesday, Intel invited us to check out their first working modern discrete graphics card, the Xe DG1 Software Development Vehicle (developer-edition). Leading the event was our host Ari Rauch, Intel Vice President and General Manager for Graphics Technology Engineering and dGPU Business. Much like gruff developer-editions of game consoles released to developers several quarters ahead of market launch, the DG1-SDV allows software developers to discover and learn the Xe graphics architecture, and develop optimization processes for their current and future software within their organizations. We walked into the event expecting to see a big ugly PCB with a bare fan-heatsink and a contraption that sort-of looks like a graphics card; but were pleasantly surprised with what we saw: a rather professional product design.
What we didn't get at the event, through, was a juicy technical breakdown of the Xe graphics architecture, and its various components that add up to the GPU. We still left pleasantly surprised for what we were shown: it works! The DG1-SDV is able to play games at 1080p, even if they are technically lightweight titles like "Warframe," and aren't maxing out settings. The SDV is a 15.2 cm-long graphics card that relies on the PCI-Express slot for power entirely (and hence pulling less than 75 W).We already know from Intel's Xe slides since 2019 that the Xe architecture is designed to be extremely scalable, with a single ISA scaling all the way from iGPUs to tiny discrete GPUs like the DG1-SDV, and scaling all the way up to double-digit TFLOP-scale compute processors for the HPCs. Along the way, though, Intel intends to compete in the gaming and client-graphics space by developing products at just the right scale of the Xe architecture, with just the right performance/Watt to compete with specific products from the NVIDIA-AMD duopoly. Forget the high-end for a moment. If Intel is able to match even the GTX 1650 and RX 5500 (or their future $150 successors) in performance and power, they end up tapping into a double-digit percentage of the client-graphics TAM, and that spells trouble for Santa Clara and Markham.The Xe DG1 is backed by Intel's robust software stack, which has seen breakneck development and feature-additions in recent times, such as a modern Control Center app, support for modern technologies such as variable-rate shading, integer scaling, etc. Intel has, for over a decade, established a foothold in the client media-acceleration space with its Quick Sync video encoders, and Xe only dials that a notch. The DG1 features Intel's entire media-acceleration and display-controller feature-set. Intel is also designing Xe to be extremely configurable by OEMs, to make the GPU finely-match their product's thermal and power targets. Responding to a specific question by us, Intel didn't rule out the possibility of discrete Xe graphics cards working in tandem with Intel iGPUs (possibly starting with "Tiger Lake").Xe DG1-SDV in action (video)
Here are a couple of brief videos we took of the DG1-SDV alive and kicking.
Here's the card itself:
And here's the money-shot of Intel's presentation, a "Warframe" gaming session. There were no performance numbers put out, but the game is being rendered at 1080p, and appears playable.
Xe DG1-SDV Physical Design
The Xe DG1-SDV (software development vehicle) is a contraption that's more evolved than "working prototype," and stops short of being a production product. It is designed to to be stable and durable enough for its target audience: ISVs, individual software developers, and systems engineers evaluating the thing for major hardware OEMs. The card has the exact same physical dimensions as the Radeon R9 Nano, and fits into any machine that has two full-height expansion slots and a PCI-Express x16 interface, no additional power cables needed. A single fan cools an aluminium fin-stack heatsink underneath. Throughout the demo, the cooler was more than audible and in need of acoustic optimization. The cooler shroud and back-plate bear a futuristic silvery design, there's also a row of LEDs near the I/O shield that put out light into the grooves of the shroud."Tiger Lake" gate-crashes the party
Intel is still trying to break its habit from being a CPU maker foremost. The "Tiger Lake" CPU microarchitecture is a significant inflection point of Intel's next-gen "Willow Cove" CPU core, and the first implementation of Xe as an iGPU solution. Given the volumes of CPUs with iGPUs Intel pushes, Xe is expected to hit critical mass in the client-segment with "Tiger Lake" proving the launchpad. There are some interesting tidbits in the "Tiger Lake" slide:
What we didn't get at the event, through, was a juicy technical breakdown of the Xe graphics architecture, and its various components that add up to the GPU. We still left pleasantly surprised for what we were shown: it works! The DG1-SDV is able to play games at 1080p, even if they are technically lightweight titles like "Warframe," and aren't maxing out settings. The SDV is a 15.2 cm-long graphics card that relies on the PCI-Express slot for power entirely (and hence pulling less than 75 W).We already know from Intel's Xe slides since 2019 that the Xe architecture is designed to be extremely scalable, with a single ISA scaling all the way from iGPUs to tiny discrete GPUs like the DG1-SDV, and scaling all the way up to double-digit TFLOP-scale compute processors for the HPCs. Along the way, though, Intel intends to compete in the gaming and client-graphics space by developing products at just the right scale of the Xe architecture, with just the right performance/Watt to compete with specific products from the NVIDIA-AMD duopoly. Forget the high-end for a moment. If Intel is able to match even the GTX 1650 and RX 5500 (or their future $150 successors) in performance and power, they end up tapping into a double-digit percentage of the client-graphics TAM, and that spells trouble for Santa Clara and Markham.The Xe DG1 is backed by Intel's robust software stack, which has seen breakneck development and feature-additions in recent times, such as a modern Control Center app, support for modern technologies such as variable-rate shading, integer scaling, etc. Intel has, for over a decade, established a foothold in the client media-acceleration space with its Quick Sync video encoders, and Xe only dials that a notch. The DG1 features Intel's entire media-acceleration and display-controller feature-set. Intel is also designing Xe to be extremely configurable by OEMs, to make the GPU finely-match their product's thermal and power targets. Responding to a specific question by us, Intel didn't rule out the possibility of discrete Xe graphics cards working in tandem with Intel iGPUs (possibly starting with "Tiger Lake").Xe DG1-SDV in action (video)
Here are a couple of brief videos we took of the DG1-SDV alive and kicking.
Here's the card itself:
And here's the money-shot of Intel's presentation, a "Warframe" gaming session. There were no performance numbers put out, but the game is being rendered at 1080p, and appears playable.
Xe DG1-SDV Physical Design
The Xe DG1-SDV (software development vehicle) is a contraption that's more evolved than "working prototype," and stops short of being a production product. It is designed to to be stable and durable enough for its target audience: ISVs, individual software developers, and systems engineers evaluating the thing for major hardware OEMs. The card has the exact same physical dimensions as the Radeon R9 Nano, and fits into any machine that has two full-height expansion slots and a PCI-Express x16 interface, no additional power cables needed. A single fan cools an aluminium fin-stack heatsink underneath. Throughout the demo, the cooler was more than audible and in need of acoustic optimization. The cooler shroud and back-plate bear a futuristic silvery design, there's also a row of LEDs near the I/O shield that put out light into the grooves of the shroud."Tiger Lake" gate-crashes the party
Intel is still trying to break its habit from being a CPU maker foremost. The "Tiger Lake" CPU microarchitecture is a significant inflection point of Intel's next-gen "Willow Cove" CPU core, and the first implementation of Xe as an iGPU solution. Given the volumes of CPUs with iGPUs Intel pushes, Xe is expected to hit critical mass in the client-segment with "Tiger Lake" proving the launchpad. There are some interesting tidbits in the "Tiger Lake" slide:
- There's a "massive leap" in graphics performance, compared to current Gen11 architecture, thanks to Xe
- Unless we're badly mistaken, Intel just put out a CPU IPC guidance for "Willow Cove" as being "double digit" (we assume in comparison to current Ice Lake / Sunny Cove)
- A "massive" AI performance improvement from DLBoost and support for more AVX-512 instructions
44 Comments on Intel Unveils Xe DG1-SDV Graphics Card, Demonstrates Intent to Seriously Compete in the Gaming Space
All Intel really has to do is come out swinging with a GPU that can ray trace better than the 2080Ti art 4K 60 fps and costs less than $1000.
Price/performance test real world test and general availability. Or it didn't happen.
Mine 0.02g on the game chosen - Intel wouldn't put a random game on show. They knew exactly what they are showing, they had at least decently optimized driver for it, and I wouldn't exclude the possibility that game itself was patched undocumentedly for this occasion. Yes, a shadow of a doubt, except it's a much more than shadow. Not a random game someone installed - this was pretty much the best performing game with best and custom driver optimization, on (granted) engineering sample, but no doubt hand-picked...
We can't say something like 'we witnessed games gain 20%+ on driver optimization' (as in attached video) in this case... Games did that, but that was something else, entirely. You would never see the first appearance of new architecture on a game that natively performs badly on GPU architecture and furthermore hasn't own drivers optimized for that game.
But, price/performance real world test and general availability, and then it happened...
Really there is nothing to see here. In fact, what was demonstrated, I would want an eye witness to confirm the dude didnt plug the HDMI into the mobo ><!
It is 2020 right? The year they enter the dGPU market? And we are impressed with a shroud? Okay...
But I believe the Samsung partnership has already been leaked (if not officially announced). Because you were expecting what? 2080Ti competitor? Even your beloved AMD can't make that.
Some people here thought (hoped?) it will take Intel 1-2 years more to come up with a working POC. But it's already here. :) It can't be anything other than an opinion, because price perception is always subjective.
Something can be called "overpriced" when it's more expensive than majority of alternative products. It's about comparing, about statistics.
If you have a single product of a kind - with no competition - it just can't be overpriced.
How quaint!
Okay we see you.
The day the 104 series chips are back at the high mid price range is the day I return to nVidia. At this point AMD can't do that. If Intel can, all the best to them.
Apparently nVidia believes that it is a waste of die space for current gen mainstream gpus.
While Intel has iGPUs and drivers/software for them, what they are aiming for now is far higher. AMD and Nvidia have been doing that for decades and have a lot of built-up software support that Intel needs to somehow match. This will take considerable amount of time. Intel is taking every chance to try and start that process before an actual product is out.
I love the idea that something is at least being attempted
Roll on Computex, hopefully more juicy details and demonstrations
It does not compete with dGPU at all, currently and from what we've seen. Can this scale? Sure. Will it scale as well as AMD or Nvidia's tech? Not by a long shot. The fact it can run some ancient game at low is... well.. even Intel IGPs have done that for years. Being capable of 30~60 fps at 1080p low instead of 720p medium is not an achievement in 2020.
What they've shown is a new shroud, really, and lots of plans and powerpoint slides.