Tuesday, January 5th 2021

Intel DG2 Xe-HPG Features 512 Execution Units, 8 GB GDDR6

Intel's return to discrete gaming GPUs may have had a modest beginning with the Iris Xe MAX, but the company is looking to take a real stab at the gaming market. Driver code from the latest 100.9126 graphics driver, and OEM data-sheets pieced together by VideoCardz, reveal that its next attempt will be substantially bigger. Called "DG2," and based on the Xe-HPG graphics architecture, a derivative of Xe targeting gaming graphics, the new GPU allegedly features 512 Xe execution units. To put this number into perspective, the Iris Xe MAX features 96, as does the Iris Xe iGPU found in Intel's "Tiger Lake" mobile processors. The upcoming 11th Gen Core "Rocket Lake-S" is rumored to have a Xe-based iGPU with 48. Subject to comparable clock speeds, this alone amounts to a roughly 5x compute power uplift over DG1, 10x over the "Rocket Lake-S" iGPU. 512 EUs convert to 4,096 programmable shaders.

A leaked OEM data-sheet referencing the DG2 also mentions a rather contemporary video memory setup, with 8 GB of GDDR6 memory. While the Iris Xe MAX is built on Intel's homebrew 10 nm SuperFin node, Intel announced that its Xe-HPG chips will use third-party foundries. With these specs, Intel potentially has a GPU to target competitive e-sports gaming (where the money is). Sponsorship of major e-sports clans could help with the popularity of Intel Graphics. With enough beans on the pole, Intel could finally invest in scaling up the architecture to even higher client graphics market segments. As for availability, VideoCardz predicts a launch roughly coinciding with that of Intel's "Tiger Lake-H" mobile processor series, possibly slated for mid-2021.
Source: VideoCardz
Add your own comment

34 Comments on Intel DG2 Xe-HPG Features 512 Execution Units, 8 GB GDDR6

#1
Vayra86
So about five MX350's then, now. Cool, 720p gaming is now within reach! In mid 2021... in a handful of select models with low-power CPUs.

Wake me up when they get serious k
Posted on Reply
#2
londiste
Vayra86So about five MX350's then, now. Cool, 720p gaming is now within reach! In mid 2021... in a handful of select models with low-power CPUs.
What? Looking at the 512EU claim, this is 4096 SPs.
First, 4096SP is about 6.4 MX350's.
Second, RX6800 is 6 MX350's.
Posted on Reply
#3
Fouquin
londisteWhat? Looking at the 512EU claim, this is 4096 SPs.
First, 4096SP is about 6.4 MX350's.
Second, RX6800 is 6 MX350's.
Also 14Gbps GDDR6 should be 6.5x of LPDDR4X's bandwidth if they scale up the internal PHYs. That will undoubtedly increase overall performance.

So to build off the conservative "5x" remark from @Vayra86 we'd choose the GTX 1050 rather than the MX350 as the memory subsystem is more in line with the DG2's dedicated GDDR. 5x of GTX 1050 performance is the RTX 3070/RTX 2080 Ti. That's pretty serious if it's priced well, and judging by only 8GB of VRAM it probably will be.
Posted on Reply
#4
TechLurker
I don't really see this denting any major GPU numbers unless Intel can somehow force it into prebuilt "gaming" PCs; which might be possible via an "All Intel" initiative, especially in countries where they have more of a stranglehold. That said, with Radeon currently trading blows with NVIDIA and both them aggressively supporting OEMs to get their GPUs into prebuilts, and the fact that AMD could also push an "All AMD" initiative too, it's going to be a long battle, but one worth watching. Especially since it's been a long time since we last had a 3 way GPU battle and every maker was trying to throw in new and innovative features to further push their GPU value along.
Posted on Reply
#5
Vayra86
londisteWhat? Looking at the 512EU claim, this is 4096 SPs.
First, 4096SP is about 6.4 MX350's.
Second, RX6800 is 6 MX350's.
Seeing is believing... So far AMD couldn't scale that up so easily over a decade of time coming from GCN.

Can Intel feed those SPs and are they efficient enough? Is the support in good order? All of this can easily put them far below 6 MX350's and I reckon it will. And then there is power budget... die size... lots of things that really matter to gauge if there is any future in it at all.

Anyway, I hope they make a big dent with their first serious release. I just question if this is the one, as we've not really seen much of Xe in gaming yet. Its easy to feed a low end IGP next to a CPU, and its what Intel has done for ages, and the results are convincingly unimpressive. I still don't quite understand why they axed Broadwell as they did, when you see console APUs now that are using similar cache and on chip memory systems. But maybe that was the old, pre-panic-mode-laid-back-quadcore-Intel at work. Today they have Raja but somehow that doesn't inspire confidence.
Posted on Reply
#6
lexluthermiester
bluetriangleclockOnly ~11% of Steam users use AMD GPUs. That number is going to drop when Intel starts shipping their graphics cards. R.I.P., AMD.
Given your comment history the question needs to be asked: Do you enjoy making pointless comments? I mean really, no one in their right mind would believe that Intel is going to sweep AMD aside like you suggest. Just not gonna happen.
Posted on Reply
#7
Vayra86
lexluthermiesterGiven your comment history the question needs to be asked: Do you enjoy making pointless comments? I mean really, no one in their right mind would believe that Intel is going to sweep AMD aside like you suggest. Just not gonna happen.
Its probably his paper launch. 5 posts to an ignore button is quite an achievement I gotta say.
Posted on Reply
#8
AnarchoPrimitiv
Third Party foundry, huh? Where is Intel going to Fina any available capacity at TSMC?
Posted on Reply
#9
Vayra86
AnarchoPrimitivThird Party foundry, huh? Where is Intel going to Fina any available capacity at TSMC?
Who said TSMC ;)

This is a wild assumption and another reason I'm not entirely convinced this is the big one for Intel. Last I checked they wanted to leverage their 10nm SuperFin, but apparently it really doesn't like power a lot.
Posted on Reply
#10
RandallFlagg
"The manufacturer did not confirm which foundry would receive orders on DG2 wafers, but it is rumored that the DG2 GPU might be manufactured by TSMC in a 6nm process technology."


videocardz.com/newz/intel-confirms-dg2-gpu-xe-hpg-features-up-to-512-execution-units


Edit: Haven't heard much about TSMC 6nm, apparently it was revealed late 2019. According to this, they went into mass production in August 2020. Couldn't find what they were mass producing though. Perhaps it is DG2.

Edit 2: According to this article, 6nm is an enhanced EUV 7nm with +20% density. For reference, this would make it about the same density as Samsung's new 5nm node.

www.gizchina.com/2020/08/21/tsmc-6nm-process-is-currently-in-mass-production/
Posted on Reply
#11
dyonoctis
Intel will need a decent campaign (and results) to make the average consumer buy their gaming gpu. For years and years intel iGPU have been associated with the worst gaming experience that you could get on a low-cost laptop. Going from a gma945 to a laptop with a Radeon sticker was a liberating experience :D

"Intel gaming GPU" doesn't roll off the tongue just yet, knowing their history it's no wonder that most people are cautious. For "newcomers" in the market and so close to launch, I don't know if them not giving any numbers on the expected performance is either a proof of wisdom, or just that there's nothing to hype about.
Posted on Reply
#12
Chrispy_
Tiger Lake's Xe IGP wasn't a disaster, and that was 25W comparable to Vega8.

If we take a Vega8 and multiply by 5.3 to emulate the shift from 96 to 512 EU, we get a "Vega42". That's not exactly bleeding edge, given that a Vega64 is starting to show its age, but 2/3rds of a Vega64 would probably put it in the ballpark of a GTX 1060 6GB or RX570.
Posted on Reply
#13
Patr!ck
This is just the beginning. I can't wait for a full Intel build with CPU + GPU + Optane SSD.
Posted on Reply
#14
sepheronx
Patr!ckThis is just the beginning. I can't wait for a full Intel build with CPU + GPU + Optane SSD.
Why?
Posted on Reply
#15
londiste
dyonoctisGoing from a gma945
GMA945? The 945 chipsets? Aren't these well over 10 years old, like 13 years or so? :D
Posted on Reply
#16
swaaye
dyonoctisIntel will need a decent campaign (and results) to make the average consumer buy their gaming gpu. For years and years intel iGPU have been associated with the worst gaming experience that you could get on a low-cost laptop. Going from a gma945 to a laptop with a Radeon sticker was a liberating experience :D

"Intel gaming GPU" doesn't roll off the tongue just yet, knowing their history it's no wonder that most people are cautious. For "newcomers" in the market and so close to launch, I don't know if them not giving any numbers on the expected performance is either a proof of wisdom, or just that there's nothing to hype about.
It comes down to performance and gaining some mind share of course. It's not hard to get the news out these days. If the performance is there in the games that are popular at the time, and the price is right, it will go fine.

If Intel are now willing to invest the resources to compete at gaming GPUs (and surely AI/compute as well), it will happen. It is a gigantic investment at this point but they aren't a strapped for cash startup. I'm all for seeing NVidia getting slapped around a bit. Heh.
Posted on Reply
#17
ZoneDymo
Vya DomusI always wonder what kind of person wastes their time to make random accounts just to thread crap random forums.
I think its just an alt of some people already on this forum tbh
sepheronxWhy?
comment history paints a pretty clear picture of the reason for such an odd statement.
Posted on Reply
#18
InVasMani
Vayra86Who said TSMC ;)

This is a wild assumption and another reason I'm not entirely convinced this is the big one for Intel. Last I checked they wanted to leverage their 10nm SuperFin, but apparently it really doesn't like power a lot.
I'd presume that Intel would try to leverage Samsung because TSMC lately is the emerging fabrication industry leader these days. To me it makes more sense Intel would want to counter balance that while they sort out their own fabrication issues and maybe some of the insight from Samsung and cross licensing could help in that regard and put increased pressure back on TSMC the other way. Samsung also makes some critical GPU components as well so they might even work out a discount. On the other hand Intel might provide some tech IP and discounts the other way in area's they aren't really directly competing in against them. Intel might even be able to shift some other resources to Samsung as well and alleviate it's chipset shortage problem. I feel Intel is long overdue for a chipset node shrink anyway.
Posted on Reply
#19
mechtech
If it works and they are in stock, that would be an advantage over the competition ;)
Posted on Reply
#20
Steevo
londisteWhat? Looking at the 512EU claim, this is 4096 SPs.
First, 4096SP is about 6.4 MX350's.
Second, RX6800 is 6 MX350's.
I will have to wait and see the actual reviews before I would even think about buying one. Your claim is that the RX6800 based off pure shader count will be equal is far fetched, clock speed, memory bandwidth, and so much more all have to be aligned for it to reach last gen performance.
Posted on Reply
#21
RandallFlagg
mechtechIf it works and they are in stock, that would be an advantage over the competition ;)
^^^^ That's what makes this the ideal time for Intel to get into dGPUs.

Sucking up some more of TSMCs capacity will surely affect other parts of the market though. Zero sum game, you can't get blood from a turnip.
Posted on Reply
#22
dyonoctis
londisteGMA945? The 945 chipsets? Aren't these well over 10 years old, like 13 years or so? :D
Yhea that was on my second laptop, a little HP core 2 duo T5200 @1.60Ghz, 1280 x 800, for 600€. The cpu was a huge step up from what I had 3 years before. (Toshiba satellite Intel celeron single core @ 1.47GHZ but with an ATI radeon xpress200m for 800€. Back then computers were getting old really fast :D )
swaayeIt comes down to performance and gaining some mind share of course. It's not hard to get the news out these days. If the performance is there in the games that are popular at the time, and the price is right, it will go fine.

If Intel are now willing to invest the resources to compete at gaming GPUs (and surely AI/compute as well), it will happen. It is a gigantic investment at this point but they aren't a strapped for cash startup. I'm all for seeing NVidia getting slapped around a bit. Heh.
I'm especially interested by the compute support. Intel already showed that they won't let Nvidia get their way in video editing, but 3D rendering is the battle that I'm looking for. Nvidia went out of their way to get a monopoly in that domain. AMD went as far as to develop their own render engine (but it's not that great )
Posted on Reply
#23
DeathtoGnomes
Called "DG2," and bas
That has to be a typo, my name abbreviated is D2G, hope they fix it. :D :twitch:
Posted on Reply
#24
laszlo
in this case the real question is can it run Crysis? or only tetris... :D
Posted on Reply
#25
1d10t
Oh yes please, give us stronger iGPU so AMD would make RDNA based APU :D
Posted on Reply
Add your own comment
May 6th, 2024 21:53 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts