• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel DG2 Xe-HPG Features 512 Execution Units, 8 GB GDDR6

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,681 (7.42/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Intel's return to discrete gaming GPUs may have had a modest beginning with the Iris Xe MAX, but the company is looking to take a real stab at the gaming market. Driver code from the latest 100.9126 graphics driver, and OEM data-sheets pieced together by VideoCardz, reveal that its next attempt will be substantially bigger. Called "DG2," and based on the Xe-HPG graphics architecture, a derivative of Xe targeting gaming graphics, the new GPU allegedly features 512 Xe execution units. To put this number into perspective, the Iris Xe MAX features 96, as does the Iris Xe iGPU found in Intel's "Tiger Lake" mobile processors. The upcoming 11th Gen Core "Rocket Lake-S" is rumored to have a Xe-based iGPU with 48. Subject to comparable clock speeds, this alone amounts to a roughly 5x compute power uplift over DG1, 10x over the "Rocket Lake-S" iGPU. 512 EUs convert to 4,096 programmable shaders.

A leaked OEM data-sheet referencing the DG2 also mentions a rather contemporary video memory setup, with 8 GB of GDDR6 memory. While the Iris Xe MAX is built on Intel's homebrew 10 nm SuperFin node, Intel announced that its Xe-HPG chips will use third-party foundries. With these specs, Intel potentially has a GPU to target competitive e-sports gaming (where the money is). Sponsorship of major e-sports clans could help with the popularity of Intel Graphics. With enough beans on the pole, Intel could finally invest in scaling up the architecture to even higher client graphics market segments. As for availability, VideoCardz predicts a launch roughly coinciding with that of Intel's "Tiger Lake-H" mobile processor series, possibly slated for mid-2021.



View at TechPowerUp Main Site
 
So about five MX350's then, now. Cool, 720p gaming is now within reach! In mid 2021... in a handful of select models with low-power CPUs.

Wake me up when they get serious k
 
Low quality post by bluetriangleclock
So about five MX350's then, now. Cool, 720p gaming is now within reach! In mid 2021... in a handful of select models with low-power CPUs.
What? Looking at the 512EU claim, this is 4096 SPs.
First, 4096SP is about 6.4 MX350's.
Second, RX6800 is 6 MX350's.
 
What? Looking at the 512EU claim, this is 4096 SPs.
First, 4096SP is about 6.4 MX350's.
Second, RX6800 is 6 MX350's.

Also 14Gbps GDDR6 should be 6.5x of LPDDR4X's bandwidth if they scale up the internal PHYs. That will undoubtedly increase overall performance.

So to build off the conservative "5x" remark from @Vayra86 we'd choose the GTX 1050 rather than the MX350 as the memory subsystem is more in line with the DG2's dedicated GDDR. 5x of GTX 1050 performance is the RTX 3070/RTX 2080 Ti. That's pretty serious if it's priced well, and judging by only 8GB of VRAM it probably will be.
 
Last edited:
I don't really see this denting any major GPU numbers unless Intel can somehow force it into prebuilt "gaming" PCs; which might be possible via an "All Intel" initiative, especially in countries where they have more of a stranglehold. That said, with Radeon currently trading blows with NVIDIA and both them aggressively supporting OEMs to get their GPUs into prebuilts, and the fact that AMD could also push an "All AMD" initiative too, it's going to be a long battle, but one worth watching. Especially since it's been a long time since we last had a 3 way GPU battle and every maker was trying to throw in new and innovative features to further push their GPU value along.
 
What? Looking at the 512EU claim, this is 4096 SPs.
First, 4096SP is about 6.4 MX350's.
Second, RX6800 is 6 MX350's.
Seeing is believing... So far AMD couldn't scale that up so easily over a decade of time coming from GCN.

Can Intel feed those SPs and are they efficient enough? Is the support in good order? All of this can easily put them far below 6 MX350's and I reckon it will. And then there is power budget... die size... lots of things that really matter to gauge if there is any future in it at all.

Anyway, I hope they make a big dent with their first serious release. I just question if this is the one, as we've not really seen much of Xe in gaming yet. Its easy to feed a low end IGP next to a CPU, and its what Intel has done for ages, and the results are convincingly unimpressive. I still don't quite understand why they axed Broadwell as they did, when you see console APUs now that are using similar cache and on chip memory systems. But maybe that was the old, pre-panic-mode-laid-back-quadcore-Intel at work. Today they have Raja but somehow that doesn't inspire confidence.
 
Last edited:
Only ~11% of Steam users use AMD GPUs. That number is going to drop when Intel starts shipping their graphics cards. R.I.P., AMD.
Given your comment history the question needs to be asked: Do you enjoy making pointless comments? I mean really, no one in their right mind would believe that Intel is going to sweep AMD aside like you suggest. Just not gonna happen.
 
Given your comment history the question needs to be asked: Do you enjoy making pointless comments? I mean really, no one in their right mind would believe that Intel is going to sweep AMD aside like you suggest. Just not gonna happen.

Its probably his paper launch. 5 posts to an ignore button is quite an achievement I gotta say.
 
Third Party foundry, huh? Where is Intel going to Fina any available capacity at TSMC?
 
Third Party foundry, huh? Where is Intel going to Fina any available capacity at TSMC?
Who said TSMC ;)

This is a wild assumption and another reason I'm not entirely convinced this is the big one for Intel. Last I checked they wanted to leverage their 10nm SuperFin, but apparently it really doesn't like power a lot.
 
"The manufacturer did not confirm which foundry would receive orders on DG2 wafers, but it is rumored that the DG2 GPU might be manufactured by TSMC in a 6nm process technology."




Edit: Haven't heard much about TSMC 6nm, apparently it was revealed late 2019. According to this, they went into mass production in August 2020. Couldn't find what they were mass producing though. Perhaps it is DG2.

Edit 2: According to this article, 6nm is an enhanced EUV 7nm with +20% density. For reference, this would make it about the same density as Samsung's new 5nm node.

 
Last edited:
Intel will need a decent campaign (and results) to make the average consumer buy their gaming gpu. For years and years intel iGPU have been associated with the worst gaming experience that you could get on a low-cost laptop. Going from a gma945 to a laptop with a Radeon sticker was a liberating experience :D

"Intel gaming GPU" doesn't roll off the tongue just yet, knowing their history it's no wonder that most people are cautious. For "newcomers" in the market and so close to launch, I don't know if them not giving any numbers on the expected performance is either a proof of wisdom, or just that there's nothing to hype about.
 
Tiger Lake's Xe IGP wasn't a disaster, and that was 25W comparable to Vega8.

If we take a Vega8 and multiply by 5.3 to emulate the shift from 96 to 512 EU, we get a "Vega42". That's not exactly bleeding edge, given that a Vega64 is starting to show its age, but 2/3rds of a Vega64 would probably put it in the ballpark of a GTX 1060 6GB or RX570.
 
This is just the beginning. I can't wait for a full Intel build with CPU + GPU + Optane SSD.
 
Intel will need a decent campaign (and results) to make the average consumer buy their gaming gpu. For years and years intel iGPU have been associated with the worst gaming experience that you could get on a low-cost laptop. Going from a gma945 to a laptop with a Radeon sticker was a liberating experience :D

"Intel gaming GPU" doesn't roll off the tongue just yet, knowing their history it's no wonder that most people are cautious. For "newcomers" in the market and so close to launch, I don't know if them not giving any numbers on the expected performance is either a proof of wisdom, or just that there's nothing to hype about.

It comes down to performance and gaining some mind share of course. It's not hard to get the news out these days. If the performance is there in the games that are popular at the time, and the price is right, it will go fine.

If Intel are now willing to invest the resources to compete at gaming GPUs (and surely AI/compute as well), it will happen. It is a gigantic investment at this point but they aren't a strapped for cash startup. I'm all for seeing NVidia getting slapped around a bit. Heh.
 
Last edited:
I always wonder what kind of person wastes their time to make random accounts just to thread crap random forums.

I think its just an alt of some people already on this forum tbh

comment history paints a pretty clear picture of the reason for such an odd statement.
 
Who said TSMC ;)

This is a wild assumption and another reason I'm not entirely convinced this is the big one for Intel. Last I checked they wanted to leverage their 10nm SuperFin, but apparently it really doesn't like power a lot.
I'd presume that Intel would try to leverage Samsung because TSMC lately is the emerging fabrication industry leader these days. To me it makes more sense Intel would want to counter balance that while they sort out their own fabrication issues and maybe some of the insight from Samsung and cross licensing could help in that regard and put increased pressure back on TSMC the other way. Samsung also makes some critical GPU components as well so they might even work out a discount. On the other hand Intel might provide some tech IP and discounts the other way in area's they aren't really directly competing in against them. Intel might even be able to shift some other resources to Samsung as well and alleviate it's chipset shortage problem. I feel Intel is long overdue for a chipset node shrink anyway.
 
If it works and they are in stock, that would be an advantage over the competition ;)
 
What? Looking at the 512EU claim, this is 4096 SPs.
First, 4096SP is about 6.4 MX350's.
Second, RX6800 is 6 MX350's.


I will have to wait and see the actual reviews before I would even think about buying one. Your claim is that the RX6800 based off pure shader count will be equal is far fetched, clock speed, memory bandwidth, and so much more all have to be aligned for it to reach last gen performance.
 
If it works and they are in stock, that would be an advantage over the competition ;)

^^^^ That's what makes this the ideal time for Intel to get into dGPUs.

Sucking up some more of TSMCs capacity will surely affect other parts of the market though. Zero sum game, you can't get blood from a turnip.
 
GMA945? The 945 chipsets? Aren't these well over 10 years old, like 13 years or so? :D
Yhea that was on my second laptop, a little HP core 2 duo T5200 @1.60Ghz, 1280 x 800, for 600€. The cpu was a huge step up from what I had 3 years before. (Toshiba satellite Intel celeron single core @ 1.47GHZ but with an ATI radeon xpress200m for 800€. Back then computers were getting old really fast :D )

It comes down to performance and gaining some mind share of course. It's not hard to get the news out these days. If the performance is there in the games that are popular at the time, and the price is right, it will go fine.

If Intel are now willing to invest the resources to compete at gaming GPUs (and surely AI/compute as well), it will happen. It is a gigantic investment at this point but they aren't a strapped for cash startup. I'm all for seeing NVidia getting slapped around a bit. Heh.
I'm especially interested by the compute support. Intel already showed that they won't let Nvidia get their way in video editing, but 3D rendering is the battle that I'm looking for. Nvidia went out of their way to get a monopoly in that domain. AMD went as far as to develop their own render engine (but it's not that great )
 
Called "DG2," and bas

That has to be a typo, my name abbreviated is D2G, hope they fix it. :D :twitch:
 
Back
Top