Tuesday, September 15th 2020

Intel DG2 Discrete Xe Graphics Block Diagram Surfaces

New details have leaked on Intel's upcoming DG2 graphics accelerator, which could help shed some light on what exactly can be expected from Intel's foray into the discrete graphics department. For one, a product listing shows an Intel DG2 graphic accelerator being paired with 8 GB of GDDR6 memory and a Tiger Lake-H CPU (45 W version with 8 cores, which should carry only 32 EUs in integrated graphics hardware). This 8 GB of GDDR6 detail is interesting, as it points towards a 256-bit memory bus - one that is expected to be paired with the 512 EU version of Intel's DG2 (remember that a 384 EU version is also expected, but that one carries only 6 GB of GDDR6, which most likely means a 192-bit bus.

Videocardz says they have received an image for the block diagram on Intel's DG2, pointing towards a 189 mm² die area for the 384 EU version. Looking at component density, it seems that this particular diagram may refer to an MXM design, commonly employed as a discrete notebook solution. 6 total GDDR6 chips are seen in the diagram, thus pointing towards a memory density of 6 GB and the aforementioned 192-bit bus. Other specs that have turned up in the meantime point towards a USB-C interface being available for the DG2, which could either point towards a Thunderbolt 4-supporting design, or something like Virtual Link.
Sources: Uniko's Hardware @ Twitter, Komachi @ twitter, via Videocardz
Add your own comment

31 Comments on Intel DG2 Discrete Xe Graphics Block Diagram Surfaces

#1
fynxer
I for one is very exited that Intel is entering the discrete GPU market with RTG at the helm.

Maybe the first generation will have some minor drawbacks compared to nvidia BUT I believe they will be fixed til second generation.

Most important is that there will be hard competition in the low and medium GPU segment off the bat and high end segment down the road.

I believe Intel will price aggressively for at lest 2 or 3 generations to get a good market share and secure their future GPU business.

ALSO Thunderbolt 4 built in could be a game changer since this would make it really simple to make cheap GFX upgrades for all laptops with Thunderbolt 4.

External GFX upgrade market for laptops is pretty much untapped, current solutions way to expensive. Intel have a BIG chance to dominate this market before the competition have time to react.

RTG = Raja Technology Group
Posted on Reply
#2
Verpal
I would be pleasantly surprised if Intel can offer a Xe GPU with RTX 3070 level performance, 256bit is a bit worrying, however.
Posted on Reply
#3
R0H1T
VerpalI would be pleasantly surprised if Intel can offer a Xe GPU with RTX 3070 level performance, 256bit is a bit worrying, however.
Say what :wtf:

AMD's having a hard time matching them with an inferior node & you're saying the potato GPU maker will get to basically 2080Ti levels right from the get go o_O
Posted on Reply
#4
laszlo
R0H1TSay what :wtf:

AMD's having a hard time matching them with an inferior node & you're saying the potato GPU maker will get to basically 2080Ti levels right from the get go o_O
oh man you destroyed his joke :laugh:
Posted on Reply
#5
AnarchoPrimitiv
R0H1TSay what :wtf:

AMD's having a hard time matching them with an inferior node & you're saying the potato GPU maker will get to basically 2080Ti levels right from the get go o_O
Do we know for a fact that AMD CAN'T make a 2080ti competitor (and actually leaks already point to RDNA2 having a 3080 competitor, so that's easily Surpassing the 2080ti) or that they WON'T? All market data shows that 95%+ of all dGPU sales are sub $500 (with the majority of that being $350 and below). For a company with much, much lower resources than Nvidia, you have to pick your battles and fight where the largest volume of sales is, so in terms of smart business moves, focusing on the segment which commands the bulk of the T.A.M. is wise.

I think people seem to forget that AMD is a multitude smaller than Nvidia, and yet everyone expects them to compete on every front. Maybe I'm an optimist, but I think the fact that AMD is so much smaller and with such a smaller R&D budget yet still able to compete with Nvidia on the most important and profitable segments AND straight up embarrass Intel is seriously impressive.... We're talking about a single, small company (compared to Intel and Nvidia) beating one and hanging with the other... I think that's impressive, and I am certainly thankful for it, otherwise I'd probably have had to pay $1000 for 8 cores from Intel instead of $290 for my 2700x almost two years ago.

While it's nice to see another competitor entering the dGPU market, it would have been much better for consumers if it was a wholly new entity and not Intel.... We need NEW competitors in old markets, not OLD competitors in new markets.
Posted on Reply
#6
P4-630
R0H1TSay what :wtf:

AMD's having a hard time matching them with an inferior node & you're saying the potato GPU maker will get to basically 2080Ti levels right from the get go o_O
Posted on Reply
#7
Pinktulips7
R0H1TSay what :wtf:

AMD's having a hard time matching them with an inferior node & you're saying the potato GPU maker will get to basically 2080Ti levels right from the get go o_O
Boy :banghead:you have no idea???? Eggy AMD about to get crush by intel GPU!!!!!!!!!!!!!!intel will bring 3080 level performance by 2021,you must be new and have lacks of knowledge..........:eek:
Posted on Reply
#8
Kohl Baas
AnarchoPrimitiv[...]
The other thing people forget is that Intel is at least 10 times bigger than AMD and nVidia combined. There should be no illusions about Intel's demise. They just got hammered by a multitude of bad things. The comfort of calcking competition for at least half a decade, some bad decisions, problems with the 10nm node, voulnerability issues, the fact that the competition has Ryzen from oblivion... And that all combined is just huge! Yet there they are, pushing R&D into GPUs the way never before. Larrabee was a dead end. The basic concept of many x86 cores used to grapics/HPS was doomed from the get go. This concept now is different. Much more like the competition. And the people hired to work on it are also from the competition. I think the question is not wether they will succeed or not. The question is when.
Posted on Reply
#10
Steevo
Pinktulips7They will by 2021
Sure, and when in 2021 they have 2080Ti leveled of performance Nvidia and AMD will have 4080Ti and 7500XT levels of performance.

Raja is smoke and mirrors, which is why he got the boot from AMD.
Posted on Reply
#11
ZoneDymo
fynxerI for one is very exited that Intel is entering the discrete GPU market with RTG at the helm.
Maybe the first generation will have some minor drawbacks compared to nvidia BUT I believe they will be fixed til second generation.
Most important is that there will be hard competition in the low and medium GPU segment off the bat and high end segment down the road.
I believe Intel will price aggressively for at lest 2 or 3 generations to get a good market share and secure their future GPU business.
ALSO Thunderbolt 4 built in could be a game changer since this would make it really simple to make cheap GFX upgrades for all laptops with Thunderbolt 4.
External GFX upgrade market for laptops is pretty much untapped, current solutions way to expensive. Intel have a BIG chance to dominate this market before the competition have time to react.
RTG = Raja Technology Group
I doubt they will price aggressively, Intel is like Apple, their products are meant to be the "luxury" item, that is how they profile themselves.
To suddenly go (consumer friendly) cheap with their first jump into gpu's….I doubt it.
Posted on Reply
#12
kapone32
Kohl BaasThe other thing people forget is that Intel is at least 10 times bigger than AMD and nVidia combined. There should be no illusions about Intel's demise. They just got hammered by a multitude of bad things. The comfort of calcking competition for at least half a decade, some bad decisions, problems with the 10nm node, voulnerability issues, the fact that the competition has Ryzen from oblivion... And that all combined is just huge! Yet there they are, pushing R&D into GPUs the way never before. Larrabee was a dead end. The basic concept of many x86 cores used to grapics/HPS was doomed from the get go. This concept now is different. Much more like the competition. And the people hired to work on it are also from the competition. I think the question is not wether they will succeed or not. The question is when.
They also cannot stop making money. They also have some serious resources and manufacturing capability vs the competition.
Posted on Reply
#13
Pinktulips7
This year Nvidia then AMD then intel but next year just reverse it................
Posted on Reply
#14
Zareek
I pray Intel can make something at least competitive in the mid-range. Competitive in price, performance and stability... The market segment really needs competition.
Posted on Reply
#15
Jism
VerpalI would be pleasantly surprised if Intel can offer a Xe GPU with RTX 3070 level performance, 256bit is a bit worrying, however.
Worrying? If they manage to offer RX580 (=1080p) performance it's good already.
Posted on Reply
#16
HugsNotDrugs
ZoneDymoI doubt they will price aggressively, Intel is like Apple, their products are meant to be the "luxury" item, that is how they profile themselves.
To suddenly go (consumer friendly) cheap with their first jump into gpu's….I doubt it.
To ensure Intel is relevant to developers producing AAA titles it needs meaningful market share in the dGPU space. To gain market share from nothing they need to position their products aggressively.

I suspect Intel's big push will be pairing Intel laptops with an Intel dGPU and offering cost incentives to OEMs to do so. Second to that I suspect those dGPUs will land in the desktop add-in market at a price competitive in the entry-level to medium performance segment.

Intel is no stranger to graphics and I suspect their products will be a great option.
Posted on Reply
#17
bug
AnarchoPrimitivDo we know for a fact that AMD CAN'T make a 2080ti competitor (and actually leaks already point to RDNA2 having a 3080 competitor, so that's easily Surpassing the 2080ti) or that they WON'T? All market data shows that 95%+ of all dGPU sales are sub $500 (with the majority of that being $350 and below). For a company with much, much lower resources than Nvidia, you have to pick your battles and fight where the largest volume of sales is, so in terms of smart business moves, focusing on the segment which commands the bulk of the T.A.M. is wise.

I think people seem to forget that AMD is a multitude smaller than Nvidia, and yet everyone expects them to compete on every front. Maybe I'm an optimist, but I think the fact that AMD is so much smaller and with such a smaller R&D budget yet still able to compete with Nvidia on the most important and profitable segments AND straight up embarrass Intel is seriously impressive.... We're talking about a single, small company (compared to Intel and Nvidia) beating one and hanging with the other... I think that's impressive, and I am certainly thankful for it, otherwise I'd probably have had to pay $1000 for 8 cores from Intel instead of $290 for my 2700x almost two years ago.

While it's nice to see another competitor entering the dGPU market, it would have been much better for consumers if it was a wholly new entity and not Intel.... We need NEW competitors in old markets, not OLD competitors in new markets.
This is definitely not about picking your battles. High end chips don't sell in volumes, you build that using chips of higher bins, it's not like if you don't do that you gain a lot to divert elsewhere.
The sad truth is the higher end binned chips from AMD just haven't measured up for a few years now. Navi was a pleasant surprise, hopefully Navi2 can build on top of that.
Posted on Reply
#18
mastrdrver
Anyone old enough to remember that Intel's last foray in to graphics was a joke.

I expect nothing different this time (if it actually materializes of which I'm very doubtful).
Posted on Reply
#19
chstamos
JismWorrying? If they manage to offer RX580 (=1080p) performance it's good already.
Every leak so far points to even RX580 being a tall order for the Xe, meanwhile it's already been pushed at least 7-8 months back (was initially supposed to come out this summer) while nvidia and AMD are still moving forward. Intel needs to get its act together, because right now, they are not looking good.

And of course they're a huge company, they're not going to go bankrupt and shut down or anything. Whoever suggested that? IBM is a huge profitable company too, but they have become completely irrelevant to the home market several decades now. They used to make home PCs, you know. If not only AMD but nVidia too held a X86 license I honestly think even intel bankrupting might have been a possibility (however slight), but that's another parallel universe.
Posted on Reply
#20
bug
mastrdrverAnyone old enough to remember that Intel's last foray in to graphics was a joke.

I expect nothing different this time (if it actually materializes of which I'm very doubtful).
Well, since I'm not old enough for memory loss to kick in yet, I still remember i740 being a rather capable and cheap card, despite not being the fastest.
Posted on Reply
#22
user112
the key to victory for intel on both the CPU and GPU side is having good performance per watt making it better to buy a a new intel GPU than a old Nvidia or AMD GPU. after that it just a matter of making cards at a good price that can consistently hit target settings like 4k120Hz+HDR rather than actually winning vs AMD or Nvidia in raw performance.
Posted on Reply
#23
Apocalypsee
bugWell, since I'm not old enough for memory loss to kick in yet, I still remember i740 being a rather capable and cheap card, despite not being the fastest.
I used to play with intel i810 graphics (integrated version for socket 370) with dedicated 4MB VRAM. The only thing I remember about it is how slow it was, and the driver simply awful. RtCW runs with white textures everywhere, but if I load one map and follow one specific path the texture suddenly appear for the rest of the game so each time I need to load that map first before playing.
Posted on Reply
#24
bug
ApocalypseeI used to play with intel i810 graphics (integrated version for socket 370) with dedicated 4MB VRAM. The only thing I remember about it is how slow it was, and the driver simply awful. RtCW runs with white textures everywhere, but if I load one map and follow one specific path the texture suddenly appear for the rest of the game so each time I need to load that map first before playing.
Well, yes, their IGP weren't so great (mostly because those were the early days of IGPs). But that's not what we're talking about here, is it?
Posted on Reply
#25
Apocalypsee
bugWell, yes, their IGP weren't so great (mostly because those were the early days of IGPs). But that's not what we're talking about here, is it?
Im talking about the driver. They need to step up the driver department. Their IGP driver is not up to par compared to nvidia/AMD. A good hardware also need a good software to compliment each other
Posted on Reply
Add your own comment
May 2nd, 2024 02:35 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts