Monday, August 2nd 2021

Intel DG2 Discrete Graphics Cards Expected To Launch at CES 2022

Intel has reportedly decided to launch their DG2 Xe-HPG discrete graphics cards at CES 2022 according to Hardware Academy on Weibo. Intel has already begun sampling these new GPUs to partners so this launch date appears feasible. The DG2 Xe-HPG discrete graphics cards are designed specifically for gaming and will offer up to 512 Execution Units paired with 16 GB of GDDR6 memory. Intel appears to be planning five separate DG2 Xe-HPG SKUs covering a large lineup of market segments with the top product competing with the NVIDIA GeForce RTX 3080 and AMD Radeon RX 6800. While these products won't be released until CES 2022 Intel may announce gaming processors with integrated Xe-HPG graphics sometime earlier.
Source: Hardware Academy (Weibo)
Add your own comment

24 Comments on Intel DG2 Discrete Graphics Cards Expected To Launch at CES 2022

#1
ZoneDymo
honestly would suddenly respect and even love Intel if they would launch these with competative performance but for...ya know...NORMAL prices making them the automatic choice.
And it would not be a bad move either because sure, they wont be making as much money as they PERHAPS could but this would be a good reason for people to give their cards a go where otherwise you would stick with what you know to work, so good marketing in a way.
Posted on Reply
#2
Vya Domus
Man, if these don't undercut the competition by a massive amount it's not going to be good at all. They're just too late.
Posted on Reply
#3
Prima.Vera
Hopefully they are cheaper and faster than the competition. But by the time those will be, not paper launched, but available to retail stores, nGreedia and AMD would already have the next gen ready to be launched, so those will compete with medium class card from them.
Posted on Reply
#4
ixi
Intel is trying to launch these cards for how long, 2 years? In 2022 it will be almost 3 years?

Hoping for good prices as I doubt that they will have competition near to AMD, Nvidia.
Posted on Reply
#5
zo0lykas
ixiIntel is trying to launch these cards for how long, 2 years? In 2022 it will be almost 3 years?

Hoping for good prices as I doubt that they will have competition near to AMD, Nvidia.
you just read my mind :)

i wanna write the same post, at least maybe one day we can see intel GPUs ;)
Posted on Reply
#6
Vayra86
Vya DomusMan, if these don't undercut the competition by a massive amount it's not going to be good at all. They're just too late.
Yep... as predicted... but when I said 'even if they match current offerings, its not enough' I was told to be wrong by about 3/4th of the readers.

'I don't care if it takes 300W'
'It can be priced 700 bucks just fine, look at what Nv is doing'
'It doesn't have to perform exactly as good as (insert any)'

Yeah yeah... guys. They need a product on-par with current competition or what will happen is what AMD has experienced the last ten years. If you're not playing in the top half end, you're not playing, and you'd better play on all metrics: Heat, Power, Noise, Performance, Support. And if you want to start leading, you also need to pile onto that with soft- and hardware design wins.

So far Raja ticks the following boxes:
- Noise/Power/Heat: depends on perf per shader unit which so far isn't looking better than competition
- Die size: relates to perf per shader unit. How much FPS per square mm. The dies we've seen are effin huge, and they're not wide because they want to run lower clocks or TDPs. They are because they need to be or there is nothing to write home about.
- Price: huge dies translate to huge price points and low wiggle room for price cuts.
- Overall performance: I suppose it can run Crysis, but who knows.
- Time to market: 2 little 2 late
- Software side: yeah. Intel has a great IGP driver after twenty years. It shows an image.

But let's keep dreaming, maybe one day we'll wake up with the sun shining and Raja six feet below. Its when he's gone that design wins happen, look at AMD right now... And a smaller node is not going to save anyone - look at the past. A node advantage is only a very short lived one and it still can't mask a shitty architecture.

What I'm missing in the whole Xe story is a strong departure from what Intel has already done. They're still just tying EUs together.
Posted on Reply
#7
Crackong
No one believes in Intel hype news anymore.
Posted on Reply
#8
Vya Domus
Vayra86Yep... as predicted... but when I said 'even if they match current offerings, its not enough' I was told to be wrong by about 3/4th of the readers.
The thing is you can be late if you're cheap, or you if you're faster. But Intel is going to be slower, late and probably not cheap.
Posted on Reply
#9
Vayra86
Vya DomusThe thing is you can be late if you're cheap, or you if you're faster. But Intel is going to be slower, late and probably not cheap.
And definitely not as well endowed on the software / added value / driver side. Even if I personally don't care a whole lot for two dozen extra features, for a lot of people those are probably a major item. They'll probably do the whole streaming thing just fine. But the rest?
Posted on Reply
#10
R0H1T
CrackongNo one believes in Intel hype news anymore.
Except Intel?
Posted on Reply
#11
ixi
CrackongNo one believes in Intel hype news anymore.
This is not hype. This is shitnel shite.
Posted on Reply
#12
Vayra86
R0H1TExcept Intel?
No they don't believe it either :D

Its a bit like the raindance. Its not really a belief, but if you try often enough maybe one day it'll start raining.
Posted on Reply
#13
TheLostSwede
Vayra86Yep... as predicted... but when I said 'even if they match current offerings, its not enough' I was told to be wrong by about 3/4th of the readers.

'I don't care if it takes 300W'
'It can be priced 700 bucks just fine, look at what Nv is doing'
'It doesn't have to perform exactly as good as (insert any)'

Yeah yeah... guys. They need a product on-par with current competition or what will happen is what AMD has experienced the last ten years. If you're not playing in the top half end, you're not playing, and you'd better play on all metrics: Heat, Power, Noise, Performance, Support. And if you want to start leading, you also need to pile onto that with soft- and hardware design wins.

So far Raja ticks the following boxes:
- Noise/Power/Heat: depends on perf per shader unit which so far isn't looking better than competition
- Die size: relates to perf per shader unit. How much FPS per square mm. The dies we've seen are effin huge, and they're not wide because they want to run lower clocks or TDPs. They are because they need to be or there is nothing to write home about.
- Price: huge dies translate to huge price points and low wiggle room for price cuts.
- Overall performance: I suppose it can run Crysis, but who knows.
- Time to market: 2 little 2 late
- Software side: yeah. Intel has a great IGP driver after twenty years. It shows an image.

But let's keep dreaming, maybe one day we'll wake up with the sun shining and Raja six feet below. Its when he's gone that design wins happen, look at AMD right now... And a smaller node is not going to save anyone - look at the past. A node advantage is only a very short lived one and it still can't mask a shitty architecture.

What I'm missing in the whole Xe story is a strong departure from what Intel has already done. They're still just tying EUs together.
Meh, Intel will just re-badge it as something corporate computers or servers need and sell it regardless :p
Posted on Reply
#14
CheapMeat
What will the Intel GPU's do uniquely or differently? Or just more of the same?
Posted on Reply
#15
rainxh11
after the RX 6600 XT pricing, if intel just be smart and undercut AMD & Nvidia and price their cards at A NORMAL PRICE, ill buy their first gpu without a second thought
Posted on Reply
#16
mahoney
rainxh11after the RX 6600 XT pricing, if intel just be smart and undercut AMD & Nvidia and price their cards at A NORMAL PRICE, ill buy their first gpu without a second thought
Got nothing to do with normal price - they can set their price at $400 and they'll still be selling for 700-800 $$. The prices for gpu's won't drop until the world normalizes and that will take some time sadly
Posted on Reply
#17
RedelZaVedno
I don't get all the pessimism regarding DG2. All Intel has to do is match 6600XT/3060, have stable well supported drivers at release, 250W TPP and most importantly cost sub $250 to sell like hot cakes.
AMD & Nvidia's greed have made it extremely easy for Intel to enter the dGPU market. If Intel can't seize the moment then we're F...ed as DIY gaming PC building will become a playground only rich and nerds can afford. Mainstream would migrate to consoles and gaming clouds. This would eventually cause AAA game publishers to shy away from porting console games to PC altogether and that would be the end for PC gaming.
Posted on Reply
#18
ixi
mahoneyGot nothing to do with normal price - they can set their price at $400 and they'll still be selling for 700-800 $$. The prices for gpu's won't drop until the world normalizes and that will take some time sadly
It will not normalize. Did it normalize after nvidia introduced 10xx series with their shitty shite founders garbage hole? -no.
Posted on Reply
#20
defaultluser
mahoneySome of you have short memories
Prices normalized in 2014 after the 200 Radeon series gpu's were selling for 150% markup
& in 2018 when RX 570/580 were going for 200-300% markup.
When all this corona shit is well behind us and the world gets going with enough stock prices will come down too.
Or even if that doesn't happen, anytime soon, I still expect a decent GPU selloff whenever Etherium transitions to Proof of Stake.

There is currently no more profitable currency than Etherium (and will take at last a year to sort-out the next big-boy,) so I anticipate some smaller miners investing hardware liquidation into ETH.
Posted on Reply
#21
HisDivineOrder
I wouldn't think Intel would blow the whole market up the first go-round, but I think this first launch will be about establishing that they're in it, they're serious, and they'll hammer out the drivers. I wonder if there just aren't a lot of people left paying attention to the market that were here back in the day when nvidia and ATI were struggling against 3dfx. Or hell, even remember S3 and 3dLabs. This is how it goes when you break into a new market.

You don't usually come in and WIN right away. You have to make a name for yourself, establish your presence, and then refine until you're up there with the others. That's what I expect from this line of cards. Not the best performance, not the best drivers. Probably not even incredible pricing. Just good pricing. This is Intel after all. Unfortunately, if it were not for COVID and Tariffs and OEM's seeing their opportunity to finally make bank on cards, then maybe we'd have seen insane pricing on them. As it is, it'll probably be less than the OEM markups, bringing it back down to around (up or down) the MSRP's of nvidia and AMD.

I expect Intel to pop out a few generations faster than nvidia and AMD are used to anymore. They aren't as hungry anymore. They aren't used to having to fight. They're used to being Coke and Pepsi, having secret meetings to determine what price points each performance level will have. I sincerely hope Intel bursts in through their meeting room like the Kool-Aid Man.
Posted on Reply
#22
eidairaman1
The Exiled Airman
ixiIntel is trying to launch these cards for how long, 2 years? In 2022 it will be almost 3 years?

Hoping for good prices as I doubt that they will have competition near to AMD, Nvidia.
Dg2 will be budget line by yhe time they launch.

Id like to see PowerVR come back in
Posted on Reply
#23
Timelessest
I need DG2 to be good so that I can buy 7800 xt/rtx 4080 cheaper!
Posted on Reply
#24
Vayra86
ixiIt will not normalize. Did it normalize after nvidia introduced 10xx series with their shitty shite founders garbage hole? -no.
Sure did, Vega, Polaris and the 1070ti, the 1070 and the 1060 3G were good perf/$. And even the top end 1080ti was highly competitive even today at its price of 650-700. It was absolutely no difference with its predecessor the equally great 980ti.

The moment it turned tits up, was when Turing was left without an answer even if AMD could have formulated one. The lack of answer to a new gen that moved NOTHING to lower price points or higher perf/$ is when that became normalized in peoples heads. Mining isnt even adding onto that, you can safely blame stagnation for it.

And go figure: Pascal to Turing was also a lenghty gap... AMD took its sweet time to finally arrive at RDNA2. But now the market is fubar because of scarce production & mining.
Posted on Reply
Add your own comment