• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Xe Graphics to Feature MCM-like Configurations, up to 512 EU on 500 W TDP

fixed.

Nvidia is the goal post here, not AMD. The fight for 2nd place is moot because the real winner will be consumers.
I don't think they can really dethrone Nvidia tbh, at least in the market we/I am interested in.

At worst Intel produces mediocre product but AMD falls behind because Intel does driver support very good. At best Intel GPUs handily beat AMD and slowly Radeon becomes irrelevant.
 
Then it will be a 300 watt card against the hypothetical but very close to 100 watt GTX 2660 Super 7nm+ 2048 Cuda / 128 bit .
there wont be gtx _660 skus this time, rtx will be coming to 3050 i expect
 
At worst Intel produces mediocre product but AMD falls behind because Intel does driver support very good.
or there's performance penatly on every driver with new architectural vulnerabilities

but on serious note,nvidia has been laughing how easy they've had it in the last couple of years.

even if intel is slower,merely producing new cards would invigorate the martket.

amd has no cards above tu106 equvalent rx5700xt.that's how sad it is.
 
Wow, intel are in the shiz , 500watt 2070 ,,? my old Vega 64 would piss on it and yet still gets called shit by you lot with its massive power draw(total bs in the hands of a tuner btw).

this is going to need at least two respins before consumer time , see you in 2021 intel GPU, shame.
 
CUDA and OpenCL are severely limited for decision-based algorithms, and are not great at AI, and have appalling latency for certain tasks. They are good at processing vast quantities of data with rudimentary transformations.

It's not the software, it's the hardware that is limited in performance or capability with things such as branching and synchronization across threads. If you make it so that it can do those things well it becomes worse at everything else. Trying to address a broader compute spectrum is a bad idea, we already have high core count CPU meant to for those.
 
amd has no cards above tu106 equvalent rx5700xt.that's how sad it is.
The biggest slam dunk in my memory is GK104 was so powerful it became GTX 680 when it's predecessor GF104/114 was mere GTX 460/560 Ti. **104 has been Nvidia's high end chip since. It's kinda like watching Mercedes dominating F1 since 2014.
Wow, intel are in the shiz , 500watt 2070 ,,? my old Vega 64 would piss on it and yet still gets called shit by you lot with its massive power draw(total bs in the hands of a tuner btw).

this is going to need at least two respins before consumer time , see you in 2021 intel GPU, shame.
At least Intel has the money to burn and improve even if the rumor is true. And a few months ago there were rumors that both Nvidia and AMD are working on MCM GPUs.

The last high end AMD that actually delivered was R9 290X. Let that sink in. R9 390x was a bloody rebadge and later R9 Fury and RX VII and Vega were late and duds.
 
The biggest slam dunk in my memory is GK104 was so powerful it became GTX 680 when it's predecessor GF104/114 was mere GTX 460/560 Ti. **104 has been Nvidia's high end chip since. It's kinda like watching Mercedes dominating F1 since 2014.

At least Intel has the money to burn and improve even if the rumor is true. And a few months ago there were rumors that both Nvidia and AMD are working on MCM GPUs.

The last high end AMD that actually delivered was R9 290X. Let that sink in. R9 390x was a bloody rebadge and later R9 Fury and RX VII and Vega were late and duds.
you can talk whatever shite you want mate , but do try to keep on topic , were talking intel and a 500 watt gpu here.

hows about you comment on that, AMD and NVidia mean naught to my point so stick your point back up your nose and stop trying to start flame wars.
 
I for one hope that Intel does have a competitive product if for nothing else to drive down prices.
 
Wow, intel are in the shiz , 500watt 2070 ,,? my old Vega 64 would piss on it and yet still gets called shit by you lot with its massive power draw(total bs in the hands of a tuner btw).

this is going to need at least two respins before consumer time , see you in 2021 intel GPU, shame.

Have you read the news piece? The 400W-500W is a Data Center solution, the consumer cards will top at 300W for the high-end one.

Where do you come up with the 500W card for Vega 64 performance?
 
Those are looking like some pretty grandiose plans for a company that hasn't launched a GPU in 22 years.

I'm hoping for competition as much as the next guy, but let's see if they can get the baby steps right and make a viable dGPU that people might want to buy first.

After all, if it's not a success, Intel will just can it and all of these roadmap ideas will be archived like Larabree was.
They released plenty GPUs. They didn't release dGPUs, but that doesn't mean they've been out of the loop for 20+ years.
They forgot the tiny print: 75% of power consumption is the interconnect. Efficiency is trash, we won't really produce this, but we have to market something.
IF accounts for a lot of the power draw in Zen designs, too. That doesn't mean AMD can't produce Zen.
 
Have you read the news piece? The 400W-500W is a Data Center solution, the consumer cards will top at 300W for the high-end one.

Where do you come up with the 500W card for Vega 64 performance?

To me it looks suspiciously like the guy is trying to justify a disappionting purchase motivated by hype. I'm of course speculating, but I personally met a few people just like that, not only in PC hardware space.

As for the Intel GPUs, I fail to see any point in speculating based on unconfirmed snippet of information. On the one hand, Intel can throw money at the problem, while on the other, it's easy to ruin a perfectly good product with a small oversight.
 
To me it looks suspiciously like the guy is trying to justify a disappionting purchase motivated by hype. I'm of course speculating, but I personally met a few people just like that, not only in PC hardware space.

As for the Intel GPUs, I fail to see any point in speculating based on unconfirmed snippet of information. On the one hand, Intel can throw money at the problem, while on the other, it's easy to ruin a perfectly good product with a small oversight.
@kings 4-500 watt for the data center only, these are all rumors, other rumors say 1-4 tiles will make up DG2 the consumer high end, these are rumors but back to 4-500 watts, is that going to compare well to arcturis or nvidias Hopper, I doubt it, also rumors indicate that the original use case for intels GPU left the waiting room already(streaming) and the GGPU faculties of this chip cost efficiency and die space, we will see but rumors are not sounding good to me

@TheUn4seen I owned Fx8350 i owned vega neither use the wattage in use people percieved but they got slammed(rightly so on efficiency) and the vega 64 has done millions of folding @home Work units all day everyday since purchase @110-150watts , does 4k gaming on every game i have played at180-250 watts , if it works it works ,i bought and stuck a 1080ti gets 10-15% better Fps but would have cost me 10-15% more, im fine with my over hyped( true but mostly by other than AMD ) GPU

so then, why is intel different why shouldn't they get stick for raping wattage< answer that one question


I am certainly not hyping This , and I certainly don't want intel to fail, but my disappointment in what's been shown(Dg1 and this) is what it is, and as i said I now think Intels a non starter in the GPU space until 2021 at the earliest(i think im being optimistic here btw) unfortunately.

AND other than the foveros bit of this tech it sounds less interesting as a commercial purchase than any of its competition BY FAR.


AND EVERYONE keeps saying Intel has the money, true they certainly do - let's see what the shareholders have to say in 2021 shall we, they love throwing their dividend at the wall to hopefully stick? hence the massive re organization lately, it was all going to plan eh.

IT takes years to bring change into chip architecture not a year not two,,,, years, MANY years like 3-5, if they f up, like Larrabee,, they cant change tac that easy and despite all, will have to crack on, so by now positive news regarding their GPU's is what we need to be seeing not this.
 
Last edited:
Wow 500W that is way too much.
 
MCM and two chips. I'm interested how will this chip work? Will it be recognized as one chip or two separate like SLI. Wonder how would the link work.
AMD is trying (maybe was) to get 2 chips connected via IF. Is Intel trying to beat AMD to the punch?
 
Will it be recognized as one chip or two separate like SLI. Wonder how would the link work.
Multiple chips.

Having multiple chips presented as one GPU has benefits for gaming and otherwise real-time rendering. For GPGPU/HPC/AI stuff it generally does not matter and presenting as they physically are is probably more beneficial for better control and efficiency.

If Intel has figured out how to efficiently combine multiple chips into one GPU they have pretty good jump on both AMD and Nvidia who have been trying to get there for at least over a decade.
 
If Intel has figured out how to efficiently combine multiple chips into one GPU they have pretty good jump on both AMD and Nvidia who have been trying to get there for at least over a decade.
This is what I care about. I know it hasn't been done so far and I'm curious. It's not the matter of gaming or workstation stuff but I wanna know if they have managed to do it and if it works OK.
 
This is what I care about. I know it hasn't been done so far and I'm curious. It's not the matter of gaming or workstation stuff but I wanna know if they have managed to do it and if it works OK.
I am willing to bet they have not.

AMD and Nvidia have put more time and effort into this than anyone and they do not have a viable solution to combining multiple dies to a single GPU. Even if the solution was something exotic and expensive, considering what workstation cards go for either one of them would have deployed it.
 
I am willing to bet they have not.

AMD and Nvidia have put more time and effort into this than anyone and they do not have a viable solution to combining multiple dies to a single GPU. Even if the solution was something exotic and expensive, considering what workstation cards go for either one of them would have deployed it.
Probably. Was curious how they have managed to get chips connected and how it looks. maybe it is to early yet.
 
Probably. Was curious how they have managed to get chips connected and how it looks. maybe it is to early yet.
Directly via foveros , emib interconnects, chip edge type connections which likely incorporate through-silicon vias at the edge of the silicon so any tile based rendering would be per tile and there does not seem to be a separate managing(for want of a better word) chip so the control ( to regulate frame presentation)must be built into the chips or driver.
 
Directly via foveros , emib interconnects, chip edge type connections which likely incorporate through-silicon vias at the edge of the silicon so any tile based rendering would be per tile and there does not seem to be a separate managing(for want of a better word) chip so the control ( to regulate frame presentation)must be built into the chips or driver.
So the way it used to be. Well in that case nothing special.
 

quoted from wccf
"
Here are the actual EU counts of Intel's various MCM-based Xe HP GPUs along with estimated core counts and TFLOPs:
  • Intel Xe HP (12.5) 1-Tile GPU: 512 EU [Est: 4096 Cores, 12.2 TFLOPs assuming 1.5GHz, 150W]
  • Intel Xe HP (12.5) 2-Tile GPU: 1024 EUs [Est: 8192 Cores, 20.48 assuming 1.25 GHz, TFLOPs, 300W]
  • Intel Xe HP (12.5) 4-Tile GPU: 2048 EUs [Est: 16,384 Cores, 36 TFLOPs assuming 1.1 GHz, 400W/500W]"
I take back my mocking if this bit is true, they may have something in 2021 that Might compete, still needs some die shrinking and power optimization though but this sounds much more promising as a development step.
 
Up to 512EUs x 4 for 500W. 500W is the 4 ciplet variant and each chiplet is 512EUs.
There is no way in this world for a 96EU iGPU to fit into 15W TDP and a 512 EU to require 500W.
 
Up to 512EUs x 4 for 500W. 500W is the 4 ciplet variant and each chiplet is 512EUs.
There is no way in this world for a 96EU iGPU to fit into 15W TDP and a 512 EU to require 500W.
It's rumoured the consumer version will top out at two tiles, 1024 EU's and. 300watt tdp.
 

quoted from wccf
"
Here are the actual EU counts of Intel's various MCM-based Xe HP GPUs along with estimated core counts and TFLOPs:
  • Intel Xe HP (12.5) 1-Tile GPU: 512 EU [Est: 4096 Cores, 12.2 TFLOPs assuming 1.5GHz, 150W]
  • Intel Xe HP (12.5) 2-Tile GPU: 1024 EUs [Est: 8192 Cores, 20.48 assuming 1.25 GHz, TFLOPs, 300W]
  • Intel Xe HP (12.5) 4-Tile GPU: 2048 EUs [Est: 16,384 Cores, 36 TFLOPs assuming 1.1 GHz, 400W/500W]"
I take back my mocking if this bit is true, they may have something in 2021 that Might compete, still needs some die shrinking and power optimization though but this sounds much more promising as a development step.

But this is like 4 times what is stated in the article on TPU, and the source is WCCFTech.

So now we suddenly think Intel can cram this into a 500W TDP? And how would this be positioned now?! 512 EU as entry level? 1024 EU midrange and 2048 EU... datacenter?! Where is enthusiast in this picture? So many questions.

Bags of salt required
 
But this is like 4 times what is stated in the article on TPU, and the source is WCCFTech.

So now we suddenly think Intel can cram this into a 500W TDP? And how would this be positioned now?! 512 EU as entry level? 1024 EU midrange and 2048 EU... datacenter?! Where is enthusiast in this picture? So many questions.

Bags of salt required
No way man, Those are rock solid facts defo:P

500 watt TDP would be the 4 tile data center, the highes end consumer is 2 tiles 300 watt, I think ill pass personally but perhaps they might surprise us ,fingers crossed.
 
Back
Top