• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Storms into 1080p Gaming and Creator Markets with Iris Xe MAX Mobile GPUs

Storm with a small group of very expensive 10nm Unicorn-Soldiers :toast:
 
HAHA, dGPU with 30 FPS @1080p cannot be called gaming.

To be able to call it gaming it should be at least 60 FPS @1080p

30FPS @1080p is gaming on life support at best, you get a computer you can play older games

BUT what happens when that next gen games comes in a month and you get 20FPS in that game and it is unplayable, then you realized you wasted your money on this Intel gaming dGPU.
 
Last edited:
We'll need DDR5 for better iGPU performance. I believe APU with RDNA2 and Zen4 will be awesome budget 1080p gaming option IF DDR5 really launches with 6400MHz frequency (It'll probably be only 4800 at the beginning). I can see RDNA2+DDR5 iGPU reaching 1080p/60fps on high in most titles once SDRAM passes 6000 million transfers per second point.
 
hahahahahahahahahahahahhahahahahahahahahahaha, SURELY THIS IS A HALLOWEEN JOKE, RIGHT ?

I'm laughin so hard, I need to part my hair just to take a dump...:roll:..:eek:..:D..:kookoo:..:clap:..:peace:
 
That was never the point of ROCm. It facilitates CUDA code translation.

I think that most are too focused on the gaming aspects of this chip. While yes, you can game on this and it's gaming performance is comparable to a MX350/GTX1050, that's not the main selling point of DG1. You don't even buy a laptop with a MX350 for gaming, do you? You mainly want it to improve content creation. Case in point, the three laptop it launches in. Not meant for gaming whatsoever.
The combined power of the Iris Xe and Iris Xe Max is nothing to be scoffed at considering the power enveloppe. FP16 => 8TFLOPS. That's GTX 1650m level.
I don't know if anybody here uses their computer for work but Gigapixel AI acceleration, video encoding, ... these things really matter for content creators.

Xe_MAX_Use-Cases.jpg


I this multi-stream video encoding test, it beat a RTX2080 & i9-10980HK combination. Can you imagine that a puny lightweight Acer Swift 3 laptop beating a >€2000 heavy gaming laptop in any task? I would say mission accomplished.

We're going have to wait another year before Intel launches their actual gaming product, DG2.

Don't even think anyone reads your post. they just laughing coz intels first gpu cant do 1080/60fps
 
I guess it's a good thing they stormed into gaming then. If they took things more slowly, they'd be crushing Adreno 400 now instead of MX450.
 
Last edited:
That was never the point of ROCm. It facilitates CUDA code translation.
My bad, but it's still a shame that it's only for HPC. As thing s stands now, it looks like you'll get 3 brands for 3 type of users : AMD for pure gamers, Nvidia for 3D and vfx, Intel for video editors.
 
The article is incorrect. Anandtech has a detailed writeup and basically there's no multi-gpu support at all for games, nor any plan to in the future. Only certain productivity workloads can utilize them together.

It's basically the same speed as the Tiger Lake iGPU, but gives added flexibility and speed for those content creation workloads.

"Storms into the market" LMAO
Well that's real disappointing. And here I thought Intel might have actually beaten AMD to a modern multi-GPU implementation.
 
well ... reading that title ... i read "storm in a glass of water" ... (tempest in a teapot basically) oh well, good enough for a basic ultraportable dGP but nothing exceptional, encoding side it's not bad tho...
also, if it's their "Max" i wonder what is their "Min" if the "Max" only compete vs a MX450

classical Intel tho ... they need to revise their PR strategy :laugh: some slide give a very childish vibe ...
 
Well that's real disappointing. And here I thought Intel might have actually beaten AMD to a modern multi-GPU implementation.

AMD already did an iGPU+dGPU crossfire years ago in systems using their APUs and dGPUs. As far as MCMs go, AMD is already on that, they have a few recent patents for something called "GPU masking" where multiple GPUs are seen as a single, logical device by the API/OS. I'm guessing that this will be a main part of their upcoming MCM dGPUs to come...
Polish_20200910_123500023.png
 
well ... reading that title ... i read "storm in a glass of water" ... (tempest in a teapot basically) oh well, good enough for a basic ultraportable dGP but nothing exceptional, encoding side it's not bad tho...
also, if it's their "Max" i wonder what is their "Min" if the "Max" only compete vs a MX450

classical Intel tho ... they need to revise their PR strategy :laugh: some slide give a very childish vibe ...
The thing is, a dGPU is about the worst thing for an ultraportable...
 
The thing is, a dGPU is about the worst thing for an ultraportable...
yeah i just realized that after xD, but as Intel designed it then it's good enough for the worst thing to be in an ultraportable...
[joke]typical intel[/joke] "we are the best at that because no one did better than us at doing the best for the worst part of something!" :laugh:
 
yeah i just realized that after xD, but as Intel designed it then it's good enough for the worst thing to be in an ultraportable...
[joke]typical intel[/joke] "we are the best at that because no one did better than us at doing the best for the worst part of something!" :laugh:
I don't think that's what they meant to do. It's just what they ended up with isn't good for anything else.
 
I don't think that's what they meant to do. It's just what they ended up with isn't good for anything else.
oh no, i am sure it's what they aimed to do ... their recent PR "we are king of yaddah yaddah real life perf yaddah yaddah gaming crown" are kinda backfiring thus they needed a new "we are the best for [insert new product]"

take it as a joke ;)
 
GeForce MX450 is based on TU117 with 64-bit bus.
 
GeForce MX450 is based on TU117 with 64-bit bus.
I wouldn't know. I have never considered those for anything. For me, it's mid-range dGPU (or better), or just stick to the IGP. I have no use for parts so low-end, you need a different nomenclature to describe them.
 
More liked Intel stormed into mediocrity. Yeah yeah it's a start but surely they should have set the bar higher than the crapulent MX350 given that will be replaced probably within 3-6 months and next gen will be vastly better.
 
Well that's real disappointing. And here I thought Intel might have actually beaten AMD to a modern multi-GPU implementation.

What was old is new again? Lmao... Intel, smh.
 
So, how much of this ridicule is due to what Intel has actually said and how much is from a clickbait headline?
Intel's slides are about mobile creation and integration with iGPU, gaming is mentioned less prominently.
 
HAHA, dGPU with 30 FPS @1080p cannot be called gaming.

To be able to call it gaming it should be at least 60 FPS @1080p

30FPS @1080p is gaming on life support at best, you get a computer you can play older games

BUT what happens when that next gen games comes in a month and you get 20FPS in that game and it is unplayable, then you realized you wasted your money on this Intel gaming dGPU.

I don't think it's even that. It's slower than a Geforce 1050, and 1050 cannot even run high settings on 1080p/30fps.

The negative reactions are probably because nvidia and AMD are neglecting the low-mid end and people (myself included) are anxious for a competitive low end card. Intel overpromised "enthusiast" level discrete cards so we were hoping no matter how much they underdelivered, we'd at least get a respectable 100-120 or 200 dollar pricerange GPU.

Maybe we will, next year. Doesn't seem promising so far, I have to say.

So far we've gotten a look at a crap discrete GPU back in February and an iterative igpu step with Xe branding.

I'm still hoping intel gets it right, eventually, because GPU prices are ridiculous right now.
 
Last edited by a moderator:
There are some Firestrike Scores out, it scores between 1050 Mobile and MX 450.
Amd have nothing in this Class, only a RX 550 and this Crap is slower than an GT 1030:laugh:


Yeah sure AMD have the APU, but the Rzyen 3 3400G is still slower than a GT 1030 (GDDR5) in Games:slap:

HAHA, dGPU with 30 FPS @1080p cannot be called gaming.
Yeah Gaming is exclusive for the Great 1000$ Card buyers, i play atm on a massive OC GT 710 GDDR5 my Games like WoW and FFXIV runs very well with it.
 
Last edited:
I have very mixed feelings about this.

1) This essentially looks a lot like GPU-coprocessor (thanks to Anandtech for a metaphor), which is good, of course. There is a question of cost and software which uses it (which I do expect to grow).
2) One thing I'm passionately hating Intel for is their active taking part in HSA downfall, and *now* they are implementing it themselves. Hypocrisy of the highest possible level (oh, and people on Anandtech missed it completely). And their solution is proprietary, as I understand. Disgusting.

On the other hand, I highly doubt that AMD and NVIDIA will have troubles making competitive products (on die or separate), and open standard would be nice (one already exists, *coughs*).
 
AMD already did an iGPU+dGPU crossfire years ago in systems using their APUs and dGPUs. As far as MCMs go, AMD is already on that, they have a few recent patents for something called "GPU masking" where multiple GPUs are seen as a single, logical device by the API/OS. I'm guessing that this will be a main part of their upcoming MCM dGPUs to come... View attachment 174088

Between that and AMD's 4/8-way specialized communication over Infinity Architecture that lets various Instinct Accelerators talk directly amongst themselves as well as with EPYC, It does feel like AMD could potentially nail MCM GPUs and some sort of heterogenous fusion of CPU+GPU appearing as a singular, giant APU within the next year or two (probably along with a homage to AMD Fusion).

I'm familiar with the asymmetric Crossfire capability; but it never quite lived up to its potential at the time. I was thinking Intel had beaten AMD to a modern implementation of the same concept with this announcement, but it turns out that was just misleading and not fully usable that way; rather, it's only usable in content creation rather than all the time.
 
Back
Top