• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Storms into 1080p Gaming and Creator Markets with Iris Xe MAX Mobile GPUs

The only novel thing here is their ability to have an iGPU and dGPU link and work together, which isn't something AMD is really doing currently. Although I recall AMD did have a more rudimentary version of it with Crossfire, and was revisiting the idea more for future MCM GPUs and heterogenous computing via Infinity Architecture in general.

Still, it is a small edge (feature-wise) Intel has for now, and if they're able to go further with a SAM-like equivalent, they could potentially squeeze out a bit more performance that way. That said, it'll be a while until they can sufficiently catch up in actual performance, unless AMD or NVIDIA trips up hard.
 
What i find funny and most dont pick up on is the fact, that unlike gaming laptops when on battery power where the GPU gets castrated and performs worse than an IGP (ive literally tested it and bitched about it in reviews) the fact is if you game on the go ie battery power, if Intel included these GPUs in power efficient laptops and allow it to stretch its legs its honestly not a bad showing.
 
If I'm rocking any form of laptop, I'm probably putting it on somewhere I can plug to the wall. I don't get the gaming on the go without charger, who even does that? You're ultimately emptying the battery for stupid games and if you need to do something actual with the laptop, such as checking bus or plane times, good luck when the laptop is dead. Sure, you have your phone but it'll be a lot more annoying. If you're working with limited battery / laptop, you normally want to keep the charge as much as you can so you can squeeze out enough to finish your hotel day or whatever. If you have nothing better to do than opening the laptop to play games in public, then I wouldn't know what to say to that. More battery is great, on the other hand, especially if you go around doing things such as diagnosing cars with your laptop, that can take time and you can't give a chance for the laptop to die when you're checking for errors in the ECU.
 
If I'm rocking any form of laptop, I'm probably putting it on somewhere I can plug to the wall. I don't get the gaming on the go without charger, who even does that? You're ultimately emptying the battery for stupid games and if you need to do something actual with the laptop, such as checking bus or plane times, good luck when the laptop is dead. Sure, you have your phone but it'll be a lot more annoying. If you're working with limited battery / laptop, you normally want to keep the charge as much as you can so you can squeeze out enough to finish your hotel day or whatever. If you have nothing better to do than opening the laptop to play games in public, then I wouldn't know what to say to that. More battery is great, on the other hand, especially if you go around doing things such as diagnosing cars with your laptop, that can take time and you can't give a chance for the laptop to die when you're checking for errors in the ECU.
Whats the point of allowing a laptop CPU suck down 100-watts of power but the GPU is castrated and limited to just 30-watts? Games dont max out the CPU nor does most workloads, Yet on battery I can hammer the CPU all day but the GPU will never actually be used to an meaningful extent. say the CPU is 100-watts under turbo 60-watts typical GPU is TDP capped at 80-watts but on battery is knocked down to 30-watts. In todays world where many tasks are GPU accelerated the fact is a dedicated GPU performs worse than an IGP in those situations.

Whats the point of 94 whr battery if your not gonna use it?

Fact is a mobile CPU + decent IGP under load with a good battery can last 3-4 hrs playing games when properly configured. I can get nearly 60 FPS out of a 25-watt Ryzen 4800U in GTA V at 720p yet in a similar test with say a 1660 Ti you get about 15-20. That said maybe the use case doesn't apply to you. But it does apply to me and at this point with each new generation. Battery performance gets worse and worse.

So if the Intel GPU is competitive and low power enough its possible to get a good on the go gaming experience on battery power while still getting 4-5 hrs of use when paired with a larger battery.

So it really comes down to how much power this thing will use if its below 30-watts it avoids the throttling issue thats a win for me. hopefully future RDNA based IGPs from AMD are coming soon.
 
What a pointless announcement.

It's for thin and lights, so the single most important metric is performance/Watt.
I skimmed all those slidees, and didn't see a single mention.

Lets, just for one second, assume it has competitive performance/Watt - and I very much doubt that, because if it did Intel would be shouting how great their efficiency was from the rooftops - how much does the damn thing cost?
 
There needs to be a more viable alternative to CUDA that isn't a proprietary API that actually gains traction
There's already OpenCL and Vulkan Compute. The issue is not "proprietary vs openource", but rather fundamental differences in GPU architectures. You cannot make a platform-independent API that can outperform platform-optimized API. There's always some kind of tradeoff.
 
Whats the point of allowing a laptop CPU suck down 100-watts of power but the GPU is castrated and limited to just 30-watts? Games dont max out the CPU nor does most workloads, Yet on battery I can hammer the CPU all day but the GPU will never actually be used to an meaningful extent. say the CPU is 100-watts under turbo 60-watts typical GPU is TDP capped at 80-watts but on battery is knocked down to 30-watts. In todays world where many tasks are GPU accelerated the fact is a dedicated GPU performs worse than an IGP in those situations.

Whats the point of 94 whr battery if your not gonna use it?

Fact is a mobile CPU + decent IGP under load with a good battery can last 3-4 hrs playing games when properly configured. I can get nearly 60 FPS out of a 25-watt Ryzen 4800U in GTA V at 720p yet in a similar test with say a 1660 Ti you get about 15-20. That said maybe the use case doesn't apply to you. But it does apply to me and at this point with each new generation. Battery performance gets worse and worse.

So if the Intel GPU is competitive and low power enough its possible to get a good on the go gaming experience on battery power while still getting 4-5 hrs of use when paired with a larger battery.

So it really comes down to how much power this thing will use if its below 30-watts it avoids the throttling issue thats a win for me. hopefully future RDNA based IGPs from AMD are coming soon.

This seems to be too general of a post. You're not talking about what laptop of what KIND of laptop is even the topic, and I already mentioned that most people are going to put the laptop to the plug. CPU power does matter, you don't anywhere have the power of a desktop CPU in most laptops.

Fact is you're coming up with a configuration that I don't really care about, and it should tell you something about the end user. I'm not going to run some easy to run game, GTA V, on battery at 720p.

I do use the 94Whr battery actually, depends on what kind of laptop it is. As long as the machine is not overly heavy, it will come in handy. Though it's more of a thing with gaming and workstation laptops that have 240W chargers.

Furthermore, if you look at the machines in the slides, you will see convertible laptops and a Swift. Those are not gaming laptops and when you want to game with them, you probably will want to try to get the performance out properly by plugging it to the wall... or you'll not only get worse performance, but also run out of battery because your thin convertible doesn't come with a 94Whr battery.
 
This press release reads like an infomercial on TV.
I was waiting for the bit at the end that said, But wait, there's more.
We'll even throw in six FREE steak knives. :laugh:
 
Practically DOA, what I find particularly baffling is the fact that this uses LPDDR4X. If you go as far as to have a dedicated die with DRAM why not use modern memory optimized for GPUs ? MX350 is practically the closest thing Nvidia can do to a integrated chip in terms of cost and performance and this barely outperforms it? Damn.

I can't believe I am saying this but Intel, of all people, can't seem to comprehend that they need mindshare. You simply can't enter this space with lowest imaginable tier of performance, this will seem like a joke to people and by the time they scrap something together that's higher performance it may as well be over for them.
 
Last edited:
It's a start I guess, but MX350 beating performance is hardly something to brag about.
To be impressive, this would have to beat the GTX 1650 in mobile.
Can't find any mention of which codecs are supported for encoding, beyond H.265.

This is very dependent on price and TDP. MX350 laptops are >€700. Bring the performance to a lower price and I'm in.
 
Hopefully Intel fixed their horriable frametime problem for their iGPU/dGPU.
 
This begs a question. Where's leaks for AMD's big navi laptop GPUs? Since AMD's touting to be more efficient than Nvidia this generation shouldn't AMD's laptop GPU division get a boost as well. They only went upto RX 5600M last year if I rember correctly, which was a little slower than RTX 2060 Max-Q, the slower version of RTX 2060 Mobile. AMD did announce RX 5700M but I think they quietly pulled the plug since no laptop was available with it.
 
At least this has gone further than Larrabee right? right?
This press release reads like an infomercial on TV.
I was waiting for the bit at the end that said, But wait, there's more.
We'll even throw in six FREE steak knives. :laugh:
If you call in the next 30 minutes, you will get another GPU absolutely free!!!
 
Last edited:
The only novel thing here is their ability to have an iGPU and dGPU link and work together, which isn't something AMD is really doing currently. Although I recall AMD did have a more rudimentary version of it with Crossfire, and was revisiting the idea more for future MCM GPUs and heterogenous computing via Infinity Architecture in general.

Still, it is a small edge (feature-wise) Intel has for now, and if they're able to go further with a SAM-like equivalent, they could potentially squeeze out a bit more performance that way. That said, it'll be a while until they can sufficiently catch up in actual performance, unless AMD or NVIDIA trips up hard.

The article is incorrect. Anandtech has a detailed writeup and basically there's no multi-gpu support at all for games, nor any plan to in the future. Only certain productivity workloads can utilize them together.

It's basically the same speed as the Tiger Lake iGPU, but gives added flexibility and speed for those content creation workloads.

"Storms into the market" LMAO
 
PCIE GEN 4? Do Intel CPUs have that?
 
Intel is great in "slides", but slow in motion... Sad times for Intel...
 
This begs a question. Where's leaks for AMD's big navi laptop GPUs? Since AMD's touting to be more efficient than Nvidia this generation shouldn't AMD's laptop GPU division get a boost as well. They only went upto RX 5600M last year if I rember correctly, which was a little slower than RTX 2060 Max-Q, the slower version of RTX 2060 Mobile. AMD did announce RX 5700M but I think they quietly pulled the plug since no laptop was available with it.
Dude chill, big navi is not even out yet. Laptops versions usually come out much later.
 
I arrived to say the same; "storming" is hardly a word to use when beating a lowly MX350.

If anyone is storming it's AMD,
 
"We reckon that most e-sports titles should be playable at over 45 FPS at 1080p."

And I reckon that players of such titles won't be satisfied with that performance. I really don't think 96 EUs at 1.65 GHz will be enough to "storm into" 1080p gaming, especially if the chips are only sold to laptop manufacturers and OEMs. Don't get me wrong, I'd be happy if AMD and nvidia had some competition in the gaming GPU market - but I can't see it happening just yet.
 
"Storms into... 1080p market", savage.

Bad news for NV MX series though.
 
To a certain someone who said that "1080p is so yesterday", then why the majority of users are on that resolution till this day?
 
That's interesting, but I wonder how long it will take to become mainstream. AMD ROCm was also supposed to make radeon run CUDA...since 2016 but there's still no news of amd support in mainstream CUDA apps

That was never the point of ROCm. It facilitates CUDA code translation.

I think that most are too focused on the gaming aspects of this chip. While yes, you can game on this and it's gaming performance is comparable to a MX350/GTX1050, that's not the main selling point of DG1. You don't even buy a laptop with a MX350 for gaming, do you? You mainly want it to improve content creation. Case in point, the three laptop it launches in. Not meant for gaming whatsoever.
The combined power of the Iris Xe and Iris Xe Max is nothing to be scoffed at considering the power enveloppe. FP16 => 8TFLOPS. That's GTX 1650m level.
I don't know if anybody here uses their computer for work but Gigapixel AI acceleration, video encoding, ... these things really matter for content creators.

Xe_MAX_Use-Cases.jpg


I this multi-stream video encoding test, it beat a RTX2080 & i9-10980HK combination. Can you imagine that a puny lightweight Acer Swift 3 laptop beating a >€2000 heavy gaming laptop in any task? I would say mission accomplished.

We're going have to wait another year before Intel launches their actual gaming product, DG2.
 
Back
Top