• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Arc A730M Tested in Games, Gaming Performance Differs from Synthetic

Actually after reading the article, I can’t really tell if the result is good or bad. In the first place, we are coming the A730M which is not the flagship A770M and seems to kind of match the RTX 3060 mobile in terms of specs. The conclusion seems to imply that the A730M performs at a RTX 3060 Mobile level, so It does not sound bad. I can’t wrap my head around why the comparison is made with the RTX 3050 and 3060 desktop version in the first place since it is clear that desktop components have more power headroom and almost no thermal restrictions, as compared to a laptop GPU. Also, the faster desktop CPUs will most likely contribute to better performance results at say 1080p, and to a minimal extend, 1440p.
It's hard to make a direct comparison because this chip is so heavily cut down, but rumors were saying the full desktop version would compete with the 3070 Ti. And it turns out the cut down is GA106 tier. The spec difference between this variant and the full chip is SMALLER than the difference between the 3060 and the 3060 Ti. So we can easily infer from this that the first gen Xe will offer similar performance to the 3060 Ti on desktop, falling short of expectations. So yes, the result is bad, quite bad.
 
I dont think anyone is shocked that intel would be better in synthetics than in real games


hopefully they can work some magic in drivers, but the game engines were simply designed for nvidia or AMD, nothings optimised for their graphics cards
 
If you check the 3Dmark-games performance level placement difference in relation with the competition, it's like around -30%-35% (or ARC is achieving 1.4X-1.5X more performance in synthetics vs games performance level)
So if the source that told the leakers the original 3070Ti story, was based on 3Dmark numbers then the top ARK would be at games 3070Ti -30-35% which gives RX 6650XT-RX 6600XT level of performance.
Now imagine Intel matching at $399 the RX6650XT for the 8GB version since it offers the same performance plus matrix A.I. cores plus better media engine and then +$10/GB for the 16GB version at $479, after all memory always sell products right?
And at a later date when AMD launch RDNA3, Intel lowers the prices (less embarrassing imo than the RX5700/5700XT $449-$399-$349 saga)
Now that would be something!
Just joking!!!
 
If you check the 3Dmark-games performance level placement difference in relation with the competition, it's like around -30%-35% (or ARC is achieving 1.4X-1.5X more performance in synthetics vs games performance level)
So if the source that told the leakers the original 3070Ti story, was based on 3Dmark numbers then the top ARK would be at games 3070Ti -30-35% which gives RX 6650XT-RX 6600XT level of performance.
Now imagine Intel matching at $399 the RX6650XT for the 8GB version since it offers the same performance plus matrix A.I. cores plus better media engine and then +$10/GB for the 16GB version at $479, after all memory always sell products right?
And at a later date when AMD launch RDNA3, Intel lowers the prices (less embarrassing imo than the RX5700/5700XT $449-$399-$349 saga)
Now that would be something!
Just joking!!!

So we'd still pay too much for shite drivers and no competition at the price point. Even if this isn't a joke, its a joke :D

I mean really, 399 for a bottom mid tier GPU... neh. Hard. Pass. Even with the full feature set Nvidia has now, I wouldn't even think of it.
 
So we'd still pay too much for shite drivers and no competition at the price point. Even if this isn't a joke, its a joke :D

I mean really, 399 for a bottom mid tier GPU... neh. Hard. Pass. Even with the full feature set Nvidia has now, I wouldn't even think of it.
But if you went to buy a new laptop and the only way to get a the latest Intel CPU came with these GPUs, would you consider it?

Pretty sure they know they cant compete, so they're going to push the OEM's to get the sales and profits anyway (and of course, over time work on drivers and faster models)
 
So we'd still pay too much for shite drivers and no competition at the price point. Even if this isn't a joke, its a joke :D

I mean really, 399 for a bottom mid tier GPU... neh. Hard. Pass. Even with the full feature set Nvidia has now, I wouldn't even think of it.
It's probably what @Mussels said, if the performance achieved is that low, the first to know about it is Intel of course, so in order to maximize their profit, the strategy is exactly what they are doing, launching OEM & China first getting the feeling of what the channel will tolerate based on the negotiations that occurred and what they achieved to actually sell and at what price on the OEM channel and then delay as much as possible the DYI international launch/reviews in order to try to fix drivers/software issues in order not to have very bad reviews that will adversely effect the future OEM sales (it's all about OEM).
I hate being a smartass, but it seems that the job that is happening in the software department is just not enough, either they must hire some more people (if the delay in order to achieve acceptable driver/sw level is because they need more time or someone must go if the problem is about execution and leadership)
Regarding reviews, the task although very difficult is pretty clear, they have to identify the major publications and what games they are testing and focus there, eliminating the major problems regarding stability/visual artifacts etc and on a second level extract as much performance as possible on these specific titles.
It can't be more than 50 games really, we have between Q1 2022 that they shipped final silicon and at the end of Q3 that the DYI international launch with review samples may occure (end of Q2 i suspect is all about China and OEMs like the mobile launch?) 25+ weeks, so half week worth of working hours for the whole Intel S/W team per title, or a week worth of working hours if they focus on the major 25 games that most publications use or 25 games that causing them the most troubles.
And to tell you the truth it is like half a month worth of working hours for the whole Intel s/w team per title since i imagine they started a lot earlier than end of Q1 that they shipped final silicon to OEMs.
Maybe I'm being too harsh especially since i don't know the internals but something must change that's for sure, they can't be happy with the results they're achieving!
 
Last edited:
so they're going to push the OEM's to get the sales and profits anyway
I believe they don't have the kind of pull they had 2 years back, probably not even half as compared to a decade back! Unless they're paying copious amounts of kickback contra revenue kind of stuff it will be hard for them to sell too many of their substandard GPU's ~ which is assuming they perform horribly.
 
I believe they don't have the kind of pull they had 2 years back, probably not even half as compared to a decade back! Unless they're paying copious amounts of kickback contra revenue kind of stuff it will be hard for them to sell too many of their substandard GPU's ~ which is assuming they perform horribly.
Remember the rumours that the GPU's only worked with specific intel CPU's and motherboards?
Intel's first Xe graphics card only works with Intel CPUs and mobos | PC Gamer

If they do this, it makes it super easy to sell all intel parts in certain market segments
 
No never heard of this one, but yeah if Intel really wants to they can definitely force the OEM's to do their bidding that's for sure!
 
But if you went to buy a new laptop and the only way to get a the latest Intel CPU came with these GPUs, would you consider it?
Honestly, if the price is right, I would. Once you have your laptop, and fire up some games, you don't feel that it is 10% behind the other laptop that you didn't buy. You only feel your own gaming experience.
 
When is Intel ARC releasing for desktop?
 
I dont think anyone is shocked that intel would be better in synthetics than in real games


hopefully they can work some magic in drivers, but the game engines were simply designed for nvidia or AMD, nothings optimised for their graphics cards
Well, barely anyone can optimize for Intel atm. And the ones that so (partners and such), probably don't put a lot of resources into that anyway.
 
Well, barely anyone can optimize for Intel atm. And the ones that so (partners and such), probably don't put a lot of resources into that anyway.
Sorry, i thought the next step was implied - FUTURE games will be, but current and past ones can't be and will rely on driver tweaks
 
Sorry, i thought the next step was implied - FUTURE games will be, but current and past ones can't be and will rely on driver tweaks
It's the same for AMD and Nvidia: games released before Ampere, will not be optimized for Ampere, but will rely on proper driver profiles instead. Intel doesn't have the best track record here, but let's stay positive about it.
 
It's the same for AMD and Nvidia: games released before Ampere, will not be optimized for Ampere, but will rely on proper driver profiles instead. Intel doesn't have the best track record here, but let's stay positive about it.
Also, just because games aren't optimised for a certain architecture it doesn't mean they can't run well on it.
 
Back
Top