Tuesday, June 7th 2022

Intel Arc A730M Tested in Games, Gaming Performance Differs from Synthetic

Intel Arc A730M "Alchemist" discrete GPU made headlines yesterday, when a notebook featuring it achieved a 3DMark TimeSpy score of 10,138 points, which would put its performance in the same league as the NVIDIA GeForce RTX 3070 Laptop GPU. The same source has taken the time to play some games, and come up with performance numbers that would put the A730M in a category lower than the RTX 3070 Laptop GPU.

The set of games tested is rather small—F1 2020, Metro Exodus, and Assassin's Creed Odyssey, but the three are fairly "mature" games (have been around for a while). The A730M is able to score 70 FPS at 1080p, and 55 FPS at 1440p in Metro Exodus. With F1 2020, we're shown 123 FPS (average) at 1080p, and 95 FPS avg at 1440p. In Assassin's Creed Odyssey, the A730M yields 38 FPS at 1080p, and 32 FPS at 1440p. These numbers roughly translate to the A730M being slightly faster than the desktop GeForce RTX 3050, and slower than the desktop RTX 3060, or in the league of the RTX 3060 Laptop GPU. Intel is already handing out stable versions of Arc Alchemist graphics drivers, and the three are fairly old games, so this might not be a case of bad optimization.
Sources: Golden Pig Update (Weibo), VideoCardz
Add your own comment

66 Comments on Intel Arc A730M Tested in Games, Gaming Performance Differs from Synthetic

#1
ZoneDymo
If the price is right, this seems like a fine product....but I highly doubt the price will be right
Posted on Reply
#2
Daven
ZoneDymoIf the price is right, this seems like a fine product....but I highly doubt the price will be right
There’s a good chance no one will be able to buy the first generation as an AIB. These GPUs might only be available in complete OEM systems and released in a limited number of countries.
Posted on Reply
#3
Arkz
lol what a surprise. Watch it launch in a few months for the price of a 3070. As the 4000 series releases.
Posted on Reply
#4
KarymidoN
those 9fps minimum bout to be annoing af
Posted on Reply
#5
Chrispy_
EXACTLY as I called it here yesterday:
Chrispy_I'm also wary of 3DMark scores. Intel Iris Xe laptop GPUs were excellent in 3DMark - outperforming Vega8 IGPs in the Renoir Ryzen 7 4800HS and Ryzen 9 4900HS and but sorely under-delivered in actual games and rendering applications, barely matching the low-power Vega5 and Vega6 in the TDP-restricted Ryzen 3 4300U and Ryzen 5 4500U respectively.
Posted on Reply
#6
aQi
Pretty impressive in terms of debut gpu, considering its the gpu confronted to be entry level tier. I like one thing that these games are not optimised for any gpu other then nvidia and amd still Intel managed to keep it good. Its a kick start but price should be the main concern here.
Posted on Reply
#7
Durvelle27
You guys complaining saying bad performance but I see this as progress considering Intel has never been competitive in the GPU space. So if this does this well the next two larger Tiers should easily offer around 3070 and 3080 performance levels. Which is great for consumers. More offerings creates more competitive pricing. Lets also not forget this is a Laptop GPU and not a desktop variant so it looks even more promising for the desktop variant.
Posted on Reply
#8
Chrispy_
aQiPretty impressive in terms of debut gpu, considering its the gpu confronted to be entry level tier. I like one thing that these games are not optimised for any gpu other then nvidia and amd still Intel managed to keep it good. Its a kick start but price should be the main concern here.
That's not true - the Xe architecture has been running in laptops for a few years now and Intel absolutely has been working with game developers to optimise drivers, game profiles, and fix bugs. Forza is one of the engines that Xe drivers initially struggled with at launch of Tiger Lake about 2 years ago.
Posted on Reply
#9
Sithaer
KarymidoNthose 9fps minimum bout to be annoing af
Thats more of a fault in the benchmarks.
Some built in benches can have very low minimums cause it freezes for a split second when the benchmark starts up effectively ruining the minimum frame, seen this happening in various game benches on my PC before.
Posted on Reply
#10
bug
It's still not bad, considering this isn't Intel's fastest mobile GPU. But I always take mobile benchmarks with a grain of salt. The mere build of the laptop can lead to different scores from the same GPU, comparing different GPUs is that much harder.
Posted on Reply
#11
Chrispy_
Durvelle27You guys complaining saying bad performance but I see this as progress considering Intel has never been competitive in the GPU space. So if this does this well the next two larger Tiers should easily offer around 3070 and 3080 performance levels. Which is great for consumers. More offerings creates more competitive pricing. Lets also not forget this is a Laptop GPU and not a desktop variant so it looks even more promising for the desktop variant.
The Xe IGPs have proved that the architecture is sound and the drivers are fine. No, they're not at AMD/Nvidia levels yet but like you say - this is their first stab at it and it's not looking like a total disaster at least. They just have to compete and if that means adjusting price and performance expectations, so be it. I don't think anyone at Intel expected it to be easy, or a guaranteed success and they'll likely learn a lot that's already going into development of 2nd generation Xe architecture.
Posted on Reply
#12
Unregistered
By the time they launch this AMD would offer an APU with faster graphics.
#13
bug
Chrispy_The Xe IGPs have proved that the architecture is sound and the drivers are fine. No, they're not at AMD/Nvidia levels yet but like you say - this is their first stab at it and it's not looking like a total disaster at least. They just have to compete and if that means adjusting price and performance expectations, so be it. I don't think anyone at Intel expected it to be easy, or a guaranteed success and they'll likely learn a lot that's already going into development of 2nd generation Xe architecture.
First stab at what? Intel, while not selling discrete GPUs is the largest GPU maker in the world, by the number of units (IGPs) sold. They have also had i740 which was decent and Larrabee (which sucked balls). The know-how is there, they just didn't feel like selling discrete GPUs for a while.

That said, I'm not expecting (or caring about) a 3090 killer, I just want another alternative in the $200-300 range. I'm still hoping Intel is on track to do that.
Posted on Reply
#14
john_
Intel programmers building drivers for the iGPUs where probably having a very easy time in the past. X game wasn't working? Who cares? Y setting in Z game produces artifacts? Just disable it.

Now they have to REALLY work, fix all those bugs and check every possible setting in games that works with the new ARC series. It was expected they where having a huge task at hands, but failing to start a game like Shadow of The Tomb Raider in DirectX 12 mode is something that I wasn't expecting to read. I think people who respect themselves, even if they are the biggest Intel fans, will avoid ARC. Except if they only care about the media engine that is probably really good.
Posted on Reply
#15
laszlo
Intel modern version of alchemy ....selling trash to high prices... seems Raja discovered how to make gold without using lead :laugh:
Posted on Reply
#16
GunShot
I see this Arc line as not a direct competitor to anything NVIDIA in gaming, that'd be suicide for Intel right now, but more of a competitor to Apple's M2 line... well, I guess, that's even double suicide. :nutkick:
Posted on Reply
#17
r9
I hope that they sell some cards enough to be encouraged to keep working on it and bring even more competitive product. Intel hater or a fanboy either way this can't be a bad thing.
Posted on Reply
#18
R0H1T
I'm shocked, truly shocked o_O
Posted on Reply
#19
john_
r9I hope that they sell some cards enough to be encouraged to keep working on it and bring even more competitive product. Intel hater or a fanboy either way this can't be a bad thing.
Intel got back in the GPU game out of necessity. GPUs have become extremely powerful co processors and in the future GPUs could be the clear majority in big supercomputers with CPUs being fewer in numbers and for simpler tasks. Also AMD having hi end GPUs AND CPUs and the possibility(until recently) of Nvidia buying ARM and building a competing gaming platform based on ARM architecture, could leave Intel platform with NO hi end graphics cards. AMD and Nvidia could just stop making graphics cards compatible with X86 Intel CPUs or just keep the best GPU models for their own platforms.
Posted on Reply
#20
Chrispy_
bugFirst stab at what? Intel, while not selling discrete GPUs is the largest GPU maker in the world, by the number of units (IGPs) sold. They have also had i740 which was decent and Larrabee (which sucked balls). The know-how is there, they just didn't feel like selling discrete GPUs for a while.

That said, I'm not expecting (or caring about) a 3090 killer, I just want another alternative in the $200-300 range. I'm still hoping Intel is on track to do that.
First stab at a larger, discrete GPU. In other words, scaling their product up to larger EU counts, and designing a dedicated bus/interconnect and GDDR6 controller that allows scaling up beyond the IGPs - which just parasitically use the existing CPU interconnect and memory access and don't require dedicated architecture for that by themselves. The basic architecture and drivers of these Arc dGPUs isn't new, but the actual product is more than just the architecture and driver.

I argued that the Xe architecture and drivers have already had two years to mature in an earlier post so I'm well aware that Intel aren't making their first GPU ever.
Posted on Reply
#21
defaultluser
john_Intel got back in the GPU game out of necessity. GPUs have become extremely powerful co processors and in the future GPUs could be the clear majority in big supercomputers with CPUs being fewer in numbers and for simpler tasks. Also AMD having hi end GPUs AND CPUs and the possibility(until recently) of Nvidia buying ARM and building a competing gaming platform based on ARM architecture, could leave Intel platform with NO hi end graphics cards. AMD and Nvidia could just stop making graphics cards compatible with X86 Intel CPUs or just keep the best GPU models for their own platforms.
But when you are joing this late, you will have to spend a decade of losing money before you can catch-up. AMD's painful transition to GCN compute cards already happened , so now that there are two established brands to compete with!

Also, prior to Etherium, discrete GPU sales were falling every year And given how cyclical cryoto can be, I wouldn't expect these cards to be all that desirable to consumers (they will pick-up discounted 3050s after Ada releases.)



Theres not enough sustained growth from Crypto for a third player that's so far behind!
Posted on Reply
#22
bug
defaultluserBut when you are joing this late, you will have to spend a decade of losing money before you can catch-up. AMD's painful transition to GCN compute cards already happened , so now that there are two established brands to compete with!

Also, prior to Etherium, discrete GPU sales were falling every year And given how cyclical cryoto can be, I wouldn't expect these cards to be all that desirable to consumers (they will pick-up discounted 3050s after Ada releases.)



Theres not enough sustained growth from Crypto for a third player that's so far behind!
I'm sure those numbers are unknown to Intel and they simply jumped in headfirst :wtf:
Posted on Reply
#23
64K
defaultluserBut when you are joing this late, you will have to spend a decade of losing money before you can catch-up. AMD's painful transition to GCN compute cards already happened , so now that there are two established brands to compete with!

Also, prior to Etherium, discrete GPU sales were falling every year And given how cyclical cryoto can be, I wouldn't expect these cards to be all that desirable to consumers (they will pick-up discounted 3050s after Ada releases.)



Theres not enough sustained growth from Crypto for a third player that's so far behind!
If Intel is determined to claim a large discrete GPU market share then it won't matter if they lose some money along the way. Their 2021 Financial Statement shows:

Revenue 79 billion USD
Profit 19.9 billion USD
Cash on Hand and Equivalents around 30 billion USD

It remains to be seen how serious they are about their commitment but they certainly have the means to bring some financial pain to Nvidia and AMD.
Posted on Reply
#24
ThrashZone
john_Intel got back in the GPU game out of necessity. GPUs have become extremely powerful co processors and in the future GPUs could be the clear majority in big supercomputers with CPUs being fewer in numbers and for simpler tasks. Also AMD having hi end GPUs AND CPUs and the possibility(until recently) of Nvidia buying ARM and building a competing gaming platform based on ARM architecture, could leave Intel platform with NO hi end graphics cards. AMD and Nvidia could just stop making graphics cards compatible with X86 Intel CPUs or just keep the best GPU models for their own platforms.
Hi,
Think it was more intel was missing out on all the easy miner money than anything else.
Posted on Reply
#25
bug
ThrashZoneHi,
Think it was more intel was missing out on all the easy miner money than anything else.
They've already built mining accelerators for that, so that's not it.
Posted on Reply
Add your own comment
Apr 26th, 2024 00:26 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts