• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Arc A730M Tested in Games, Gaming Performance Differs from Synthetic

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Intel Arc A730M "Alchemist" discrete GPU made headlines yesterday, when a notebook featuring it achieved a 3DMark TimeSpy score of 10,138 points, which would put its performance in the same league as the NVIDIA GeForce RTX 3070 Laptop GPU. The same source has taken the time to play some games, and come up with performance numbers that would put the A730M in a category lower than the RTX 3070 Laptop GPU.

The set of games tested is rather small—F1 2020, Metro Exodus, and Assassin's Creed Odyssey, but the three are fairly "mature" games (have been around for a while). The A730M is able to score 70 FPS at 1080p, and 55 FPS at 1440p in Metro Exodus. With F1 2020, we're shown 123 FPS (average) at 1080p, and 95 FPS avg at 1440p. In Assassin's Creed Odyssey, the A730M yields 38 FPS at 1080p, and 32 FPS at 1440p. These numbers roughly translate to the A730M being slightly faster than the desktop GeForce RTX 3050, and slower than the desktop RTX 3060, or in the league of the RTX 3060 Laptop GPU. Intel is already handing out stable versions of Arc Alchemist graphics drivers, and the three are fairly old games, so this might not be a case of bad optimization.



View at TechPowerUp Main Site | Source
 
If the price is right, this seems like a fine product....but I highly doubt the price will be right
 
lol what a surprise. Watch it launch in a few months for the price of a 3070. As the 4000 series releases.
 
those 9fps minimum bout to be annoing af
 
EXACTLY as I called it here yesterday:

I'm also wary of 3DMark scores. Intel Iris Xe laptop GPUs were excellent in 3DMark - outperforming Vega8 IGPs in the Renoir Ryzen 7 4800HS and Ryzen 9 4900HS and but sorely under-delivered in actual games and rendering applications, barely matching the low-power Vega5 and Vega6 in the TDP-restricted Ryzen 3 4300U and Ryzen 5 4500U respectively.
 
Last edited:
Pretty impressive in terms of debut gpu, considering its the gpu confronted to be entry level tier. I like one thing that these games are not optimised for any gpu other then nvidia and amd still Intel managed to keep it good. Its a kick start but price should be the main concern here.
 
Last edited:
You guys complaining saying bad performance but I see this as progress considering Intel has never been competitive in the GPU space. So if this does this well the next two larger Tiers should easily offer around 3070 and 3080 performance levels. Which is great for consumers. More offerings creates more competitive pricing. Lets also not forget this is a Laptop GPU and not a desktop variant so it looks even more promising for the desktop variant.
 
Pretty impressive in terms of debut gpu, considering its the gpu confronted to be entry level tier. I like one thing that these games are not optimised for any gpu other then nvidia and amd still Intel managed to keep it good. Its a kick start but price should be the main concern here.
That's not true - the Xe architecture has been running in laptops for a few years now and Intel absolutely has been working with game developers to optimise drivers, game profiles, and fix bugs. Forza is one of the engines that Xe drivers initially struggled with at launch of Tiger Lake about 2 years ago.
 
those 9fps minimum bout to be annoing af

Thats more of a fault in the benchmarks.
Some built in benches can have very low minimums cause it freezes for a split second when the benchmark starts up effectively ruining the minimum frame, seen this happening in various game benches on my PC before.
 
It's still not bad, considering this isn't Intel's fastest mobile GPU. But I always take mobile benchmarks with a grain of salt. The mere build of the laptop can lead to different scores from the same GPU, comparing different GPUs is that much harder.
 
You guys complaining saying bad performance but I see this as progress considering Intel has never been competitive in the GPU space. So if this does this well the next two larger Tiers should easily offer around 3070 and 3080 performance levels. Which is great for consumers. More offerings creates more competitive pricing. Lets also not forget this is a Laptop GPU and not a desktop variant so it looks even more promising for the desktop variant.
The Xe IGPs have proved that the architecture is sound and the drivers are fine. No, they're not at AMD/Nvidia levels yet but like you say - this is their first stab at it and it's not looking like a total disaster at least. They just have to compete and if that means adjusting price and performance expectations, so be it. I don't think anyone at Intel expected it to be easy, or a guaranteed success and they'll likely learn a lot that's already going into development of 2nd generation Xe architecture.
 
The Xe IGPs have proved that the architecture is sound and the drivers are fine. No, they're not at AMD/Nvidia levels yet but like you say - this is their first stab at it and it's not looking like a total disaster at least. They just have to compete and if that means adjusting price and performance expectations, so be it. I don't think anyone at Intel expected it to be easy, or a guaranteed success and they'll likely learn a lot that's already going into development of 2nd generation Xe architecture.
First stab at what? Intel, while not selling discrete GPUs is the largest GPU maker in the world, by the number of units (IGPs) sold. They have also had i740 which was decent and Larrabee (which sucked balls). The know-how is there, they just didn't feel like selling discrete GPUs for a while.

That said, I'm not expecting (or caring about) a 3090 killer, I just want another alternative in the $200-300 range. I'm still hoping Intel is on track to do that.
 
Intel programmers building drivers for the iGPUs where probably having a very easy time in the past. X game wasn't working? Who cares? Y setting in Z game produces artifacts? Just disable it.

Now they have to REALLY work, fix all those bugs and check every possible setting in games that works with the new ARC series. It was expected they where having a huge task at hands, but failing to start a game like Shadow of The Tomb Raider in DirectX 12 mode is something that I wasn't expecting to read. I think people who respect themselves, even if they are the biggest Intel fans, will avoid ARC. Except if they only care about the media engine that is probably really good.
 
Intel modern version of alchemy ....selling trash to high prices... seems Raja discovered how to make gold without using lead :laugh:
 
I see this Arc line as not a direct competitor to anything NVIDIA in gaming, that'd be suicide for Intel right now, but more of a competitor to Apple's M2 line... well, I guess, that's even double suicide. :nutkick:
 
I hope that they sell some cards enough to be encouraged to keep working on it and bring even more competitive product. Intel hater or a fanboy either way this can't be a bad thing.
 
I'm shocked, truly shocked o_O
pokemon GIF
 
I hope that they sell some cards enough to be encouraged to keep working on it and bring even more competitive product. Intel hater or a fanboy either way this can't be a bad thing.
Intel got back in the GPU game out of necessity. GPUs have become extremely powerful co processors and in the future GPUs could be the clear majority in big supercomputers with CPUs being fewer in numbers and for simpler tasks. Also AMD having hi end GPUs AND CPUs and the possibility(until recently) of Nvidia buying ARM and building a competing gaming platform based on ARM architecture, could leave Intel platform with NO hi end graphics cards. AMD and Nvidia could just stop making graphics cards compatible with X86 Intel CPUs or just keep the best GPU models for their own platforms.
 
First stab at what? Intel, while not selling discrete GPUs is the largest GPU maker in the world, by the number of units (IGPs) sold. They have also had i740 which was decent and Larrabee (which sucked balls). The know-how is there, they just didn't feel like selling discrete GPUs for a while.

That said, I'm not expecting (or caring about) a 3090 killer, I just want another alternative in the $200-300 range. I'm still hoping Intel is on track to do that.
First stab at a larger, discrete GPU. In other words, scaling their product up to larger EU counts, and designing a dedicated bus/interconnect and GDDR6 controller that allows scaling up beyond the IGPs - which just parasitically use the existing CPU interconnect and memory access and don't require dedicated architecture for that by themselves. The basic architecture and drivers of these Arc dGPUs isn't new, but the actual product is more than just the architecture and driver.

I argued that the Xe architecture and drivers have already had two years to mature in an earlier post so I'm well aware that Intel aren't making their first GPU ever.
 
Intel got back in the GPU game out of necessity. GPUs have become extremely powerful co processors and in the future GPUs could be the clear majority in big supercomputers with CPUs being fewer in numbers and for simpler tasks. Also AMD having hi end GPUs AND CPUs and the possibility(until recently) of Nvidia buying ARM and building a competing gaming platform based on ARM architecture, could leave Intel platform with NO hi end graphics cards. AMD and Nvidia could just stop making graphics cards compatible with X86 Intel CPUs or just keep the best GPU models for their own platforms.


But when you are joing this late, you will have to spend a decade of losing money before you can catch-up. AMD's painful transition to GCN compute cards already happened , so now that there are two established brands to compete with!

Also, prior to Etherium, discrete GPU sales were falling every year And given how cyclical cryoto can be, I wouldn't expect these cards to be all that desirable to consumers (they will pick-up discounted 3050s after Ada releases.)

jpr_desktop_gpu_historical.png


Theres not enough sustained growth from Crypto for a third player that's so far behind!
 
Last edited:
But when you are joing this late, you will have to spend a decade of losing money before you can catch-up. AMD's painful transition to GCN compute cards already happened , so now that there are two established brands to compete with!

Also, prior to Etherium, discrete GPU sales were falling every year And given how cyclical cryoto can be, I wouldn't expect these cards to be all that desirable to consumers (they will pick-up discounted 3050s after Ada releases.)

jpr_desktop_gpu_historical.png


Theres not enough sustained growth from Crypto for a third player that's so far behind!
I'm sure those numbers are unknown to Intel and they simply jumped in headfirst :wtf:
 
But when you are joing this late, you will have to spend a decade of losing money before you can catch-up. AMD's painful transition to GCN compute cards already happened , so now that there are two established brands to compete with!

Also, prior to Etherium, discrete GPU sales were falling every year And given how cyclical cryoto can be, I wouldn't expect these cards to be all that desirable to consumers (they will pick-up discounted 3050s after Ada releases.)

jpr_desktop_gpu_historical.png


Theres not enough sustained growth from Crypto for a third player that's so far behind!

If Intel is determined to claim a large discrete GPU market share then it won't matter if they lose some money along the way. Their 2021 Financial Statement shows:

Revenue 79 billion USD
Profit 19.9 billion USD
Cash on Hand and Equivalents around 30 billion USD

It remains to be seen how serious they are about their commitment but they certainly have the means to bring some financial pain to Nvidia and AMD.
 
  • Like
Reactions: bug
Intel got back in the GPU game out of necessity. GPUs have become extremely powerful co processors and in the future GPUs could be the clear majority in big supercomputers with CPUs being fewer in numbers and for simpler tasks. Also AMD having hi end GPUs AND CPUs and the possibility(until recently) of Nvidia buying ARM and building a competing gaming platform based on ARM architecture, could leave Intel platform with NO hi end graphics cards. AMD and Nvidia could just stop making graphics cards compatible with X86 Intel CPUs or just keep the best GPU models for their own platforms.
Hi,
Think it was more intel was missing out on all the easy miner money than anything else.
 
Back
Top