Thursday, August 11th 2022

Intel Arc A750 Trades Blows with GeForce RTX 3060 in 50 Games

Intel earlier this week released its own performance numbers for as many as 50 benchmarks spanning the DirectX 12 and Vulkan APIs. From our testing, the Arc A380 performs sub-par with its rivals in games based on the DirectX 11 API. Intel tested the A750 in the 1080p and 1440p resolutions, and compared performance numbers with the NVIDIA GeForce RTX 3060. Broadly, the testing reveals the A750 to be 3% faster than the RTX 3060 in DirectX 12 titles at 1080p; about 5% faster at 1440p; about 4% faster in Vulkan titles at 1080p, and about 5% faster at 1440p.

All testing was done without ray tracing, performance enhancements such as XeSS or DLSS weren't used. The small set of 6 Vulkan API titles show a more consistent performance lead for the A750 over the RTX 3060, whereas the DirectX 12 API titles sees the two trade blows, with a diversity of results varying among game engines. In "Dolmen," for example, the RTX 3060 scores 347 FPS compared to the Arc's 263. In "Resident Evil VIII," the Arc scores 160 FPS compared to 133 FPS of the GeForce. Such variations among the titles pulls up the average in favor of the Intel card. Intel stated that the A750 is on-course to launch "later this year," but without being any more specific than that. The individual test results can be seen below.
The testing notes and configuration follows.

Source: Intel Graphics
Add your own comment

85 Comments on Intel Arc A750 Trades Blows with GeForce RTX 3060 in 50 Games

#1
ratirt
It would seem this one has a performance level a little bit below 6600XT.
Posted on Reply
#2
john_
No matter what is our opinion about the ARC series, it's good seeing Intel trying hard to convince us about ARC's value.
We need Intel and as long as we see them trying, we can be at least safe to assume they are not throwing the towel.
Posted on Reply
#3
AusWolf
We've already seen numbers, I want the product now.
Posted on Reply
#4
ratirt
AusWolfWe've already seen numbers, I want the product now.
You really want to buy it? I wonder for what price.
Posted on Reply
#5
AusWolf
ratirtYou really want to buy it? I wonder for what price.
Yes. Though not because it's a good idea, but because I'm curious. :p
Posted on Reply
#6
ZoneDymo
AusWolfWe've already seen numbers, I want the product now.
Be sure to take us along for your ride, I find this product interesting but I have enough issues to figure out as is :P
Also as of yet DX11 performance is apparently bad and seeing as one of the games I play the most is DX11, this card does not seem like its for me atm.
Posted on Reply
#7
Bwaze
"A750 is on-course to launch "later this year"...

Nothing regarding Intel ARC is "on-course". It has been postponed for almost two years now, and it looks like it will directly compete with late 2022 / early 2023 generation of Nvidia, AMD.

And it doesn't seem to be the price lowering force either. We'll get the price / performance equivalent of 2020 Nvidia, AMD cards, but with all the driver problems, non working functions, bugs... You'll pay more to have the privilege to help beta test the Intel product!
Posted on Reply
#8
Crackong
These PR things doesn't work anymore.
Posted on Reply
#9
AusWolf
ZoneDymoBe sure to take us along for your ride, I find this product interesting but I have enough issues to figure out as is :p
Also as of yet DX11 performance is apparently bad and seeing as one of the games I play the most is DX11, this card does not seem like its for me atm.
Fair enough. :) I think even if DX11 performance is behind the competition, it's still gonna be enough for my needs. I had a 5700 XT (with the weak cooler mount from Asus), and now I have both a 6500 XT and a 6400, so it can't be that bad. :D

If I actually manage to buy one at a fairly OK price, I'll post my experiences, for sure. :)
Posted on Reply
#10
watzupken
Frankly, Intel should stop releasing paper GPU results, and it will be exciting to see this sort of result at the start of the year, or even end of last year. Given that we are in Q3, and all we get is more product teaser and results, with no products, the product will be DOA. Not that nobody will buy it, but the sale will be terrible, and/or, they have to sell it very cheap.
Posted on Reply
#11
ixi
How it sounds to me...

CrackongThese PR things doesn't work anymore.
Don't speak like that. Intel otherwise will get upset and leave us.
Posted on Reply
#12
john_
Bwazeand it looks like it will directly compete with late 2022 / early 2023 generation of Nvidia, AMD.
Intel is probably lucky that AMD and Nvidia will probably not offer a cheap next gen product early. Nvidia will start with 4080 and up. Even if they throw 4070 in the market, it's not going to be cheap. They lowered their profit margin from about 65% to 45% and started offering GPUs to their partners at much lower prices. Nvidia needs money to go forward and having the strongest by far brand and 80% of the market, also knowing that Intel is far behind and AMD's need to keep focusing on EPYC because of capacity restrictions, they will not price low. I am expecting Nvidia to start with products costing over $800 in the 4-6 first months (just a random prediction to give an idea) and AMD to also start at higher than expected prices, like $600 for their cheaper RX 7000 option. So Intel will have plenty of time and room to play under $500. Fun fact, Raja is reliving his Polaris times. He just doesn't have a Vega equivalent yet.
Posted on Reply
#13
ixi
ZoneDymoBe sure to take us along for your ride, I find this product interesting but I have enough issues to figure out as is :p
Also as of yet DX11 performance is apparently bad and seeing as one of the games I play the most is DX11, this card does not seem like its for me atm.
What do you PLAY ma fellow forum mate.
john_Intel is probably lucky that AMD and Nvidia will probably not offer a cheap next gen product early. Nvidia will start with 4080 and up. Even if they throw 4070 in the market, it's not going to be cheap. They lowered their profit margin from about 65% to 45% and started offering GPUs to their partners at much lower prices. Nvidia needs money to go forward and having the strongest by far brand and 80% of the market, also knowing that Intel is far behind and AMD's need to keep focusing on EPYC because of capacity restrictions, they will not price low. I am expecting Nvidia to start with products costing over $800 in the 4-6 first months (just a random prediction to give an idea) and AMD to also start at higher than expected prices, like $600 for their cheaper RX 7000 option. So Intel will have plenty of time and room to play under $500. Fun fact, Raja is reliving his Polaris times. He just doesn't have a Vega equivalent yet.
If AMD and Nvidia prices will cost as much as you say. DOA, of course they gonna work and people still gonna buy, but that will be a big WASTE of ya money.

Not to mention that in reality - 90% or even higher percentage of people are still using full hd monitors. For full HD resolution RTX 20xx is already overkill not to mention rtx 30xx and AMD rx 6xxx.

What I don't like is this. we get newer and stronger gpu in 1 of 2 years after previous release which is awesome of course, but there is big but...... nowadays game makers spits on optimization and just hope that GPU raw power will be alright to handle the game.
Posted on Reply
#14
john_
ixiIf AMD and Nvidia prices will cost as much as you say. DOA, of course they gonna work and people still gonna buy, but that will be a big WASTE of ya money.
We are expecting significant performance upgrades from the next gen. Nvidia will be offering, twice the current speeds? Something like that? AMD might gain even more in RT in an effort to close the gap with Nvidia. So, consider 3090 Ti at $800 and 4070 at $900. 3090 Ti will have the VRAM capacity advantage and it will be cheaper, probably better performance at some titles at 4K, 4070 will be the new card with advantages in about everything else. Both will sell. Higher performance than 3090 Ti at $900? It will sell. Easily. And it will not be DOA, because Nvidia also enjoys positive coverage from the press. So, the press will be in it's conclutions like "Best buy. Just Buy It." I know people are hoping for a $500-$600 RTX 4070, but I believe that is wishful thinking at it's maximum. Nvidia needs money and also needs profit margins. They can't start selling cheap GPUs now. To go where. From 80% market share to 85% market share? It doesn't make sense. They will sell expensive new RTX 4000 cards to give more time to their partners to sell RTX 2000 and RTX 3000 cards and also try to recover their profit margins, even if that means losing 5-10% market share.
Posted on Reply
#15
Bwaze
john_I am expecting Nvidia to start with products costing over $800 in the 4-6 first months (just a random prediction to give an idea) and AMD to also start at higher than expected prices, like $600 for their cheaper RX 7000 option. So Intel will have plenty of time and room to play under $500.
Yeah, that's a given.

But even if we just get RTX 4070 and 4080 from Nvidia, it will still cause a price adjustment for most of the range. Even with very expensive RTX 4070 - if it's really the equivalent of RTX 3090 Ti, it could be offered at $1100 (with no price / performance increase, we have seen that in RTX 20x0 launch). But it will be probably well below that - at least theoretically, they can still claim various difficulties later and raise the price.
Posted on Reply
#16
ExcuseMeWtf
It will trade blows harder with its drivers :roll:
Posted on Reply
#17
john_
ixiNot to mention that in reality - 90% or even higher percentage of people are still using full hd monitors. For full HD resolution RTX 20xx is already overkill not to mention rtx 30xx and AMD rx 6xxx.

What I don't like is this. we get newer and stronger gpu in 1 of 2 years after previous release which is awesome of course, but there is big but...... nowadays game makers spits on optimization and just hope that GPU raw power will be alright to handle the game.
I don't think resolution is so much of a parameter today. If it was, anything over $700 MSRP would be staying on selves collecting dust.
Optimization was always a problem, it just seems to be bigger today because games are huge, developers try to market them as fast as possible and frankly a 10+ core CPU and modern GPUs are huge carpets to hide underneath any performance problem. Also an unoptimised game will sell more CPUs and GPUs than an optimized one, meaning not only you can market it faster, you can also get nice sponsor money from Nvidia, AMD and Intel, by partially optimizing for their architecture instead for everyones.
BwazeYeah, that's a given.

But even if we just get RTX 4070 and 4080 from Nvidia, it will still cause a price adjustment for most of the range. Even with very expensive RTX 4070 - if it's really the equivalent of RTX 3090 Ti, it could be offered at $1100 (with no price / performance increase, we have seen that in RTX 20x0 launch). But it will be probably well below that - at least theoretically, they can still claim various difficulties later and raise the price.
I was reading all over the internet about 4070 being a $500-$600 card. So for some people it's not a given. Probably they just try to justify waiting 2 years for a brand new GPU. Don't know.
But I don't expect it to be over $1000. RTX 2000's pricing was a result of the lack of competition and Ray Tracing marketing. RTX 4070 is not the top model and there is competition. But who knows. Someday we will definitely get a x070 for over $1000 anyway. It might be now.
Posted on Reply
#18
timta2
ixiHow it sounds to me...




Don't speak like that. Intel otherwise will get upset and leave us.
That's gross and I would feel embarrassed if I had posted that.
Posted on Reply
#19
natr0n
I hope they dont try that shit were you have to buy a k/unlocked version down the road to overclock your gpu.
Posted on Reply
#20
Bomby569
are they in the gpu selling business or the leak news business? If it's the 1st i see no gpu's, if it's the 2nd their stocks are going to skyrocket.
Posted on Reply
#21
Bwaze
natr0nI hope they dont try that shit were you have to buy a k/unlocked version down the road to overclock your gpu.
Or the newest thing, monthly fee for overclocking feature! If BMW can do it...
Posted on Reply
#22
Rahmat Sofyan
"limited edition"..

never seen one yet in store, but already limited .. wth
Posted on Reply
#23
Courier 6
curious if steve gets his hands on one, just to see what numbers and other stuff comes out
Posted on Reply
#24
Jism
The whole GPU devision of intel has a estimated loss of 3.5 billion. There is a chance that the whole GPU devision is sold off.

www.tomshardware.com/news/intel-gpu-division-losses-estimated-at-3-5-billion-usd

Intel can put benchmarks out or in context all they want. Generally they perform worse at quite higher consumption compared to Nvidia or AMD. And both camps about to release their next gen that would make intel look like low-low-end.
Posted on Reply
#25
chstamos
What I don't get is how apple could make a decent performing GPU -integrated, at that- seemingly out of nowhere, and intel has been developing this debacle since 2017 (check it out - that's when Xe discrete graphics was first announced, half a decade ago) and still end up with... this clusterf__k.

I'm not trolling for apple, honestly. I'd like some kind of explanation for that. Is it that the Apple igpu is not required to support as many games, for example, considering the relative scarcity of gaming on mac? What is it? I do get how difficult it is to develop a brand new architecture, so how did apple do it?
Posted on Reply
Add your own comment
Apr 29th, 2024 20:28 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts