• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Arc A750 Trades Blows with GeForce RTX 3060 in 50 Games

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,680 (7.42/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Intel earlier this week released its own performance numbers for as many as 50 benchmarks spanning the DirectX 12 and Vulkan APIs. From our testing, the Arc A380 performs sub-par with its rivals in games based on the DirectX 11 API. Intel tested the A750 in the 1080p and 1440p resolutions, and compared performance numbers with the NVIDIA GeForce RTX 3060. Broadly, the testing reveals the A750 to be 3% faster than the RTX 3060 in DirectX 12 titles at 1080p; about 5% faster at 1440p; about 4% faster in Vulkan titles at 1080p, and about 5% faster at 1440p.

All testing was done without ray tracing, performance enhancements such as XeSS or DLSS weren't used. The small set of 6 Vulkan API titles show a more consistent performance lead for the A750 over the RTX 3060, whereas the DirectX 12 API titles sees the two trade blows, with a diversity of results varying among game engines. In "Dolmen," for example, the RTX 3060 scores 347 FPS compared to the Arc's 263. In "Resident Evil VIII," the Arc scores 160 FPS compared to 133 FPS of the GeForce. Such variations among the titles pulls up the average in favor of the Intel card. Intel stated that the A750 is on-course to launch "later this year," but without being any more specific than that. The individual test results can be seen below.



The testing notes and configuration follows.


View at TechPowerUp Main Site | Source
 
It would seem this one has a performance level a little bit below 6600XT.
 
No matter what is our opinion about the ARC series, it's good seeing Intel trying hard to convince us about ARC's value.
We need Intel and as long as we see them trying, we can be at least safe to assume they are not throwing the towel.
 
We've already seen numbers, I want the product now.
 
You really want to buy it? I wonder for what price.
Yes. Though not because it's a good idea, but because I'm curious. :p
 
We've already seen numbers, I want the product now.

Be sure to take us along for your ride, I find this product interesting but I have enough issues to figure out as is :P
Also as of yet DX11 performance is apparently bad and seeing as one of the games I play the most is DX11, this card does not seem like its for me atm.
 
"A750 is on-course to launch "later this year"...

Nothing regarding Intel ARC is "on-course". It has been postponed for almost two years now, and it looks like it will directly compete with late 2022 / early 2023 generation of Nvidia, AMD.

And it doesn't seem to be the price lowering force either. We'll get the price / performance equivalent of 2020 Nvidia, AMD cards, but with all the driver problems, non working functions, bugs... You'll pay more to have the privilege to help beta test the Intel product!
 
Last edited:
Be sure to take us along for your ride, I find this product interesting but I have enough issues to figure out as is :p
Also as of yet DX11 performance is apparently bad and seeing as one of the games I play the most is DX11, this card does not seem like its for me atm.
Fair enough. :) I think even if DX11 performance is behind the competition, it's still gonna be enough for my needs. I had a 5700 XT (with the weak cooler mount from Asus), and now I have both a 6500 XT and a 6400, so it can't be that bad. :D

If I actually manage to buy one at a fairly OK price, I'll post my experiences, for sure. :)
 
Frankly, Intel should stop releasing paper GPU results, and it will be exciting to see this sort of result at the start of the year, or even end of last year. Given that we are in Q3, and all we get is more product teaser and results, with no products, the product will be DOA. Not that nobody will buy it, but the sale will be terrible, and/or, they have to sell it very cheap.
 
How it sounds to me...

aVX5e4y_460s.jpg


These PR things doesn't work anymore.
Don't speak like that. Intel otherwise will get upset and leave us.
 
Last edited:
and it looks like it will directly compete with late 2022 / early 2023 generation of Nvidia, AMD.
Intel is probably lucky that AMD and Nvidia will probably not offer a cheap next gen product early. Nvidia will start with 4080 and up. Even if they throw 4070 in the market, it's not going to be cheap. They lowered their profit margin from about 65% to 45% and started offering GPUs to their partners at much lower prices. Nvidia needs money to go forward and having the strongest by far brand and 80% of the market, also knowing that Intel is far behind and AMD's need to keep focusing on EPYC because of capacity restrictions, they will not price low. I am expecting Nvidia to start with products costing over $800 in the 4-6 first months (just a random prediction to give an idea) and AMD to also start at higher than expected prices, like $600 for their cheaper RX 7000 option. So Intel will have plenty of time and room to play under $500. Fun fact, Raja is reliving his Polaris times. He just doesn't have a Vega equivalent yet.
 
Be sure to take us along for your ride, I find this product interesting but I have enough issues to figure out as is :p
Also as of yet DX11 performance is apparently bad and seeing as one of the games I play the most is DX11, this card does not seem like its for me atm.
What do you PLAY ma fellow forum mate.

Intel is probably lucky that AMD and Nvidia will probably not offer a cheap next gen product early. Nvidia will start with 4080 and up. Even if they throw 4070 in the market, it's not going to be cheap. They lowered their profit margin from about 65% to 45% and started offering GPUs to their partners at much lower prices. Nvidia needs money to go forward and having the strongest by far brand and 80% of the market, also knowing that Intel is far behind and AMD's need to keep focusing on EPYC because of capacity restrictions, they will not price low. I am expecting Nvidia to start with products costing over $800 in the 4-6 first months (just a random prediction to give an idea) and AMD to also start at higher than expected prices, like $600 for their cheaper RX 7000 option. So Intel will have plenty of time and room to play under $500. Fun fact, Raja is reliving his Polaris times. He just doesn't have a Vega equivalent yet.
If AMD and Nvidia prices will cost as much as you say. DOA, of course they gonna work and people still gonna buy, but that will be a big WASTE of ya money.

Not to mention that in reality - 90% or even higher percentage of people are still using full hd monitors. For full HD resolution RTX 20xx is already overkill not to mention rtx 30xx and AMD rx 6xxx.

What I don't like is this. we get newer and stronger gpu in 1 of 2 years after previous release which is awesome of course, but there is big but...... nowadays game makers spits on optimization and just hope that GPU raw power will be alright to handle the game.
 
Last edited:
If AMD and Nvidia prices will cost as much as you say. DOA, of course they gonna work and people still gonna buy, but that will be a big WASTE of ya money.
We are expecting significant performance upgrades from the next gen. Nvidia will be offering, twice the current speeds? Something like that? AMD might gain even more in RT in an effort to close the gap with Nvidia. So, consider 3090 Ti at $800 and 4070 at $900. 3090 Ti will have the VRAM capacity advantage and it will be cheaper, probably better performance at some titles at 4K, 4070 will be the new card with advantages in about everything else. Both will sell. Higher performance than 3090 Ti at $900? It will sell. Easily. And it will not be DOA, because Nvidia also enjoys positive coverage from the press. So, the press will be in it's conclutions like "Best buy. Just Buy It." I know people are hoping for a $500-$600 RTX 4070, but I believe that is wishful thinking at it's maximum. Nvidia needs money and also needs profit margins. They can't start selling cheap GPUs now. To go where. From 80% market share to 85% market share? It doesn't make sense. They will sell expensive new RTX 4000 cards to give more time to their partners to sell RTX 2000 and RTX 3000 cards and also try to recover their profit margins, even if that means losing 5-10% market share.
 
I am expecting Nvidia to start with products costing over $800 in the 4-6 first months (just a random prediction to give an idea) and AMD to also start at higher than expected prices, like $600 for their cheaper RX 7000 option. So Intel will have plenty of time and room to play under $500.


Yeah, that's a given.

But even if we just get RTX 4070 and 4080 from Nvidia, it will still cause a price adjustment for most of the range. Even with very expensive RTX 4070 - if it's really the equivalent of RTX 3090 Ti, it could be offered at $1100 (with no price / performance increase, we have seen that in RTX 20x0 launch). But it will be probably well below that - at least theoretically, they can still claim various difficulties later and raise the price.
 
Not to mention that in reality - 90% or even higher percentage of people are still using full hd monitors. For full HD resolution RTX 20xx is already overkill not to mention rtx 30xx and AMD rx 6xxx.

What I don't like is this. we get newer and stronger gpu in 1 of 2 years after previous release which is awesome of course, but there is big but...... nowadays game makers spits on optimization and just hope that GPU raw power will be alright to handle the game.
I don't think resolution is so much of a parameter today. If it was, anything over $700 MSRP would be staying on selves collecting dust.
Optimization was always a problem, it just seems to be bigger today because games are huge, developers try to market them as fast as possible and frankly a 10+ core CPU and modern GPUs are huge carpets to hide underneath any performance problem. Also an unoptimised game will sell more CPUs and GPUs than an optimized one, meaning not only you can market it faster, you can also get nice sponsor money from Nvidia, AMD and Intel, by partially optimizing for their architecture instead for everyones.
Yeah, that's a given.

But even if we just get RTX 4070 and 4080 from Nvidia, it will still cause a price adjustment for most of the range. Even with very expensive RTX 4070 - if it's really the equivalent of RTX 3090 Ti, it could be offered at $1100 (with no price / performance increase, we have seen that in RTX 20x0 launch). But it will be probably well below that - at least theoretically, they can still claim various difficulties later and raise the price.
I was reading all over the internet about 4070 being a $500-$600 card. So for some people it's not a given. Probably they just try to justify waiting 2 years for a brand new GPU. Don't know.
But I don't expect it to be over $1000. RTX 2000's pricing was a result of the lack of competition and Ray Tracing marketing. RTX 4070 is not the top model and there is competition. But who knows. Someday we will definitely get a x070 for over $1000 anyway. It might be now.
 
I hope they dont try that shit were you have to buy a k/unlocked version down the road to overclock your gpu.
 
are they in the gpu selling business or the leak news business? If it's the 1st i see no gpu's, if it's the 2nd their stocks are going to skyrocket.
 
"limited edition"..

never seen one yet in store, but already limited .. wth
 
Last edited:
curious if steve gets his hands on one, just to see what numbers and other stuff comes out
 
The whole GPU devision of intel has a estimated loss of 3.5 billion. There is a chance that the whole GPU devision is sold off.


Intel can put benchmarks out or in context all they want. Generally they perform worse at quite higher consumption compared to Nvidia or AMD. And both camps about to release their next gen that would make intel look like low-low-end.
 
Back
Top