Thursday, August 11th 2022

Intel Arc A750 Trades Blows with GeForce RTX 3060 in 50 Games

Intel earlier this week released its own performance numbers for as many as 50 benchmarks spanning the DirectX 12 and Vulkan APIs. From our testing, the Arc A380 performs sub-par with its rivals in games based on the DirectX 11 API. Intel tested the A750 in the 1080p and 1440p resolutions, and compared performance numbers with the NVIDIA GeForce RTX 3060. Broadly, the testing reveals the A750 to be 3% faster than the RTX 3060 in DirectX 12 titles at 1080p; about 5% faster at 1440p; about 4% faster in Vulkan titles at 1080p, and about 5% faster at 1440p.

All testing was done without ray tracing, performance enhancements such as XeSS or DLSS weren't used. The small set of 6 Vulkan API titles show a more consistent performance lead for the A750 over the RTX 3060, whereas the DirectX 12 API titles sees the two trade blows, with a diversity of results varying among game engines. In "Dolmen," for example, the RTX 3060 scores 347 FPS compared to the Arc's 263. In "Resident Evil VIII," the Arc scores 160 FPS compared to 133 FPS of the GeForce. Such variations among the titles pulls up the average in favor of the Intel card. Intel stated that the A750 is on-course to launch "later this year," but without being any more specific than that. The individual test results can be seen below.
The testing notes and configuration follows.

Source: Intel Graphics
Add your own comment

85 Comments on Intel Arc A750 Trades Blows with GeForce RTX 3060 in 50 Games

#76
HenrySomeone
john_Intel is probably lucky that AMD and Nvidia will probably not offer a cheap next gen product early. Nvidia will start with 4080 and up. Even if they throw 4070 in the market, it's not going to be cheap. They lowered their profit margin from about 65% to 45% and started offering GPUs to their partners at much lower prices. Nvidia needs money to go forward and having the strongest by far brand and 80% of the market, also knowing that Intel is far behind and AMD's need to keep focusing on EPYC because of capacity restrictions, they will not price low. I am expecting Nvidia to start with products costing over $800 in the 4-6 first months (just a random prediction to give an idea) and AMD to also start at higher than expected prices, like $600 for their cheaper RX 7000 option. So Intel will have plenty of time and room to play under $500. Fun fact, Raja is reliving his Polaris times. He just doesn't have a Vega equivalent yet.
Came here to say exactly this in reply to the same comment and then saw yours. While it is true that the Arc release time frame is far from optimal, it still isn't a lost cause precisely because of the reasons you've stated; due to the fact that "middle" class gpus probably still won't see replacements for well over half a year, Intel still won't be absolutely too late as long as they manage to get theirs out in time for the holiday season.
Posted on Reply
#77
chstamos
Insane scenario time, just shooting the breeze here. Maybe intel should swap a precious x86 CPU license for nvidia with a license for all of nvidia's current graphics patents and technologies, and move from there.

I mean, talking about intel competency in graphics seems to be a wild alternate history fantastical scenario already, so why not imagine wild shit up.

I know it won't ever happen, in a million years. But were it to be proffered, would nvidia bite?
Posted on Reply
#78
john_
chstamosInsane scenario time, just shooting the breeze here. Maybe intel should swap a precious x86 CPU license for nvidia with a license for all of nvidia's current graphics patents and technologies, and move from there.

I mean, talking about intel competency in graphics seems to be a wild alternate history fantastical scenario already, so why not imagine wild shit up.

I know it won't ever happen, in a million years. But were it to be proffered, would nvidia bite?
Intel enjoys strong and deep relationships with big OEMs. To say it differently, if Intel was stopping producing X86 tomorrow, at Dell, HP and other companies people would be panicking more than compared while living a magnitude 7 earthquake. So, Intel isn't in a rush to sell ARC GPUs. Well they are, but it's not that they can't sell those. If they manage to fix their drivers, application and they don't have critical errors in the ARC hardware, they can sell those 4 millions it was rumored they will build to those OEMs. Dell for example, I assume they could buy a million ARC GPUs in just one order, at even lower than cost, as part of a package for(let's say) 5 million Intel CPUs and chipsets.

As for Nvidia getting an X86 license, they tried 15 years ago I think, Intel knew better NOT to give them one. Intel could have another more dangerous AMD to face today if they had given Nvidia that license. And probably Nvidia doesn't want it anyway. ARM closed the gap with X86 and today GPUs are the heavy lifters in compute, so not really a huge need for an ultra fast CPU. Intel getting patents from Nvidia? Nvidia wouldn't be selling anyway. Nvidia tries for over a decade to create proprietary standards. Sharing is not exactly part of their character as a company. Not to mention that Intel is the reason Nvidia had to close it's chipset division 10+ years ago.
In the end sharing patents is a risk for both of them. A better X86 CPU from Nvidia, could mean huge troubles for Intel. A good enough GPU series from Intel could mean the end of Nvidia as the huge company we know them today.
Posted on Reply
#79
Totally
chstamosInsane scenario time, just shooting the breeze here. Maybe intel should swap a precious x86 CPU license for nvidia with a license for all of nvidia's current graphics patents and technologies, and move from there.

I mean, talking about intel competency in graphics seems to be a wild alternate history fantastical scenario already, so why not imagine wild shit up.

I know it won't ever happen, in a million years. But were it to be proffered, would nvidia bite?
Not a chance, whatever it is Intel wants ( and cash ain't it) is something they didn't want to pay or give up otherwise they wouldn't attempted so many roundabout ways to acquire an x86 license several years ago.
Posted on Reply
#80
r9
From the transistor count and die size I was hoping to be Arc 750 = 3060ti and Arc 770 = 6800 non-xt but it looks like 770 will be 3060ti perf.
So it will be hard to undercut the competition when it costs more to make the card.
However none of that is really surprising hopefully they'll stick with it and be a third player in the GPU space as NVIDIA and AMD haven't been price competing even before the GPU shortages.
Posted on Reply
#81
80251
john_Intel enjoys strong and deep relationships with big OEMs. To say it differently, if Intel was stopping producing X86 tomorrow, at Dell, HP and other companies people would be panicking more than compared while living a magnitude 7 earthquake. So, Intel isn't in a rush to sell ARC GPUs. Well they are, but it's not that they can't sell those. If they manage to fix their drivers, application and they don't have critical errors in the ARC hardware, they can sell those 4 millions it was rumored they will build to those OEMs. Dell for example, I assume they could buy a million ARC GPUs in just one order, at even lower than cost, as part of a package for(let's say) 5 million Intel CPUs and chipsets.
I thought OEM's didn't really buy discrete GPU's in quantity but rather systems with iGPU's?
Posted on Reply
#82
r9
80251I thought OEM's didn't really buy discrete GPU's in quantity but rather systems with iGPU's?
Can't buy something that doesn't exist yet ;)
Posted on Reply
#83
gasolina
i think this arc a750 gonna be at least 500$ since the A380 is around 200-250$ on taobao on chinese sites and this a750 gonna be 700$ rumours , it's great a product when we feel like doing charity for corps. I believe just let intel GPU division die with Raja since this dude can only cook curry not making any below average GPU in the next 20 years
Posted on Reply
#84
r9
chstamosInsane scenario time, just shooting the breeze here. Maybe intel should swap a precious x86 CPU license for nvidia with a license for all of nvidia's current graphics patents and technologies, and move from there.

I mean, talking about intel competency in graphics seems to be a wild alternate history fantastical scenario already, so why not imagine wild shit up.

I know it won't ever happen, in a million years. But were it to be proffered, would nvidia bite?
I don't see why would they allow any other player in the x86 space when they can make gpus without any help from anyone else.
Also we need to move away from the archaic x86 CPUs anyways it's like a damn anchor for handhelds and laptops.
Posted on Reply
#85
RandallFlagg
Well, seems we know why all the delays with ARC occurred now.

One was due to a factory shutdown from COVID in 2021. The second big delay in 2022, because a big chunk of their engineering team was in... Russia.
That holiday release was delayed 4-6 weeks because the factory making the boards was hit by Covid and things obviously slowed down or stopped. SemiAccurate has confirmed this issue. If you are going to launch for holiday sales and you get delayed, it is probably a better idea to time it with the next obvious sales uplift than launch it between, oh say Christmas and New Years Day. So that pushed DG2/AA into mid/late Q1. Fair enough.
...
So why the delay and the mess? Intel is usually pretty good at drivers but this time around things are quite uncharacteristic. Intel offered a few reasons for this on their Q2/22 analyst call which boiled down to, ‘this is harder than we thought’ but that isn’t actually the reason. If that was it, the SemiAccurate blamethrower would have been used and refueled several times already so what really caused this mess?

The short version is to look where the drivers are being developed. In this case Intel is literally developing the DG2 drivers all over the world as they do for many things, hardware and software. The problem this time is that key parts of the drivers for this GPU, specifically the shader compiler and related key performance pieces, were being done by the team in Russia. On February 24, Russia invaded Ukraine and the west put some rather stiff sanctions on the aggressor and essentially cut off the ability to do business in the country.
www.semiaccurate.com/2022/09/02/why-is-intels-gpu-program-having-problems/
Posted on Reply
Add your own comment
May 17th, 2024 14:47 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts