Saturday, April 15th 2023

Intel Compares Arc A750 with RTX 3060 With Latest Driver Update

Intel has released a couple of new performance slides for the Arc A750, claiming better performance per dollar than RTX 3060 with the latest driver update. Intel has been pushing hard to improve its Arc GPU drivers, both fixing issues, bringing Game On support, and performance improvements. The latest Arc GPU Graphics Drivers 101.4311 beta update brought several Game On optimizations as well as some performance uplifts, mostly focused on the Arc A750 and DirectX 12, ranging from 4 percent up to 63 percent, depending on the game and the resolution.

Although the Arc A750 has 8 GB of GDDR6 memory has less memory than the RTX 3060 which comes with 12 GB of GDDR6 memory, bear in mind that the Arc A750 has a 256-bit memory interface compared to a 192-bit one on the RTX 3060 12 GB graphics card, leaving it with a higher 512 GB/s maximum memory bandwidth. The Intel Arc A750 is also less expensive, retailing at $249, compared to RTX 3060 12 GB, which sells at around $350.
According to Intel's own slides, the latest driver update brought decent performance improvements in several games, but more importantly, it is enough to give the Intel Arc A750 an edge, pushing it ahead of the RTX 3060 12 GB graphics card, at least in Dead Space Remake game. Although Intel has listed a bit of a higher price for the RTX 3060 12 GB, Arc A750 still offers higher performance per dollar, at least in some games. Of course, NVIDIA always has the RT and DLSS aces up its sleeve.

The Intel Arc A750 was pretty high on the performance per dollar in our review back in October last year, even when it was priced at $290, and with the recent driver updates, it is an even better choice.
Source: Intel
Add your own comment

67 Comments on Intel Compares Arc A750 with RTX 3060 With Latest Driver Update

#51
Minus Infinity
Entry levrl is dead in discrete GPU's folks. APU's will ultimately kill off crap like the 6500/6400 from AMD and Nvidia's 3050 class stuff. Phoenix won't do it, but Strix Point surely will be putting a lot of nails in those coffins with Zen 5, RDNA3+ and possible v-cache. When we get RDNA4 in iGPU's it's game over totally for that class of discrete cards.
Posted on Reply
#52
enb141
Chrispy_I don't have any first-hand experience with the RX 6400, but my undestanding is that it and the 6500 would never have existed outside of the pandemic/ETH-mining scalpocalypse.

They were never designed to be dGPUs. They are laptop chips designed to complement APUs and are lacking a lot of things we'd normally expect from a graphics card's display engine, because all of those things would be redundant duplicates alongside an APU.

@evelynmarie probably picked a bad example there because the 6400 and 6500 are possibly the two most cut-down, incomplete, actually half-baked cards AMD have released to market in a very long time. The 6600 was supposed to be the bottom of the product stack, and that's a fantastic example of a cheap, do-everything GPU that works amazingly. I dumped one in my HTPC briefly in 2022 and it was unbelievably capable for its price and power draw, the drivers were absolutely flawless while I was testing it, and the damn thing was both tiny and quiet.
For HTPC sucks, because no VRR no 10 bit and now you can't use hardware acceleration in kodi.
evelynmarieThe 6400 literally supports VRR according to the specifications. So yes, you CAN enable VRR. You just have to do it from within Windows, not Adrenalin Software. You have to do it via Windows Settings -> System -> Display -> Graphics Settings -> Variable Refresh Rate, and toggle it to on.
There's no such VRR or anything there.
evelynmarieAlso AMD is not crappy. I literally said this already. I had a GTX 1080 before I got the 6700 XT. My 1080 served me well, and my 6700 XT has served me well so far since I got it. I don't appreciate having my hardware insulted for no reason.
I'm insulting the 6400 because their drivers sucks, for me is a crap video card that doesn't do what is supposed to do, which is VRR, 10 bit and Hardware Acceleration in Kodi.
evelynmarieI didn't pick a bad example. AMD just relies on Windows native controls to toggle stuff on like Variable Refresh Rate, which the 6400 / 6500 both support. NVIDIA however likes to unnecessarily duplicate functionality into their outdated-looking control panel.
That option is not available anywhere, in Nvidia you can enable it in their control panel.
Chrispy_The 6500 and 6400 aren't awful cards, just bad examples of a typical AMD driver/HDMI/DP feature experience.
  • PCIe x4 instead of x16 causes problems on older machines, and the silicon was designed specifically for PCIe 4.0 in laptops.
  • Lacking encoder means parts of the Adrenaline suite are just missing entirely. That could be relevent IMO, as with things like Freesync that causes issues as I've found AMD drivers expose some Freesync options in one tab whilst being greyed out in another tab when using a non-freesync display. I've managed to "enable" Freesync in the driver via game profiles on a non-freesync display which breaks things and causes odd behaviour until a reboot.
  • Display engine limitations that are different to all previous-gen cards, and the rest of the entire 60-series line-up. If anything was going to trip up the driver dev team or slip through the QC net, that's a prime candidate.
Do you actually have a 6400 working flawlessly? If you do, I'll shut up - but I have had issues with HDMI compatibility with 5700-series cards that AMD couldn't solve and they recommended RMAing my card. The fact I was talking about 30 cards and had tested with a few monitors and two TVs made me realise it was a silicon/driver fault that wasn't going away. A quick search seems to show lots of issues with HDMI and particularly problems with HDMI audio. I'm completely guessing here but the fact there are reported issues with the 6400 and 6500 regarding VRR with TVs, HDMI audio, and colour depth makes me think that shoving a compute chip designed for laptops into a desktop dGPU in a hurry has left some things under-tested, unpatched, and possibly in a less-than-desirable state.

I mean, all that I've just said sounds like big problems with Radeon drivers but I've had equally stupid issues with Nvidia connecting to TVs in the past. For years, their control panel screwed up refresh rates and colour range, frequently resetting to limited-HDMI (16-235 instead of 0-255) for both a home TV and an TV in the lobby at work.
Nvidia also has issues, but in comparison with AMD they are way way minuscule.
Minus InfinityEntry levrl is dead in discrete GPU's folks. APU's will ultimately kill off crap like the 6500/6400 from AMD and Nvidia's 3050 class stuff. Phoenix won't do it, but Strix Point surely will be putting a lot of nails in those coffins with Zen 5, RDNA3+ and possible v-cache. When we get RDNA4 in iGPU's it's game over totally for that class of discrete cards.
The reason why entry level exists is because of intel, their crappy integrated video card sucks, specially old ones. Newer integrated intel doesn't suck as they did in their 10 gen and below.

If intel had developed a good integrated video card, then add on video cards would be needed only for mid to high end gaming.
Posted on Reply
#53
MrMilli
enb141For HTPC sucks, because no VRR no 10 bit and now you can't use hardware acceleration in kodi.

There's no such VRR or anything there.

I'm insulting the 6400 because their drivers sucks, for me is a crap video card that doesn't do what is supposed to do, which is VRR, 10 bit and Hardware Acceleration in Kodi.

That option is not available anywhere, in Nvidia you can enable it in their control panel.

Nvidia also has issues, but in comparison with AMD they are way way minuscule.

The reason why entry level exists is because of intel, their crappy integrated video card sucks, specially old ones. Newer integrated intel doesn't suck as they did in their 10 gen and below.

If intel had developed a good integrated video card, then add on video cards would be needed only for mid to high end gaming.
VRR is in Adrenalin. See screenshot.
But as many have stated, RX6400/6500 were never meant to be stand-alone cards. You can't take those as a reference towards other Radeons. IMO RX6400/6500 should just be avoided.
Also about Kodi, I'm using it for a good 5 years and sometimes things just break in some updates. Like the horrible v20 release. You can't blame any Kodi problems to anyone else but the Kodi devs.
Posted on Reply
#54
enb141
MrMilliVRR is in Adrenalin. See screenshot.
But as many have stated, RX6400/6500 were never meant to be stand-alone cards. You can't take those as a reference towards other Radeons. IMO RX6400/6500 should just be avoided.
Also about Kodi, I'm using it for a good 5 years and sometimes things just break in some updates. Like the horrible v20 release. You can't blame any Kodi problems to anyone else but the Kodi devs.
VRR is in Adrenalin but doesn't works with 6400 as you said, actually all AMD should be avoided, at least I will.

Kodi didn't break anything, it was working pretty good and some day, AMD updated their drivers, and boom, no more Hardware Acceleration in Kodi (just some videos work, 90% of my videos doesn't)
Posted on Reply
#55
Gica
Vayra865800X3D plus board puts you at 400 ,- thats pretty competitive for top end gaming CPU. And itll max perf on a simple air cooler too. Though in general, youre absolutely correct, AMD 'past gen' definitely sees bigger price cuts.
The launch price ($300 for the 5600X and $450 for the 5800X3D, minimum) and the total lack of offers under $300 until the appearance of Alder Lake with very competitive prices and performance. And it is not the first time that AMD has demonstrated that it does not think about profit when it has the opportunity. You now buy ryzen 5 for ~$200 + game only because they don't have room for Raptor.
The same scenario applies to video cards. They sell cheaper because their video cards can compete with nVidia only in rasterization. For Content Creation offers are almost non-existent.
Chrispy_While I mostly agree with you, I wouldn't call a $600 GPU "mainstream" yet. Market data still shows that a very small percentage of people spend more than $400 on a GPU and the Steam hardware survey seems to indicate that the $149 reigning champion of the mainstream (GTX 1650) has only recently been dethroned.

There's also mounting evidence from manufacturing dates on 4070 GPUs that Nvidia has had these cards ready for a very very long time and has been intentionally holding them back until the inventory of old stock gets sold. Laptops with the 4070 have been reviewed and on sale since January and it takes much longer to get a laptop to market than a dGPU, so that correlates with the rumours that desktop 4070 cards have been held back from a 2022 launch to screw over the mainstream market shift unsold, overpriced 30-series inventory first.

Hmm, $600 clearly too much for the mainstream:
This guy cites a ton of sites, channels, sources - the 4070 is a serious sales flop with nobody willing to spend the asking price.
The classification is done according to performance. You don't have an enthusiastic video card if you buy an RX 6500XT at the price of an RX 6950XT.. :confused:
Processors receive the classification from the manufacturer: i/r 3, 5, 7 and 9.
For video cards, in the same series, we have: lowest/entry/middle/mainstream/high/enthusiast

Prices may vary from one country to another.
For example, for me, I think I would be a complete idiot to choose the 6700XT over the 4070. The RTX 4070 compares to the 6800/6900 in rasterization and surpasses the entire 6000 series in the others (ray tracing, CUDA, enc/dec support, etc. ).
We chose the lowest prices.

Posted on Reply
#56
Vayra86
enb141ARC are bad for old games because only has the latest DirectX, older DirectX are emulated.

Yes, AMD are trash, random reboots on my computer when I turn off/on my Smart TV or my Receiver, limited to 8 bits (my old 1050 TI had 10 bits), no VRR on Smart TV.

Actually are crappier with new releases, Hardware Acceleration in kodi was working until about 3 or 4 drivers ago, now they broke it and haven't fixed since then.


Adrenaline is simplified, the UI is nice but I prefer Nvidia because you can fine tune settings per game or application.

Yes Nvidia is not perfect, but the reason why, even with their higher prices, are still the top of the line is because they offer the best drivers of all GPU manufacturers.
You cant do per game settings on AMD? Im gonna check that out !
Posted on Reply
#57
enb141
Vayra86You cant do per game settings on AMD? Im gonna check that out !
Per game, but not per application, with Nvidia you can by both.
Posted on Reply
#58
Vayra86
enb141Per game, but not per application, with Nvidia you can by both.
Thats what a game is too isnt it? Again Ill check that out. The argument is already running thin as we speak ;)
Posted on Reply
#59
enb141
Vayra86Thats what a game is too isnt it? Again Ill check that out. The argument is already running thin as we speak ;)
I don't see a list of anything in there, maybe you have to manually add them.
Posted on Reply
#60
Kyan
enb141VRR is in Adrenalin but doesn't works with 6400 as you said, actually all AMD should be avoided, at least I will.
I understand that you get a terrible experience with the 6400, and i'm not really surprised, but from RX6600 and higher, everything work flawlessly. It's just that AMD have fucked up during the pandemic by trying to (badly) make lower end card to fill the 200-300€ range. But all other card are great and the software work properly. If you have specific usage like HTPC or professionnal app (and even standard usage imo), always check (forum, yt) before buying anything.

For the intel ARC 750, it's really great to see intel continuing to improve their driver. Well, we will have heard that they stopped making dGPU by now if the driver was not updated but I hope that i could consider 3 GPU manufacturer when it will be time to upgrade.
Posted on Reply
#61
Chrispy_
enb141Per game, but not per application, with Nvidia you can by both.
I definitely can do "per application" on AMD drivers here at work, unless Rhino7 and Autodesk CAD suites count as "games" :D
Posted on Reply
#62
enb141
KyanI understand that you get a terrible experience with the 6400, and i'm not really surprised, but from RX6600 and higher, everything work flawlessly. It's just that AMD have fucked up during the pandemic by trying to (badly) make lower end card to fill the 200-300€ range. But all other card are great and the software work properly. If you have specific usage like HTPC or professionnal app (and even standard usage imo), always check (forum, yt) before buying anything.

For the intel ARC 750, it's really great to see intel continuing to improve their driver. Well, we will have heard that they stopped making dGPU by now if the driver was not updated but I hope that i could consider 3 GPU manufacturer when it will be time to upgrade.
That's the problem, AMD's forums are dead, is hard to find out information, with nvidia, you can do the research pretty easily because they will good forums, reddit.

Unfortunately I won't go back to AMD because support is useless, the simple tasks that I'm looking for, AMD doesn't have it and doesn't care to fix it.
Chrispy_I definitely can do "per application" on AMD drivers here at work, unless Rhino7 and Autodesk CAD suites count as "games" :D
I don't care anymore, I'm back to Nvidia.
Posted on Reply
#63
Patriot
enb141I got that card because I was looking for a low profile video card for my HTPC, guess what, it sucks, why:

- Doesn't supports VRR on my Smart TV
- Limited to 8 bits, 10 and 12 bits doesn't works on Smart TV (my old 1050 TI did)
- Hardware acceleration on Kodi is now broken from about 3 or 4 versions from now
- Random Reboot when I turn off/on my Receiver

Frankly, I don't believe you either.
It's not our fault you can't read specs and picked a card without hardware encoding. AMD designed that as a laptop chip for flex graphics, to use IGP for all those functions, then tried to stuff it down desktop users throats. It's a trash chip. As for HDMI VRR not working, cable could have been failing and dropping out of spec.
Posted on Reply
#64
enb141
PatriotIt's not our fault you can't read specs and picked a card without hardware encoding. AMD designed that as a laptop chip for flex graphics, to use IGP for all those functions, then tried to stuff it down desktop users throats. It's a trash chip.
I don't need hardware encoding, but decoding is a joke that they broke it when it was working.

The performance is not the problem, is faster than my 1050ti, the drivers are the problem.
Posted on Reply
#65
chowow
the drivers are getting better Intel is getting serious
Posted on Reply
#66
Patriot
enb141I don't need hardware encoding, but decoding is a joke that they broke it when it was working.

The performance is not the problem, is faster than my 1050ti, the drivers are the problem.
Shrugs, I use a T640 thin client $120 for everything as a media center. Ryzen embedded Vega3 works great.
Posted on Reply
#67
Bayfront Benny
Xaser04I have both a A750 LE and A770 BiFrost (and have owned a A770 Phantom Gaming - so basically all of them) and I can say I have not experienced any of this. The system outside of gaming felt exactly the same as with either a AMD or Nvidia card (and either Intel or AMD). Something else was going on to cause that issue.

The "3rd party cards" all run perfectly fine with the Intel drivers, either Beta or WHQL releases. No idea where the idea they wouldn't have come from as it not something I have experienced or read about anywhere.
What CPU did you use with the BiFrost? Did it have integrated graphics? Mine is the AMD 7950X and it could be that creating some confusion for the OS
Posted on Reply
Add your own comment
May 14th, 2024 12:48 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts