• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel's Arc A750 Graphics Card Makes an Appearance

Realtek and Mediatek sells millions of networking devices and they still have absolutely awful drivers a lot of the time. Just one example. Just because Nvidia has exceptionally good drivers doesn't mean it's so simple for everyone. AMD is still arguably behind them after decades of actual gaming cards manufacturing.
Nvidia has bad drivers aswell, that is the reason why I updating only when is necessary, but at least I have stable drivers, not like Intel where something like this does not exist. Your example is worth nothing, you just confirming what i said, Intel has worst drivers and if you want trash GPU it is your manifacturer
 
Last edited:
Nvidia has bad drivers aswell, that is the reason why I updating only when is necessary, but at least I have stable drivers, not like Intel where something like this does not exist. Your example is worth nothing, you just confirming what i said, Intel has worst drivers and if you want trash GPU it is your manifacturer
Nvidia has objectively the best drivers on the market, just because some new games do not run well on release doesn't make the whole thing bad. Now Intel drivers are bad, because their theoretical performance in the benchmarks does not in any way match what happens in games, especially on older titles.
 
Nvidia has objectively the best drivers on the market, just because some new games do not run well on release doesn't make the whole thing bad. Now Intel drivers are bad, because their theoretical performance in the benchmarks does not in any way match what happens in games, especially on older titles.
Who told you I had a problem with new games? I speak for game and aplication crashes caused by combination of drivers and features enabled in the nvidia control panel, for monitor that worked with overclock with the old driver, and the new one left me without a picture, forcing me to enter in save mode to delete the new driver. That is why I update only when is necessary, not because I will get shity 10% more fps in 2-3 games. The only think that matter for me is the stability.

Regarding the Intel's bias "theoretical" performance - 3Dmark is so manipulated and useless that there are no words to describe it, according it a380 is faster than rx 6500xt.
3Dmark is the same bullshit as geekbench for Apple and the sad thing is that many people like you are fooled into believing that this is theoretical performance or even actual. Of course that perfromance never become true. Тhis is how you win over deluded customers, not with technology, but with lies
 
I still hold some level of confidence (perhaps hope?) that these things won't be a complete wash.

Intel is sticking them in their NUC enthusiast and laptop products. Those products have always been costly (though you have always gotten your monies worth for what they are and in the case of the enthusiast killed everything else in it's form factor) and have never been crap or performed poorly. Speaking as someone who owned skull canyon, hades canyon, phantom canyon, along with some of their laptop kits as linux machines.

IMHO, I think this is going to end up a little under 3060 performance.
 
I bet that A770 will be 5% slower than 1080ti. Lets raja suffer :laugh:

The E-sports truck is remarkable.

But a in-depth review proberly will devalue these cards significant for not just lacking performance but roughly 50% more power consumption then competition.

They have cheated with drivers. They released unofficial scores that would assume the cards would be equal or better then counterparts such as Nvidia or AMD, but the truth is these things are -15% behind pretty much, and the power requirement is quite more.

The reason why these are so delayed is obvious too; the first batch just did'nt perform and they had to spin it back with fixes in regards of performance. These are just compute based tiles or GPU's that did'nt meet quality standards and shipped as a "gamer" GPU.

It's the same we saw with Vega, Fuji and Polaris. They all do great in computational work but lack horsepower in games. If @W1zzard gets his hands on one i'd like to see the rumor leaks vs actual scores or performance.
 
And hardware doesn't seem to be the problem. Drivers for games are the problem, something that Intel has very little experience with.
And what is your logical deduction for this?
Writing a driver for a small GPU or for a GPU are not any different. Nvidia and AMD have no problem scaling theirs from tiny low-end (or even integrated GPUs) to massive high-end GPUs, and they do so because the scaling happens in the GPU, not the driver.

Scaling hardware however, is a challenge, and for Intel the hardware is the problem. They perform well in synthetic benchmarks because they tend to create different workloads compared to games, typically designed to create a high load rather than rendering efficiently. If A750 have a similar resource balance to A380, then we can expect similar performance characteristics. While driver updates can fix the reported bugs, it will not change the performance characteristics of these products. The underlying problem is the effectiveness of the hardware scheduling, which would require new generations of hardware to solve.
 
Little experience again? Did you saw the photo, Intel has hundreds of millions sold iGPUs and still cannot make good driver for their iGPU. Its not to experince, it's just their incompetence. They never released stable driver for HD4600 the most popular gpu 2013-2016, "supported" till 2021 when is their last driver for it.
Optimizing an iGPU for video decoding is light years different than working with game developers to build an ecosystem which gives support to the shiny API's that make everything work.
 
Is that the same as FreeSync? Or an even better, newer open, VRR standard?

Ah no, it's dedicated video encoding and decoding hardware from their iGPUs; true hardware-accelerated. It's seriously the best thing about their iGPUs in my opinion. Can often do a better, more efficient and faster job than basically the majority of dGPU's or CPU when video editing. It's super powerful for what it is. The newer ones can handle H.265/HEVC.

I just checked their wiki and it looks like it will do it for sure. Says: "Version 9 (Intel Arc Alchemist, Meteor Lake, Arrow Lake)Intel Arc Alchemist GPUs adds 8K 10-bit AV1 hardware encoding.".
 
It's seriously the best thing about their iGPUs in my opinion.

It really is, I actually understand the preference to quicksync vs dGPU.
 
I think video editors will jump at Intel dGPUs! I hope they market it that way!

I just saw another article writing: " Intel's ACM-G11 has world-class media processing capabilities (hardware decoding/encoding of 4K H.264/H.265/AV1 streams)"
 
Optimizing an iGPU for video decoding is light years different than working with game developers to build an ecosystem which gives support to the shiny API's that make everything work.
Overall, Intel has done better in supporting these shiny APIs than AMD. Their OpenGL support have been much better than AMD's for years, and they even beat AMD in offering Vulkan support.

As for ecosystem, beyond having a good profiling tool, the ecosystem is the API of choice along with its tools and accompanying documentation, which for PCs today usually means DirectX, Vulkan or OpenGL.
Intel's responsibility is to ensure their driver works according to the API specs. PC games are not written for specific hardware.
 
Last edited:
Back
Top