• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Arc A380 Desktop GPU Does Worse in Actual Gaming than Synthetic Benchmarks

All this nonsense about Drivers. Intel has almost inexhaustible resources to write proper drivers. They've had years to fix it. Kodouri and his team should be ashamed of this card.
This makes the RX6500 look like screamingly-fast bargain.


Since when 6500xt was slow?

Tbh I was hoping for pressure on the 6700/3060Ti segment. This level of performance isn't doing anybody any good.

A review would be nice. One that looks at die size and such, to put things in the proper perspective.


It is doing good as third player have entered the market. There is not anymore AMD or crapgreedia. If intel will not have anything to put up against strongest amd ot crapgreedia they will be made to lower price than others otherwise they gonna lose income and clients.
 
Last edited:
Ignore them, they don't have the card and don't know what they are talking about. Simple fact is the rx6500xt has no real competition in its price bracket. its a solid performer. AMD released a good product at a good price

I have a seriously hard time calling the RX 6500 XT a solid performer and a good product, especially given it's a 4 GB/64-bit GPU without video encoding hardware that doesn't exactly beat even its own predecessor (RX 5500 XT), in a market where MSRP has no real value, you'll find the 6500 XT anywhere from $180 to $350 in most parts of the world, still. It's not a very good deal for what you get, in most cases you're way better served by simply buying an RX 6600 instead. Or the RTX 3050.
 
really dont get this, I mean how could it possible be a driver issue...Intel employs so many gawd damn people and you get can a team of...idk 10 or so to just optimise drivers for a single (populair) game so that runs as well is it can and then move on to the next?

how freaking hard can it be....come on.....jeez
Their CEO gave himself a 100mil for continuously running the company in the ground and they bought a GPU company that haven't produced anything but a vaporware for 25 years. With this in mind are you really surprised of the GPU outcome ?
Basically you need smart people in the key roles that can recognize and hire other smart people.
 
*Pretends to be shocked*
 
You don't get optimized igpu drivers, you get drivers. It's a new game, it's not an excuse, it's reality. How long as AMD doing GPU's? they still mess up the drivers.
CPU is another story, we all know the previous management were a disgracee. They were the 1st to get the new EUV machines from ASML. It doesn't make them better at what they do, but they are doing things differently now.

There is an occasional mess up in drivers in an otherwise pretty refined driver regime aimed at gaming (and it sure as hell is optimized, I'm sure you remember NvInspector, you can tweak a million things per game) and there is a driver regime that was never aimed at gaming, and is to this date incapable of delivering performance expected based on hardware available. That bit is key. Efficiency is king. The company that extracts the highest FPS per square mm of die space is going to win the race, and if it does so at decent or lower power budget, it wins doubly so.
 
I honestly wonder what qualcomm would deliver with a "discrete adreno" of some sort, if it allocated intel's discrete gpu budget to it. I bet it would be better.
 
If it has enough VRAM to actually do raytracing (unlike the 6400 / 6500 XT), that's something... I guess?
 
3DMark performance shows possible potential. Games show current reality.
For clarity, if you think synthetic benchmarks illustrate performance of an optimal driver, then absolutely no. Synthetic benchmarks will only be relevant if games have very similar characteristics.

I honestly see no usefulness for consumers to look at synthetic game benchmarks at all, except perhaps if there are new technologies to just get a sneak peek.

Immature drivers might help a little, but it seems just as likely the drivers have been " optimized " for synthetic benchmarks and it won't ever perform on 1650 level even.
Most gamers have many misconceptions about how optimization works.
Drivers are not (normally) optimized for specific applications/games/benchmarks etc., except the few times they are cheating*.

If we see games scale in one way and synthetics scale in another, then it's mostly due to hardware resources. We should not expect games to scale badly if they use the same graphics APIs. And if there are exceptions, those would probably boil down to certain parameters to functions causing extra work, and such should be identified quickly and fixed. But if we see general performance differences, then it's not due to "immature drivers".

*)
Drivers have been known to cheat certain synthetic benchmarks and sometimes even games, like replacing shaders to gain performance with "minor" degradation in visual quality. Overriding API-calls is also possible, but will cause driver overhead, so this is super rare. One very old documented example of cheating is this.

Well this time they have sendt a card on the market, thats 100% better than last time :D
More like infinitely better! :P

Poor performance from Intel gpu cause' game devs have already optimised game codes for either AMD or Nvidia in those game titles mentioned by the OP.
Games are not optimized for specific hardware, at least not the way you think.
Doing so would require either 1) utilize very specific low-level hardware differences unique to one hardware maker or 2) utilize separate code paths and low-level APIs targeting each specific GPU-family to optimize for.
(1) is "never" done intentionally and 2) is only used in very specific cases).
Practically all game engines these days are using very bloated and abstracted engines. Most games today contain little to no low-level code at all, and quite often use an abstracted game engine (often third-party).
 
For clarity, if you think synthetic benchmarks illustrate performance of an optimal driver, then absolutely no. Synthetic benchmarks will only be relevant if games have very similar characteristics.

I honestly see no usefulness for consumers to look at synthetic game benchmarks at all, except perhaps if there are new technologies to just get a sneak peek.
Exactly. Games are designed to use your hardware as a whole. Synthetics are designed to test and stress one very specific aspect of your hardware (e.g. the GPU cores only). That's why my CPU usage in Time Spy graphics tests is near zero. Games and synthetics will never be the same.
 
It's been funny to read this thread and see 20 year old shenanigans from both GPU companies brought up, and how they remain amazingly relevant today :D

It is also equally unfortunate to see how clubism seems to be alive and well to an extent that the entry of a third player is being seeing as a hostility and an act to be disregarded when we as consumers should be cheering on a new product from a new manufacturer. I wish Intel well and Raja Koduri's team success in developing Arc into a fine series of dedicated graphics, and if I can, I will definitely be purchasing one, even if just a basic model.
 
Poor performance from Intel gpu cause' game devs have already optimised game codes for either AMD or Nvidia in those game titles mentioned by the OP.
To be fair, this is probably the best reply, they very much are optimised for one maybe two architectures and few are for Intel, there was a dirt game (codemaster dirt rally I think) made for and in collaboration with Intel, it ran very well on Intel tbf, be nice to see how it runs there.
 
Why? I see that the price for 6500 XT is around 155-190 USD, while RTX 3050 is over 300 USD. GTX1650 begins about same or 10% above 6500XT, with most models over 200USD. And as can be seen on screenshots above in this thread, 6500XT a winner performance/USD category (of roughly same priced cards)

Because it offers absolutely the same raster performance as the 2019 RX 5500 XT, minus the hardware acceleration for media content.
It is a sidegrade or downgrade which no one needs or has ever asked for.
 
Simple fact is the rx6500xt has no real competition in its price bracket. its a solid performer. AMD released a good product at a good price

i8n.gif


self conviction is only left when a product is so hard scam in various levels like:

-price 200us (what a trash)

-4gb of 64bit vram (for a 200us what a trash again)

-seriously cutted pci-e lanes (meanwhile other products in price range have 16x lanes and
before company product like rx 5500 xt comes with 8x lanes, what a trash again)

-cutted encode and encode capabilities (meanwhile other products in price range have encode and decode capabilities and before company product like rx 5500 xt have encode and decode capabilities, what a trash again)

back to theme still with a380 games performance think about intel need launch arc now for try improve drivers with users feedback
with proper price according actual game performance, if price stay between 120 to 130us max will be very interesting product because offer 6gb 96bit of vram, 8x lanes pci-e and complete decode and encode capabilities included av1 format

:)
 
Last edited:
The A380 doesn't compete in that segment, Intel has higher core parts for that (though those may as underwhelming as this low-end one).

Intel A380 - 1024 cores
AMD 6700 - 2560 cores
Nvidia 3060Ti - 4864 cores

This competes in cores with the 1650/super RX6400/6500. And loses.
Are there higher specced cards? I thought A380 is their top offering.
 
Ignore them, they don't have the card and don't know what they are talking about. Simple fact is the rx6500xt has no real competition in its price bracket. its a solid performer. AMD released a good product at a good price
not at $200 for a 64-bit card.

The 6600 is already barely enough at 128-bits *( because , very small die size means noticeably less Infinity Cache,, and there is a minimum size required for your nearly doubling of performance per memory tick (SEE RX 6900 xt, vs 5700 xt)!

The existing 128-bit RDNA cards are faster than the 6500 Xt! ( if the cache actually worked on a cad this small, then the performance would be in the range of the 3050, not 10% slower than a 5500 XT!
 
Last edited:
Are there higher specced cards? I thought A380 is their top offering.
Like ARF said, this is their *lowest* specced offering. Now, I still don't anticipate the Arc 770 or 780 to match the 3070 but matching any of those upper-tier GPUs is not the point for a freshman effort. Instead, offering a decent midrange GPU at a decent price should be the target. We'll see if Intel is up to that challenge.
 
Intel's naming scheme is quite interesting. I guess the next generation will have B380, B580 and B780 cards. Of course, B790, B550 and B310 products are also likely in order to widen the product stack.
 
Like ARF said, this is their *lowest* specced offering. Now, I still don't anticipate the Arc 770 or 780 to match the 3070 but matching any of those upper-tier GPUs is not the point for a freshman effort. Instead, offering a decent midrange GPU at a decent price should be the target. We'll see if Intel is up to that challenge.
Ah, ok, then there's still hope.

And I am really curious about the actual performance. Because there's no reason the performance gap between synthetics and games is so big. Either Intel is somehow cheating in tests (not proven so far, cheating usually affects image quality) or the potential is actually there, but for some reason it's not realized in games so far.
 
i8n.gif


self conviction is only left when a product is so hard scam in various levels like:

-price 200us (what a trash)

-4gb of 64bit vram (for a 200us what a trash again)

-seriously cutted pci-e lanes (meanwhile other products in price range have 16x lanes and
before company product like rx 5500 xt comes with 8x lanes, what a trash again)

-cutted encode and encode capabilities (meanwhile other products in price range have encode and decode capabilities and before company product like rx 5500 xt have encode and decode capabilities, what a trash again)

back to theme still with a380 games performance think about intel need launch arc now for try improve drivers with users feedback
with proper price according actual game performance, if price stay between 120 to 130us max will be very interesting product because offer 6gb 96bit of vram, 8x lanes pci-e and complete decode and encode capabilities included av1 format

:)
There is one thing the 6500 XT does right: my Asus TUF is so quiet, even when overclocked! It literally competes with passive cards in noise. It's such a delight to game on a completely silent PC! :cool:
 
There is one thing the 6500 XT does right: my Asus TUF is so quiet, even when overclocked! It literally competes with passive cards in noise. It's such a delight to game on a completely silent PC! :cool:

I would like to know more about this... I guess you can have a quiet PC with a higher end graphics card like the Radeon RX 6800 XT in the correct PC case. Be Quiet.
 
I would like to know more about this... I guess you can have a quiet PC with a higher end graphics card like the Radeon RX 6800 XT in the correct PC case. Be Quiet.
In a large case with lots of airflow, maybe. It's a shame I don't like large cases (m-ATX is my top preference).
 
It is so funny that some of the ppl treated Intel as a 'New comer' in the graphics segment despite the fact that Intel had 20 years of iGPU experiences.

They just need to work hard.
We need more actual products, not fancy hype train PPTs.
 
I am not an Intel fan at all (in fact, I got rid of my all my Intel CPU-powered computers after Meltdown etc) but I really do not get the people who are hating on this dGPU effort of theirs. People should be overjoyed that a new player is entering the market when consumers are getting shafted by ridiculous stunts such as the RX 6500 "XT" with x4 PCIe and no hardware encoding (when even my low-end Polaris card that I bought for $90 open box on eBay has that). Who cares if the initial gaming performance is a bit underwhelming? AMD had massive driver issues when they have been making dGPUs for literally decades (well, technically ATI). Even if it really does end up underperforming, Intel will simply cut the price because they can afford to do so to annoy AMD and Nvidia and then they *will* sell, just in a different performance tier. It is a practically a dream come true to see competition return to the low-end segment. Not everyone wants or can afford a mid or high end card, especially with the current ridiculous power consumption (combined with a global energy crisis). And as a Linux user I know that even if the Windows drivers end up not being the best, it will have very good Linux support (including OpenCL).
 
Back
Top