• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Arc A380

I'm not sure the public looking for low end GPUs really use RT. Btw that pricing is extremely high.
 
Great point, while technically it is easy to force Gen 3 on a modern board and get test results, actual platforms with Gen 3 limitation do not have the BIOS updates for ReBAR.

Intel 10th gen has rebar with PCIe 3.0.
 
Instead of written down unclear graphics setting. Why not just screenshots of graphics setting menu ?
 
Great point, while technically it is easy to force Gen 3 on a modern board and get test results, actual platforms with Gen 3 limitation do not have the BIOS updates for ReBAR.
I have a fairly shitty Gen 3 motherboard, Prime B450 Plus, but it did receive BIOS updates for ReBAR. I'm sure that there are plenty of B450/X470 motherboards out there that can use ReBar on PCIe Gen 3.
 
What happened to the reports that by increasing the power limit A380 can gain up to 50% boost in fps ?
 
What happened to the reports that by increasing the power limit A380 can gain up to 50% boost in fps ?
That happens when you let random internet people review your product

Intel 10th gen has rebar with PCIe 3.0.
I have a fairly shitty Gen 3 motherboard, Prime B450 Plus, but it did receive BIOS updates for ReBAR. I'm sure that there are plenty of B450/X470 motherboards out there that can use ReBar on PCIe Gen 3.
Right .. thanks for clearing that up
 
What happened to the reports that by increasing the power limit A380 can gain up to 50% boost in fps ?

Perhaps that's possibly game-specific? It would be useful to see if OC performance is better in modern (DX12/Vulkan) games.

EDIT: I tried pulling up the video where that claim was made, and it seems that the A380 without OC ran slower than the GTX1650, whereas it was about on par with it or faster with OC. On the other hand, in the TPU review stock performance is already similar to the GTX1650. So it could be their model had (driver-related?) issues with OC disabled.

cQNIWQm.png

5SS0WXq.png


 
Last edited:
  • Like
Reactions: r9
actually it would work without any connector and limited to a max powerdraw of 75w given its V-Sync 60hz consumption if kept at 1080p (my PSU offer 6pin and 6+2pin tho but i usually use sleeved extensions)

Maybe but probably not. The sense pins on pcie power connectors are there for a reason, to tell the card if it has power or not. If the card has a power connector at least 6 pins should plugged

although if they had a 75w limited LP model at a lower price than the 6400, i would take one for giving "the new player on the market" a go (and if there was no RE-BAR "mandatory" :laugh: )

From the bio you're using a 3600 and b550, don't you have rebar on that? It should be a bios update away
 
The what now? :twitch:
All of the negative hype around the 6500XT will have those town criers and now with this card's 6GB frame buffer will be called the minimum. We have to look at the fact that this A380 card is only being held back by drivers to more compete with the rest of the low end cards. Yes it is in the range of the 1650 Super/6400 but those are both super mature in terms of drivers.
 
Perhaps that's possibly game-specific? It would be useful to see if OC performance is better in modern (DX12/Vulkan) games.
Umm, thats like a 45% perf jump with the OC..... my 'X' key is glowing and calling to me.


The state of things is so messed up rn. 200 quid or so for something thats in the region of a 1/4 slower than a GTX 1060. I dont like this timeline.
 
That RT performance is honestly pretty impressive, looking forward to see what the A770 will be like
I agree, impressive as a first showing, very keen to see how that plays out through the rest of the stack.
You mean that performance only marginally better than an XP slideshow with RT on :wtf:
I think W1zzard summed it up pretty well, obviously the outright performance of this card makes RT on unplayable, but the relative performance against competitors of different architectures is an indication that their architecture may handle RT reasonably well.

From the conclusion;
It's still an interesting data point with promising results that bodes well for the architecture though, because Intel will be releasing higher-end graphics cards very soon.
 
Maybe but probably not. The sense pins on pcie power connectors are there for a reason, to tell the card if it has power or not. If the card has a power connector at least 6 pins should plugged
i meant in a LP board -without- a connector at all. (i am not that idiotic, no worries ... )
From the bio you're using a 3600 and b550, don't you have rebar on that? It should be a bios update away
from my syspecs, why would i get a A380 to replace a Powercolor Red Devil RX 6700 XT 12gb :laugh: (i suspect not even a A770 either ... :laugh: )
no, it would be in LP only for the "SFFHTPCARGH!" listed just after, which use a i7-3770 and a GT 730 2gb ... PCIeX 2.0 :p no Re-Bar
thus, why the RX 6400 4gb is a better option in the end, as it consum less and have performance in the same ballpark
 
"into the unknow"
What an apt blurb for Intel's debut GPU :roll:
 
Slower performance than the 6400 (excluding rt, which isn't a realistic idea on either card), higher price and twice the power -- if Intel thinks this will end up in anything except the i740 discount bin, they've got another thing coming!
 
Last edited:
I'm not sure the public looking for low end GPUs really use RT.

Nope, but as it's built into the architecture, Intel has two choices: Disable it and deal with complaints about locked-out features, or leave it in and deal with complaints that it doesn't perform. Personally, I'm glad they chose the former. If nothing else, it provides a window into the potential of higher-end chips.

Btw that pricing is extremely high.

MSRP: Pretty high, by 20-30 USD. Street pricing? Oh yeah; straight bananas.
 
HTPC card I guess... would rather get a 5600G for that actually
never imagined I would read HTPC card in 2022, what is this? 2010 ?
 
Intel's big mistakes are advertising this as a "gaming card" and the pricing, when it's really a low-end graphics card which is fine for most workloads other than gaming.

But we need some people to "beta test" it, right?

If everyone held out with buying an Arc GPU until 2025, then there would be no Arc GPU in 2025. It's not economically feasible to spend money on developing a product line that no one buys.

If you personally prefer a well tested product and don't want to risk having issues with something new, that's fine. But you can't expect everyone to be on the same opinion, unless you want Arc to fail miserably - which I don't see why you would, as you said it yourself that it's good to have a third player in the game.

So yes, "beta testing" is worth it. ;)
If I were one of the three GPU makers, I would create a "beta GPU" program, where the public could get their hands on a "locked down" and downclocked QS GPU to get more widespread game testing, either for free or a symbolic fee, and supply these a couple of months ahead of a new GPU architecture release. With hundreds of games (thousands if you count indie games), there is no way any of these three can cover enough test cases.

never imagined I would read HTPC card in 2022, what is this? 2010 ?
So no one is building media center PCs any more?
 
That's also why they're sending out massive staff to YouTubers to give them a cozy time ;)
 
Quite honestly, Intel needs to subsidize first gen in order to create a large (beta) user base, iron out the issues, and go on to move to competitive successors. I don't really get the "buy it even if it's substandard to help competition" argument.

Intel is not some company struggling to keep afloat and keep developing before closing shop. Intel is not 3dfx before rampage.

It's not going bankrupt anytime soon, it's got billions of dollars, cash, at hand.

They want to succeed, they need to price accordingly. Current pricing on this card is laughable, while the card itself could be an enticing proposition at 110-120 dollars. That's what it should cost, at most, in my opinion.
 
From the conclusion:
"I then tried Intel's newest 101.3220 Beta driver and it magically solved all serious issues. The card would now work and be 100% stable in all games, including Witcher, DoS 2 and RDR2—looks like the driver team is actually fixing bugs, good job! Just to emphasize, as soon as I switched to the beta drivers there was not a single crash during all my testing—a huge improvement."

So not so bad and obviously getting better quite fast.

also from the cons:
  • "Immature drivers and software"
Immature is not so bad, it is just expected at this point in time.
So if we`ll see improvement at this (driver) front on a monthly manner I can call it better than expected.
Bad drivers means not only unstable, bugged or having a suite not working as intended. Bad drivers also means unoptimised or having heavy overhead that decreases performance and having stuttering on old graphic APIs. Especially with reBAR off the GPU is a bad experience in many games with excessive stutters.
 
Thanks for another great review! Seems to mostly support GN's findings from a few weeks ago - roughly RX 6400 performance (though quite variable, and must have ReBAR) at 50% higher power consumption.

One question, @W1zzard: shouldn't the "average fps" and "relative performance" tables be identical, just with different numerical representations? 'Cause at least for 1080p the ordering is different, which seems strange to me.
 
Back
Top