• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Arc A770

Well, I'm not in the market for a mid-range card right now, but it is good to have more competition. Now all we need is NVIDIA to release a desktop CPU and we can have a 3-way competition for CPUs as well.

All this talk of graphics brings up something I wish they would branch into and this is APUs. AMD once had awesome (for the time) APUs and it seems they're bringing them back (to an extent). I know there was talk with DX12 that you could one day combine the power of integrated graphics with discrete graphics cards, but I haven't heard much more about that in years. It would be cool if Intel brought their iGPUs up to match AMD's iGPUs and then there was a way to combine the power of both integrated and discrete.
 
Lots off transistors and power
 
The A770 actually came out better than I had expected. It's a solid 1440p60 card with good RT performance. The 16GB of VRAM will make it relevant going forward, and AV1 encode is a boon.

That said, it has way too many disadvantages for me. As a person for whom efficiency is far more important than raw performance, I find the power figures completely unacceptable. 44W idle, 50W during video playback and 211W at 60Hz V-synced is simply atrocious. My GPU spends most of its time either idling or playing back videos, and it uses 4 and 10W doing so. The fact that the ARC requires ReBAR and Win10 makes it useless as an upgrade option. The final straw is the lack of native DX9 support and subpar performance in DX11, further limiting its appeal.

And then there's the price. For what it offers, $250 would be fair. Here in Europe the RX6600 starts at $215, and the RX6600XT at $280 sans VAT. I'm glad Intel entered the dGPU market, but I see no way it can influence it with its current pricing.
 
Last edited:
I'd like to see a top end Arc card which I hope shows up in time.
 
what happened here? An anomaly, or did those cards really choke completely on this game/load?
VRAM

Frametimes look very good, I had expected a massacre there, nothing of the sort
Yeah nothing noteworthy there. Actually they look a tiny bit better sometimes than competitors, but again, not enough to mention imo
 
I do believe the ARC A770 page (https://www.techpowerup.com/gpu-specs/arc-a770.c3914) in the GPU Database needs to be updated, now that Techpowerup's review has been released, the estimated relative performance positioning was indeed too hopeful :laugh:


ARC A770.PNG
 
Colour me impressed with the RT performance. Intel hit the ground running there.

Still, there is a lot of variance in performance between games it seems. The Borderlands and Divinity results are kind of weird, same performance at 1080p and 1440p?
 
@W1zzard

Was the card tested with DX 11 in Borderlands 3 instead of 12? Watching now the review from Jayztwocents and he got completely different numbers with 12:
BL3Arc.jpg


I know that once it was mentioned in a TPU BL 3 test that DX 11 is preferred cause of early DX 12 build issues but as someone who plays this game on and off since the launch day I've tried it a few times with diff cards.
DX 12 used to be in beta mode but after a certain update its not beta anymore, it depends on the card but for the most part DX 12 is the better one to use imo.
 
Was the card tested with DX 11 instead of 12? Watching now the review from Jayztwocents and he got completely different numbers with 12:
As indicated I'm using DX11 because DX12 was pretty much unusable the last time I checked due to constant shader recompile on startup (that was a year or so after release).
Is this fixed now and DX12 is a first class citizen in BL3? What API the majority of players use?
 
As indicated I'm using DX11 because DX12 was pretty much unusable the last time I checked due to constant shader recompile on startup (that was a year or so after release).
Is this fixed now and DX12 is a first class citizen in BL3? What API the majority of players use?

I can't speak for the playerbase but it was indeed patched some time ago and its out of the early stage, from what I heard it even defaults to 12 with AMD cards now and Wonderlands using the same engine also defaults to 12.

On my system DX 12 is much better than 11 with the current game version so I switched to that.
It does use up a lot more Vram/memory tho but that shouldn't be an issue here.
 
That's why I'm testing Days Gone. Same engine, proper DX12 implementation

Yeah to be honest BL 3 was never a well optimized title, I was just wondering what happened there with those numbers and ye Days Gone is well made considering its UE 4.
I guess if you have nothing better to do you could give it a quick check, that early long shader compile loading is gone now at least I did not see it happen when I switched to 12 recently.
 
Thanks for the review. :)

Now I'm sure: if AIB cards come out with idle fan-stop, I'll buy one. I'm specifically hunting for an Asus TUF to match the rest of my build. I'm just not sure if I want the 8 or 16 GB version. I'm on 1080p, so don't need the extra VRAM, but I might later.
 
So how can we buy one of the Intel FE cards? Is this going through traditional retailers?
 
I have very little faith currently in Intel's driver team, everything points to problems (shady maybe even) in that area.
 
Other sites have done some OpenCL benchmarks / Blender, and it seems like the A770 is pretty good for compute tests.

I'm cautiously optimistic here.
 
Raytracing yes, but where is shader and TMU and ROP performance? Looks like 256TMUs and 128ROPs ain't helping much, I wonder how powerful occlusion culling is on these cards.
 
What the hell is happening with idle power consumption? Looks like a bug similar i had with 5700XT - you could use another monitor in the review or check if VRAM clocks were elevated, becasuse that looks like the symptoms -increased power consumption.
 
I wonder, why are there any screams from users for not using any of Intel/AMD's industry game-changers?! :laugh:
You mean like how Mantle became Vulkan, and heavily influenced d3d12?

Turn down the fanboy please. Every company has a good idea once in a while.
 
So a Vega 64 but slightly better lol (well with AI cores)
 
So a Vega 64 but slightly better lol (well with AI cores)
Or considering it in another way, if you were playing only with Metro Exodus, it's RTX 3070Ti performance at little above the price of an RTX 3050. More seriously, that game probably gives an idea of what is possible to achieve with these GPUs that Intel might or might not achieve in other titles too.
 
As a less complex tech-know-how performance enthusiast... i only look at 1080p/1440p game performance, power consumption, thermals and noise levels (and a few features which i won't bore you with). IMO, intels done a great job for getting their feet wet (again) in the GPU dome of riches where 2 gladiators have stood the test of time for too long and hard and now turning on consumers by wreaking havoc to fill their pockets.

ARC: Looking at the performance charts, yep a little pricey for my liking too and the rebar limitation kinda sucks (for system oldies). But, i'm sure the price isn't set in stone and seeing its Intels first attempt, forthcoming updates/optimizations should rally up some much deserved added performance margins (maybe even lifting non-rebar woes to a finer degree of acceptance). Bottom line, these cards are competing at the highest level if we can account for the majority of GPU sales which sit in the ~$350'ish plenty-of-perf-fuelled-visual-candy/mid-tier space. If intel can somehow tighten up that wider performance hit on non-rebar systems and offer better value going forward (slashing prices), we've definitely got something well established already without having to build our hopes on next Gen.

I would love to see these cards reviewed again at a later time, assuming Intel is already working at full pace to roll out updates.

The funny thing is, i'm not even in the market for a mid-tier card... my current one competes at the 3070 TI/RX 6800 level. Whether its Intel or Bob in his backyard, the new kid on the block makes things only so much more interesting... [hopefully] even a tiny share in the market for now is enough to eventually put holes in NVIDIAs fine timber boat which is costing the consumer a submarine + annual crew wages. Yeah thats right, i'm all in for the DISRUPTION of waves and would like to see (one day) the big boys squeal in ripples - fat chance, but there u have it! (...meanwhile will end up grabbing a 40-series or RDNA3 at silly prices... so don't mind my non-comformist eruption and weak-ass hypnotised fattened-up-beyond-reasonable-enthusiasm consumerist hypocrisy)
 
Driver quality is killing it.
Hardware is mostly OK, Software needs a lot of improvement.
If they don't improve drastically the driver efficiency and also make it good at CPU multithreading scaling, the problem (the % performance that they lose due to driver inefficiency) will be even bigger in Battlemage.
Raja didn't even seem to me that optimistic!
This is the first time i watched him in video interview, it seem he gave mostly honest answers and didn't sugarcoated the whole situation!
INTEL-ARC-ALCHEMIST-3.jpg


 
What the hell is happening with idle power consumption? Looks like a bug similar i had with 5700XT - you could use another monitor in the review or check if VRAM clocks were elevated, becasuse that looks like the symptoms -increased power consumption.
I'd like to know that too. I'm wondering if we can expect some improvement with newer drivers.
 
Back
Top