• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Arc A770

Honestly if this came out 12 months ago it might have made an impression. A770 is slower than a 2080! And we'll soon comparing it to RDNA3 and Lovelace. Competition is good, but it'll only make a dent at the lower end. I hope drivers can eek another 10-20% more performance and power consumption can be lowered. I still wonder if they will go all in on development of Battelmage though as that needs to be a massive upgrade and at least double performance because it'll be facing Blackwell and RDNA4.
 
this card would be a great buy when pandemic happen, post-pandemic

not so much, still nice try tho
 
Honestly if this came out 12 months ago it might have made an impression. A770 is slower than a 2080! And we'll soon comparing it to RDNA3 and Lovelace. Competition is good, but it'll only make a dent at the lower end. I hope drivers can eek another 10-20% more performance and power consumption can be lowered. I still wonder if they will go all in on development of Battelmage though as that needs to be a massive upgrade and at least double performance because it'll be facing Blackwell and RDNA4.
For me, this kind of performance is absolutely fine. The only thing I don't like so far is the idle power consumption. I hope it'll be fixed with driver updates.
 
Interesting...

At 3440X1440 (UWQHD which I run at) -

Riftbreaker :

1665017560125.png


This site runs 1080P/1440P/3440x1440/4K with 20 games and averages them together to get a ranking. Using that method the A770 is 14.6% faster than the 3060, and ~3.3% faster than the 6650XT :

1665018211299.png
 
Too little, too late imo

It's really going to show when the nextgen GPUs from Nvidia and AMD come out in probably a couple of months.
 
Too little, too late imo

It's really going to show when the nextgen GPUs from Nvidia and AMD come out in probably a couple of months.
I am pretty sure Intel is counting on both AMD and nVidia to not release any mainstream gpus this year to keep the prices high.
Otherwise they might have just shove it under the rug and forget it ever existed. I suspect the 40-series announcement have something to do with that.
 
Any info on using A770 as a GPU aid for adobe premiere or after effect?
Utilizing the AV1 encoder during editing?
Dose it have CUDA like hardware so I can shift from NV to intel sometime?
 
People yet again proving that they don't want 3rd player if it doesn't instantly provide to them the fastest GPU for the cheapest price. It's probably fascinating from a psychology viewpoint.
Common...
 
People yet again proving that they don't want 3rd player if it doesn't instantly provide to them the fastest GPU for the cheapest price. It's probably fascinating from a psychology viewpoint.
Common...

It's not just that Intel can't provide a faster GPU. They can't even get the drivers to function properly.
 
It's not just that Intel can't provide a faster GPU. They can't even get the drivers to function properly.
TBH Intel taught many people a good lesson, that software isn't simply throwing money at the screen and drivers comes out.
 
It's a bit sad the drivers are in a state they are - but it's not just driver issue I guess, the whole architecture is very compute oriented. As if it was made for mining first. ;-)

And people were just days ago convinced that Intel wouldn't release a card which doesn't work without BAR...
 
they need todo something about the driver overhead in both d9d9on12 and DX11/12 scenarios
the performance issues at lower resolutions SCREAM driver overhead
thankfully this should be fixable

not sure what they are smoking with that idle power consumption tho
id buy one just to play with it but they gotta fix there driver overhead

they need to write their own implementation of D3D9ON12 and DX11 figure out how to accelerate the draw calls
AMD had similar issues with draw call overhead tanking performance in DX11 due to a poor scheduler implementation

thankfully intel's architecture looks like it might be flexible enough for a driver level fix
D3D9ON12 is relatively new
 
Last edited:
they need todo something about the driver overhead in both d9d9on12 and DX11/12 scenarios
the performance issues at lower resolutions SCREAM driver overhead
thankfully this should be fixable

not sure what they are smoking with that idle power consumption tho
id buy one just to play with it but they gotta fix there driver overhead

they need to write their own implementation of D3D9ON12 and DX11 figure out how to accelerate the draw calls
AMD had similar issues with draw call overhead tanking performance in DX11 due to a poor scheduler implementation

thankfully intel's architecture looks like it might be flexible enough for a driver level fix
D3D9ON12 is relatively new
The issue for AMD was more than just drivers, it was inherit to GCN architecture.
it is the same reason why AMD never made a GCN gaming card with more than 4096 cores, and was not able to compete with Pascal.
One of RDNA's main goal was to address the bottleneck, so I wouldn't be so sure that it is just drivers for Arc.
Software overhead and GPU architecture goes hand in hand, it could also be Arc being so compute oriented makes it harder to optimize.
 
You mean like how Mantle became Vulkan, and heavily influenced d3d12?

Turn down the fanboy please. Every company has a good idea once in a while.
Mantle?! Wtf! Mantle is like ~9-years-old and still only supports ~180 titles out of tens-of-thousands of titles. How is that even comparable to NVIDIAs tech features' prowess that supports over gazillions more on top of that?! :kookoo:

People like you need to understand, like others here, *today's* OBSERVATION / PERSONAL EXPERIENCE (not going back to the stone-age :laugh:) has nothing to do with fanaticism, etc.

Find another term because that doesn't apply here. :roll:
 
It's not just that Intel can't provide a faster GPU. They can't even get the drivers to function properly.
I still say massively scale up Gen w/ it’s stable and mature drivers
 
People yet again proving that they don't want 3rd player if it doesn't instantly provide to them the fastest GPU for the cheapest price. It's probably fascinating from a psychology viewpoint.
Common...
It's a shame. They want Nvidia and AMD to give them the best they can for a good price even though there's proof that it's not gonna happen. Then a 3rd player comes up with something, and it's suddenly not good enough. It's like everybody needed 4K 120 FPS all of a sudden.

When Nvidia came up with the (massively crippled) 3050, people rejoiced, even though it was too expensive for what it is. Now we have a card that can approximate the 2080 in some games in a 2060 price range, and it's not good.

I don't understand people, really.
 
Intel has Power consumption - Nothing better than this love story

Awesome review :) kudos to TPU
 
I am very pleased with the raytracing performance compared to its competitors, it looks great to be honest and if XeSS is going to be implemented by devs into current games, we might just see even better results, I am very excited to see where things are going to go post RTX4000 / RX7000 and ARC 770.

nVidia killed my enthusiasm for the GPU space, but Intel kindled something, mmm.
 
First of all, nice review, and even though most shit on the A770, I kinda like it, it's rough, there are much better options out there, but it's an ok first step for Intel, I really hope they continue.
Second, the page here showing the A770 Intel Arc A770 Specs | TechPowerUp GPU Database is kind of misleading and wrong, shows the A770 being better than the 3070TI ???
 
Idle consumption can't really be lower with that 600MHz GPU clock. It should be at 300MHz or lower for lower power...
I'm still hoping that they'll fix it in newer drivers. The latest driver is still in beta. Their iGPUs run at 350 MHz with basically zero power consumed when idle. I see no reason why the A770 can't do the same.
 
Idle consumption can't really be lower with that 600MHz GPU clock. It should be at 300MHz or lower for lower power...

The VRAM is running at 2 Ghz no matter the load. That's most probably where the high idle power comes from.
Many other GPUs, mainly from AMD, have a similar problem, but only when using multiple display outputs or high refresh rate displays.
 
The VRAM is running at 2 Ghz no matter the load. That's most probably where the high idle power comes from.
Many other GPUs, mainly from AMD, have a similar problem, but only when using multiple display outputs or high refresh rate displays.
The review was done with this testing method:
Idle: Windows 10 sitting at the desktop (2560x1440) with all windows closed and drivers installed.
I'm thinking whether we would see the same with a 1080p desktop. I know it shouldn't happen at 1440p either, I'm just wondering.
 
Back
Top