• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Arc A770

For where they priced from a customer standpoint it's not bad from Intel's standpoint it's a complete fail.
1664987118296.png

Needs twice the memory width and transistors to achieve the same performance that's absolutely terrible.
I would like nothing more then for them to stick around in the GPU space but if they don't make huuuuge improvement with the next gen (if we ever get another one) this is unsustainable.
 
Someone please help me.

I play GTA V at 1920x1080 and MSFS2020 at 3840x1080. Really the only two games I care about. The rest, if they work at least OK, it’s good enough for me. My monitors only 60Hz but I play GTA with the settings maxed out and it’s already fine but I can’t manage 60FPS on MSFS even at medium with my RX6600.

I don’t see benchmarks for these two games.
can someone please do a comparison of the A770 16GB to the RX6600 8GB?

I want to know if it’s worth upgrading. Thank you.
 
That is your opinion. One could also say that Nvidia focused and influenced the market to think that "rasterization" was no longer enough by pumping Ray Tracing but needed to develop DLSS to counteract the performance hit. Yes AMD had to respond but FSR is much more group friendly than DLSS. XeSS seems to be at least just as good as both. Are you telling me that combining that technology would not benefit the user?
...and that's your "opinion" too. :roll:

See how that works on the other side of that fence? Facts has also proven that everyone that screams "this needs to be standard for everyone" - DLSS, NVIDIA's Broadcast or just ANY NVIDIA industry leading tech - are usually users that have NVIDIA competitors' hardware or still have NVIDIA relic GPUs. I wonder, why are there any screams from users for not using any of Intel/AMD's industry game-changers?! :laugh:

Interesting! /s
 
I don’t see benchmarks for these two games.
can someone please do a comparison of the A770 16GB to the RX6600 8GB?

I want to know if it’s worth upgrading. Thank you.
GTA V is a DX11 game, it's going to be much worse on A770. Also what kind of an upgrade is this, same generation, same price tier and performance tier (mostly). Wait for RDNA3 and get a next-gen graphics card in the middle of 2023.
 
Nice review! Expected results and somehow this reminded me of first gen ryzen cpu when it first launched offering reasonable price and reasonable performance

When Ryzen first launched AMD was offering an 8 core for the same price as Intel's 4 core. Suffice to say it won in multi-threaded benchhmarks big time.

I don't feel like this GPU is providing the same level of value.
 
GTA V is a DX11 game, it's going to be much worse on A770. Also what kind of an upgrade is this, same generation, same price tier and performance tier (mostly). Wait for RDNA3 and get a next-gen graphics card in the middle of 2023.
What do you think about MSFS at dual 1080P? I need an improvement there which would be my main goal.
 
I missed this initially, it should explain why idle power consumption is so high.

View attachment 264306
I wonder if the GPU is actually designed to handle variable memory clock/voltage and therefore if just driver updates would be able to eventually bring lower idle power consumption. What a bummer.
It seems they designed the GPU with fixed memory clock in mind, which is why there's no memory OC either. On all hardware that I'm aware of this is purely a firmware/software limitation. The biggest issue is changing the frequency without things breaking during the switch, so it's not something trivial. I confirmed with them that they will look into memory OC for next-gen

I disagree on the conclusion from Wizzard thought: 350$ for a card that which performance is tied with a 3060ti sounds like a bargain. I feel like the general performance which shows that the 3060ti is 10% faster is probably due to the drivers because on other benchmarks you clearly see that the A770 is the same or even better. But even if i disagree on the conclusion, thank you Wizzard for the review!
Good. Read my reviews, look at the data, come to your own conclusions, don't parrot the opinion someone else feeds you

I agree with other users here that request that TPU do another review of this card after some time when the better drivers come out because with fresh 1st release drivers it's pretty sure we are not seeing all the real performance potential of the A770 here.
The A770 will be part of my test group for the foreseeable future, and I update that with new drivers and new games every few months. Whether I'll make a separate article depends on what they can achieve in terms of product improvements
 
What do you think about MSFS at dual 1080P? I need an improvement there which would be my main goal.
If you need an improvement now then the only option is to step up to a higher tier. If you want gaming performance Intel Arc is the LAST place you would ever want to look. The issues with drivers pretty much make them unbuyable, especially if you are unhappy now.
 
It took one or two respins which caused the delay of 18 months.

I woud'nt say it's a bad performer, its nicely in the middle. But that idle power consumption, the beta driver feature. I guess if you need a "quick" esports based machine on 1080p then this is a OK card. But next gen AMD and Nvidia will make that Arc look stupid.
 
What do you think about MSFS at dual 1080P? I need an improvement there which would be my main goal.
MSFS 2020 does support DX12 and (imo) will likely get XeSS at some point, way down the line. It could be better, but unless it is running out of VRAM (unlikely), it will probably not show a large improvement.
 
Not bad for a first attempt just need lots of work still on drivers and the timing well both NV and AMD have a new gen coming out in weeks.....
 
Intel is losing a lot on each card.
Yeah, it's going to be an acceptable loss (hopefully) to gain marketshare and experience in the GPU business. In terms of die size and transistor count, it's not a lot lower than Navi 21, and it's significantly higher than Navi 22.

You can't compare die size with Ampere as TSMC6 is very different to Samsung 8nm, but in terms of transistor count, VRM complexity, and bus width (which affects PCB costs) - it's definitely costing Intel more than a 3070Ti is costing Nvidia. Don't forget that TSMC6 is a more expensive node than Samsung 8nm too, and I've not even accounted for that.
 
They're not bad, but they're not that good, either.

Some ups and downs with performance consistency, but as others have mentioned, the power consumption is a bit of a glaring issue and hopefully they can get that resolved.

Here's hoping they can fix issues with future drivers and future builds. I'll be patiently waiting and hoping we keep a third party in this GPU race.
 
Looks like there's very good potential for competitive performance/power ratio when we look at properly optimized games like Metro Exodus with raytracing.
A year from now, we could be looking at the A770 competing with the 3070 and 3070 Ti all around, which is pretty great considering Intel is a newcomer.


For where they priced from a customer standpoint it's not bad from Intel's standpoint it's a complete fail.
View attachment 264313
Needs twice the memory width and transistors to achieve the same performance that's absolutely terrible.
I would like nothing more then for them to stick around in the GPU space but if they don't make huuuuge improvement with the next gen (if we ever get another one) this is unsustainable.
Why would you compare transistor amount between a fully enabled Navi 23 and cut-down G10? The A750 isn't using a bunch of those transistors anyway.
A better comparison would be the GA104 with 17B transistors, because that way we're comparing GPUs with similar hardware capabilities (dedicated tensor cores, larger RT units with more accelerated stages, etc.). In this case Intel is spending a bunch more transistors on the 16MB big L2 cache inside the chip, whereas Nvidia depends on a more expensive GDDR6X outside the chip.
The only obvious loss on Intel's side is die size (transistor density), but it just goes to show how intel does have room to grow.
 
I asked on the other thread and apparently the GPU is very good at playing 234x DX12 games, but severely gimped on many thousands of DX11 API or less games... I'm in the market for a low-end GPU upgrade but this seems far too variable vs RX 6600 / RTX 3060 for playing a mix of old / new games...

In overall the card itself is better than what I expected for a first real try + I really like the design, RT is also surprisingly decent.

But since I'm also a variety gamer who plays both new and old/er games this would bother me a lot not knowing what its gonna do when I start up whatever game. 'That and I do have 1k+ hours in BL 3..'
I've bought a second hand RTX 3060 Ti ~ 3 weeks ago and so far I'm very satisfied with it, I did consider waiting for Intel's offers but honestly I got tired of waiting and the prices are always questionable in my country anyway.

How these cards will age vs the 3060 Ti only time will tell tho I kinda expect the 770 16GB model to end up being the better card in the long run. 'I do intend on keeping this 3060 Ti for a long time unless it dies on me'
 
If you need an improvement now then the only option is to step up to a higher tier. If you want gaming performance Intel Arc is the LAST place you would ever want to look. The issues with drivers pretty much make them unbuyable, especially if you are unhappy now.

More than anything I'm concerned I'll have the same issue as I did with the ASRock Arc A380 where it didn't output to 3/4 monitors I tried it on across 3 different builds/PCs. And the one I finally got it to post/boot on, after installing the driver and rebooting, stopped working as well. So I sent it back for RMA. My RX6600 isn't flawless, there are some driver issues (GTA crashes if I load into Online directly, for it to work I have to load into Story Mode then switch to Online) and also I can't play some games full screen on monitor one with YouTube playing on the second monitor because it'll lock up occasionally. Not hardware issues, definitely AMD driver issues, and I've tried everything to resolve it. I would upgrade to an RX6700XT but I'm leery of having the same issues. That's why the A770 might be a worthy gamble.

Edit: also, Eyefinity is broken, there is no way to compensate for bezels like on NVIDIA Surround. There is a way to get to the old "Eyefinity Pro" control panel to try to do it but while it opens it's totally broken. Does Intel Arc have this?
 
Last edited:
The lesson to be learned from this is brute force is not going to cut it.
I like that all the way Intel was very humble and didn't exaggerated with expectations, with proper drivers a770 can equal rtx3070/6700xt, maybe in a year or two.
 
I wont buy the card but I hope people will. It would be nice to have another player in the market driving competition.
 
What do you think about MSFS at dual 1080P? I need an improvement there which would be my main goal.
MSFS is Microsoft Flight Simulator, right?
NVIDIA showed this game as an example for the better performance of the 40 series cards with dlss 3. If you love the game why on earth you think to pick a different brand and something without their technology? Just wait some months, wait to see if the 4060 will have the specs leaked recently and then read the reviews to make a good decision.
Intel just joined the desktop gpus market, their drivers are a mess. Also prices are still high. Wait!
 
More than anything I'm concerned I'll have the same issue as I did with the ASRock Arc A380 where it didn't output to 3/4 monitors I tried it on across 3 different builds/PCs. And the one I finally got it to post/boot on, after installing the driver and rebooting, stopped working as well. So I sent it back for RMA. My RX6600 isn't flawless, there are some driver issues (GTA crashes if I load into Online directly, for it to work I have to load into Story Mode then switch to Online) and also I can't play some games full screen on monitor one with YouTube playing on the second monitor because it'll lock up occasionally. Not hardware issues, definitely AMD driver issues, and I've tried everything to resolve it. I would upgrade to an RX6700XT but I'm leery of having the same issues. That's why the A770 might be a worthy gamble.

Edit: also, Eyefinity is broken, there is no way to compensate for bezels like on NVIDIA Surround. There is a way to get to the old "Eyefinity Pro" control panel to try to do it but while it opens it's totally broken. Does Intel Arc have this?
so if you keep having problems why aren't you just buying something like a RTX 3070. Seems weird to go back to the same platform and expect different outcomes.
 
It seems they designed the GPU with fixed memory clock in mind, which is why there's no memory OC either. On all hardware that I'm aware of this is purely a firmware/software limitation. The biggest issue is changing the frequency without things breaking during the switch, so it's not something trivial. I confirmed with them that they will look into memory OC for next-gen

With that, did they also imply that automatic idle memory clock adjustment just for the purposes of lowering idle power consumption is not expected to be implemented for the current-gen? I imagine that real-time memory frequency switching upon user intervention may pose problems, but perhaps things could be simpler with only 2 frequencies (high-low) engaged in a predictable manner by the driver/firmware.

A potential workaround for the high idle power consumption is using an iGPU as a primary adapter in a hybrid graphics configuration, but whether this actually works depends on how well the drivers play with this function and if the discrete GPU (i.e. the A770) supports getting completely switched off and turned on seamlessly. Furthermore, it introduces additional inefficiencies that lower GPU performance and can bring software compatibility issues, at least on Windows.
 
1664995393229.png

@W1zzard what happened here? An anomaly, or did those cards really choke completely on this game/load?

Is this that famous memory wall they run into? They're all 8GB cards, and apparently, Intel can do a whole lot more with 8GB? That idea is supported by the fact RTX 3070ti 8GB is also on rock bottom there, while the 12GB 3060 is fighting Arc. This is a very interesting result if so.

Overall, great review as usual, and I have a much better feeling about Arc altogether now. Frametimes look very good, I had expected a massacre there, nothing of the sort, efficiency on par with Turing top end is also impressive. Looks like Raja did quite a few things right, too, damn.
 
Yeah, it's going to be an acceptable loss (hopefully) to gain marketshare and experience in the GPU business. In terms of die size and transistor count, it's not a lot lower than Navi 21, and it's significantly higher than Navi 22.

It was certainly designed to compete at the 3070 level, like the old rumours said. It just... couldn't really
 
I like that all the way Intel was very humble
That's definitely a saving grace, but everybody knows first impressions matter a lot. The pessimist could conclude they're buying goodwill for the future :)

Not bad for a first attempt just need lots of work still on drivers and the timing well both NV and AMD have a new gen coming out in weeks.....
Well, Nvidia for sure isn't competing on Intel's level for a while to come... not with ADA at least.
 
Back
Top