• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Arc A770

Meanwhile, ole lying / hateful conspiracy theorists, TweakTown, etc., are all mad and extremely salty right now that Intel refused to send them a review sample.

:roll:
Is the conspiracy in the room with you right now?

The most important thing in this lunch is to see that the drives improved, a lot.
Keep the improvement like this, and by the time Arc second gen arrive Intel will have real worthy product to offer (subject to cost of course)
If they can get around their requiring ReBar and can make a low profile card that is faster then a 1650 super I'd jump on it. The complete and utter lack of upgrade path is annoying ASF.
 
What this proves is that we definitely need a standard for Ray Tracing and FSR, DLSS, XeSS as soon as possible.
 
Better than I was expecting. Hopefully this is the footing they need to push into the discrete GPU market. Looking forward to seeing their next line of GPUs.
 
About what I expected... If only they had more Xe cores (like 36-40) along with a less "max power" approach.
 
Last edited:
They'd sell as many of these as they could make if they could have released 6 months ago. But with current pricing I think most people would rather spend a little less for a 6600/6650, or spend a little more for a 6700, which you can find for $380-$390 now.
 
I missed this initially, it should explain why idle power consumption is so high.

1664980297193.png

I wonder if the GPU is actually designed to handle variable memory clock/voltage and therefore if just driver updates would be able to eventually bring lower idle power consumption. What a bummer.
 
To all those bashing drivers, try RX5700 XT launch drivers that was FUN
 
I missed this initially, it should explain why idle power consumption is so high.


I wonder if the GPU is actually designed to handle variable memory clock/voltage and therefore if just driver updates would be able to eventually bring lower idle power consumption. What a bummer.
Looks like a bandaid fix to some issue they had considering the state of the drivers. Not that it helped much.
 
What this proves is that we definitely need a standard for Ray Tracing and FSR, DLSS, XeSS as soon as possible.
Won't happen and it doesn't need to happen because it will extremely limit innovation.

We have XeSS, FSR, accelerated RT cores, etc. now ONLY due to NVIDIA is LEADING (not jelly about others innovations) the market and forcing its rivals to do better and catch up and stop being lazy and/or inferior.
 
  • Decent midrange performance
  • Reasonable pricing
  • Support for DirectX 12 and hardware-accelerated ray tracing
  • Better RT performance than AMD, slightly worse than NVIDIA
  • Beautiful design
  • 16 GB VRAM
  • Backplate included
  • XeSS upscaling technology
  • Support for HDMI 2.1 & DisplayPort 2.0
  • Support for AV1 hardware encode and decode
  • 6 nanometer production process

Might we add “Dual-slot card” as a Pro? Especially in the light of the absolute behemoths called the 40-series.

B2T:
Nice improvements on the drivers since the introduction of ARC 380 and very promising performance (Raster and Ray tracing) for a first generation, even though it was delayed quite a bit.
Price could be better too, $275-$300 would have been excellent, $350 is too much for what it has to offer at the moment.

Battle mage might actually take a real stab at the enthusiast level, and we will have 3 options throughout the line-up. Exciting prospects ahead!
 
What this proves is that we definitely need a standard for Ray Tracing and FSR, DLSS, XeSS as soon as possible.

Intel has said they are working with Microsoft to create an API for that where you can plug in any implementation which works on your GPU and choose it at runtime. I sure hope it'll come sooner than later because game developers are now wasting a ton of resources implementing all three of them.

When it's done you just go to game settings choose DLSS/FSR/XeSS and the amount of sharpening and you're all set.
 
I was not expecting the A770 to run faster than a 3060ti on several games and benchmarks, and this even on 1st release drivers. On other games thought the performance is concerning because it's way lower. What would make sense is that the drivers are holding back the real performance on these games and hopefully it will be on par with the other benchmarks after some updates.

I disagree on the conclusion from Wizzard thought: 350$ for a card that which performance is tied with a 3060ti sounds like a bargain. I feel like the general performance which shows that the 3060ti is 10% faster is probably due to the drivers because on other benchmarks you clearly see that the A770 is the same or even better. But even if i disagree on the conclusion, thank you Wizzard for the review!

I agree with other users here that request that TPU do another review of this card after some time when the better drivers come out because with fresh 1st release drivers it's pretty sure we are not seeing all the real performance potential of the A770 here.

I must say i'm pretty tempted to buy one even knowing that early adopter = beta tester. But if the promesses of the card come true and you have a 3060ti performance with 16GB vram for 350$ then hell yeah!

Who cares if the card is not as power efficient as others, at this TDP it's not like it's heating up like crazy anyway, and i very much prefer a higher TDP card with better performance for 350$ than the same performance with better TDP at 450$.
 
Last edited:
RTRT performance is actually really good
Yup, pleasantly surprised by that.

They've done a better job than AMD did with RDNA2, though arguably two years after AMD and with a die that is larger and has more transistors than even Navi 22 found in the much faster 6750XT.

Still, it doesn't suck - and that's what we need for healthy competition in DXR titles.
 
Won't happen and it doesn't need to happen because it will extremely limit innovation.

We have XeSS, FSR, accelerated RT cores, etc. now ONLY due to NVIDIA is LEADING (not jelly about others innovations) the market and forcing its rivals to do better and catch up and stop being lazy and/or inferior.
That is your opinion. One could also say that Nvidia focused and influenced the market to think that "rasterization" was no longer enough by pumping Ray Tracing but needed to develop DLSS to counteract the performance hit. Yes AMD had to respond but FSR is much more group friendly than DLSS. XeSS seems to be at least just as good as both. Are you telling me that combining that technology would not benefit the user?
 
What’s with the abysmal borderlands performance?!?!

Conclusion, buy an RX 6650 XT. Unless you want a piece of history instead of a graphics card.
But who wouldn’t want a piece of history??? :)
 
That is your opinion. One could also say that Nvidia focused and influenced the market to think that "rasterization" was no longer enough by pumping Ray Tracing but needed to develop DLSS to counteract the performance hit. Yes AMD had to respond but FSR is much more group friendly than DLSS. XeSS seems to be at least just as good as both. Are you telling me that combining that technology would not benefit the user?
My personal opinion is that the ARC team should have worked with the Radeon team to come up with an industry standard for AI (tensor) cores, upscaling, and revised OpenCL to actually compete with CUDA.

With Nvidia dominating the market like they do, splintering up alternatives will only hurt the both of them.
 
Beautiful paperweight.
More seriously it is impressive in RT showing that or nVidia's RT cores aren't all that impressive.
The card isn't very good it is similar to 3070 in size, yet it is miles behind, Intel is losing a lot on each card.
 
What’s with the abysmal borderlands performance?!?!
I asked on the other thread and apparently the GPU is very good at playing 234x DX12 games, but severely gimped on many thousands of DX11 API or less games... I'm in the market for a low-end GPU upgrade but this seems far too variable vs RX 6600 / RTX 3060 for playing a mix of old / new games...
 
Better than I thought. From their first generation of discrete cards in 25 years, this looks to be a capable card as it looks to be around 1080 Ti/2080 levels in performance.
 
A very big negative was forgotten; No DirectX 9 support. So gamers have to rely on emulation for half of their game collection.

This is much better than I expected, well done Intel, I am looking forward to what you will bring next.
So what was your expectation then?
It's not that long ago A770 was supposed to be a RTX 3060 Ti competitor, now in reality it's more a RTX 3060 competitor.

A770 scale better than AMD and NV as resolution increase. That's a fact.
So what? The cards are barely suitable for 1440p, so how are you going to enjoy that "scaling"? ;)

I was not expecting the A770 to run faster than a 3060ti on several games and benchmarks, and this even on 1st release drivers. On other games thought the performance is concerning because it's way lower. What would make sense is that the drivers are holding back the real performance on these games and hopefully it will be on par with the other benchmarks after some updates.
People need to stop making excuses for bad performance. Bugs will be fixed, but the overall performance characteristics are unlikely to change. In order for there to be a significant change in performance, they would have to do major changes to one of their implementations of graphics APIs, and all the games using that would be affected accordingly.

I feel like the general performance which shows that the 3060ti is 10% faster is probably due to the drivers because on other benchmarks you clearly see that the A770 is the same or even better.
What's the logic here?
The general performance is the average of all the games. There is no reason to conclude the driver is holding it back based on this. It seems like you are looking at outliers and expecting everything to scale the same. This is kind of the point of averaging a large sample set; it eliminates the cherry-picked cases.

I agree with other users here that request that TPU do another review of this card after some time when the better drivers come out because with fresh 1st release drivers it's pretty sure we are not seeing all the real performance potential of the A770 here.
This happens periodically anyways.
They've had this card in testing for ~6 months, there is no reason why there should be a major untapped performance potential here.

I must say i'm pretty tempted to buy one even knowing that early adopter = beta tester. But if the promesses of the card come true and you have a 3060ti performance with 16GB vram for 350$ then hell yeah!
At this price I would rather have RTX 3060.
 
I asked on the other thread and apparently the GPU is very good at playing 234x DX12 games, but severely gimped on many thousands of DX11 API or less games... I'm in the market for a low-end GPU upgrade but this seems far too variable vs RX 6600 / RTX 3060 for playing a mix of old / new games...
No doubt. Wow. I think my library is probably 90% Not dx12
 
I missed this initially, it should explain why idle power consumption is so high.

View attachment 264306
I wonder if the GPU is actually designed to handle variable memory clock/voltage and therefore if just driver updates would be able to eventually bring lower idle power consumption. What a bummer.
This is my main concern with the card - 44W idle power is huge nowadays.. I hope it’s something they can fix in software. Since you can’t OC the memory clock I’m wondering if there’s something preventing stability at different clocks for the VRAM.

I seem to remember some previous gen AMD or Nvidia having problems with memory clocks not being stable below a certain frequency, a while ago.
 
Nice review! Expected results and somehow this reminded me of first gen ryzen cpu when it first launched offering reasonable price and reasonable performance
 
could you please add a d3d9 performance per dollar chart? I'd love to see it be 0 FPS/$
 
Back
Top