Friday, September 30th 2022

ICYMI: 16GB Arc A770 Priced Just $20 Higher than 8GB; A770 Starts Just $40 Higher Than A750

The Intel Arc 7-series performance-segment graphics cards announced earlier this week, are all priced within $60 of each other. The series begins with the Arc A750 at USD $289. $40 more gets you the Arc A770 Limited Edition 8 GB, at $329. The top-of-the-line Arc A770 Limited Edition 16 GB is priced just $20 higher, at $349. This puts the Intel flagship at least $30 less than the cheapest NVIDIA GeForce RTX 3060 available in the market right now, which can be had for $380. The dark horse here is the AMD Radeon RX 6650 XT, which is going for as low as $320.

Intel extensively compared the A770 to the RTX 3060 in its marketing materials, focusing on how its ray tracing performance is superior to even that of NVIDIA RTX in this segment, and that the Intel XeSS performance enhancement is technologically on-par with 2nd generation super-scaling techs such as FSR 2.0 and DLSS 2. If Intel's performance claims hold, the A770 has the potential to beat both the RTX 3060 and RX 6650 XT in its segment. The Arc A750, A770 8 GB, and A770 16 GB, go on sale from October 12. Stay tuned for our reviews.
Add your own comment

41 Comments on ICYMI: 16GB Arc A770 Priced Just $20 Higher than 8GB; A770 Starts Just $40 Higher Than A750

#26
AusWolf
Interesting choice. Will anybody buy the A750 and A770 8 GB when you can have the 16 GB model for basically the same price?
Posted on Reply
#27
Vayra86
AnarchoPrimitivBased on history, I have a feeling that Intel will only be cutting into AMD's marketshare and not Nvidia's, and if this occurs, Intel's presence in the GPU market will do absolutely nothing to improve conditions for consumers.

In the past (as far back as 2008), even when AMD has offered GPUs that perform better and even at a lower price, everyone still buys Nvidia. Say what you want about AdoredTV, but a couple years ago he did a great three or four part video that used market research, sales figures, etc the demonstrated this very phenomenon and described it as Nvidia's "mentia", or mindshare. Basically it goes like this: the vast majority of PC hardware consumers are not enthusiasts who compare specs and benchmarks for hours and days prior to making a purchase, just like consumers in any other market, they base their purchasing decisions on a lot of non-empirical, irrational factors. These consumers therefore will be more influenced by the fact that they notice more people own Nvidia than AMD, that Nvidia has more fans that are more vocal online, and despite not being true since the 290x, the fans constantly repeat that AMD runs hot and that they have bad drivers (despite there being no real comprehensive empirical data to back up such a claim) and because they're not the type to do research, they're just going to accept the accusations as fact. Now, in their mind, they've associated Nvidia with the winning side and therefore want to associate themselves with the winning side as well.

These consumers will not look at the benchmarks, see Intel is performing better in their price tier and buy an Intel videocard. They will only consider Intel when they see a bunch of other people willing to buy intel or when, psychologically speaking, they've come to associate Intel videocards with the "winning side". The same irrational considerations that prevent them from buying an AMD videocard will also prevent them from buying an Intel videocard and therefore, Intel's sales will predominantly come at AMD's expense because the people most willing to take a chance on an Intel videocard are the ones willing to buy an AMD videocard...most likely because they're willingness to do the research and look at benchmarks is probably what brought them to buy an AMD videocard and would therefore make them also open to the idea of buying an Intel videocard.

Your diehard types who only buy Nvidia and will not even consider AMD (whether by habit or active choice), which I feel makes a large portion of Nvidia's marketshare and the consumer GPU market as a whole, are probably not going to consider Intel either and are only hoping that Intel's entry into the market will allow them to continue to buy Nvidia, but at a lower price. If Intel stays in the market, then years down the road they may change this, but for the first six months or year, or even Intel's first couple of generations, they'll predominantly take marketshare from AMD and this will do absolutely nothing to improve the conditions of the GPU market for consumers. As long as Nvidia holds on to 80% marketshare (or probably anything over 50%) they're going to have the monopolistic power to keep their prices high and even maintain the trend of constantly increasing prices every generation (like how a 4080 12GB is basically a 4070 so now xx70 tier cards are priced around $800, and because AMD's shareholders will expect the same profits and profit margin, AMD will follow suit to some degree).

Bottom line, everyone hoping that Intel will correct the course of the GPU market is going to be disappointed because if it happens at all, it won't be happening any time in the immediate future.
Somebody's got blinders on, holy crap... seeing this in 2022 is mind blowing.

AdoredTV is a clown, and you only confirmed it, the reason he exists is because of the cult following feeding his ad revenue ;)

AMD still hasn't got feature parity with Nvidia, you can twist that however you like, but 'cheaper and faster' is not the criterium that confirms anything. Its just one of many. AMD had inferior product that they sold at lower price. But inferior, it has been for a long time and it really still is on the feature aspect. Historically we also know that stuff needs to 'just work' for the vast majority, the niche that likes to tweak is just that, a niche, and Nvidia caters to the former group best. Still does.

Ryzen is the practical proof of what I'm saying. It took several generations of 'beating' the competition to catch Intel and actually gain market share. Platform needed to mature, arch needed refinements. Yes, brand recognition exists and its only logical, Rome isn't built in a day, and neither is trust that stuff just works. And how is AMD doing on that front today, wrt pricing? Saving us? It once again confirms that everything is more of the same: AMD prices its product to the maximum of what the market can/might bear. So why did they price lower on GPUs back in the day? Let's see.... 1+1= ? You're saying that isn't true as AMD 'was better and cheaper' at times. That is some real mental gymnastics!

Now don't get me wrong, I"m absolutely NOT pro Nvidia, I hate their guts since Turing, haven't liked a single thing they released since then, except DLSS, but even so I prefer AMD's agnostic FSR implementation to much greater degree. So I'm 'with AMD' on their approach to less proprietary bullshit. I never once considered spending even a dime on Gsync, but have a Freesync monitor on an Nvidia card. But we have to keep calling things what they are. Misplaced favoritism is just that, misplaced. RDNA2 is technically impressive, power efficient, a good architecture. But: it misses features the competition does have. If they add those in RDNA3 and have reasonable price/perf to compete, its insta-buy. But at the same time, I say what I've said up here about Nvidia vs AMD and the battle they fight. AMD has been cheapskating it and it shows, the moment they got serious, they have competitive product again. Go figure...

As for Intel cutting into AMD's share, you might be right about that for all the reasons mentioned here. Intel similarly has an offering that isn't 'just working' everywhere anytime no matter what game you play. Ironically the DX11 performance takes a back seat just like it did for AMD. Feature parity isn't there because half the features don't even work. Trust is zero, or sub zero. But let's take a long look at Nvidia, too, that's definitely not all sunshine and rainbows either lately, most notably on power usage - but that is actually new, and not the usual MO for them. AMD has a real opportunity here to capture share, if RDNA3~5 are great.
Posted on Reply
#28
r9
I think they'll sell fine when(if) they get released.
I'm not looking to buy a GPU at the moment but if I was I'll be really tempted just to satisfy my curiosity.
The Quim ReaperI really don't see the point of having both an 8GB & 16GB 770 card with only a $20 price difference...very odd.
They'll probably perform exactly the same and with the highest price being $350 there is not much room to price it right.
Most people for 16GB version will gladly play those extra $20 with or without effect on performance so once those get sold out people will buy buy the 8GB version and still not feel bad because it does perform the same in real life. So actually it priced perfectly. :D
Posted on Reply
#29
agatong55
Waiting for reviews... want to get an upgrade for my wifes GTX 1650.
Posted on Reply
#30
r9
Vayra86Somebody's got blinders on, holy crap... seeing this in 2022 is mind blowing.

AdoredTV is a clown, and you only confirmed it, the reason he exists is because of the cult following feeding his ad revenue ;)

AMD still hasn't got feature parity with Nvidia, you can twist that however you like, but 'cheaper and faster' is not the criterium that confirms anything. Its just one of many. AMD had inferior product that they sold at lower price. But inferior, it has been for a long time and it really still is on the feature aspect. Historically we also know that stuff needs to 'just work' for the vast majority, the niche that likes to tweak is just that, a niche, and Nvidia caters to the former group best. Still does.

Ryzen is the practical proof of what I'm saying. It took several generations of 'beating' the competition to catch Intel and actually gain market share. Platform needed to mature, arch needed refinements. Yes, brand recognition exists and its only logical, Rome isn't built in a day, and neither is trust that stuff just works. And how is AMD doing on that front today, wrt pricing? Saving us? It once again confirms that everything is more of the same: AMD prices its product to the maximum of what the market can/might bear. So why did they price lower on GPUs back in the day? Let's see.... 1+1= ? You're saying that isn't true as AMD 'was better and cheaper' at times. That is some real mental gymnastics!

Now don't get me wrong, I"m absolutely NOT pro Nvidia, I hate their guts since Turing, haven't liked a single thing they released since then, except DLSS, but even so I prefer AMD's agnostic FSR implementation to much greater degree. So I'm 'with AMD' on their approach to less proprietary bullshit. I never once considered spending even a dime on Gsync, but have a Freesync monitor on an Nvidia card. But we have to keep calling things what they are. Misplaced favoritism is just that, misplaced. RDNA2 is technically impressive, power efficient, a good architecture. But: it misses features the competition does have. If they add those in RDNA3 and have reasonable price/perf to compete, its insta-buy. But at the same time, I say what I've said up here about Nvidia vs AMD and the battle they fight. AMD has been cheapskating it and it shows, the moment they got serious, they have competitive product again. Go figure...

As for Intel cutting into AMD's share, you might be right about that for all the reasons mentioned here. Intel similarly has an offering that isn't 'just working' everywhere anytime no matter what game you play. Ironically the DX11 performance takes a back seat just like it did for AMD. Feature parity isn't there because half the features don't even work. Trust is zero, or sub zero. But let's take a long look at Nvidia, too, that's definitely not all sunshine and rainbows either lately, most notably on power usage - but that is actually new, and not the usual MO for them. AMD has a real opportunity here to capture share, if RDNA3~5 are great.
Lucky for AMD that the raytracing comes at a huge performance cost for the most time people can't decide if it makes things look better or worst.
It's definitely a good idea but game devs become so good at faking lighting that Raytracing doesn't have the same effect as other technologies in the past likeDX9 with the realistic water in Farcry and HL.
Plus not supported in all games. DLSS same thing works better but no supported in all games.
All companies are the same they are here to maximize their profit from any product.
Whoever has the fastest CPU/GPU has the right to charge more for the whole line up. Not something I agree with but the biggest part of the customer bas they'll just read AMD released the fastest CPU which of course they can't afford but in their mind that means that at any price range any AMD will be better then Intel. Same for the GPUs.
agatong55Waiting for reviews... want to get an upgrade for my wifes GTX 1650.
I see what you did there. :D
Recently I bought my wife a woofer it was a surprise and the other day she surprised me (without her knowledge) with a SteamDeck for my upcoming birthday.
The secret is knowing how to put the right spin to it. :D
Posted on Reply
#31
droopyRO
GunShot~2 to 3-years from now? Nah, those titles are already here... today. e.g. Godfall, FC6, etc.

Maybe with some extra-mega-super-giga texture packs, other than that on ultra or high texture you are fine with 6-8GB in 2022. The two games i play ATM, none of them uses above 8GB of vRAM. So yes, it will be another 10 years before 8GB of vRAM is a minimum system requirement.
Posted on Reply
#32
FeelinFroggy
If Intel GPUs do match the 3060 in performance and price, then I think it will be a success. While I wont be buying these GPUs and probably wont for several generations, releasing their first GPUs and hitting the mainstream segment is pretty good. If they were able to get out and seriously compete with Nvidia and AMD right away, then Intel or another company already would have.

Hopefully they will be successful and we can get more competition.
Posted on Reply
#33
evernessince
ZetZet16GB is probably barely better, if any. Considering RTX 3070 is fine with 8GB of VRAM, I don't see how a card in 3060 performance tier would need more. It's not a 4K card anyway.
Posted on Reply
#34
FeelinFroggy
GunShot~2 to 3-years from now? Nah, those titles are already here... today. e.g. Godfall, FC6, etc.
Additional VRAM does nothing for you but make your GPU more expensive, any unused vram just sits idle and does nothing.

Look at 1080p which has been around for a while now. 4gb of VRAM was enough for games made in 2010 and it's still enough for games made in 2022.

Obviously higher resolutions will have a higher floor for VRAM use and 4gb is not enough RAM for higher resolutions.

The floor wont change, but the peak VRAM use never moved past 4gb for 1080p and so why would it go higher for 1440p or 4k?
Posted on Reply
#35
ODOGG26
Vayra86Somebody's got blinders on, holy crap... seeing this in 2022 is mind blowing.

AdoredTV is a clown, and you only confirmed it, the reason he exists is because of the cult following feeding his ad revenue ;)

AMD still hasn't got feature parity with Nvidia, you can twist that however you like, but 'cheaper and faster' is not the criterium that confirms anything. Its just one of many. AMD had inferior product that they sold at lower price. But inferior, it has been for a long time and it really still is on the feature aspect. Historically we also know that stuff needs to 'just work' for the vast majority, the niche that likes to tweak is just that, a niche, and Nvidia caters to the former group best. Still does.

Ryzen is the practical proof of what I'm saying. It took several generations of 'beating' the competition to catch Intel and actually gain market share. Platform needed to mature, arch needed refinements. Yes, brand recognition exists and its only logical, Rome isn't built in a day, and neither is trust that stuff just works. And how is AMD doing on that front today, wrt pricing? Saving us? It once again confirms that everything is more of the same: AMD prices its product to the maximum of what the market can/might bear. So why did they price lower on GPUs back in the day? Let's see.... 1+1= ? You're saying that isn't true as AMD 'was better and cheaper' at times. That is some real mental gymnastics!

Now don't get me wrong, I"m absolutely NOT pro Nvidia, I hate their guts since Turing, haven't liked a single thing they released since then, except DLSS, but even so I prefer AMD's agnostic FSR implementation to much greater degree. So I'm 'with AMD' on their approach to less proprietary bullshit. I never once considered spending even a dime on Gsync, but have a Freesync monitor on an Nvidia card. But we have to keep calling things what they are. Misplaced favoritism is just that, misplaced. RDNA2 is technically impressive, power efficient, a good architecture. But: it misses features the competition does have. If they add those in RDNA3 and have reasonable price/perf to compete, its insta-buy. But at the same time, I say what I've said up here about Nvidia vs AMD and the battle they fight. AMD has been cheapskating it and it shows, the moment they got serious, they have competitive product again. Go figure...

As for Intel cutting into AMD's share, you might be right about that for all the reasons mentioned here. Intel similarly has an offering that isn't 'just working' everywhere anytime no matter what game you play. Ironically the DX11 performance takes a back seat just like it did for AMD. Feature parity isn't there because half the features don't even work. Trust is zero, or sub zero. But let's take a long look at Nvidia, too, that's definitely not all sunshine and rainbows either lately, most notably on power usage - but that is actually new, and not the usual MO for them. AMD has a real opportunity here to capture share, if RDNA3~5 are great.
You make some good points, but what are all these features that gamers need to know about or have that Nvidia has that AMD doesn't? I honestly dont know. People mention features but never say what features that's so important that one would be missing out by purchasing an AMD card.
Posted on Reply
#36
awesomesauce
I bet They will sell like cupcakes

What a missed opportunity by nvidia not releasing something new and competitive in this segment (under 400$).

1500$+ card not gonna sell well + crypto not good atm
Posted on Reply
#37
TheinsanegamerN
ODOGG26You make some good points, but what are all these features that gamers need to know about or have that Nvidia has that AMD doesn't? I honestly dont know. People mention features but never say what features that's so important that one would be missing out by purchasing an AMD card.
Stability for one thing, people seem to forget AMDs long history of garbage drivers and their (still present) habit of letting problems annoy users until the media gets involved (Ryzen 5000 compatibility, Ryzen 3000 compatibility, rDNA 1 downclocking issues, gcn black screens, ece.).

Performance is another issue. Scream fine wine all you want, people want the performance they paid for now, not 5 years from now.

You could also get into features like shadow play, which took AMD years to copy, or NVENC encoding, which AMD still has no answer for.

Then all those who talk of "ngreedia mindshare" conveniently forget that evergreen sold real good. The hd 5000s hit 49% market share. Then AMD left their product to rot while they dumped money into the failure that was bulldozer, and was caught completely off guard when Nvidia actually fixed fermi instead of re releasing it. AMD would do the same thing with Hawaii (to waste cash on seamicro).

I could go on. (Hey remember how AMD launched Ryzen mobile then abandoned driver development to OEMs?)

AMDs biggest opponent is not Nvidia. It's AMD. Their rocky launches and the copium their community produces do not translate to sales. Strong launches with solid drivers and strong product software make sales.
Posted on Reply
#38
GunShot
droopyRO
Maybe with some extra-mega-super-giga texture packs, other than that on ultra or high texture you are fine with 6-8GB in 2022. The two games i play ATM, none of them uses above 8GB of vRAM. So yes, it will be another 10 years before 8GB of vRAM is a minimum system requirement.
Clearly, you haven't played enough games... again... with *eye-candy* as I've stated above. :shadedshu:
FeelinFroggyAdditional VRAM does nothing for you but make your GPU more expensive, any unused vram just sits idle and does nothing.

Look at 1080p which has been around for a while now. 4gb of VRAM was enough for games made in 2010 and it's still enough for games made in 2022.

Obviously higher resolutions will have a higher floor for VRAM use and 4gb is not enough RAM for higher resolutions.

The floor wont change, but the peak VRAM use never moved past 4gb for 1080p and so why would it go higher for 1440p or 4k?
Same for you. You need to get out more and play more pass ancient 1080p. :shadedshu:
Posted on Reply
#39
Vayra86
ODOGG26You make some good points, but what are all these features that gamers need to know about or have that Nvidia has that AMD doesn't? I honestly dont know. People mention features but never say what features that's so important that one would be missing out by purchasing an AMD card.
Eventually, AMD copied what Nvidia brought in terms of featureset. Every time.

- Shadowplay
- DLSS
- Hairworks (lol! more of a joke than anything else, but hey, AMD had to follow with TressFX)
- TXAA / temporal AA
- FXAA
- RT
- Gsync
- etc.

The gist of it is, Nvidia was exercising thought and innovative leadership in the gaming segment, and AMD was not. They are now building on that, I hope they keep momentum. But Nvidia hasn't stopped trying to lead. And these are no small things - many pushes Nvidia initiated have been pretty neat ones that brought gaming graphics further.
Posted on Reply
#40
Bruno_O
TheinsanegamerNStability for one thing, people seem to forget AMDs long history of garbage drivers and their (still present) habit of letting problems annoy users until the media gets involved (Ryzen 5000 compatibility, Ryzen 3000 compatibility, rDNA 1 downclocking issues, gcn black screens, ece.).

Performance is another issue. Scream fine wine all you want, people want the performance they paid for now, not 5 years from now.

You could also get into features like shadow play, which took AMD years to copy, or NVENC encoding, which AMD still has no answer for.

Then all those who talk of "ngreedia mindshare" conveniently forget that evergreen sold real good. The hd 5000s hit 49% market share. Then AMD left their product to rot while they dumped money into the failure that was bulldozer, and was caught completely off guard when Nvidia actually fixed fermi instead of re releasing it. AMD would do the same thing with Hawaii (to waste cash on seamicro).

I could go on. (Hey remember how AMD launched Ryzen mobile then abandoned driver development to OEMs?)

AMDs biggest opponent is not Nvidia. It's AMD. Their rocky launches and the copium their community produces do not translate to sales. Strong launches with solid drivers and strong product software make sales.
As somone that replaces their GPU yearly, multiple times on the same build (recently had a 6800, 6600xt, 3060 and 3080) and that builds PCs since 2004, it's clear to me that you are a nvidia shill.

"Stability" my ass, the past 2 years I've had more issues with Geforces than AMDs. nVidias HDMI 2.1 implementation on TVs is a joke.

And as somone that had pretty much 1 Radeon of every gen (plus Geforces) since the x000 series (before HD 2000 series) I haven't had issues gaming with Radeons since AMD bought them. And yes fine wine does exist, while I've experienced the contrary with nVidias (old cards getting lower performance with newer drives after some years).

All features you mentioned only affect streamers, not vanilla gamers.

"I could go on" you can't, there's no killer feature that makes Geforces better for gamers. There was 1, DLSS which is great, but AMD and FSR 2.0 open source just killed it like GSync was killed by VRR.

Guy's post on AdoredTVs is correct. I have lots of friends that won't even consider an AMD card for their builds, they won't even check benchmarks. And they are IT guys, not stupid 12 year olds.

Ryzen took 2-3 generations to start getting mindshare, and was helped by the (justified) hatred people have at Intel for their shady practices and forever 4 core CPUs. nVidia doesn't sit on their hands like that, so there's no vacuum for AMD or Intel to fill.

All in all, to me this is actually advantageous. People flock to buy nVidia so, usually, I can get AMDs for a lower price. But these 2 are neck n neck on performance/quality of drivers, and any drastic comment on that is being a fan boy.
Posted on Reply
#41
AusWolf
TheinsanegamerNStability for one thing, people seem to forget AMDs long history of garbage drivers and their (still present) habit of letting problems annoy users until the media gets involved (Ryzen 5000 compatibility, Ryzen 3000 compatibility, rDNA 1 downclocking issues, gcn black screens, ece.).

Performance is another issue. Scream fine wine all you want, people want the performance they paid for now, not 5 years from now.

You could also get into features like shadow play, which took AMD years to copy, or NVENC encoding, which AMD still has no answer for.

Then all those who talk of "ngreedia mindshare" conveniently forget that evergreen sold real good. The hd 5000s hit 49% market share. Then AMD left their product to rot while they dumped money into the failure that was bulldozer, and was caught completely off guard when Nvidia actually fixed fermi instead of re releasing it. AMD would do the same thing with Hawaii (to waste cash on seamicro).

I could go on. (Hey remember how AMD launched Ryzen mobile then abandoned driver development to OEMs?)

AMDs biggest opponent is not Nvidia. It's AMD. Their rocky launches and the copium their community produces do not translate to sales. Strong launches with solid drivers and strong product software make sales.
I would agree if it was 2014, but it's not. AMD has come a long way since then. The B550 USB bug was the last hiccup I've heard of with the Zen 2/3/4 platform. AMD systems have been as stable as Intel since then. Their GPUs work perfectly, too. Some 5700 series cards had heat issues, but the 6000 series is just as solid as Nvidia.
Posted on Reply
Add your own comment
Jun 14th, 2024 08:24 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts