• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Previews Arc A750 Graphics Card Performance

The problem is the A750 needs to be $300 at most but they won't announce the price...
 
At this point I would just like them to be competent enough to release the Arcs to the rest of the world besides China. The rumor is that the release for the USA is delayed until Fall. Not sure about the rest of the world. Not that I would buy one anyway but there does seem to be some interest in them.
There's also interest in horrific train crashes.
 
60The competition vs. AMD & Nvidia will 0/be s n, a 6owe dcamwelcomed and help with pricing a bit, however, I don't think Intel will come close enough just yet to have much of an effect. Maybe 5 or so years from now, but not in the present or near future.
AMDs pricing has came down, so i mean 300 aint bad for a 6600/xt
 
If the GN video was any indication Tom (The engineer) knows this and basically said as much.

Also, some refreshing absence of PR talk regarding performance targets and expectations regarding the Driver.
Intel, at least to some point, got the price memo and set a revised $129-$139 MSRP for the A380 instead of $150 at their initial release in China.
 
I'm not that pessimistic as you guys, AMD took ages to get a competing gpu past the middle range. Intel done a good job at first try, now bring pricing down and improve drivers.
Up to the Radeon 300 series, AMD(/ATI) was participating in the high-end segment. It was just during the RX 400/500/5000 series they only had products in the lower mid-range and below.
 

Also, some refreshing absence of PR talk regarding performance targets and expectations regarding the Driver.
Intel, at least to some point, got the price memo and set a revised $129-$139 MSRP for the A380 instead of $150 at their initial release in China.

Yeah exactly. I will buy one, because this is techpowerup an enthusiast forum. I am curious about technology and how this will age, and about the "age" in general as Intel coming in as a dGPU competitor in general, not like will their dGPUs compete. Lots of cynics in the thread. Maybe im too much of a dreamer chasing my sense of adventure, my life was way more fun when I didn't treat hardware purchases like betting on dogs at a race track. Can't wait to "throw my money away" on the experience.
 

Also, some refreshing absence of PR talk regarding performance targets and expectations regarding the Driver.
Intel, at least to some point, got the price memo and set a revised $129-$139 MSRP for the A380 instead of $150 at their initial release in China.
The initial release in China was $129+VAT, i posted about it back then when we saw the +25% performance/$ vs RX6400 figures.
 
The problem is the A750 needs to be $300 at most but they won't announce the price...

Isn't the 3060 12gb with which it's competing 400$? 100$ discount at launch seems a bit much, maybe 50$ because you're kind of beta testing after all
 
The performance that A750 limited edition achieving in those titles are way higher than what the difference vs Gunnir photon A380 should be.
Either in the original GamersNexus review results the Gunnir wasn't set to run at 2450MHz (official turbo spec of the model and 92W TBP) but at less (2GHz perhaps to be a better representative of all the A380 models? / 2GHz is the official (reference) turbo frequency if A380 is powered only through pci-express's 75W) or A750 LE isn't 24Xe design but 28Xe.(probably the former-meaning 2GHz turbo in GamersNexus preview)
I can't find any other explanation why the numbers are so high vs what A380 can do.
Edit: i just watched GamersNexus A750 video and at 14:16 Tom Petersen says showing to the A380 model that it may be 75W and latter 2GHz, if this is the case (Tom Petersen didn't seem too sure about it) it's strange, in the previous video (A380 preview) Steve Burke didn't mention anything about it!
 
Last edited:
Isn't the 3060 12gb with which it's competing 400$? 100$ discount at launch seems a bit much, maybe 50$ because you're kind of beta testing after all
3060 is overpriced right now besides Nvidia being the market leader commands a premium, not unlike Intel did only 3 years back in the CPU space!
 
The games are not optimized for Intel, these are just the games which happen to perform better.


Considering that Nvidia and AMD are about to release their new generations in the next few months, so a price of ~$200 would be decent if the TDP is also good?


It was surely a good card, even though some claimed Nvidia launched it to make AMD look bad.
Back when 1080 Ti was about to launch, I was planning to buy one, and since I had one GPU fail I bought a 1060 as a temporary solution. Then prices and availability went crazy, and I'm still rocking my 1060 :p
I dare say that the launch price of the new generation will have such a big jump as to not affect the competitiveness of intel products, and even the current line of both AMD and Nvidia. lol
 
Might need to jump through a couple of hoops to get one in my region but now I'm interested again. Means A770 might just land around 3070 too! Yeah, in most games it's gonna blow but hey, I like a good underdog story, even if the underdog is somehow Intel.
 
What I found interesting from a lot of the graphs was the strong .1% an 1% outliers that were there. SO if there is driver optimisations to be had I can see them being similar to how RX6xxx from AMD and how they got free performance just through driver maturity.

How much they will get is the big question but hopefully Intel will see how their 1st generation cards are utilised and then optimise the architecture perhaps in Celeste.
 
Performance results shown here are from a small subset of the games, that work very well with Intel® Arc™ and the Alchemist architecture. I’m not asserting that ALL GAMES will show these results, but it’s a view of what Intel Arc A-series cards are capable of with the right software and engineering enablement.
I think that's the main takeaway from this story, a decently honest claim from Intel.
 
So about same ballpark as 6600XT. So should be around the same price then also.
 
Your fault, my fault, nobody's fault...

it won't matta...

It's STILL gonna be DOA if the price (and drivers) aint right - Big Jake McCandles (aka John Wayne) sorta :)
 
Up to the Radeon 300 series, AMD(/ATI) was participating in the high-end segment. It was just during the RX 400/500/5000 series they only had products in the lower mid-range and below.

I think people have a hard time remembering because prior to RDNA2 AMD struggled to compete in the high end post 2013... They made a bad bet on HBM crippling the Fury X which might have ended up a decent product had it shipped with 8GB and after that they gave up trying to compete in the high end till rdna2 shipped.
 
right how many years did the rx480 last?
 
Isn't the 3060 12gb with which it's competing 400$? 100$ discount at launch seems a bit much, maybe 50$ because you're kind of beta testing after all
Mid-ranged Nvidia cards are at a standstill right now, there still asking Spring Crypto prices and as a result products aren't selling (3050, 3060, and 3060Ti are all collecting dust at my local brick n mortar store).

A 3060 is not a $400 card, its basically a RTX 2070 from 2018 (nobody is buying them right now at the inflated price)

A better comparison would be the RX 6600 (which is almost identical to a 3060) is currently selling for $300 and the next tier up (6600XT) can be had for as low as $340
 
2 points nobody has mentioned - if these fail hard, they are going to be worth $$$ some years in the future. The DG1 most likely will anyways. Also, concerns about driver quality are a double-edged sword. Sure, they might not be great, but that also means that the performance is only going to increase from here.
 
Because of course their A770/A780 is going to be an RTX 3090 Ti killer LMAO.

These GPUs are trash, no two ways about it. I'm gonna enjoy laughing at the people posting "I bought an Intel GPU but it runs slow in <random game> REEEEEE". Then in a few more years, another round of pointing and laughing at the "Has Intel abandoned Arc, why are there no driver updates" posts.
Because killing the 3090 Ti is the only way to launch a product line. Do people only buy $1500+ GPUs? Come on!
 
Back
Top