• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Sparkle Arc A750 Titan OC

I can't see any performance improvement here. 150Mhz increase in frequency and the power consumption during gaming is significantly higher though giving literally no difference in performance.
I hope this will be addressed in the future releases of the cards and Intel will get something ranked higher in the performance charts.
 
The horrific power figures are because at the end of the day the GPU this is based on is huge for it's performance class, it's 400mm^2 and when you compare it to something like AMD's Navi 31 which has a 300mm^2 GCD (which is a fair comparison because Intel's chips have similar L2 caches and no L3) it's way faster and power efficient, it's kind of crazy that they need this much die area.

There is something terribly wrong with how their chips are designed, the architecture is too granular, only 8 shaders per compute unit is crazy.
 
The horrific power figures are because at the end of the day the GPU this is based on is huge for it's performance class, it's 400mm^2 and when you compare it to something like AMD's Navi 31 which has a 300mm^2 GCD (which is a fair comparison because Intel's chips have similar L2 caches and no L3) which is way faster it's kind of crazy that they need this much die area.

There is something terribly wrong with how their chips are designed, the architecture is too granular, only 8 shaders per compute unit is crazy.
I would not say horribly wrong just yet but definitely the arch needs some refinement. Maybe the potential is there but Intel needs to bring it out. The die size is big though for what it offers but like I said. Maybe it needs some tweaks here and there and it will work out.
 
I would not say horribly wrong just yet but definitely the arch needs some refinement. Maybe the potential is there but Intel needs to bring it out. The die size is big though for what it offers but like I said. Maybe it needs some tweaks here and there and it will work out.
Nah there really is something wrong, I can't help but notice that Intel still designs their GPUs as though they are CPUs, Larrabee's phantom is still in there.

Control logic is expensive both in terms of area and power that's why GPU manufactures generally try to cram as many shaders per compute unit as they can, AMD is at 64 and Nvidia at 128 per CU. For some reason Intel insists to do it in the complete opposite way and it shows.
 
Nah there really is something wrong, I can't help but notice that Intel still designs their GPUs as though they are CPUs, Larrabee's phantom is still in there.

Control logic is expensive both in terms of area and power that's why GPU manufactures generally try to cram as many shaders per compute unit as they can, AMD is at 64 and Nvidia at 128 per CU. For some reason Intel insists to do it in the complete opposite way and it shows.
Perhaps, but that does not mean it cannot be addressed and refined to be more suitable for GPUs? I mean, this is the first iteration of the Intel's GPU after so many years. It is obviously not the best and the performance could have been better but like I say, it is the first iteration of the GPU. I think it would have been better to wait for the next gen Intel GPUs and see how far they have gone in the performance and wattage and the die size. That will give us some comparison to what they have currently and we will be able to see the progress.
I have a concern if Intel will be able to start catching up at least a little bit to mitigate the gap. On the other hand, at the beginning, there is always way more simple things that could be done to make the performance better. While they are moving in with the tweaks and refinement then they face obstacles. I hope, Intel's way will be at least easy from the beginning of the refinement.
 
The Sparkle Arc A750 Orc is available for $189 USD at NewEgg (US). Hard to beat that.

It has terrbile drivers. Because the specs are quite impressive but the performance relatively mediocre.
And the power consumption is a disaster: 225watt/550wattPSU requirement vs 132watt/300wattPSU requirement.

1696795688081.png
1696795703149.png


vs RX 6600:

1696795748207.png
1696795724667.png
 
I really don't want to repeat myself, but since it's present in every other GPU review...
6 nanometer production process
Is NOT a Pro (or Con) it's just a technical detail. Good power efficiency would be a Pro, but that's not even the case here.
 
Room for performance improvements but not really a disaster anymore.
No, its still a disaster. Just not a hurricane + flash flooding+ wildfires at the same time. I have A750 and A770... Its so bad that Id rather use my RTX A2000 SFF rig or RX 6800m gaming laptop than the Arc systems.

VR is totally busted for me and I play several of the games that have been on the "we will fix it soon" list since December or January. Its just not worth messing with for me right now.
 
I really don't want to repeat myself, but since it's present in every other GPU review...

Is NOT a Pro (or Con) it's just a technical detail. Good power efficiency would be a Pro, but that's not even the case here.
You are absolutely right, this will be removed in future reviews, until a new tech comes around
 
You are absolutely right, this will be removed in future reviews
Thank you for considering removing the 6nm argument.
..., until a new tech comes around
But do you mean by this that you plan to again name 5nm (or an MCM setup), or whatever comes next, to use as an argument for or against a graphics card?
Because again this will not be of any use to the end user if it doesn't result in good performance and/or power consumption, so it is no argument in its own right. (Directly end user relevant tech like an update DP or HDMI interface version is of course a valid point)
 
Thank you for considering removing the 6nm argument.

But do you mean by this that you plan to again name 5nm (or an MCM setup), or whatever comes next, to use as an argument for or against a graphics card?
Because again this will not be of any use to the end user if it doesn't result in good performance and/or power consumption, so it is no argument in its own right. (Directly end user relevant tech like an update DP or HDMI interface version is of course a valid point)
I get your point, and you are technically right, will think about it
 
No, its still a disaster. Just not a hurricane + flash flooding+ wildfires at the same time. I have A750 and A770... Its so bad that Id rather use my RTX A2000 SFF rig or RX 6800m gaming laptop than the Arc systems.

VR is totally busted for me and I play several of the games that have been on the "we will fix it soon" list since December or January. Its just not worth messing with for me right now.
Weird my cards and game combos even old are working great.
 
Weird my cards and game combos even old are working great.
Glad to hear that you are one of 5 people that is the case for.

Been using the cards since November, and doing bug reporting through the Intel discord just as long. "It works for everything I do perfectly" is not common among Arc owners.
And, somehow, the Arc fanboys tend to be even worse than the AMD guys back during the wacky gurrilla marketing campaign they ran circa 2016.
 
Glad to hear that you are one of 5 people that is the case for.

Been using the cards since November, and doing bug reporting through the Intel discord just as long. "It works for everything I do perfectly" is not common among Arc owners.
And, somehow, the Arc fanboys tend to be even worse than the AMD guys back during the wacky gurrilla marketing campaign they ran circa 2016.

Not really, I mean there are far more im sure, It doesnt work perfectly, but with all due respect it seems you are just really mad because it didnt play what ever your %favorite game% was perfectly.

Just sell it and buy something else? There really isnt any worth in our subjective experiences, but overall ARC is doing well enough to make sales and take market share.
 
Not really, I mean there are far more im sure, It doesnt work perfectly, but with all due respect it seems you are just really mad because it didnt play what ever your %favorite game% was perfectly.

Just sell it and buy something else? There really isnt any worth in our subjective experiences, but overall ARC is doing well enough to make sales and take market share.
Its not one game is the problem. Its something like half of my library that has some kind of issue with Arc. Bethesda games in general have performance and stuttering issues. Unity games have stuttering and frame pacing issues. Lots of older DX9/10/11 games are still performing at about half of what they should (or even worse). Vulkan titles have random weird problems with memory allocation still, a year after bug reports were filed. Random web page corruption with browser hardware acceleration enabled and displays randomly blanking out when using AV1 (one of the main selling points for many). Heck, some of the software that was bundled with the Arc cards at the end of last year still doesnt work reliably.

Is it fun to tinker with on a side system? Sure, its oddball hardware.
Would I put it in my daily driver? No.
Would I recommend it to anyone who isnt all about registry hacks, DLL overrides, checking for new drivers daily, and constantly running DDU? No.

Honestly, the only thing Arc is doing for many is getting them to finally forget about AMD driver problems from a decade or two ago. At least thats what the market share numbers look like.
 
Who doesn't use a GPU for 4-6 years!?!?!
GPUs are cheap. There are people whom spend tens of thousands of dollars replacing a car because its 3 years old. I know people whom buy GPUs every single gen and whine about how expensive they are.
 
GPUs are cheap. There are people whom spend tens of thousands of dollars replacing a car because its 3 years old. I know people whom buy GPUs every single gen and whine about how expensive they are.
All depends on the gpu tier and disposable income one has.
 
The RT performance is also worse than AMDs, even last gen, 6700 XT is a comparable GPU to this, size-wise, and is easily ahead in RT per the review page, 7600 is ~same, so you can hardly say in the summary that it has better RT perf than AMD, simply not the case. In general Intel GPUs are a trainwreck, avoid if possible. Did they not want to release a successor this year or even last year? lol what happened to that

"While I have to congratulate Intel on all their recent improvements on the software side, these comparisons show that Intel must lower their prices considerably. The mining boom is over, and graphics cards are readily available, which makes pricing a major factor again."

the issue with that is that this is a pretty big GPU, bigger than 6700 XT, I think about 400mm2, so selling it even lower - 100% at a loss, this GPU is simply a disaster. It's a complex GPU with a lot of shaders but the performance is simply not there, it only competes with GPUs half its size. It seems Intel has shaved their GPU division anyway, and you can clearly see here why.
 
Why is Red Dead 2 is tested with DX12, when original 6700 XT review was done in Vulkan instead?
Also tested in 2022 May, both 6750 XT reviews all showed 77-80 FPS for RDR2 for 3070 and 3070 Ti. Now all of them show 91-97.
Those were driver issues?
 
Last edited:
Back
Top