• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Arc A750 Trades Blows with GeForce RTX 3060 in 50 Games

Joined
Oct 3, 2019
Messages
139 (0.08/day)
Processor Ryzen 3600
Motherboard MSI X470 Gaming Plus Max
Cooling stock crap AMD wraith cooler
Memory Corsair Vengeance RGB Pro 16GB DDR4-3200MHz
Video Card(s) Sapphire Nitro RX580 8GBs
Storage Adata Gammix S11 Pro 1TB nvme
Case Corsair Caribide Air 540
What I don't get is how apple could make a decent performing GPU -integrated, at that- seemingly out of nowhere, and intel has been developing this debacle since 2017 (check it out - that's when Xe discrete graphics was first announced, half a decade ago) and still end up with... this clusterf__k.

I'm not trolling for apple, honestly. I'd like some kind of explanation for that. Is it that the Apple igpu is not required to support as many games, for example, considering the relative scarcity of gaming on mac? What is it? I do get how difficult it is to develop a brand new architecture, so how did apple do it?
 
Joined
Mar 1, 2021
Messages
402 (0.34/day)
Location
Germany
System Name Homebase
Processor Ryzen 5 5600
Motherboard Gigabyte Aorus X570S UD
Cooling Scythe Mugen 5 RGB
Memory 2*16 Kingston Fury DDR4-3600 double ranked
Video Card(s) AMD Radeon RX 6800 16 GB
Storage 1*512 WD Red SN700, 1*2TB Curcial P5, 1*2TB Sandisk Plus (TLC), 1*14TB Toshiba MG
Display(s) Philips E-line 275E1S
Case Fractal Design Torrent Compact
Power Supply Corsair RM850 2019
Mouse Sharkoon Sharkforce Pro
Keyboard Fujitsu KB955
Well I thought they are going the Larrabee route, but this looks like Vega and within that another Koduri masterpiece. I guess they are shipping it soon to get rid off the first gen inventory and try to rescue it with Battlemage somehow. Investors must be screaming right now.
 
Joined
Dec 30, 2010
Messages
2,110 (0.43/day)
What I don't get is how apple could make a decent performing GPU -integrated, at that- seemingly out of nowhere, and intel has been developing this debacle since 2017 (check it out - that's when Xe discrete graphics was first announced, half a decade ago) and still end up with... this clusterf__k.

I'm not trolling for apple, honestly. I'd like some kind of explanation for that. Is it that the Apple igpu is not required to support as many games, for example, considering the relative scarcity of gaming on mac? What is it? I do get how difficult it is to develop a brand new architecture, so how did apple do it?

It really is the same as what Raja attempted with Vega vs Intel with it's arc now. A compute based GPU (intially designed for Compute) derived and put to a gaming version of it. They usually did'nt pass quality standards and with "some tricks" you can utilize it as a gaming GPU.

The tradeoff is that it might underperform in games, but excell in compute, and at the cost of more power often compared to a traditional and real "gaming" GPU as you see with Nvidia Geforce and Quaddro, or RDNA with CDNA.

The drivers of Intel just suck. First reviews coud'nt get it stable. Many glitches, artifacts, underperforming or even chrashes. If you buy one now and you would even attempt playing the latest or newest games, the chances are it just wont work or it's performance would significantly lack.

The delays of over a year has nothing todo with covid or something else. The first batch just did'nt cut it. It proberly had bugs, the performance was whack and they had to respin it to get it right. Or at least working. These GPU's simply dont match, consume more power and games are a gamble for the next few years in terms of drivers and / or performance.

I mean Vega was in it's ways a good card, excelled at Compute, did games and was equal over time to a 1080Ti, it just required a tad more power and that and polaris was just clocked beyond it's efficiency curve. Polaris was also bandwidth starved in terms of performance. For example the memory was rated to only "feed" the gpu up to 1000Mhz GPU clock. Anything above was just a waste of power.

But they kind of had to, in order to still compete with the 1060. The price tag of 250$ however made it good and it was the best 1080p card at that time.
 
Joined
Sep 6, 2013
Messages
3,034 (0.78/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
What I don't get is how apple could make a decent performing GPU
They probably.... borrowed some ideas from Imagination Technologies. Hired some very talented and experienced people and then throw a huge amount of money on the problem.
I really wonder why Intel didn't tried to buyout Imagination. It would have been probably "cheap" and Imagination could offer experienced and talented personnel and patents that could help Intel to start producing better stuff sooner. After 2017 Imagination probably was begging someone to come and buy them.
 
Joined
Jun 11, 2019
Messages
492 (0.27/day)
Location
Moscow, Russia
Processor Intel 12600K
Motherboard Gigabyte Z690 Gaming X
Cooling CPU: Noctua NH-D15S; Case: 2xNoctua NF-A14, 1xNF-S12A.
Memory Ballistix Sport LT DDR4 @3600CL16 2*16GB
Video Card(s) Palit RTX 4080
Storage Samsung 970 Pro 512GB + Crucial MX500 500gb + WD Red 6TB
Display(s) Dell S2721qs
Case Phanteks P300A Mesh
Audio Device(s) Behringer UMC204HD
Power Supply Fractal Design Ion+ 560W
Mouse Glorious Model D-
Intel is probably lucky that AMD and Nvidia will probably not offer a cheap next gen product early. Nvidia will start with 4080 and up. Even if they throw 4070 in the market, it's not going to be cheap. They lowered their profit margin from about 65% to 45% and started offering GPUs to their partners at much lower prices. Nvidia needs money to go forward and having the strongest by far brand and 80% of the market, also knowing that Intel is far behind and AMD's need to keep focusing on EPYC because of capacity restrictions, they will not price low. I am expecting Nvidia to start with products costing over $800 in the 4-6 first months (just a random prediction to give an idea) and AMD to also start at higher than expected prices, like $600 for their cheaper RX 7000 option. So Intel will have plenty of time and room to play under $500. Fun fact, Raja is reliving his Polaris times. He just doesn't have a Vega equivalent yet.
They just had a second mining jackpot in 6 years and you think they need money to go forward? I don't think Huang's bookkeepers and analysts are so dumb they expected the crazy times to roll on forever, especially considering that even back when the 2020 boom started, ETH PoS switch was already on the map.

Here's my calculation: $500 adjusted for 20% inflation + 50$ towards a new leather jacket for Jensen. MSRP is going to be 650$.
 
Joined
Feb 20, 2019
Messages
7,443 (3.89/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
After discovering that the previous slides of Intel's GPU performance bore little resemblence to the real game performance, how is anyone expected to trust these Intel slides?

Intel, you were caught lying, and you haven't addressed that yet. Pull yourselves together FFS.
 
Joined
Sep 6, 2013
Messages
3,034 (0.78/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
They just had a second mining jackpot in 6 years and you think they need money to go forward? I don't think Huang's bookkeepers and analysts are so dumb they expected the crazy times to roll on forever, especially considering that even back when the 2020 boom started, ETH PoS switch was already on the map.

Here's my calculation: $500 adjusted for 20% inflation + 50$ towards a new leather jacket for Jensen. MSRP is going to be 650$.
They where dumb. That's why they ordered huge amount of wafers from Samsung, produced huge amounts of RTX 3000 cards and now they are discounting them like there is no tomorrow to manage to sell them. They where that dumb to order a huge amount of 5nm wafers from TSMC that now wish to pospone getting them 6 months later than the agreed date, they where dump enough to have to announce about 1.4 billions less income, to have to suffer a huge drop on profit margin from about 65% to about 45%. That's even lower than Intel's profit margin at Intel's current situation, lower than AMD's profit margin considering AMD is the little guy between the three.

Now if you think that a 12GB card at a much more expensive process, like TSMC's 5nm with performance close to 3090 Ti will start selling at $650, with all my heart I wish you end up correct. But if in their current position after doing all those price redactions they are at 45% profit margin, selling a card with those characteristics at $650 will be suicidal. Not to mention the price redaction in current RTX 3000 unsold cards we will have to witness. Can you imagine an RTX 3090 selling for $550? An RTX 3070 for $300? I doubt we are in a GTX 970 era. Not to mention inflation.
And it doesn't matter how much money they have in their bank accounts. For companies of this size trying to be on top in new markets like AI, with competition from huge companies like Intel, Amazon, Google, Alibaba etc. the amount of money they have, will never be enough.
 
Joined
Jun 10, 2014
Messages
2,906 (0.80/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
I'm curious to see details about framerate consistency across games.

Also, is it just me, or are the "heavier" games mostly on the lower end of that scale comparing it to RTX 3060?

Optimization was always a problem, it just seems to be bigger today because games are huge, developers try to market them as fast as possible and frankly a 10+ core CPU and modern GPUs are huge carpets to hide underneath any performance problem.
Adding cores will not solve any performance problem in a game. In a highly synchronized workload like a game, the returns from using more threads are diminishing very quickly, and can quickly turn into unreliable performance or even glitching. What you're saying here is just nonsense.

Also an unoptimised game will sell more CPUs and GPUs than an optimized one, meaning not only you can market it faster, you can also get nice sponsor money from Nvidia, AMD and Intel, by partially optimizing for their architecture instead for everyones.
Firstly, no modern PC game is optimized for a specific GPU architecture, that would require using the GPU's low-level API instead of DirectX/Vulkan/OpenGL and bypassing the driver (because translating APIs is the primary task of the driver).

Your claims are approaching conspiracy territory. No game developer wants their game to perform poorly, that would make the gameplay less enjoyable for the majority of their customers. Game developers don't get a cut of GPU sales either, and in the cases of GPU makers "sponsoring" games, that has more to do with technical assistance and marketing, and even if they were to receive any funds, that would be drops in the bucket compared to the budget of the big game titles.

Many games today are bloated and poorly coded for a host of reasons;
- Most use off-the-shelf game engines, writing little or no low-level code themselves. Instead they interface with the engine. This also means these engines have generic rendering pipelines design to render arbitrary objects, not specifically tuned to the specific game.
- Companies want quick returns, often resulting in short deadlines, changing scopes and last minute changes.
- Maintenance is often not a priority, as the code is often hardly touched after launch, leading programmers to rush to meet requirements instead of writing good code. This is the reason why game code is known as some of the worst in the industry.

I mean Vega was in it's ways a good card, excelled at Compute, did games and was equal over time to a 1080Ti, it just required a tad more power and that and polaris was just clocked beyond it's efficiency curve. Polaris was also bandwidth starved in terms of performance. For example the memory was rated to only "feed" the gpu up to 1000Mhz GPU clock. Anything above was just a waste of power.

But they kind of had to, in order to still compete with the 1060. The price tag of 250$ however made it good and it was the best 1080p card at that time.
Which Vega cards are you talking about?
Vega 56 performed slightly over GTX 1070 but cost $500 (with a $100 "value" of games).

And how was Polaris bandwidth starved?
RX 480 had 224/256 GB/s vs. GTX 1060's 192 GB/s.

Both Polaris and Vega underperformed due to poor GPU scheduling, yet they performed decently in some compute workloads, as some of then are easier to schedule.
 
Joined
Oct 3, 2019
Messages
139 (0.08/day)
Processor Ryzen 3600
Motherboard MSI X470 Gaming Plus Max
Cooling stock crap AMD wraith cooler
Memory Corsair Vengeance RGB Pro 16GB DDR4-3200MHz
Video Card(s) Sapphire Nitro RX580 8GBs
Storage Adata Gammix S11 Pro 1TB nvme
Case Corsair Caribide Air 540
They probably.... borrowed some ideas from Imagination Technologies. Hired some very talented and experienced people and then throw a huge amount of money on the problem.
I really wonder why Intel didn't tried to buyout Imagination. It would have been probably "cheap" and Imagination could offer experienced and talented personnel and patents that could help Intel to start producing better stuff sooner. After 2017 Imagination probably was begging someone to come and buy them.

They had some shares... up to 16% I see.

"On 22 June 2017, Imagination Technologies' board of directors announced it was putting the entire company up for sale[33] and, on 25 September 2017, they announced that the company was being acquired by Canyon Bridge, a private equity fund ultimately owned by the Chinese government.[34][35] In November 2017 the sale to Canyon Bridge was approved in a transaction which valued the business at £550 million (£1.82 per share)."

And you got the begging part right, too.
 
Joined
Sep 6, 2013
Messages
3,034 (0.78/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
What you're saying here is just nonsense.
Why people have to add this after posting their opinion? You said something, it looked logical, you had in the end to throw an insult. Do you feel smarter by adding that sentence? Are you a game developer in a multi billion company? Am I? This is a forum. Tell your opinion, let the "nonsense" comment in your mind. Don't post it. And if you are NOT a game developer in a multi billion company, how do you know that having more cores doesn't help? Today 4 cores even with HyperThreading are not enough. 12 threads are considered minimum. That will become 16 threads in a year or two, we might move to 24 or 32 threads as a minimum in 5-10 years to avoid having performance problems. How do you explain that? That today we play with 12 or 16 threads to get the best performance? If this post was 5 years old, your post would have being the same with the only difference me talking about 6+ cores CPUs. And your arguments the same.

I totally ignored the rest of your post. You might have some good points there, but I really don't post here to get upset from every "I know better, you say nonsense" individual.

PS I just remembered you. You are that "I know everything person you know nothing".
OK, time to expand the ignore list. You are free to post whatever you like as a reply to my posts. Don't care.
 
Joined
Oct 27, 2020
Messages
789 (0.61/day)
After discovering that the previous slides of Intel's GPU performance bore little resemblence to the real game performance, how is anyone expected to trust these Intel slides?

Intel, you were caught lying, and you haven't addressed that yet. Pull yourselves together FFS.
To what Intel's ARC claims are you referring exactly?
Regarding A380 they said that based on their chinese SRP 1030 yuan and AMD's RX6400 1199 yuan they had 25% better performance per price which means in the games they tested RX6400 had -6.87% worst performance vs A380 and in the TPU test the results were the following, so quiet close to TPU correct?


Regarding A750 they showed previously how it performed in 5 titles vs RTX 3060 (+6% up to +17%) saying at the same time that in some newer titles that had DX12/Vulcan based engines the performance delta could reach these levels if the game is suitable to ARC's architecture, clarifying also that A750 won't look as good in all the games and that in many DX11 games they will have low performance for the reasons they explained.
Now they tested 50 games and the performance claim is 3-5% higher than RTX 3060.
I expect the TPU results to be very close to these claims just like in A380's case!
 
Joined
Oct 18, 2013
Messages
5,556 (1.44/day)
Location
Everywhere all the time all at once
System Name The Little One
Processor i5-11320H @4.4GHZ
Motherboard AZW SEI
Cooling Fan w/heat pipes + side & rear vents
Memory 64GB Crucial DDR4-3200 (2x 32GB)
Video Card(s) Iris XE
Storage WD Black SN850X 4TB m.2, Seagate 2TB SSD + SN850 4TB x2 in an external enclosure
Display(s) 2x Samsung 43" & 2x 32"
Case Practically identical to a mac mini, just purrtier in slate blue, & with 3x usb ports on the front !
Audio Device(s) Yamaha ATS-1060 Bluetooth Soundbar & Subwoofer
Power Supply 65w brick
Mouse Logitech MX Master 2
Keyboard Logitech G613 mechanical wireless
Software Windows 10 pro 64 bit, with all the unnecessary background shitzu turned OFF !
Benchmark Scores PDQ
Intel earlier this week released its own performance numbers
^^THIS^^

Lets see some real, INDEPENDANT, 3rd party tests results, then perhaps we can actually decide if these cards are ok, or just DOA :)

Come on TPU, surely someone here can convince team blue to give up a review sample asap, yes ?
 
D

Deleted member 185088

Guest
Be sure to take us along for your ride, I find this product interesting but I have enough issues to figure out as is :p
Also as of yet DX11 performance is apparently bad and seeing as one of the games I play the most is DX11, this card does not seem like its for me atm.
Dx12 games are plagued with stutters due to shader compilation.
 
Joined
Aug 3, 2022
Messages
133 (0.20/day)
Processor i7-7700k @5ghz
Motherboard Asus strix Z270-F
Cooling EK AIO 240mm
Memory Hyper-X ( 16 GB - XMP )
Video Card(s) RTX 2080 super OC
Storage 512GB - WD(Nvme) + 1TB WD SDD
Display(s) Acer Nitro 165Hz OC
Case Deepcool Mesh 55
Audio Device(s) Razer Karken X
Power Supply Asus TUF gaming 650W brozen
Mouse Razer Mamba Wireless & Glorious Model D Wireless
Keyboard Cooler Master K70
Software Win 10
Hmmm, finally the games have increased from the last leak. Now, they are calling this special edition something. is their driver stable enough? or are they still doing workarounds?
These performance numbers seem pretty average :)
Waiting for the spec sheets of these special edition ARC and also wondering what happened to their ARC 3 & 5 series cards. As of now, they jumped straight to ARC 7xx cards!
How will these cards play a vital role in the revenue game for intel as the graphics division is -$500mn?

The marketing team is doing good enough to get these numbers and everything with YouTubers also, waiting for reviews of these special edition card ( FE vs OEM )
 

64K

Joined
Mar 13, 2014
Messages
6,187 (1.66/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
^^THIS^^

Lets see some real, INDEPENDANT, 3rd party tests results, then perhaps we can actually decide if these cards are ok, or just DOA :)

Come on TPU, surely someone here can convince team blue to give up a review sample asap, yes ?

imo Intel is hesitant to release review samples because they are concerned that a thorough review like it would get on this site wouldn't make their GPUs look very good.
 
Joined
Jul 16, 2014
Messages
8,131 (2.26/day)
Location
SE Michigan
System Name Dumbass
Processor AMD Ryzen 7800X3D
Motherboard ASUS TUF gaming B650
Cooling Artic Liquid Freezer 2 - 420mm
Memory G.Skill Sniper 32gb DDR5 6000
Video Card(s) GreenTeam 4070 ti super 16gb
Storage Samsung EVO 500gb & 1Tb, 2tb HDD, 500gb WD Black
Display(s) 1x Nixeus NX_EDG27, 2x Dell S2440L (16:9)
Case Phanteks Enthoo Primo w/8 140mm SP Fans
Audio Device(s) onboard (realtek?) - SPKRS:Logitech Z623 200w 2.1
Power Supply Corsair HX1000i
Mouse Steeseries Esports Wireless
Keyboard Corsair K100
Software windows 10 H
Benchmark Scores https://i.imgur.com/aoz3vWY.jpg?2
All testing was done without ray tracing, performance enhancements such as XeSS or DLSS weren't used.
Every gamer I know would use these enhancements whenever possible. Typical Intel, changing the parameters to create a fake favorable result.
 
Joined
Sep 6, 2013
Messages
3,034 (0.78/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
Wasn't the A7XX card supposed to compete with the 3070? There were "leaks" posted nearly 1 year ago saying they were neck and neck with a 3070. I think we all know the fate of ARC. Intel is just going to milk this lame cow to get some scraps out of this failed project.
Rumors where saying 3080, then 3070, then over 3060 Ti, for the top model.

Intel only needs problematic free hardware and software that offers some kind of performance and compatibility that will not force consumers returning their systems back to manufacturers/sellers. Intel can keep bleading money and keep selling ARC or whatever future GPUs to OEMs at cost, or even under the cost, if those GPUs are bought together with a big enough quantity of CPUs and chipsets. As long as Intel is improving it's designs and software it can keep losing money. In the end it's an investment that can start generating billions of income or even profit for Intel in a few years. If they abandon GPUs, getting in super computers will start becoming more and more difficult in the future. And if that happens, what are they going to do? Become AMD in CPUs and try to stay alive through manufacturing?
 
Joined
Jul 16, 2014
Messages
8,131 (2.26/day)
Location
SE Michigan
System Name Dumbass
Processor AMD Ryzen 7800X3D
Motherboard ASUS TUF gaming B650
Cooling Artic Liquid Freezer 2 - 420mm
Memory G.Skill Sniper 32gb DDR5 6000
Video Card(s) GreenTeam 4070 ti super 16gb
Storage Samsung EVO 500gb & 1Tb, 2tb HDD, 500gb WD Black
Display(s) 1x Nixeus NX_EDG27, 2x Dell S2440L (16:9)
Case Phanteks Enthoo Primo w/8 140mm SP Fans
Audio Device(s) onboard (realtek?) - SPKRS:Logitech Z623 200w 2.1
Power Supply Corsair HX1000i
Mouse Steeseries Esports Wireless
Keyboard Corsair K100
Software windows 10 H
Benchmark Scores https://i.imgur.com/aoz3vWY.jpg?2
I hope they dont try that shit were you have to buy a k/unlocked version down the road to overclock your gpu.
in interesting twist would indeed be sketchy.
 
Joined
Feb 20, 2019
Messages
7,443 (3.89/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
To what Intel's ARC claims are you referring exactly?
Maybe I'm missing something? Every site and streamer I read/follow (TPU, G3D, KG, GN, HUB, hell - even LTT) were all massively disappointed by the A380 because it failed to live up to claims. I vaguely remember Intel saying that the A380 was originally supposed to be about par with a GTX 1060. It's so late to market that Pascal was the relevant competition!

It turns out that yes, the A380 does actually match a 1060 (or fall somewhere between a 6400 and 6500XT) but with the caveat of only with ReBAR enabled, and only in a modern motherboard with a PCIe 4.0 slot, and only in games that support ReBAR, and only in DX12 or modern Vulkan titles, and only if the drivers actually work at all, which is not a given under any circumstances. That's added to the fact that Intel clearly optimised for misleading synthetic benchmarks as the 3DMark score is way out of line with the performance it demonstrates in any game titles.

The media being unanimously disappointed with the Arc is not because of unrealistic expectations. Those expectations were set by Intel themselves and the fact that post-launch (in China) Intel adjusted their claims to be more in line with how it actually performs (including the whole laundry-list of the non-trivial caveats) is kinda just confirmation of that disappointment. It's even worse than than it seems, too - because the target market for an A380 buyer isn't a brand-new machine playing modern AAA DX12/Vulcan titles. It's going to be someone looking to cheaply upgrade an old PCIe 3.0 board and likely playing older games because the A380 isn't really good enough to get a great experience in the ReBAR-optimised, AAA DX12 titles that Intel actually have acceptable performance with.

Let's see if the results from independent reviewers match these official Intel graphs for the games shown when we actually have an official Arc A750 launch....
 
Joined
Apr 30, 2011
Messages
2,656 (0.56/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
We know that for DX12 and Vulkan performance isn't totally bad. Thing is that for OpenGL and DX9-11 and power draw they keep the data close to their chest. Those will make or brake their higher performance level GPU market share even with a better price than their competitors' GPUs.
 
Joined
Mar 18, 2015
Messages
178 (0.05/day)
Considering that it's likely to be a disaster in DirectX 9/10/11 and OpenGL titles, they can't price this thing at more than $200. At that price it might be worth taking a risk on, even if I expect Intel to abandon it in terms of driver support pretty quickly, because Linux and Mesa make that somewhat irrelevant. Any more than $200 and there's zero reason to consider it over an AMD or Nvidia alternative.
 
Joined
Mar 14, 2008
Messages
511 (0.09/day)
Location
DK
System Name Main setup
Processor i9 12900K
Motherboard Gigabyte z690 Gaming X
Cooling Water
Memory Kingston 32GB 5200@cl30
Video Card(s) Asus Tuf RTS 4090
Storage Adata SX8200 PRO 1 adn 2 TB, Samsung 960EVO, Crucial MX300 750GB Limited edition
Display(s) HP "cheapass" 34" 3440x1440
Case CM H500P Mesh
Audio Device(s) Logitech G933
Power Supply Corsair RX850i
Mouse G502
Keyboard SteelSeries Apex Pro
Software W11
Let me buy that...... oh.......
 
Joined
Apr 30, 2011
Messages
2,656 (0.56/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
Considering that it's likely to be a disaster in DirectX 9/10/11 and OpenGL titles, they can't price this thing at more than $200. At that price it might be worth taking a risk on, even if I expect Intel to abandon it in terms of driver support pretty quickly, because Linux and Mesa make that somewhat irrelevant. Any more than $200 and there's zero reason to consider it over an AMD or Nvidia alternative.
Me thinks they will try to sell those GPUs for $50 less than the GPUs that match in performance on DX12/Vulkan games. And if they don't sell well they will go on discount for lower price.
 
Top