• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Arc A750 Trades Blows with GeForce RTX 3060 in 50 Games

Joined
Feb 23, 2008
Messages
1,064 (0.18/day)
Location
Montreal
System Name Aryzen / Sairikiki / Tesseract
Processor 5800x / i7 920@3.73 / 5800x
Motherboard Steel Legend B450M / GB EX58-UDP4 / Steel Legend B550M
Cooling Mugen 5 / Pure Rock / Glacier One 240
Memory Corsair Something 16 / Corsair Something 12 / G.Skill 32
Video Card(s) AMD 6800XT / AMD 6750XT / Sapphire 7800XT
Storage Way too many drives...
Display(s) LG 332GP850-B / Sony w800b / Sony X90J
Case EVOLV X / Carbide 540 / Carbide 280x
Audio Device(s) SB ZxR + GSP 500 / board / Denon X1700h + ELAC Uni-Fi 2 + Senn 6XX
Power Supply Seasonic PRIME GX-750 / Corsair HX750 / Seasonic Focus PX-650
Mouse G700 / none / G602
Keyboard G910
Software w11 64
Benchmark Scores I don't play benchmarks...
Impressive numbers... Now, how are the visual quality, stability and crashes?
 
Joined
Jun 11, 2019
Messages
497 (0.27/day)
Location
Moscow, Russia
Processor Intel 12600K
Motherboard Gigabyte Z690 Gaming X
Cooling CPU: Noctua NH-D15S; Case: 2xNoctua NF-A14, 1xNF-S12A.
Memory Ballistix Sport LT DDR4 @3600CL16 2*16GB
Video Card(s) Palit RTX 4080
Storage Samsung 970 Pro 512GB + Crucial MX500 500gb + WD Red 6TB
Display(s) Dell S2721qs
Case Phanteks P300A Mesh
Audio Device(s) Behringer UMC204HD
Power Supply Fractal Design Ion+ 560W
Mouse Glorious Model D-
They where dumb. That's why they ordered huge amount of wafers from Samsung, produced huge amounts of RTX 3000 cards and now they are discounting them like there is no tomorrow to manage to sell them. They where that dumb to order a huge amount of 5nm wafers from TSMC that now wish to pospone getting them 6 months later than the agreed date, they where dump enough to have to announce about 1.4 billions less income, to have to suffer a huge drop on profit margin from about 65% to about 45%. That's even lower than Intel's profit margin at Intel's current situation, lower than AMD's profit margin considering AMD is the little guy between the three.

Now if you think that a 12GB card at a much more expensive process, like TSMC's 5nm with performance close to 3090 Ti will start selling at $650, with all my heart I wish you end up correct. But if in their current position after doing all those price redactions they are at 45% profit margin, selling a card with those characteristics at $650 will be suicidal. Not to mention the price redaction in current RTX 3000 unsold cards we will have to witness. Can you imagine an RTX 3090 selling for $550? An RTX 3070 for $300? I doubt we are in a GTX 970 era. Not to mention inflation.
And it doesn't matter how much money they have in their bank accounts. For companies of this size trying to be on top in new markets like AI, with competition from huge companies like Intel, Amazon, Google, Alibaba etc. the amount of money they have, will never be enough.
Well, mainly, I don't think that nvidia can afford to drop their market share at a time when growth is uncertain - it's their leverage for near future. If we believe that 1) AMD is competitive and they actually care and 2) They are in a better position margin-wise => Jensen's pricing power should be limited by market rivals. Yeah, sure, they dropped all the way back to a number they haven't seen for more than a decade (you can't slow down CapEx immediately over one quarter like you cut prices!) but they're also in a new territory where AMD can undercut at the high end (even if it's by $50) and Intel is about to release (admittedly bad) mainstream products that they promised to price aggressively.
And yes, I can imagine 3090s selling for $550. I mean, there's already used ones popping up at $850-900. Besides, even Rolex prices are going back down this year, if we try to make parallels with premium products, which these GPUs are. So yeah, I rest my case, I guess we'll see in a few months who's going to be right.
 
Joined
Jun 10, 2014
Messages
2,907 (0.80/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
Why people have to add this after posting their opinion?
I posted nothing but facts, you posted incorrect statements. At the end of my correction I concluded your statement as nonsense, that's not an insult. If you can't defend your statements, than you shouldn't make them. Grow up.
Here are your statements for the record:
... a 10+ core CPU and modern GPUs are huge carpets to hide underneath any performance problem.
Also an unoptimised game will sell more CPUs and GPUs than an optimized one, meaning not only you can market it faster, you can also get nice sponsor money from Nvidia, AMD and Intel, by partially optimizing for their architecture instead for everyones.

Today 4 cores even with HyperThreading are not enough. 12 threads are considered minimum. That will become 16 threads in a year or two, we might move to 24 or 32 threads as a minimum in 5-10 years to avoid having performance problems. How do you explain that? That today we play with 12 or 16 threads to get the best performance? If this post was 5 years old, your post would have being the same with the only difference me talking about 6+ cores CPUs. And your arguments the same.
Nice attempt of a straw man argument there. :rolleyes:
The literature on parallelization has been known since 60s, and the limits of scaling are described by Amdahl's law. This is basic knowledge for CS studies, and don't attempt to approach this subject before understanding it. Assuming you're limited to the scope of gaming here, game simulation("game loop") and rendering are both pipelined workloads, which means you have to apply Amdahl's law on each step of the pipeline, and you need to resync all the worker threads before continuing. Combine this with fixed deadlines for each workload (e.g. 100 Hz tick rate gives 10 ms for game simulation, 120 Hz framerate gives 8.3 ms for rendering), leaves little wiggle room for using large quantities of threads for small tasks. Each synchronization increases the risk of delays either from the CPU side (SMT) or from the OS scheduler. Each delay to synchronization will pile up, and if the accumulated delay is large enough, it causes stutter or potentially game breaking bugs. So in conclusion, there are limits to how many threads a game engine can make use of.

And if you think what I'm posting is opinions, then you're mistaken. What I'm doing here is citing facts and making logical deductions, these are essential skills for engineers.

Considering that it's likely to be a disaster in DirectX 9/10/11 and OpenGL titles, they can't price this thing at more than $200. At that price it might be worth taking a risk on, even if I expect Intel to abandon it in terms of driver support pretty quickly, because Linux and Mesa make that somewhat irrelevant. Any more than $200 and there's zero reason to consider it over an AMD or Nvidia alternative.
I think even $200 might be a little too much, considering AMD's and Nvidia's next get will arrive soon. If the extra features like various types of sync, OC, etc. in the control panel are still not stable, I would say much less. I think it might soon be time for a poll of what people would be willing to pay for it in its current state, for me it would probably be $120/$150 for A750/A770, and I would probably not put it in my primary machine. But I want to see a deeper dive into framerate consistency on A750/A770.
 

Draxivier

New Member
Joined
Aug 11, 2022
Messages
1 (0.00/day)
Maybe I´m dumb, but please can you check the numbers of the comparatives and the graphs?

On CoD Vanguard at 1440p, the A750 gets 75 fps and the RTX 3060 gets 107 fps.
But in the graph made by Intel it says that they perfom EXACTLY the same.
¿¿¿???
 
Joined
May 17, 2021
Messages
3,005 (2.70/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
Maybe I´m dumb, but please can you check the numbers of the comparatives and the graphs?

On CoD Vanguard at 1440p, the A750 gets 75 fps and the RTX 3060 gets 107 fps.
But in the graph made by Intel it says that they perfom EXACTLY the same.
¿¿¿???

that will be corrected later like everything in this "launch"
 
Joined
Jun 10, 2014
Messages
2,907 (0.80/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
Maybe I´m dumb, but please can you check the numbers of the comparatives and the graphs?

On CoD Vanguard at 1440p, the A750 gets 75 fps and the RTX 3060 gets 107 fps.
But in the graph made by Intel it says that they perfom EXACTLY the same.
¿¿¿???
Good spot.
Have anyone checked some of the rest?
It could be a mistake, either the table or the graph, but chances are the table is more correct.
 
Joined
May 17, 2021
Messages
3,005 (2.70/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
Good spot.
Have anyone checked some of the rest?
It could be a mistake, either the table or the graph, but chances are the table is more correct.

Small family company doesn't have the people to check six graphs for errors. Give them a brake people
 

64K

Joined
Mar 13, 2014
Messages
6,251 (1.67/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
I'm buying ARC even if it's total crap, just to keep Intel's dGPU division alive. God knows we need big 3rd player on dGPU market soooo badly.
Lisa & Huang must be kept in check or their profit margins will keep going up and up and up...
Crypto Stealing GIF by KiwiGo (KGO)

The flip side of that coin is that you will be rewarding Intel when they don't deserve it if it is a turd.
 
Joined
Aug 12, 2010
Messages
79 (0.02/day)
Location
Brazil
Processor Ryzen 7 5800X3D
Motherboard Biostar B450GT3
Cooling Wraith Max + 2x Noctua Redux NF-P12 / 1x NF-B9 / 1x NF-R8
Memory 2x8GB Corsair LPX DDR4-3200
Video Card(s) Gigabyte RX 6700 XT Gaming OC
Storage Netac N950e Pro 1TB NVMe
Display(s) Dell 32" S3222DGM - 1440P 165Hz
Case Riotoro CR1088
Power Supply Corsair VS550
Mouse Microsoft Comfort Mouse 4500
Keyboard Dell KB216
Software Windows 11 Pro
All I hope is that Intel can pull off the drivers side of the business. We desperately need a third player.
 
Last edited:
Joined
Oct 27, 2020
Messages
789 (0.60/day)
Maybe I'm missing something? Every site and streamer I read/follow (TPU, G3D, KG, GN, HUB, hell - even LTT) were all massively disappointed by the A380 because it failed to live up to claims. I vaguely remember Intel saying that the A380 was originally supposed to be about par with a GTX 1060. It's so late to market that Pascal was the relevant competition!

It turns out that yes, the A380 does actually match a 1060 (or fall somewhere between a 6400 and 6500XT) but with the caveat of only with ReBAR enabled, and only in a modern motherboard with a PCIe 4.0 slot, and only in games that support ReBAR, and only in DX12 or modern Vulkan titles, and only if the drivers actually work at all, which is not a given under any circumstances. That's added to the fact that Intel clearly optimised for misleading synthetic benchmarks as the 3DMark score is way out of line with the performance it demonstrates in any game titles.

The media being unanimously disappointed with the Arc is not because of unrealistic expectations. Those expectations were set by Intel themselves and the fact that post-launch (in China) Intel adjusted their claims to be more in line with how it actually performs (including the whole laundry-list of the non-trivial caveats) is kinda just confirmation of that disappointment. It's even worse than than it seems, too - because the target market for an A380 buyer isn't a brand-new machine playing modern AAA DX12/Vulcan titles. It's going to be someone looking to cheaply upgrade an old PCIe 3.0 board and likely playing older games because the A380 isn't really good enough to get a great experience in the ReBAR-optimised, AAA DX12 titles that Intel actually have acceptable performance with.

Let's see if the results from independent reviewers match these official Intel graphs for the games shown when we actually have an official Arc A750 launch....
I only replied to the following you said:
«After discovering that the previous slides of Intel's GPU performance bore little resemblence to the real game performance, how is anyone expected to trust these Intel slides?
Intel, you were caught lying, and you haven't addressed that yet»

Please send me the link for the previous Intel slides/statements that proved that Intel was lying about gaming performance because I'm not aware.

I said way way early, when the rumors was that ARC-512 was between 3070-3070Ti in performance that it will be around or lower than 3060Ti and that if A380 match RX570 it will be a miracle. I can't answer for the expectations of others.
I'm not aware Intel setting these expectations but leakers (like i said in the past they probably correlated the 3DMark scores that were leaked to them with the actual gaming performance)
If you find any official Intel statement supporting these performance claims please sent it to me.
Sure the hardware should be capable of supporting near 3070 performance in some new DX12/Vulcan games well optimized for Intel's architecture (in some resolutions) and from what i can tell even Intel themselves thought that the performance delta between synthetic-games will be lower (so the S/W team underdelivered) but did they actually officially in a slide showed to the public much higher performance than the one they claim here?
Edit:
I have a feeling that the original internal Intel performance forecast before one year was around 3060Ti performance or slightly above for ARC-512 and around -13% GTX1650 super for ARC-128 so the software team underdelivered around -10%-12% imo)
 
Last edited:
Joined
Mar 1, 2021
Messages
413 (0.35/day)
Location
Germany
System Name Homebase
Processor Ryzen 5 5600
Motherboard Gigabyte Aorus X570S UD
Cooling Scythe Mugen 5 RGB
Memory 2*16 Kingston Fury DDR4-3600 double ranked
Video Card(s) AMD Radeon RX 6800 16 GB
Storage 1*512 WD Red SN700, 1*2TB Curcial P5, 1*2TB Sandisk Plus (TLC), 1*14TB Toshiba MG
Display(s) Philips E-line 275E1S
Case Fractal Design Torrent Compact
Power Supply Corsair RM850 2019
Mouse Sharkoon Sharkforce Pro
Keyboard Fujitsu KB955
You probably have to do the right rain dance, the right software stack, the proper Windows update version and of course the right game version to match Intels claims. Oh, I forgot that you are forced to use a rbar system with the newest shiny uefi so that ur oprom can talk to it.

Don't get me wrong, I would love one more player, but Intel did all the stuff that Raja did when he was at AMD, they pushed the Hypetrain uphill and forget to build rails downhill. That's typical PR vs engineering. We will see if they make it to Battlemage or not.
 
Joined
Sep 6, 2013
Messages
3,052 (0.78/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
Well, mainly, I don't think that nvidia can afford to drop their market share at a time when growth is uncertain - it's their leverage for near future. If we believe that 1) AMD is competitive and they actually care and 2) They are in a better position margin-wise => Jensen's pricing power should be limited by market rivals. Yeah, sure, they dropped all the way back to a number they haven't seen for more than a decade (you can't slow down CapEx immediately over one quarter like you cut prices!) but they're also in a new territory where AMD can undercut at the high end (even if it's by $50) and Intel is about to release (admittedly bad) mainstream products that they promised to price aggressively.
And yes, I can imagine 3090s selling for $550. I mean, there's already used ones popping up at $850-900. Besides, even Rolex prices are going back down this year, if we try to make parallels with premium products, which these GPUs are. So yeah, I rest my case, I guess we'll see in a few months who's going to be right.
Nvidia needs mostly to expand to other markets. They are doing it for years, by transforming the dumb graphics chip to a compute monster. They invested heavily in AI and Machine Learning, they spend billions on Mellanox, they tried to become the next Intel by buying ARM, so they can dictate where ARM architecture which direction will prioritize. Probably focussing on servers, desktops, laptops instead of mobiles. They failed, so they will have to play with what ARM will offer them and what they can build on that, like what Apple and others are doing. But to do so, expand to other markets and be the protagonist there, not just the third, forth option, they need money. They need higher profit margins than those they are "suffering" this quarter. So, selling cheap, even if that would expand their market share, it's probably NOT an option. Selling ultra fast cards for cheap means low profit margins AND those buying those cards wouldn't be again customers for the next 1-2-3-5 years. Selling for a premium means, higher profit margins, another chance to get rid of RTX 3000 and RTX 2000 stock at prices that will not be under cost, people without unlimited money buying cheaper models, meaning they will be customers again sooner.
Now, selling expensive (let's suggest 3090 Ti at $700, 4070 for $800, 4080 for $1200 and 4090 for $2000) will open up opportunities for the competition right? Well, what can the competition do?
Intel. Absolutely NOTHING.
AMD. Follow Nvidia's pricing. Why? Because they don't have unlimited capacity at TSMC and whatever capacity they have, they prioritize first for EPYC and Instinct, then for SONY and Microsoft console APUs, then CPUs and GPUs for big OEMs, with mobiles probably being before desktops, then retail Ryzen CPUs and lastly retail GPUs. That's why AMD had such nice financial results. Because they are the smaller player, with the least capacity, selling almost EVERYTHING they build to corporations and OEMs, not retail customers. We just get whatever it's left. So. Can AMD start selling much cheaper than Nvidia. No. Why? Because of capacity limitations. Let's say that RX 7600 is as fast as RTX 3090 Ti and is priced at $500. A weak after it's debut, demand will be so high that the card will become unavailable everywhere. It's retail price will start climing and will get to cost as much as the RTX 3090 Ti. AMD gets nothing from that jump from $500 to $700, retailers get all that difference. We already seen it. RX 6900 XT, RX 6800 XT, RX 6800 came out with MSRPs that where more or less logical if not just very good. Then the crypto madness started and latter mid and low end models where introduced at MSRPs that where looking somewhat higher than expected when compared to MSRPs of Nvidia's already available models.

So, Nvidia can keep prices up and don't fear of losing 5-10% of market share, knowing that whatever loss they will have because of competition will be controllable. On the other hand their profits will be much higher, even with that minus 5-10% market share.
Just an opinion of course. Nothing more.
 
Last edited:
Joined
Feb 20, 2019
Messages
7,484 (3.88/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
I only replied to the following you said:
«After discovering that the previous slides of Intel's GPU performance bore little resemblence to the real game performance, how is anyone expected to trust these Intel slides?
Intel, you were caught lying, and you haven't addressed that yet»

Please send me the link for the previous Intel slides/statements that proved that Intel was lying about gaming performance because I'm not aware.

I said way way early, when the rumors was that ARC-512 was between 3070-3070Ti in performance that it will be around or lower than 3060Ti and that if A380 match RX570 it will be a miracle. I can't answer for the expectations of others.
I'm not aware Intel setting these expectations but leakers (like i said in the past they probably correlated the 3DMark scores that were leaked to them with the actual gaming performance)
If you find any official Intel statement supporting these performance claims please sent it to me.
Sure the hardware should be capable of supporting near 3070 performance in some new DX12/Vulcan games well optimized for Intel's architecture (in some resolutions) and from what i can tell even Intel themselves thought that the performance delta between synthetic-games will be lower (so the S/W team underdelivered) but did they actually officially in a slide showed to the public much higher performance than the one they claim here?
Edit:
I have a feeling that the original internal Intel performance forecast before one year was around 3060Ti performance or slightly above for ARC-512 and around -13% GTX1650 super for ARC-128 so the software team underdelivered around -10%-12% imo)
Ah shit, you're going to ask me to find a specific slide from the last 3 years?
I am perhaps remembering it wrong, but I'm a deeply cynical man who expects the worst from everyone and I'm rarely disappointed. Intel have managed to achieve that.
Intel leaks/PR/investor talks/rumour mill has been strong, but it feels like 2-3 years of me constantly complaining about the flurry of news about a nonexistent product and broken promises.

It's late here, perhaps someone else can go back through 30+ months of Intel noise to find the full history of Intel article hype to find one particularly incriminating slide but I just have this vague feel that Intel are constantly backtracking on what they said last time. I'll have a proper hunt through the dozens/hundreds of Intel Arc articles tomorrow if workload permits.

Claim-to-claim it's all reasonable, but go back a year or so and they were promising the moon. If A750 is out next week and matches these claims then we're done here - Intel have delivered. Can you buy an A750 yet? No. Can you even buy an A380 in most of the world yet? No. They need to deliver a product to market (and not just a foreign test slice of the market) before it's obsolete and they need to deliver approximately the performance they promised. If the A750 doesn't launch worldwide for another 4-6 months, all of the promises made now are kind of worthless because the price and performance of current competition is in constant flux. As mentioned earlier, Intel's initial claims were comparing to 2017's Pascal cards because that's how overdue this thing is.

The flip side of that coin is that you will be rewarding Intel when they don't deserve it if it is a turd.
This. We may need a third player in the market but if AMD are the pro-consumer underdog compared to the incumbent market-leader, Intel are so horrible (both anti-consumer and anti-competition) that they are orders of magnitudes worse than AMD and Nvidia combined. They do have a full, varied, and longstanding history of well-documented monopolisation, anti-competitive malpractice, bribery, coercion, and blackmail.

Don't take my word for it, read the news articles from the last 30 years, Wikipedia, archive.org - whatever; You need to have been living under a rock to think Intel are the good guys. PC technology would likely be close to a decade ahead of where we are now if it wasn't for Intel playing dirty.
 
Last edited:

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
40,435 (6.54/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
All I hope is that Intel can pull off the drivers side of the business. We desperately need a third player.
It won't reduce pricing, so it's moot.

unintel is in a big hole.

Ah shit, you're going to ask me to find a specific slide from the last 3 years?
I am perhaps remembering it wrong, but I'm a deeply cynical man who expects the worst from everyone and I'm rarely disappointed. Intel have managed to achieve that.
Intel leaks/PR/investor talks/rumour mill has been strong, but it feels like 2-3 years of me constantly complaining about the flurry of news about a nonexistent product and broken promises.

It's late here, perhaps someone else can go back through 30+ months of Intel noise to find the full history of Intel article hype to find one particularly incriminating slide but I just have this vague feel that Intel are constantly backtracking on what they said last time. I'll have a proper hunt through the dozens/hundreds of Intel Arc articles tomorrow if workload permits.

Claim-to-claim it's all reasonable, but go back a year or so and they were promising the moon. If A750 is out next week and matches these claims then we're done here - Intel have delivered. Can you buy an A750 yet? No. Can you even buy an A380 in most of the world yet? No. They need to deliver a product to market (and not just a foreign test slice of the market) before it's obsolete and they need to deliver approximately the performance they promised. If the A750 doesn't launch worldwide for another 4-6 months, all of the promises made now are kind of worthless because the price and performance of current competition is in constant flux. As mentioned earlier, Intel's initial claims were comparing to 2017's Pascal cards because that's how overdue this thing is.


This. We may need a third player in the market but if AMD are the pro-consumer underdog compared to the incumbent market-leader, Intel are so horrible (both anti-consumer and anti-competition) that they are orders of magnitudes worse than AMD and Nvidia combined. They do have a full, varied, and longstanding history of well-documented monopolisation, anti-competitive malpractice, bribery, coercion, and blackmail.

Don't take my word for it, read the news articles from the last 30 years, Wikipedia, archive.org - whatever; You need to have been living under a rock to think Intel are the good guys. PC technology would likely be close to a decade ahead of where we are now if it wasn't for Intel playing dirty.
Aplauses lol

By the time they get these out the newer gpus would of trampled these
 
Joined
Feb 21, 2006
Messages
2,014 (0.30/day)
Location
Toronto, Ontario
System Name The Expanse
Processor AMD Ryzen 7 5800X3D
Motherboard Asus Prime X570-Pro BIOS 5013 AM4 AGESA V2 PI 1.2.0.Ca.
Cooling Corsair H150i Pro
Memory 32GB GSkill Trident RGB DDR4-3200 14-14-14-34-1T (B-Die)
Video Card(s) AMD Radeon RX 7900 XTX 24GB (24.5.1)
Storage WD SN850X 2TB / Corsair MP600 1TB / Samsung 860Evo 1TB x2 Raid 0 / Asus NAS AS1004T V2 14TB
Display(s) LG 34GP83A-B 34 Inch 21: 9 UltraGear Curved QHD (3440 x 1440) 1ms Nano IPS 160Hz
Case Fractal Design Meshify S2
Audio Device(s) Creative X-Fi + Logitech Z-5500 + HS80 Wireless
Power Supply Corsair AX850 Titanium
Mouse Corsair Dark Core RGB SE
Keyboard Corsair K100
Software Windows 10 Pro x64 22H2
Benchmark Scores 3800X https://valid.x86.fr/1zr4a5 5800X https://valid.x86.fr/2dey9c
I'm buying ARC even if it's total crap, just to keep Intel's dGPU division alive. God knows we need big 3rd player on dGPU market soooo badly.
Lisa & Huang must be kept in check or their profit margins will keep going up and up and up...
Crypto Stealing GIF by KiwiGo (KGO)
Wrong gif

just-do-it-unacceptable.gif
 
Joined
Jan 15, 2021
Messages
337 (0.27/day)
I Bellver AMD and NVIDIA in the GPU market are a cartel. They must have the same shareholders. Why do NVIDIA needs AMD? Because otherwise they would be the only GPU selling company which is a monopoly and is in theory illegal in the US.
 
Joined
Feb 21, 2006
Messages
2,014 (0.30/day)
Location
Toronto, Ontario
System Name The Expanse
Processor AMD Ryzen 7 5800X3D
Motherboard Asus Prime X570-Pro BIOS 5013 AM4 AGESA V2 PI 1.2.0.Ca.
Cooling Corsair H150i Pro
Memory 32GB GSkill Trident RGB DDR4-3200 14-14-14-34-1T (B-Die)
Video Card(s) AMD Radeon RX 7900 XTX 24GB (24.5.1)
Storage WD SN850X 2TB / Corsair MP600 1TB / Samsung 860Evo 1TB x2 Raid 0 / Asus NAS AS1004T V2 14TB
Display(s) LG 34GP83A-B 34 Inch 21: 9 UltraGear Curved QHD (3440 x 1440) 1ms Nano IPS 160Hz
Case Fractal Design Meshify S2
Audio Device(s) Creative X-Fi + Logitech Z-5500 + HS80 Wireless
Power Supply Corsair AX850 Titanium
Mouse Corsair Dark Core RGB SE
Keyboard Corsair K100
Software Windows 10 Pro x64 22H2
Benchmark Scores 3800X https://valid.x86.fr/1zr4a5 5800X https://valid.x86.fr/2dey9c
I Bellver AMD and NVIDIA in the GPU market are a cartel. They must have the same shareholders. Why do NVIDIA needs AMD? Because otherwise they would be the only GPU selling company which is a monopoly and is in theory illegal in the US.
So to follow this logic Intel and AMD are a cartel also?
 
Joined
Jul 29, 2022
Messages
383 (0.57/day)
Which Vega cards are you talking about?
Vega 56 performed slightly over GTX 1070 but cost $500 (with a $100 "value" of games).

And how was Polaris bandwidth starved?
RX 480 had 224/256 GB/s vs. GTX 1060's 192 GB/s.

Both Polaris and Vega underperformed due to poor GPU scheduling, yet they performed decently in some compute workloads, as some of then are easier to schedule.
I don't know about Polaris, but Vega was very memory inefficient. Vega56 performed equal to Vega64 once you set the 56 to the same memory clocks, despite the latter having more shaders. The TMUs were not fed fast enough. Apparently Raja's team wanted to give it some form of tile based rendering in hardware, but they couldn't get it working on the silicon, so the card needed a ton of memory bandwidth to perform. AMD fixed that issue with Navi AFAIK.
They also misconfigured the Vega cards with very high voltages, so they couldn't hold their boost clocks at all. The Vega56 I had, I lowered the GPU voltages a bit and it could keep its clock at 1700MHz indefinitely, plus I bumped the memory speed to something around 1100MHz. It got around 24k in Fire Strike (graphics score), which put it between the 2070 and 2080, or 1080 and 1080Ti. This was the $400 Vega 56 mind you.

This same guy who fucked up Vega is now working on Intel Arc, mind you.
 
Joined
Sep 6, 2013
Messages
3,052 (0.78/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
They also misconfigured the Vega cards with very high voltages, so they couldn't hold their boost clocks at all.
I don't think this is misconfiguration. Let's say that you produce 10 Vega GPUs and (random funny voltages) 5 need 2V to work without a problem at their base/turbo frequencies, 4 need 2.1V and one needs 2.2V.
You can set standard voltage for all cards at 2V and throw away 5 cards, at 2.1V and throw away only 1 card, or 2.2V and sell all of them as good working cards. I think companies just go with the third option. That's why we can undervolt most of our hardware at some degree and have no stability problems with it.
 
Joined
Jun 10, 2014
Messages
2,907 (0.80/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
I Bellver AMD and NVIDIA in the GPU market are a cartel. They must have the same shareholders. Why do NVIDIA needs AMD? Because otherwise they would be the only GPU selling company which is a monopoly and is in theory illegal in the US.
I believe pretty much all the big American tech companies have the "same" shareholders, as the majority of their shares are owned by mutual funds, either retirement funds or owned by individuals. I seriously doubt these shareholders have a coordinated effort for AMD and Nvidia to maintain a duopoly.

I don't know about Polaris, but Vega was very memory inefficient. Vega56 performed equal to Vega64 once you set the 56 to the same memory clocks, despite the latter having more shaders. The TMUs were not fed fast enough. Apparently Raja's team wanted to give it some form of tile based rendering in hardware, but they couldn't get it working on the silicon, so the card needed a ton of memory bandwidth to perform. AMD fixed that issue with Navi AFAIK.
Polaris and Vega have very similar performance characteristics, and they both have more memory bandwidth and TMU throughput than their Pascal counterparts, so that's not the issue, on paper at least. Their issue is resource management, which is why we see some synthetic workloads and the odd outlier do better than most games. The TMUs are responsible for transforming and morphing chunks of textures which are fed from the memory bus. But the efficiency of the entire GPU is very dependent on successful scheduling. Whenever you see more simple workloads scale significantly better than challenging ones, GPU scheduling will be the major contributor, so this is very similar to the problems Intel have with ARC.

Keep in mind Vega 56 have the same scheduling resources of Vega 64, but less resources to schedule. We saw a similar sweetspot with GTX 970 vs. GTX 980, where GTX 970 achieved more performance per GFlop and was very close when run at the same clocks.

They also misconfigured the Vega cards with very high voltages, so they couldn't hold their boost clocks at all. The Vega56 I had, I lowered the GPU voltages a bit and it could keep its clock at 1700MHz indefinitely, plus I bumped the memory speed to something around 1100MHz. It got around 24k in Fire Strike (graphics score), which put it between the 2070 and 2080, or 1080 and 1080Ti. This was the $400 Vega 56 mind you.
That's a misconception. If you think all Vega 56s would remain 100% stable on all workloads throughout the warranty period, then you're wrong. This voltage is a safety margin to compensate for chip variance and wear of the chips, how large this margin is depends on the quality and characteristics of the chip.

This same guy who fucked up Vega is now working on Intel Arc, mind you.
I tend to ignore speculations about the affects of management, regardless if it's good or bad.
I know all the good engineering is done on the lower level, but management and middle-management still needs to facilitate that, through priorities and resources.

But I think this guy is a good example of someone failing upwards.
 
Joined
Jun 10, 2014
Messages
2,907 (0.80/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
I find it quite a bit misleading when Intel portrays newer games as more fitting for their architecture when their own numbers seem to show the opposite. Take for instance F1 2021 which was one of the first games they showcased, achieving a +15% vs. RTX 3060. But now look at the newer game F1 2022 running the same engine; -11%. Now I don't know whether the newer game is somehow bad, but this doesn't bode well. So let's look at some other examples;
CoD Warzone -> CoD Vanguard: +53% -> -30% (same game engine)
Battlefield V -> Battlefield 2042: -10% -> 0% (same game engine)
So it's a mixed bag to say the least.

Among the games viewed as favorable to ARC there are also many either very old or not particularly advanced graphically, to mention a few; Fortnite (Unreal), Deeprock Galactic (Unreal), PUBG (Unreal), WoW, Arcadegeddon, Dying Light 2, Warframe, and a lot more Unreal games. So most of these are either not very advanced games (often Unreal) or ~10 year old engines with patched in DX12 support.

I'm not claiming any of these are bad games, or even invalid for a comparison. My point is that Tom Petersen and Ryan Shrout in their PR tour with LTT, GN, PCWorld, etc. claimed newer and better games will do better on Intel ARC, and I don't see the numbers supporting that. To me it seems like the "lighter"(less advanced) games do better on A750 while the "heavier" games do better on RTX 3060, and my assessment based on that future games will probably scale more like the heavier ones. I would like to remind people about an historical parallel; years ago AMD loved to showcase AotS, which was demanding despite not so impressive graphics, but this was supposed to showcase the future of gaming. Well did it? (no)

What conclusions do you guys draw?
 
Joined
Jun 22, 2021
Messages
73 (0.07/day)
Location
CZ
Processor Ryzen 7 5800X
Motherboard B550 Aorus Elite v2
Cooling Arctic Liquid Freezer II 240 ARGB w/ Conductonaut
Memory 4x8 GB HyperX Fury 3600MHz
Video Card(s) MSI RTX 3080 Ti Gaming X Trio
Storage 970 EVO 1TB, 970 EVO Plus 2TB
Display(s) Omen 32q
Case SilentiumPC Regnum RG6V EVO TG ARGB
Audio Device(s) Topping D10s+L30, Emu Teak, AH-D5200&D2000, HD660S2, Focal Elegia, SR325x, 7Hz Zero:2, MDR-1AM2, etc
Power Supply Corsair RM850
Mouse Xtrfy M4
Keyboard Alloy Origins 60
Software Win11
Benchmark Scores very high of course
Maybe I´m dumb, but please can you check the numbers of the comparatives and the graphs?

On CoD Vanguard at 1440p, the A750 gets 75 fps and the RTX 3060 gets 107 fps.
But in the graph made by Intel it says that they perfom EXACTLY the same.
¿¿¿???

It seems they swapped Warzone and Vanguard numbers in the 3060 column by mistake
 
Top