• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Arc Board Partners are Reportedly Stopping Production, Encountering Quality Issues

No its more like the AIBs can smell the blood in the water and don't want to be stuck holding the Buckets of Blue CHUM, when they already have Red and Green flavors of CHUM They can't get rid of
Crashing prices and slower then expected Ramp of Demand have led to severe overstock
Combine all of that with Rumors swirling about Production and silicon level defects The AIBs want nothing todo with arc

I was really hoping the the rumors of train wreak and mismanagement where just scuttlebut but its looking more and more like Intel Royally Screwed The Pooch and there entire product stack is now DOA
 
Last edited:
Honestly, I was hoping Intel would come to the world of GPU's and help keep everyone on point.

They were probably bribed to not finish :D
 
This kind of claim of quality concern doesn't tell us much without knowing what kind of concern they are talking about. This could mean a whole range of problems ranging from a bad reference design/spec, die package quality to all kinds of problems in the architecture, if it's true at all.

One thing that's important to keep in mind; the lessons Intel have learned from running this generation of GPUs will probably not be implemented in their next generation, but the one following, as their next gen is probably in later design stages by now.

Intel wants AIBs to eat shit on profits by selling to OEMs and system integrators to gain marketshare, yet at the same time Intel is unwilling to subsidise the prices of their own fucking GPUs in order to build that same marketshare.
Assuming these claims are true at all, of course.

Anyways, I'm puzzled over this claim, as AIBs are usually not selling volumes to OEMs, so asking AIBs to sell to OEMs seems like a strange request, unless the AIBs have already made the cards and "can't" sell them. Nvidia and AMD usually sell reference designs to OEMs directly, as there is usually no reason for an AIB to get involved in those "bare bone" cards. And as we know, OEM GPUs are high volume and low margin products.
 
Compared to Aftermarket OEM. is the bigger cash flow initial investment notwithstanding
the amount of people that buy dGPU's is incomprehensibly small compared to the volumes that dell and hp move
 
Every passing day and Moore's Law Is Dead's leak/report/hearsay seems more and more likely. First time I heard about it was in LTT's WAN Show. And those guys rarely if ever cover rumor mill.

 
Ahhhh, recession across the globe is affecting the rich now. I didn't see every market possible crashing when they raised prices of food, who would have thunk it?

Oh, well I'm canceling everything possible not needed right now to save money so I can still eat good and raise my family. GOOD LUCK EVERYONE ELSE!
Yup I'm using my hard earned pay to maintain my vehicle.

Pc upgrades are dead last for me

You're all weird, is what you are.

I wonder if this was an issue with the GPU's themselves, or a component? Look what happened to the 30 series at launch with the MOSFET issue
So does the toilet water rotate in the opposite direction? Do you drive on the wrong side of the road? ;)

Honestly, I was hoping Intel would come to the world of GPU's and help keep everyone on point.

They were probably bribed to not finish :D
I disagree, intel wouldn't have caused gpu prices to change as they would have marked theirs where both AMD and nvidia have theirs, due to the time they engineered this trash just to recover their losses.

This to me is turning into another i740, or larrabee-crappy drivers as proven by the intel igps, aka a never start/ Dead on Arrival
 
Last edited:
Intel want intel only builds from OEMs - no third party wifi, ethernet, graphics, etc.,

I'm guessing they don't have enough GPUs to do so with whatever fault has popped up, and want those cards meant for retail to fill in the gap?
 
Since nothing is mentioned about this supposed QC issues, it may be too early to conclude it’s a bad chip. It could be reference design that needs to improve, but hey, there were issues with Ampere release as well if you recall. Considering that Intel is inexperienced in designing dGPU, I don’t think they will deliver a flawless product. Objectively, ARC seems to be in a mess now, but seriously, nobody including Intel should expect flawless success in the first attempt. Deep pockets only mean more people working on the R&D and development, more money for cutting edge technology and marketing. But that does not mean they WILL get it right. Veterans like Nvidia and AMD have stumbled before and expected to stumble at some point.
I do hope Intel keep up the tempo because I feel they have crossed a significant milestone to deliver a working dGPU where I think performance may not be great, but still decent, letdown mostly due to driver stability which should not be impossible to fix.
 
Too many delays killed it. This had to be released when there was a shortage of GPU due to the cryptocurrency rush. Now is just too late unless they can price it right, but ROI might not allow this.
 
Fury and Vega were his “Hype” Neither lived up to it. Yes I still owned both…

its so weird to me, I was perfectly active in the space when those came out but never saw this "hype", never saw things being overpromised it or whatever.
I always thought and still think the Fury X is one of the coolest cards ever release, small, black/stealthy, watercooled, HBM memory and with that always the exception when a game said it needed 6 gb of Vram, they always added that the Fury X would work as well because of its fast memory.

and Vega 56 atleast was a superior price/performance card compared to Nvidia's equivalent, yes the 64 was not worth what they asked sure but the 56 was a fine option.

so I never understood or understand this retroactive negativity it gets.
 
Low quality post by Jimmy_
Raja!Raja! he is living in a bubble
 
Intel needs to broken into two companies one designing chips and the other that runs the fabrication. The whole design and fab thing STILL isn't working for them.

If they want to continue on this course they will end up relying on low margin high quantity products to make money. The high end desktop and server is being consumed by Apple ARM AMD and Nvidia.

I seriously doubt Intel will be able to get their shit together.
 
Everybody loses, i don't get all the happiness
 
Shit, Vega was his best work!!.
You might be right there. Not sure if that's a positive remark or not though :D

1,5~2 years too late, zero profit margin, built on rare HBM and no future roadmap in gaming ;)
Sounds remarkably like Arc, sans HBM. Heck it even shares the early availability woes!

Everybody loses, i don't get all the happiness
There wasn't anything to lose with Intel, we didn't have anything yet. Except Raja showing us chips. And more chips.
 
Well, this time it got further than larrabee
 
its so weird to me, I was perfectly active in the space when those came out but never saw this "hype", never saw things being overpromised it or whatever.
I always thought and still think the Fury X is one of the coolest cards ever release, small, black/stealthy, watercooled, HBM memory and with that always the exception when a game said it needed 6 gb of Vram, they always added that the Fury X would work as well because of its fast memory.

and Vega 56 atleast was a superior price/performance card compared to Nvidia's equivalent, yes the 64 was not worth what they asked sure but the 56 was a fine option.

so I never understood or understand this retroactive negativity it gets.
You must have had some pretty big blinders on



This wasn't just a GPU man. This was 'an uprising' - 'a revolution', it was the mobilization of gamers worldwide to finally show team green what's what.
This was the culmination of efforts post RX480, Raja's favorite dropped baby:

Wow look at all of you bashing Intel! If Intel didnt try to rush to market before the mining bubble collapsed everyone would praising Raja.

But since we got the Pin The Tail On The Donkey game out....Intel still employs Raja?
Hang on.. this was Intel 'rushing to market'? :oops:
 
There wasn't anything to lose with Intel, we didn't have anything yet. Except Raja showing us chips. And more chips.

We lost a possible competitor, i doubt they will continue the bet on gpu for much longer being pressed by the financial results. Either Gelsinger gets fired or he will can it to keep his job
 
its so weird to me, I was perfectly active in the space when those came out but never saw this "hype", never saw things being overpromised it or whatever.
I always thought and still think the Fury X is one of the coolest cards ever release, small, black/stealthy, watercooled, HBM memory and with that always the exception when a game said it needed 6 gb of Vram, they always added that the Fury X would work as well because of its fast memory.

and Vega 56 atleast was a superior price/performance card compared to Nvidia's equivalent, yes the 64 was not worth what they asked sure but the 56 was a fine option.

so I never understood or understand this retroactive negativity it gets.

Even tho your Fury or Vega 56 worked good as a graphics card, it was originally designed as a compute based card. Yes both excelled when you threw in computational workloads, but performed bad to worse when games where up. At sometimes quite more power then the competition (Nvidia). It was a card that was basicly wasting quite some resources to archieve the same. The Polaris was a good 1080p card, but a clear demonstration that the card was pretty much clocked at such levels that it went past it's efficiency in order to compete with the 1060.

They excel at mining, but they had issues with for example DX11 titles. Nvidia was just faster and it took AMD quite some driver revisions to get it on par.

Now the exact same thing is happening with Intel and it's arc lineup. They are all left-overs from a computational graphics card line. They do provide graphics but usually at 50% higher power consumption compared to the rest. On top of that there's quite some driver issues going on which just makes the card a complete gamble when you buy and try it. The delays where simple revisions because the card just did'nt perform as expected, or did'nt meet quality targets. Raja is'nt a bad individual, i just think he needs to stop doing the obvious because it's not working.

AMD redesigned their GPU completely with RDNA and with succes. Raja left prior and it was proberly the best thing for them to let him go.
 
The Polaris was a good 1080p card, but a clear demonstration that the card was pretty much clocked at such levels that it went past it's efficiency in order to compete with the 1060.

that's not true, the 480 barely consumed more power then the 1060

 
Last edited:
Hang on.. this was Intel 'rushing to market'?
I was leaning towards coincidence since it wasnt that obvious but, the timing of events won out.
 
You must have had some pretty big blinders on



This wasn't just a GPU man. This was 'an uprising' - 'a revolution', it was the mobilization of gamers worldwide to finally show team green what's what.
This was the culmination of efforts post RX480, Raja's favorite dropped baby:


Hang on.. this was Intel 'rushing to market'? :oops:
Kinda funny that your calling out the 7 year old rx580, yet, it beats this.
And it and the Vega are still viable to use, as opposed to the arc A###.

Not a dig just an observation.
 
My take is the same as with any other Arc-related article: let's not hype it, let's not bury it - let's wait and see.
 
"All of this suggests that the new GPU lineup is on the verge of extinction, even before it has launched. However, we are sure that the market will adapt and make a case for the third GPU maker."

???
 
We lost a possible competitor, i doubt they will continue the bet on gpu for much longer being pressed by the financial results. Either Gelsinger gets fired or he will can it to keep his job
For as long as they put financial guys in top level technical positions the results will always be the same. AMD and nvidia both have Lisa and Mr. Elon-wannabe Huang (ironically, relatives), but those guys are engineers first and then good businespeople (did that come PC enough?).

Intel on the other hand does the opposite. Put some big duck swingers with attitude and see the results. I honestly wonder how their board allows for this? It looks to me that the board is none the wiser.
 
Since nothing is mentioned about this supposed QC issues, it may be too early to conclude it’s a bad chip. It could be reference design that needs to improve, but hey, there were issues with Ampere release as well if you recall. Considering that Intel is inexperienced in designing dGPU, I don’t think they will deliver a flawless product. Objectively, ARC seems to be in a mess now, but seriously, nobody including Intel should expect flawless success in the first attempt. Deep pockets only mean more people working on the R&D and development, more money for cutting edge technology and marketing. But that does not mean they WILL get it right. Veterans like Nvidia and AMD have stumbled before and expected to stumble at some point.
I do hope Intel keep up the tempo because I feel they have crossed a significant milestone to deliver a working dGPU where I think performance may not be great, but still decent, letdown mostly due to driver stability which should not be impossible to fix.
You make the most sense here compared to all the other comments.
 
Back
Top