• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Reorganises its Graphics Chip Division, Raja Koduri Seemingly Demoted

it is widely accepted that AMD has better image quality
Uhhh.... No? Now what you could say with DLSS, at the same image quality Nvidia has much better FPS, though ofc with games that have FSR 2.X that lead is reduced. Same with Ray-tracing. I'll give AMD that they tend to age better due to improvements with drivers over time. AMD is great, but you'd think that professional reviews, members in the tech community, Tech Jesus and everyone else would criticize, or at least point out, if there is any difference in image quality.

I'm not sure why you would use "widely accepted", when if you ask 100 tech-savvy or IT or whatever group of people you want, if they think there is a difference in image quality between AMD or NVIDIA, you'll get more NVIDIA votes than AMD and 90+% no difference ones...
 
They are losing lots of money on this, plus in no way they can come close to nVidia or AMD, better kill it and move on to something else.
Were they expecting to strike gold just after one product release? if yes that was a very unrealistic expectation.

Optane was quit too early what a wasted opportunity that was and if they do the same here, people wont invest in first gen Intel products as they might become abandonware quickly.

All they needed to do was loss lead with a very attractive price point as well as wait until drivers were ready so it had a better first impression, but the entire ARC thing smacks of impatience from the top.
 
I hope they continue to pursue GPUs but this restructuring sure doesn't look that way. It looks like the first steps in dissolving the entire division.
 
How many transistors are the critics here in charge of?

How many transistors are in an Arc GPU? About 21.7 billion? Who here has managed 20+ billion of anything? Who here has managed 200+ of anything? Store manager with 2 employees? Part time at a pizza shop?

It's always easy to throw criticisms and insults while standing outside looking in while making zero contribution to anything.

And your point? Whether he was in charge of a project with 2 person team or 5000 people doesn't change the fact that he over promised and under delivered, nor was this a one off thing.
 
And your point? Whether he was in charge of a project with 2 person team or 5000 people doesn't change the fact that he over promised and under delivered, nor was this a one off thing.

Nah the over-promising and under-deliver is just AMD thing, like what they are doing with 7900XTX without Raja

Raja has been quietly working at Intel, hopefully they can make some breakthrough soon, the GPU market need some competent competitor to Nvidia
 
They are losing lots of money on this, plus in no way they can come close to nVidia or AMD, better kill it and move on to something else.
The A770 is around a 3060Ti in performance, cheaper, more Vram for texture packs (and 3d rendering) and in early days on driver development.

I am getting myself one in January.
 
Hi,
Intel and software/ firmware has always been pretty bad
But Raja well when with asus fried lots of haswell-e and broardwell-e chips and boards and just said it was because of weak chips on ROG forum :laugh:

Can his ego handle it ?

I had a Intel NIC sporadically dying on a good working Asus AM4 board. Once it dissapeared it took the whole board with it 2 weeks later. Incredible how such hardware is'nt tested properly before being put out.

As for Raja; he's still a key in designing compute based hardware. Thats what that whole line of Intel Graphics card is really. It's just derivative from compute hardware that did'nt meet quality guidelines as a compute or professional card. Just like Vega and Instinct or Geforce and Quadro.
 
The A770 is around a 3060Ti in performance, cheaper, more Vram for texture packs (and 3d rendering) and in early days on driver development.

I am getting myself one in January.

It also has in some ways higher hardware capabilities which may be potentially unlocked in future driver and software updates.

On the other hand, Intel might announce they are exiting discrete gaming cards, which would mean you bought an abandonware, not functioning with new games etc...

It's a bit of a blow A770 isn't even on TechPowerUP review charts of RTX 4080, RX 7900 XT - because they only extend down to RTX 3070, RX 6800. I hope it will be included in incoming midrange card reviews.
 
Did you really trigger from that opinion? It's called freedom of speech.


Could you provide us scientific proofs?
Don't really think Nvidia is that stupid.
They have done it before, it's just been a bit, people are quick to forget.
GTX275 era. It is also not about stupidity but pride.

Huh, it appears they have done it atleast 3 times over the decades, I was only aware of the one I experienced directly on the GTX275 which made me sell it and go back to my 4850.
I use a 3080ti currently btw.


That said, this is about Intel, and currently ARC doesn't perform well enough to merit it's use in modern games, and doesn't properly support old games... and requires rebar sooo.
 
Last edited:
Starts his own firm, or joins as an executive in a smaller, less known company?!



They are indirectly on topic.
Given the current state of affairs at AMD, maybe Lisa Su should follow because of the terrible graphics decisions... :rolleyes: Lower market share, dark forecast, bad product lineup, etc...
Dude, Lisa Su has brought stability to the cpu division !! The reason you can buy $150 R5 5600 6 core CPU is partially because of her and what the rest of the team at AMD has done since 2017. Her track record in terms of execution , release schedule, and even performance has been spot. Yes they gambled on Vega & HBM and lost but the 7900XTX isn't a failure. I think / hope prey they are on the cusp of coming up to par with Nvidia next gen. Put some respect on Lisa name !! Lol
 
That's not how it works, the drivers needs to be WHQL certified

But most of the drivers that AMD releases are NOT WHQL certified. Which opens another question - for exactly is this certification for and does AMD want to comply with it? :D
 
Ive been using Arc for 1 month now and the drivers are abysmally bad and updates are not nearly coming at a quick enough pace. Software is the only thing that will save the GPU division and it looks grim.

Now you understand why Dr. Lisa Su fired him, he is pathetic.

Now one can see why AMD's graphics cards have always been playng second fiddle to NVIDIA. Why Intel thought it was a good idea to bring him on board, I don't know. Perhaps they should poach engineers off NVIDIA if they can and then make something decent? Intel graphics card releases have been a disaster and not worth the materials they're made with.

I'd be embarrassed to release something as dysfunctional as that and would make an excuse, any excuse, not to release it.

Hope he gets better soon, though. One should still wish people better health regardless, especially at Christmas.
 
Were they expecting to strike gold just after one product release? if yes that was a very unrealistic expectation.

Optane was quit too early what a wasted opportunity that was and if they do the same here, people wont invest in first gen Intel products as they might become abandonware quickly.

All they needed to do was loss lead with a very attractive price point as well as wait until drivers were ready so it had a better first impression, but the entire ARC thing smacks of impatience from the top.
They were making GPUs forever now, drivers shouldn't have been an issue, and native support for DX11/10/9. They didn't fail where we expected them performance, but on basic stuff. Killing it makes total sense, as they'll never compete with nVidia or AMD on performance or price, as they just lower prices for previous generation of GPUs and crush anything Intel comes up with.
The A770 is around a 3060Ti in performance, cheaper, more Vram for texture packs (and 3d rendering) and in early days on driver development.

I am getting myself one in January.
Price wise it is the same as a 3060ti/6700XT, while destroyed by both.
 
They were making GPUs forever now, drivers shouldn't have been an issue, and native support for DX11/10/9. They didn't fail where we expected them performance, but on basic stuff. Killing it makes total sense, as they'll never compete with nVidia or AMD on performance or price, as they just lower prices for previous generation of GPUs and crush anything Intel comes up with.

Price wise it is the same as a 3060ti/6700XT, while destroyed by both.
3d rendering is an important part of my workflow - and with that - the 16gb matters.
 
3d rendering is an important part of my workflow - and with that - the 16gb matters.
Intel really cant do anything with there 3D rendering cause cuda is the industry standard open cl isnt really supported by anyone and the people who do support it the performance isnt as good as on cuda. AMD , Intel and apple need to push cuda support by paying software devs to support it.
 
Wow! We can just feel the love in this thread..

Personally, I think Raja has done good. He took Intel's GPU efforts from passable IGPs, to full on and competitive world class discrete GPUs. They managed that in the middle of a pandemic and economic recession. And all many of you can do is whine, complain and throw insults?

Sorry folks, but that's a fail for YOU, not Raja and his teams. Take a step back, use your brains from something more than a seat cushion and see reality for what it is! :shadedshu:
 
Wow! We can just feel the love in this thread..

Personally, I think Raja has done good. He took Intel's GPU efforts from passable IGPs, to full on and competitive world class discrete GPUs. They managed that in the middle of a pandemic and economic recession. And all many of you can do is whine, complain and throw insults?

Sorry folks, but that's a fail for YOU, not Raja and his teams. Take a step back, use your brains from something more than a seat cushion and see reality for what it is! :shadedshu:

I also think his actual job (as an engineer) was a good one, but he was also the head of the project and we can't forget the rest, the drivers and the communication was a disaster. Intel (that is in a bad spot) burned a lot of money on the project, you'd expect a bit more from it.
 
Lol, what is wrong with some people in this thread? I'd hate to be in a room with half of you trying to have a civil conversation. A lot of sad boys at arm chairs throwing rocks at a giant. Raja could out engineer and out manage any one of you, the fact that he has the chops to be a lead architect, and releases a product AT ALL is astounding. I love that people are just making stuff up like "oh he got fired", just cause you don't like him and that's what you wished happened.

Then we have the tin foil hat image quality argument, that apparently started as a "joke", but when people say that later it's usually just to save face. "nvidia did it 20 yers ago! I dun trust em!". There's no image quality difference between the two manufacturers, and anything that would produce image quality savings these days wouldn't give an appreciable performance boost (unless we're talking DLSS vs FSR, which no doubt the goal posts will now be moved to). Which goes to show how little most people understand about how far GPUs have moved in the last 10-20 years. Anisotropic filter cheating isn't what it used to be, and you can't cheat floating point precision in your shaders like you used to, there are hard standards you have to conform to to be DX12 certified. Even then, GPU testing is even MORE rigorous now then it used to be, much more empirical, any image quality difference would be reported far and wide through scientific testing.

Go back under your brigdes.
 
Salty people will salt. That's the way it is.
Forums will see occasional rant`s.
Flamers will flame. They don`t know otherwise.

I need to Haiku this.
 
His batting average hasn't been very good, but hopefully he can hit a homerun this time.
 
Intel really cant do anything with there 3D rendering cause cuda is the industry standard open cl isnt really supported by anyone and the people who do support it the performance isnt as good as on cuda. AMD , Intel and apple need to push cuda support by paying software devs to support it.
You seem to know better than the owner of a product who has chosen it for certain reasons, what is better for it? As far as I know, Intel discrete graphics driver issues are more related to their performance when playing PC games. I have some ideas that in a workflow with content creation and editing programs, the performance is better.
 
I disagree. No they weren't perfect, but no one expected them to be. For an introductory product, they not only got alot right, but they have greatly improved and optimized what they didn't.

Software was a disaster. They clearly over promised on the gpu side of things, performance, but especially the drivers that were not ready for what we were told the product would be, and the old DX9 and 10 compatibility should never have been a surprise for us to uncover like a easter egg. It was all very disastrous and a brand PR nightmare.
 
I suspect nvidia is cheating on the drivers level, providing lower image quality for higher framerates.

While on this topic I feel like this I'd an important distinction....as consumers, our entire capability to compare videocards is based upon the assumption that all frames are created equally. It's the common unit of measurement that allows us to compare across brands, BUT with the increasing encroachment of software manipulation with DLSS, FSR, and with Nvidia's frame generation as the most egregious form, can we truly rest assured that a frame generated by an Nvidia card and an AMD card are exactly the same?

If they are not, and this may sound dramatic, then basically every review based upon the FPS unit of measurement (which is literally every review) is worthless. Obviously the average consumer doesn't have the capability to ensure that the frames are equivalent, so if some professional reviewers would tackle this issue, I feel like it would of paramount importance to the entire community going forward.
 
Back
Top