• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Reorganises its Graphics Chip Division, Raja Koduri Seemingly Demoted

Ive been using Arc for 1 month now and the drivers are abysmally bad and updates are not nearly coming at a quick enough pace. Software is the only thing that will save the GPU division and it looks grim.
Yep. Intel doesn't do so well software-wise and never has.
Now you understand why Dr. Lisa Su fired him, he is pathetic.
There's no doubt.
Hi,
Intel and software/ firmware has always been pretty bad
Yep. The worst that I'd ever seen.
But Raja well when with asus fried lots of haswell-e and broardwell-e chips and boards and just said it was because of weak chips on ROG forum :laugh:

Can his ego handle it ?
He doesn't have much choice, eh? :laugh:
I suspect nvidia is cheating on the drivers level, providing lower image quality for higher framerates.
They've been caught doing that before.
Where are all the defenders of Raja, saying he is awesome?
He has defenders? Well, if people will defend Jensen Huang, I guess it's not too much of a stretch. :D

Where is he gonna go now, I can't imagine the offers are filling his inbox. lol
Well, there's always Matrox! :D
picto-building.jpg
 
Last edited:
Ah yes, Raja Koduri! He continues his tradition of overblown claims followed by abysmal failures.

His work on Vega got him kicked from ATi and now his work on ARC has him demoted by Intel. The fact that Intel is keeping him shows how desperate they are.
He'll go down in history as the chip engineer with the most creative and innovative designs that didn't work.
 
He'll go down in history as the chip engineer with the most creative and innovative designs that didn't work.
Yup. It's kinda hard to feel sorry for him because you know that despite all of his failures, he's still making major bank. I wish that someone would pay me that well to fail at everything! :roll:
 
what we can only refer to as—demoted, given he's back to being chief architect rather than being in charge of the AXG business unit.

Curious. I dipped my toes into management for a few years and went back for engineering because I hated it. Not sure I’d call it a “demotion” they are two different ladders at most big boy tech companies.

“Management “ isn’t the crown of the corporate ladder lol
 
Curious. I dipped my toes into management for a few years and went back for engineering because I hated it. Not sure I’d call it a “demotion” they are two different ladders at most big boy tech companies.

“Management “ isn’t the crown of the corporate ladder lol
To me, whatever job pays the most is the crown of the corporate ladder. It's probably completely dependent on which company we're talking about.
 
To me, whatever job pays the most is the crown of the corporate ladder. It's probably completely dependent on which company we're talking about.

Not really FAANG companies for example along with other big tech companies and even ones you might not think are so big like Micron or like Logitech certainly use different ladders.

These places aren’t McDonald’s . I certainly get paid more than managers above me and other principal engineers and architects get massive checks.
 
Let's hope nvidia takes him on board :D

Nah, no need. They can pay anyone to stand around smiling holding a piece of silicon.
 
Nah, no need. They can pay anyone to stand around smiling holding a piece of silicon.

To be fair to the jacket, it probably is a really nice jacket.
 
You have to look at reality, Intel couldn't compete with 2 years old GPUs (the 770 is similar to the 3070 in size), Intel came late to the party and knows that they won't be able to compete for several generations, sales will be abysmal unless sold at loss (as it's probably the case with the 770).

iGPUs do count, they out to HDMI DP DVI...etc, decode video and compatible with all DX versions hence play games. You could argue it's difficult to scale them but it seems Intel's new GPU division didn't use their expertise, performance could be excused but not the lack of compatibility with earlier DX versions and all the issues they have now are not.
This should have been known from start they wouldnt be competing on performance probably for at least 2 generations of products.

The DX and price stuff probably killed the market these GPUs could have been aimed at. I would recommend a $100 Intel GPU to someone who wanted to play old games.

Top end performance historically has only been a small part of the market.
 
You have to look at reality, Intel couldn't compete with 2 years old GPUs (the 770 is similar to the 3070 in size), Intel came late to the party and knows that they won't be able to compete for several generations, sales will be abysmal unless sold at loss (as it's probably the case with the 770).
The fact that the A770 didn't compete at the top end didn't surprise me. What I found lacking is its availability (which is practically non-existent still, months after release), its price, the terrible driver support, the high idle power consumption, and the inconsistent performance across different games. It does quite okay in some (especially with RT on), but completely tanks in others.

To be honest, I was more than ready to buy one. I couldn't care less that it performs at 2070-2080 level maximum, but the above things broke the deal for me.
 
Support of too old DX is stupid. Who wishes to play so old games must use his old PC if it is alive for that task. Instead of disposing of it as polluting e-waste.
 
The fact that the A770 didn't compete at the top end didn't surprise me. What I found lacking is its availability (which is practically non-existent still, months after release), its price, the terrible driver support, the high idle power consumption, and the inconsistent performance across different games. It does quite okay in some (especially with RT on), but completely tanks in others.

To be honest, I was more than ready to buy one. I couldn't care less that it performs at 2070-2080 level maximum, but the above things broke the deal for me.
How is availability non existent? There are like 5 different Arc cards on Newegg…
 
How is availability non existent? There are like 5 different Arc cards on Newegg…
Newegg... UK? Over here, there's literally one retailer that has it for £380.

Support of too old DX is stupid. Who wishes to play so old games must use his old PC if it is alive for that task. Instead of disposing of it as polluting e-waste.
I disagree. Just because a game is old, it doesn't mean I like it any less. Not having to mess with retro PCs just to play them is a must for me.
 
There are still quite a few people with old computers available. Your reasons are important in a your personal way. You cannot impose them on other people.
 
There are still quite a few people with old computers available. Your reasons are important in a your personal way. You cannot impose them on other people.
I don't. All I'm saying is, there are people out there who like playing both old and new games.
 
Hi,
I like the seemingly demoted part
He really got promoted because now failure blame is someone else's problem so back to team/ zoom meetings management :laugh:
 
There are still quite a few people with old computers available. Your reasons are important in a your personal way. You cannot impose them on other people.

yet you are doing the exact same.....are you ok man?
 
Support of too old DX is stupid. Who wishes to play so old games must use his old PC if it is alive for that task. Instead of disposing of it as polluting e-waste.
One of the best things about PC gaming is the ability to play old and new games, with old games getting a boost in performance with new hardware.

Personally, lots of the new games are rubbish, we have 16 cores desktop CPUs yet we still don't have a game that comes close to Supreme Commander.

As for e-waste, I have a 10+ years old laptop that still perfect for word processing, Excel, web browsing...etc (a 120gb SSD can bring everything back to life), instead of throwing your old stuff sell or give it away.
 
One of the best things about PC gaming is the ability to play old and new games, with old games getting a boost in performance with new hardware.

Personally, lots of the new games are rubbish, we have 16 cores desktop CPUs yet we still don't have a game that comes close to Supreme Commander.

As for e-waste, I have a 10+ years old laptop that still perfect for word processing, Excel, web browsing...etc (a 120gb SSD can bring everything back to life), instead of throwing your old stuff sell or give it away.


I remember Total Annilation it was so fun to build thousand of robots. I have not tried my hand on the spiritual successor supreme commander. I don't think games nowadays are as experimental and creative as the 90s to late 2000s with nowadays less room for that as company need to make profit.
 
Do people think it will take only one iteration for Intel to join AMD/Nvidia level of finition ?

Come on, it's a long term project, driver and hardware wise.

I think if you bought an intel GPU, you bought a "promise" too that they'd improve drivers over time, I think they will do that but it will require time.


Otherwise people mentioning Nvidia is cheating on image quality..don't know what to tell you, I mean it's not like we had tools to measure image quality and back up those kind of claims...

Price apart, the 7900XTX is awesome but so is the 4090, no need to use disinformation and lies to justify your preference, period.
 
I remember Total Annilation it was so fun to build thousand of robots. I have not tried my hand on the spiritual successor supreme commander. I don't think games nowadays are as experimental and creative as the 90s to late 2000s with nowadays less room for that as company need to make profit.

Turning a profit doesn't seem to be the problem. Companies nowadays are just hell-bent on minimizing risk and maximizing profit. It's no longer simple risk aversion. Even if a project has low risk and guaranteed profit, if it isn't enough of a profit it won't be entertained. Until that changes I don't believe MTXs, a direct symptom of this problem, will go away.
 
Last edited:
Otherwise people mentioning Nvidia is cheating on image quality..don't know what to tell you, I mean it's not like we had tools to measure image quality and back up those kind of claims...
Besides, who's not cheating one way or another? AMD still has their "AMD Optimized" Tessellation option which I think limits the number of tessellated edges to a certain number, yet no one complains about it.
 
Back
Top