• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Reorganises its Graphics Chip Division, Raja Koduri Seemingly Demoted

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
18,937 (2.51/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
Big things are afoot at Intel's graphics chip division once again, as the company has just broken up its Accelerated Computing Systems and Graphics (AXG) business unit which will result in some big changes. For starters, Raja Koduri has been—what we can only refer to as—demoted, given he's back to being chief architect rather than being in charge of the AXG business unit. Some of his staff will be moved to other business units inside Intel as the AXG business unit will cease to exist. This doesn't mean Intel will stop making discrete consumer GPUs, with at least the Battlemage/Arc B-series launch still being planned to take place sometime in 2023.

At the same time, it looks like Raja Koduri will be out of action for what is likely to be at least a month since he posted on Twitter that he's had emergency back surgery while on a business trip. How this will affect his transition back to his role as chief architect is anyone's guess at this point in time. However, he will not be focusing solely on GPUs in the future, but the broader range of products that Intel offers—particularly the integration of GPU, CPU and AI architectures at Intel. We've posted an official statement from Intel after the break, which Intel provided to Tom's Hardware. We also wish Raja a speedy recovery!




Discrete graphics and accelerated computing are critical growth engines for Intel. With our flagship products now in production, we are evolving our structure to accelerate and scale their impact and drive go-to-market strategies with a unified voice to customers. This includes our consumer graphics teams joining our client computing group, and our accelerated computing teams joining our datacenter and AI group.

In addition, Raja Koduri will return to the Intel Chief Architect role to focus on our growing efforts across CPU, GPU and AI, and accelerating high priority technical programs.

View at TechPowerUp Main Site | Source
 
"Raja Koduri demoted" what a surprise...
 
Raja the idiot
 
Last edited:
Ive been using Arc for 1 month now and the drivers are abysmally bad and updates are not nearly coming at a quick enough pace. Software is the only thing that will save the GPU division and it looks grim.
 
Ive been using Arc for 1 month now and the drivers are abysmally bad and updates are not nearly coming at a quick enough pace. Software is the only thing that will save the GPU division and it looks grim.
Now you understand why Dr. Lisa Su fired him, he is pathetic.
 
Hi,
Intel and software/ firmware has always been pretty bad
But Raja well when with asus fried lots of haswell-e and broardwell-e chips and boards and just said it was because of weak chips on ROG forum :laugh:

Can his ego handle it ?
 
Low quality post by Space Lynx
Speaking of Lisa Su... I don't know if I can resist that 7900 XTX at 999 if it comes back in stock in next 30 or so days... hell, it beats a 4090 in AC Valhalla, and that is a game I was wanting to play soon. Would be nice getting 165 fps at 165hz 1440p ultra. :D

WHAT DO YOU THINK BOYS????? SHOULD LISA SU TAKE US TO THE THE MOON?!?!?!?!

:rockout::rockout::rockout::rockout::rockout::rockout::rockout::rockout::rockout::rockout::rockout::rockout::rockout::rockout::rockout::rockout::rockout::rockout::rockout::rockout::rockout::rockout::rockout::rockout::rockout::rockout::rockout::rockout::rockout::rockout::rockout::rockout:
 
"!!he's had emergency back surgery while on a business trip.""
Occupational Hazzard for one who is bending over backwards to Avoid being Butt Shafted for a lacklustre GPU Launch/Product
 
Im not insulting another member so...
 
I suspect nvidia is cheating on the drivers level, providing lower image quality for higher framerates.

it is widely accepted that AMD has better image quality
 
Where are all the defenders of Raja, saying he is awesome?
 
Yes, it is a mystery why RTX 4090 is so much faster than RX 7900 XTX given pretty much the same resources on a hardware level.
Your comments are off topic, there's the door
 
Where is he gonna go now, I can't imagine the offers are filling his inbox. lol

Starts his own firm, or joins as an executive in a smaller, less known company?!

Your comments are off topic, there's the door

They are indirectly on topic.
Given the current state of affairs at AMD, maybe Lisa Su should follow because of the terrible graphics decisions... :rolleyes: Lower market share, dark forecast, bad product lineup, etc...
 
Yes, it is a mystery why RTX 4090 is so much faster than RX 7900 XTX given pretty much the same resources on a hardware level.

not a mistery, they reduce the image quality, we just established this
 
Someone is trying really hard... lmao
 
I suspect nvidia is cheating on the drivers level, providing lower image quality for higher framerates.

That's not how it works, the drivers needs to be WHQL certified, that means a lot of testing and conformance. So, no.

Nvidia has a lot of experience with optimizations per game, like 15+ years more experience than Intel on that domain.

Secondly, Nvidia uses a lot of power to get those high FPS, it's a power hungry GPU.
 
That's not how it works, the drivers needs to be WHQL certified, that means a lot of testing and conformance. So, no.

Nvidia has a lot of experience with optimizations per game, like 15+ years more experience than Intel on that domain.

Secondly, Nvidia uses a lot of power to get those high FPS, it's a power hungry GPU.
You literally just joined to freaking post that??
 
You literally just joined to freaking post that??
Yes I did.

I am aware that not everyone is a driver developer, and there are a lot of myths about that kind of thing. But there is no cheating, only lots of power and drivers optimized per game over years of iterations.

As for Intel, it was the first discrete GPU and it shows. I think they will get better over time, not only on the SW side but on the HW side. It's a decent GPU, once you start using 4k resolutions, it's clear that the GPU has performance, is just crippled by the driver CPU overhead, most likely because of HW workarounds and what not.
 
Yes I did.

I am aware that not everyone is a driver developer, and there are a lot of myths about that kind of thing. But there is no cheating, only lots of power and drivers optimized per game over years of iterations.

As for Intel, it was the first discrete GPU and it shows. I think they will get better over time, not only on the SW side but on the HW side. It's a decent GPU, once you start using 4k resolutions, it's clear that the GPU has performance, is just crippled by the driver CPU overhead, most likely because of HW workarounds and what not.
Erm no the i740 from Intel was their first AGP discreet 25-30 years ago. They still cant get their igp drivers right. Raja just added fuel to the fire by being an idiot.
 
Last edited:
Low quality post by Space Lynx
Yes I did.

I am aware that not everyone is a driver developer, and there are a lot of myths about that kind of thing. But there is no cheating, only lots of power and drivers optimized per game over years of iterations.

As for Intel, it was the first discrete GPU and it shows. I think they will get better over time, not only on the SW side but on the HW side. It's a decent GPU, once you start using 4k resolutions, it's clear that the GPU has performance, is just crippled by the driver CPU overhead, most likely because of HW workarounds and what not.

look boys!!! its Elon!!! he resigned from Twitter and is now hanging out with us on TPU.

cool. cool cool cool.
 
Back
Top