Monday, February 18th 2019

Intel Acquires Indian Startup to Strengthen Position in Discrete GPU Tech

Several years ago, Ineda, a small startup from Hyderabad, India made headlines when they developed custom-design processors for use in wearable devices that were optimized for high energy-efficiency, while still having the ability to read out various sensors or listen to voice commands at the same time. Such improvements help increase battery life on devices that people don't want to recharge every day. Over the years the company has received several million dollars in funding from Samsung, Qualcomm, Imagination Technologies and others.

Looks like this caused enough attention at chip giant Intel, who's trying to come up with a competitive design for a discrete graphics processor, that's able to take on AMD's and NVIDIA's offerings. While Ineda certainly has patents that could come in useful, it looks like Intel is more interested in the company's manpower. With around 100 engineers, the company has a lot of talent, that's experienced in chip design and how to make these chips energy efficient.
Besides the obvious implications for mobiles devices, energy efficiency is the #1 constraint these days when it comes to graphics performance, since power generates heat, which requires bigger and more expensive heatsinks, and drives up fan noise levels. NVIDIA learned this after GTX 480, and has since made constant improvements in this sector, which is the foundation for their market dominance today.

Ineda Systems founder Dasaradha Gude used to be managing director of AMD India, which suggests some ties with Raja Koduri, who used to be head at AMD's Radeon Graphics department and moved to Intel not long ago, to start up their discrete GPU project. This makes us speculate that Raja, who's also from Hyderabad originally, was instrumental in making this buy-out happen.

Times of India reached out to Intel, which confirmed the all-cash purchase of Ineda, but didn't provide many additional details:
Intel acquired engineering resources from Ineda Systems, a silicon and platform services provider based in Hyderabad. This transaction provides Intel with an experienced SOC (system on chip) team to help build a world-class discrete GPU business
Intel has a lot of software development staff in Hyderabad, and is looking to expand that to several thousand engineers in the coming years. This is good news for the engineers, as they won't have to move, and it helps keep cost down because no relocations are needed. Source: Times of India
Add your own comment

36 Comments on Intel Acquires Indian Startup to Strengthen Position in Discrete GPU Tech

#1
R0H1T
Interesting, wonder if BTA had any prior inside scoop?
Posted on Reply
#2
LocutusH
That render... i already want one
Posted on Reply
#3
Zubasa
LocutusH, post: 3996764, member: 150559"
That render... i already want one
As I remember this render was fan made, not from Intel.
Posted on Reply
#5
64K
I don't get hyped about anything anymore but I am certainly looking forward to what Intel brings next year in gaming GPUs. I hope they will bring a lineup at more reasonable prices than Nvidia because Nvidia badly needs the competition to come back down to earth. If Intel doesn't bring that then I haven't lost anything because right now we have AMD struggling due to lack of R&D money for GPUs and Nvidia overcharging just because they can imo.
Posted on Reply
#6
dorsetknob
"YOUR RMA REQUEST IS CON-REFUSED"
Do i smell a ATIntel brand in the offering :) Very Soon
Posted on Reply
#7
XXL_AI
intel already lost the cpu war, all of the operating systems shifting from complex instruction sets to dedicated ARM chips.

they are trying to catch the GPU train but it is too late for them. even if they release an expensive FPGA at a loss, it will be a huge problem for them to solve.
Posted on Reply
#8
Steevo
XXL_AI, post: 3997039, member: 182013"
intel already lost the cpu war, all of the operating systems shifting from complex instruction sets to dedicated ARM chips.

they are trying to catch the GPU train but it is too late for them. even if they release an expensive FPGA at a loss, it will be a huge problem for them to solve.
What crack pipe are you hitting? Intel lost the CPU war, I lost an ARM..... AMD they lost a leg. It's terrible this war, Intel had to layoff millions... The $USR is needing some CPU time, c'mon babe, just a little taste, a few extra cycles on that CISC with vector table lookup.
Posted on Reply
#9
XXL_AI
Steevo, post: 3997048, member: 19251"
What crack pipe are you hitting? Intel lost the CPU war, I lost an ARM..... AMD they lost a leg. It's terrible this war, Intel had to layoff millions... The $USR is needing some CPU time, c'mon babe, just a little taste, a few extra cycles on that CISC with vector table lookup.
you need to feel the reality stone impact on your forehead.
Posted on Reply
#10
Steevo
XXL_AI, post: 3997050, member: 182013"
you need to feel the reality stone impact on your forehead.
Is that like crack rock? Or more like pop rocks and soda?

I bet ARM could throw a football clear over those mountains. Like Uncle Rico. Just needs the chance. Man, if Intel didn't stop every other architecture from sucking at real world work, we could get things done!!!
Posted on Reply
#11
XXL_AI
Steevo, post: 3997057, member: 19251"
Is that like crack rock? Or more like pop rocks and soda?

I bet ARM could throw a football clear over those mountains. Like Uncle Rico. Just needs the chance. Man, if Intel didn't stop every other architecture from sucking at real world work, we could get things done!!!
funny kiddo. go ahead and play at another park.

there are tons of hardware coming with extended battery life and very good connectivity.
Posted on Reply
#12
Steevo
XXL_AI, post: 3997080, member: 182013"
funny kiddo. go ahead and play at another park.

there are tons of hardware coming with extended battery life and very good connectivity.
Is very good what we aim for anymore? How about OKish, cause when I want video games that look great I don't go much beyond OKish. Risc works well at one thing, reduced instruction computing it's even kinda in the name if you squint and look, and when put out in the real world it fails if asked to do anything beyond what it's meant for. Kinda like we don't use knives to eat soup, all the accelerated "stuff" people like to have like high def video, fast processing, high fps, eye candy, requires more silicon switches, and ARM cores are becoming more complex, but still suck compared to X86-64 for average compute loads.

But tell me how Intel and AMD can even sell a single CPU in the face of the monster of cell phones and cheap tablets for kids?
Posted on Reply
#13
64K
Steevo, post: 3997101, member: 19251"
Is very good what we aim for anymore? How about OKish, cause when I want video games that look great I don't go much beyond OKish. Risc works well at one thing, reduced instruction computing it's even kinda in the name if you squint and look, and when put out in the real world it fails if asked to do anything beyond what it's meant for. Kinda like we don't use knives to eat soup, all the accelerated "stuff" people like to have like high def video, fast processing, high fps, eye candy, requires more silicon switches, and ARM cores are becoming more complex, but still suck compared to X86-64 for average compute loads.

But tell me how Intel and AMD can even sell a single CPU in the face of the monster of cell phones and cheap tablets for kids?
I may be misunderstanding you but Intel and AMD sell tens of millions of CPUs for desktops and laptops every year. I read in one report that PC sales actually increased last year for the first time in 6 years. For example where I work we have around ten thousand desktops and laptops combined. That's just a conservative guess. It's probably more. The desktops are i3 2120s. It's a cheap CPU but it's plenty for MS Office.

Tablets are fine for surfing, email, videos etc but not so great for large Excel spreadsheets for one thing so I don't see Intel or AMD having any trouble selling CPUs any time in the near future.
Posted on Reply
#14
XXL_AI
Steevo, post: 3997101, member: 19251"
Is very good what we aim for anymore? How about OKish, cause when I want video games that look great I don't go much beyond OKish. Risc works well at one thing, reduced instruction computing it's even kinda in the name if you squint and look, and when put out in the real world it fails if asked to do anything beyond what it's meant for. Kinda like we don't use knives to eat soup, all the accelerated "stuff" people like to have like high def video, fast processing, high fps, eye candy, requires more silicon switches, and ARM cores are becoming more complex, but still suck compared to X86-64 for average compute loads.

But tell me how Intel and AMD can even sell a single CPU in the face of the monster of cell phones and cheap tablets for kids?
I'm not sure you and I living in the same era. most of the task on the computers now being handled by gpu acceleration, even web browsing. for that you don't need xeon, any arm cpu can handle a lot of task. for video editing, again the same story, for gaming? it is easy, not a rocket science to port the game instructions to run on ARM. start with NEON.
Posted on Reply
#15
bogmali
In Orbe Terrum Non Visi
Drop the namecalling please cause it is ruining the good conversation.
Posted on Reply
#16
Steevo
XXL_AI, post: 3997131, member: 182013"
I'm not sure you and I living in the same era. most of the task on the computers now being handled by gpu acceleration, even web browsing. for that you don't need xeon, any arm cpu can handle a lot of task. for video editing, again the same story, for gaming? it is easy, not a rocket science to port the game instructions to run on ARM. start with NEON.
Nah, 90% of the work is still CPU serial work. I use my GPU for folding at home and got a paper that tells me I know about computers that cost a lot to get. Maybe you should go design a ARM risc CPU to beat everyone, but that would make it a CISC architecture, either way, show us how it's done since you know better.
Posted on Reply
#17
Tartaros
XXL_AI, post: 3997131, member: 182013"
I'm not sure you and I living in the same era. most of the task on the computers now being handled by gpu acceleration, even web browsing.
You sure about that? Just because you can drop specific tasks on gpu doesn't mean you can go full risc and rely on gpu to do serious things. If that was the case then there wouldn't be so many fallen competitors from the risc side and powerpc would have survived in consumer grade products.

Just don't go that way. Sure ARM has evolved a lot but still it can do what it can do.
Posted on Reply
#18
XXL_AI
I invite you to take a look at Nvidia Jetson systems. you know, they can autonomously ride a car better than any pc worldwide.
Posted on Reply
#19
R0H1T
XXL_AI, post: 3997174, member: 182013"
they can autonomously ride a car better than any pc worldwide
Say what :wtf:
Posted on Reply
#20
Tartaros
XXL_AI, post: 3997174, member: 182013"
I invite you to take a look at Nvidia Jetson systems. you know, they can autonomously ride a car better than any pc worldwide.
Comparing apples to oranges won't get you anywhere.
Posted on Reply
#21
XXL_AI
R0H1T, post: 3997176, member: 131092"
Say what :wtf:
"What" is this; in the end of the day, everything can be done with software optimizations.

We're dedicated ourselves to run our applications on AARCH64 platform and we don't need x86_64 anymore. Because what we can do at a 30W power budget at a $1200 system is far more superior in the means of "realtime" processing compared to any PC on the planet.

Tartaros, post: 3997177, member: 76683"
Comparing apples to oranges won't get you anywhere.
Not comparing the two. you are so focused on using the hardware available to you, you stopped thinking about any other possiblity.
Posted on Reply
#22
dorsetknob
"YOUR RMA REQUEST IS CON-REFUSED"
XXL_AI, post: 3997174, member: 182013"
I invite you to take a look at Nvidia Jetson systems. you know, they can autonomously ride a car better than any pc worldwide.
link please or go home
Posted on Reply
#23
R0H1T
Because those are specialized apps & chips used for one particular task, in autonomous(?) cars. It's like saying AMD is better than Intel because they do encryption better.
Posted on Reply
#24
XXL_AI
dorsetknob, post: 3997179, member: 8331"
link please or go home
can you walk home or do you want me to summon my autonomous car to drop you off?
https://en.wikipedia.org/wiki/Tesla_Autopilot#Hardware_2


R0H1T, post: 3997181, member: 131092"
Because those are specialized apps & chips used for one particular task, in autonomous(?) cars. It's like saying AMD is better than Intel because they do encryption better.
nope, not even close to one particular task. there are around hundred examples that can be used with jetson processors.

follow the white rabbit;
https://elinux.org/Jetson_TX1
https://elinux.org/Jetson_TX2
https://elinux.org/Jetson_AGX_Xavier

I'm sure of one thing, you common-computer-fanboys won't understand what I'm talking about.
https://hackaday.com/2015/11/24/the-nvidia-jetson-tx1-its-not-for-everybody-but-it-is-very-cool/
https://hackaday.com/2017/03/14/hands-on-nvidia-jetson-tx2-fast-processing-for-embedded-devices/

a few of our examples include, firewalls, routers, signal generators, software-defined-radios, multiple audio video stream capture & upload hardware, single or multiple 4K capture hardware, VR systems.. and yeah, of course, autonomous cars.
Posted on Reply
#25
prtskg
64K, post: 3997121, member: 148270"
I may be misunderstanding you but Intel and AMD sell tens of millions of CPUs for desktops and laptops every year. I read in one report that PC sales actually increased last year for the first time in 6 years. For example where I work we have around ten thousand desktops and laptops combined. That's just a conservative guess. It's probably more. The desktops are i3 2120s. It's a cheap CPU but it's plenty for MS Office.

Tablets are fine for surfing, email, videos etc but not so great for large Excel spreadsheets for one thing so I don't see Intel or AMD having any trouble selling CPUs any time in the near future.
I think he was being sarcastic.
Posted on Reply
Add your own comment