Wednesday, May 26th 2010

Intel Shelves Larrabee as a GPU Architecture

Intel has once again shelved plans to come back to the discrete graphics market, with the much talked about GPU codenamed Larrabee. In a recent post by Director, Product and Technology Media Relations Bill Kircos on the company blog, it was detailed that the company's priorities at the moment lie with releasing industry-leading processors that have the graphics-processing horsepower for everyday computing. The Intel HD graphics will only get better with the 2011 series of Core processors based on the Sandy Bridge architecture, where the iGPU core will be completely integrated with the processing complex.

An unexpected yield of the Larrabee program seems to be that Intel has now perfected many-core processors. Since Larrabee essentially is a multi-core processor with over 32 IA x86 cores that handle graphics workload, it could as well give Intel a CPU that is Godsent for heavy-duty HPC applications. Intel has already demonstrated a derivative of this architecture, and is looking to induct it into its Xeon series of enterprise processors.Source: Intel Blogs
Add your own comment

16 Comments on Intel Shelves Larrabee as a GPU Architecture

#1
btarunr
Editor & Senior Moderator
Many Thanks to DanishDevil for the tip.
Posted on Reply
#2
FordGT90Concept
"I go fast!1!11!1!"
"Perfected many-core processors?" More like gave up on it. Well, not "gave up" (again), but delayed. At this rate, Duke Nukem Forever will be out first. :roll:

Disappointing but the project was always a major leap of the imagination and the leap appears to be too great for Intel to want to brave.
Posted on Reply
#3
WhiteLotus
I thought this was known ages ago?

EDIT: In fact I recall a news thread about it here.
Posted on Reply
#4
toyo
I could be wrong, but my opinion is that Intel did some benchies with what Larrabee samples they had against latest 5800 and GTX 480 and probably realised there's no competition... priority my a**... they just can't get the job done properly, so now they say that it's not a priority any longer. Blah.
Posted on Reply
#6
Imsochobo
by: WhiteLotus
Yea here we go. From Dec. '09
they're dropping it all together now, it was for consumers it was dropped before :P

but yeah, epic fail this time around i have to say.
Posted on Reply
#7
FordGT90Concept
"I go fast!1!11!1!"
Nope, just like before, Larrabee isn't dead; it has just been shelved indefinitely (last time, it was shelved until 2010). I suspect a derivative will surface as a Tesla-like product but the GPU functionality of it isn't too likely anymore.


Let's be perfectly honest, Larrabee as a viable GPU product was always a long shot. GPUs are fast because they are fundamentally simple. CPUs (the basis from which Larrabee comes from) are fundamentally complex. The only way for complex to compete with simple is through much higher transistor counts. If Fermi was hot, a competitive Larrabee product would be positively melting. It's a shame Intel couldn't work their magic to make it happen but, ya can't beat physics at physic's game. Intel hoped they could simplify x86 enough to make a viable GPU and, as it turns out, it just can't happen.
Posted on Reply
#8
slyfox2151
not to troll,

but i hope that IGPU is a typo.

nuff with the I srsly.
Posted on Reply
#9
FordGT90Concept
"I go fast!1!11!1!"
"iGPU" should be IGP (Integrated Graphics Processor), as opposed to discrete graphics card that contains a GPU. It annoyed me too.
Posted on Reply
#10
Initialised
Whatever happened to the x86 everywhere concept?
Posted on Reply
#11
Fourstaff
I have a feeling that they developed 'bee so that they can have a snapshot on how the GPU works, the same way why AMD bought ATi. Hopefully we can shut off the internal graphics card if its not used to conserve power. I support an architecture of having 2 powerful processors aided by ~40 Atom-strength processors for heavily multhreaded apps.
Posted on Reply
#12
robn
I don't get Intel; they're so good at refining technology, but can't pull off brand new.

I.e. latest CPUs are Nehalem+ derived from Core, which came from Pentium M, which was a modified PIII, and that was an improved Pentium Pro. The chips perform excellently, so no reason not to do that, but possibly a whole new design could be even better. Of course they did try a new architecture with P4 Netburst ...which ended up losing ground and got canned, whilst AMD came up with x86-64bit and IMC on CPU die.

64bit Itanium also a mild disaster, so Intel now push x86 Xeons at business.

2 goes at powerful 3D graphics failed (Larrabee and i740 back in 1998).

Use of RDRAM ...needs no elaboration!

All I can think of for success and innovation are the SSDs and sales / marketing teams! I was looking forward to Larrabee as well.
Posted on Reply
#13
1c3d0g
Give it up, Intel. You've been trying to make GPU's (in all kinds of forms) for years, yet you have consistently failed to produce even a partial working silicone prototype. Give_it_up! Either buy a dedicated graphics company such as Imagination, Matrox, or best of all NVIDIA, and work your way up from there.

There's simply no need to WASTE billions on a doomed project when you can simply buy a company (you have plenty of cash, plus it's a great investment) that has the known expertise, and more importantly, a real working and ready-to-integrate graphics core so you can implement it already and be done with this expensive Larrabee joke. :)
Posted on Reply
#14
NC37
Knew it was going to fail from day 1. If Intel was serious about graphics, they'd make actual competitive IGPs. Then maybe we'd have less people clogging up support forums about why they can't run the apps they want to run on Intel crap.

Course the other part of it was, they tried to reinvent the wheel on how the GPU functions. Intel has been trying to push CPUs > GPUs for years but it just isn't happening. So they come out with Larabee instead, a GPU made out of CPUs. Intel may be the monster of the industry, but they have no standing power to change things when their current graphics tech just sucks.

Its like Intel as Stewart on Mad TV suddenly sporadically flailing limbs saying "hey look what I can do!" Then ATI/NV going back to duking it out as if Intel never existed, seeing him as obviously not a threat.
Posted on Reply
#15
OneCool
Awwwww.Was the GPU market going to fast for ya Intel?
Posted on Reply
#16
currahee440
I am thoroughly impressed with Intel's GMA solution. This is coming from me, as a child, who has used a lot of integrated stuff. Running with a Radeon IGP 320, I could only run WolfET at 640x480 and still only get 12fps if I was lucky. Gone are those days. The integrated stuff can run games, although not very well, and support DX10. That in itself is an achievement when there was a huge discrepancy between integrated and discrete. Intel was really two steps behind back then. In fact they didn't decide to implement transform and lighting until its GMA X line of integrated processors.
They've really caught up and it's not like 80% of the people in this world run games on their computers. If they do they're just simple flash games or casual games. Most mainstream users play on their xboxes or wiis or ps3s.
Posted on Reply
Add your own comment