Monday, December 7th 2009

Intel Larrabee Fails as a GPU Project

Intel's ambitious attempts at building a discrete GPU have been shelved, as reports emerge of the company canceling the silicon's first implementation as a GPU, but rather as a "software development platform for internal and external use." In a statement issued to Internetnews.com, Intel spokesperson Nick Knupffer explained the company's current position of Larrabee, saying that development of Larrabee's silicon (the chip) and software were behind schedules. "Larrabee silicon and software development are behind where we hoped to be at this point in the project," he said. "As a result, our first Larrabee product will not be launched as a stand-alone, discrete graphics product, rather it will be used as a software development platform for internal and external use."

Larrabee as a discrete GPU made a lot of news in its short public-life, it was given much credibility as it was coming from Intel, an IT industry heavyweight. Earlier this year, Intel demonstrated a Larrabee-based product (including actual product design of the "Larrabee card"), at last month's SC'09 show. The company seemed to have avoided calling it a discrete GPU, instead a "computational co-processor for the Intel Xeon and Core families." It was reasonable in calling it that, since by design, Larrabee is a many-core processor which uses 32 IA cores interconnected by caches. At SC'09, Intel demonstrated its computational power which peaked at over 1 TFLOP, but not before overclocking it.

Market analyst Jon Peddie of Jon Peddie Research remains optimistic. "I believe they will definitely come back. Intel's commitment has not slackened. The part is being repositioned as a HPC co-processor where I think it will do very well," he said. "They learned a whole lot from this. A whole lot. They are not going to throw that investment or knowledge away. I wouldn't be surprised to see them come back in a few years with a graphics part. Intel could decide to follow the high performance trail like AMD is doing with Fusion," he added.
Source: internetnews
Add your own comment

41 Comments on Intel Larrabee Fails as a GPU Project

#2
pmrdij
somewhere Nelson Muntz can be heard saying...

- Robert (pMr)dEATHiNjUNE
Posted on Reply
#3
locoty
it should give some time for NVIDIA with fermi ( if it is real):roll:
Posted on Reply
#4
afw
Well tried INTEL ... :toast: ... hope the next try will bear fruit ...
more competition .... better the products and the prices will be great ... :D ....
Posted on Reply
#5
WarEagleAU
Bird of Prey
Epic fail. I was actually wanting to see how this would turn out and would have loved to try it myself. Hopefully it will get going.
Posted on Reply
#6
VulkanBros
Well ... bad for competition ... but I would have been really surprised if Intel would have succeeded with a x86 product to be a serious competitor to Nvidia/ATI ....
Posted on Reply
#7
v12dock
Block Caption of Rainey Street
Where is the Like button...
Posted on Reply
#9
A Cheese Danish
Seems like Larrabee will just be a development software platform for now.
I was hoping for something cool :(
Posted on Reply
#10
Static~Charge
The ghost of the i740 is coming back to haunt Intel....
Posted on Reply
#11
FordGT90Concept
"I go fast!1!11!1!"
I hope they can get the DX/OGL version to market in 2011 if not late 2010. I'm sad it is getting pushed back but I'm now happy with my HD 5870 purchase seeing as I couldn't have waited over a year to upgrade (8800 GT was showing its age).
Posted on Reply
#12
erocker
*
Larabee = in the trash can

Die shrink of Larabee so it eats much less power = Larabee 2.

It's not over.
Posted on Reply
#13
Fx
this is no surprise. Intel set themselves up for this

as stated though- I am sure they did learn a lot in a short period of time
Posted on Reply
#14
Easy Rhino
Linux Advocate
v12dockWhere is the Like button...
:slap:
Posted on Reply
#15
Solaris17
Super Dainty Moderator
i absolutely called this. i knew it.
Posted on Reply
#16
L|NK|N
Solaris17i absolutely called this. i knew it.
Posted on Reply
#17
Morgoth
Fueled by Sapphire
sad rip larabee crossfire day dreaming
Posted on Reply
#18
1c3d0g
Anybody still saying designing a decent performing GPU is easy? Respect for NVIDIA & ATI for holding on as long as they can. But I'm saddened by this news as we now have to continue to face piss-poor GMA-like performance from Intel's IGP's.
Posted on Reply
#19
panchoman
Sold my stars!
saw this one coming. stringing 20 pentium 4's together to create a graphics chip?

intel:

Posted on Reply
#20
imperialreign
Pancho - nice to see you back around these parts! :toast:


Dang, when Intel fails . . . they fail.

Oh well - considering all the hype over the last year+, I'm sure ATI and nVidia have been chanting "I'll believe it when I see it" everytime the name Larrabee cropped up. Back to business as usual.
Posted on Reply
#21
aj28
erockerLarabee = in the trash can

Die shrink of Larabee so it eats much less power = Larabee 2.

It's not over.
Oh yes it is!
The design was downright silly...
YOUR 32NM CANNOT SAVE YOU NOW!!!
Posted on Reply
#23
phanbuey
lol at the responses... you have to give them some credit for trying tho. I give them an A for effort.

Eventually they'll get frustrated, give up, and buy nvidia.
Posted on Reply
#24
Unregistered
By the time Larrabee 2 comes out, AMD will have 32 core MSM CPU's. So Intel is really going to have to ramp up the design or, as it was this time around, it will be obsolete even before it is demo'ed.
#25
rick22
Solaris17i absolutely called this. i knew it.
Your the man solaris.....:nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick: love ya man
Posted on Reply
Add your own comment
Apr 18th, 2024 18:27 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts