Friday, April 10th 2009

Intel Displays Larrabee Wafer at IDF Beijing

Earlier this week, Intel conducted the Intel Developer Forum (IDF): Spring 2009 event at Beijing, China. Among several presentations on the the architectural advancements of the company's products, that include Nehalem and its scalable platforms, perhaps the most interesting was a brief talk by Pat Gelsinger, Senior Vice President and General Manager of Intel's Digital Enterprise Group, on Larrabee. The term is Intel's first "many cores" architecture used to work as a graphics processor. The architecture will be thoroughly backed by low-level and high-level programming languages and tools by Intel.

French website Hardware.fr took a timely snap off a webcast of the event, showing Gelsinger holding a 300 mm wafer of Larrabee dice. The theory that Intel has working prototypes of the GPU deep inside its labs gains weight. Making use of current-generation manufacturing technologies, Intel is scaling the performance of x86 processing elements, all 32+ of them. As you can faintly see from the wafer, Larrabee has a large die. It is reported that first generation of Larrabee will be built on the 45 nm manufacturing process. Products based on the architecture may arrive by late 2009, or early 2010. With the company kicking off its 32 nm production later this year, Larrabee may be built on the newer process a little later.
Source: Hardware.fr
Add your own comment

62 Comments on Intel Displays Larrabee Wafer at IDF Beijing

#27
a_ump
i honestly hope larrabee is a huge flop, i don't see it being very successful. As stated before it may be successful for cheaper PC's due to the integrated GPU, or whatever the hell it's going to be called, destroying current integrated graphics.

Do i see it doing away with directx or opengl? hell no, why would something from one major company destroy or change all the microsoft, nvidia, amd, and game developers know when they didn't even make the move up to dx10.1(cept AMD). I just see larrabee and fusion being great technology that could be implemented or used as a great idea for mobile chips, and even smaller versions for netbooks, and other portable devices. The way things are now won't change drastically especially since it would require lots of time and resources to support a great change like one away from DirectX and OpenGL
Posted on Reply
#28
LittleLizard
this a risky bet, if the results of performance are not that good, intel maybe be in troubles
Posted on Reply
#29
a_ump
LittleLizardthis a risky bet, if the results of performance are not that good, intel maybe be in troubles
couldn't agree more, the way games are made, the API's and whatnot are not going to change for intels larrabee, even if it is a huge breakthrough in tech with great potential.
Posted on Reply
#30
Mussels
Freshwater Moderator
a_umpcouldn't agree more, the way games are made, the API's and whatnot are not going to change for intels larrabee, even if it is a huge breakthrough in tech with great potential.
you're all forgetting a feature of DX11: CPU emulation. They ran crysis purely in software on an i7y for a techdemo, i'm pretty confident larrabee will do better than the i7's 30 FPS.
Posted on Reply
#31
a_ump
huh, i have no idea what your talking about lol, ima google it
Posted on Reply
#32
Mussels
Freshwater Moderator
a_umphuh, i have no idea what your talking about lol, ima google it
my memory failed me, on the FPS count.

www.techpowerup.com/index.php?77466

"An Intel Core i7 was able to "run" Crysis, on a resolution of 800 x 600, churning out a proud 7.36 frames per second"

larrabee is going to pwnnnnn an i7 at this.
Posted on Reply
#33
a_ump
ah i c, but larrabee may pwn at 800x600 and GPU emulation on CPU, however i see if being a far step behind what AMD and Nvidia are offeing now and what they will be offing by larrabee's release
Posted on Reply
#34
FordGT90Concept
"I go fast!1!11!1!"
BradleyKZNYou guys know about Nvidia's Tesla right? Google it. Lots of info around, not only on nvidias website!
Tesla is basically a NVIDIA graphics card without any display ports on it that runs CUDA code. They really aren't that special--mostly cheap marketing.
a_umpah i c, but larrabee may pwn at 800x600 and GPU emulation on CPU, however i see if being a far step behind what AMD and Nvidia are offeing now and what they will be offing by larrabee's release
Intel is aimming for a very high FlOp processor that is fully programmable. As such, DirectX and OpenGL will be able to run on a driver which in turn runs on the card. Intel's Larrabee will be in direct competition to AMD's Radeon and NVIDIA's GeForce. More over, Larrabee isn't hard coded to do anything specific so it can be made to do much more than AMD Stream and NVIDIA CUDA can do using the same programming paradigms as programming for an x86 processor. In essence, it is a super computer on a card.

Remember, GPUs, at their core, are just high FlOp processors.
Posted on Reply
#35
eidairaman1
The Exiled Airman
Musselsmy memory failed me, on the FPS count.

www.techpowerup.com/index.php?77466

"An Intel Core i7 was able to "run" Crysis, on a resolution of 800 x 600, churning out a proud 7.36 frames per second"

larrabee is going to pwnnnnn an i7 at this.
:laugh:
Posted on Reply
#36
FordGT90Concept
"I go fast!1!11!1!"
7.36 FPS is actually rather respectable. I would have expected around 5. CPUs are generalized processors and not very strong on the FPU front. That's why we need GPUs.
Posted on Reply
#37
eidairaman1
The Exiled Airman
if you want Respectable FPS 30 as the Minimum and that is with alot of Crap going on.
Posted on Reply
#38
Flyordie
Intel can't make a small CPU. Its genetics... they just can't. Sure it may be 32nm but whew... how big is that i7 man? Its like 2x the size as the PII. If Intel can't even fit an 8 core CPU on a single socket @ 45nm, and AMD can with room to spare then Larrabee won't see the mobile market at all.

I will believe it when I see it.
Posted on Reply
#39
Morgoth
Fueled by Sapphire
i dont get it why are the moost ppl here so negative abouth it? ur ppl scared that intel joins the high end - mid end gpu market?
larrabee can do allot more then a gpu can
Posted on Reply
#40
Flyordie
Morgothi dont get it why are the moost ppl here so negative abouth it? ur ppl scared that intel joins the high end - mid end gpu market?
larrabee can do allot more then a gpu can
Im not scared. I just don't think Intel can do it with their current mindset. They are thinking monolithic, not efficient.
Posted on Reply
#41
DrPepper
The Doctor is in the house
Morgothi dont get it why are the moost ppl here so negative abouth it? ur ppl scared that intel joins the high end - mid end gpu market?
larrabee can do allot more then a gpu can
Yeah we're scared incase intel destroys ATI and nvidia :(
Posted on Reply
#42
Error 404
DrPepperYeah we're scared incase intel destroys ATI and nvidia :(
Fat chance. :laugh:

However, I do see larrabee as one of those things that isn't going to be used to its full power; sure, people will play games on it, and certain programs may even utilise some of its processing capabilities. But it is not truly useful if, for example, Windows can't use it as a CPU when booting! Loading times, when not bottlenecked by the HDD (24 SSD RAID Array?) would be amazingly short, programs would load in the blink of an eye, and you could load your favorite game in seconds and (hopefully) be able to play at good frame rates!

If larrabee cant do that, then there's no point.
Posted on Reply
#43
nafets
I have no doubt on Intel's ability to produce exceptional CPU and now possibly, GPU chips.

The problem comes with drivers and driver support. Intel has yet to release competent drivers for it's current integrated graphics solutions. Not to mention the actual onboard graphics chips are underpowered and are absolute junk with regards to any type of 3D gaming.

Intel is going to need a huge new army of programmers/engineers to design and produce effective drivers, which fully utilize Larrabee's potential. Without that, it's just a shiny new piece of silicon with nowhere to go...
Posted on Reply
#44
a_ump
hum, so larrabee from your alls posts is a lot more capable than i origonally thought. But still as stated i don't see intel will go from rather sad integrated GPU's to being able to produce something to compete will AMD and Nvidia. But we shall see, i with they already had engineering samples, course we should all remember that what they say it will do on paper is always better than how it will actually perform. I wonder bout AMD's fusion and how it will compare, i personally look for Fusion to be superior to Larrabee just because AMD has good experience with GPU's and CPU's where as Intel is soley good CPU knowledge.
Posted on Reply
#45
Mussels
Freshwater Moderator
a_umphum, so larrabee from your alls posts is a lot more capable than i origonally thought. But still as stated i don't see intel will go from rather sad integrated GPU's to being able to produce something to compete will AMD and Nvidia. But we shall see, i with they already had engineering samples, course we should all remember that what they say it will do on paper is always better than how it will actually perform. I wonder bout AMD's fusion and how it will compare, i personally look for Fusion to be superior to Larrabee just because AMD has good experience with GPU's and CPU's where as Intel is soley good CPU knowledge.
This is going to be used in more creative ways than just a GPU. For example: using the WARP10 i linked before, these 32 Cores could run 30 virtual machines in a VMware, and remotely send them to thinware PC's. Normally you'd need a lot of servers in a datacenter for something like that, with this - just the one addin card.
Posted on Reply
#46
a_ump
MusselsThis is going to be used in more creative ways than just a GPU. For example: using the WARP10 i linked before, these 32 Cores could run 30 virtual machines in a VMware, and remotely send them to thinware PC's. Normally you'd need a lot of servers in a datacenter for something like that, with this - just the one addin card.
card? isn't it just going to be a cpu chip with GPU die and CPU die on the same pcb but inserted into a socket, LGA1366 i thk is planned for that isn't it? going to be really interesting to just see exactly what larrabee can excel in compared to current technology
Posted on Reply
#47
Mussels
Freshwater Moderator
a_umpcard? isn't it just going to be a cpu chip with GPU die and CPU die on the same pcb but inserted into a socket, LGA1366 i thk is planned for that isn't it?
i dunno. i'm throwing out theories at midnight here. you cant expect perfection.
I think its a card.. but i might be wrong.
Posted on Reply
#48
DrPepper
The Doctor is in the house
Musselsi dunno. i'm throwing out theories at midnight here. you cant expect perfection.
I think its a card.. but i might be wrong.
I'm almost 100% sure it is a card.
Posted on Reply
#49
FordGT90Concept
"I go fast!1!11!1!"
eidairaman1if you want Respectable FPS 30 as the Minimum and that is with alot of Crap going on.
Which is why we don't game on CPUs. ;)

The GPU does most of the heavy lifting and the CPU handles the other tasks like caching images, queuing up sounds, etc.
FlyordieIntel can't make a small CPU. Its genetics... they just can't. Sure it may be 32nm but whew... how big is that i7 man? Its like 2x the size as the PII.
Their processors are bigger because of their inclusive cache design which means more memory and more memory means bigger processors...
FlyordieIm not scared. I just don't think Intel can do it with their current mindset. They are thinking monolithic, not efficient.
Remember, Intel IGPs are the most popular graphics chips in the world...
Posted on Reply
#50
DrPepper
The Doctor is in the house
FordGT90ConceptTheir processors are bigger because of their inclusive cache design which means more memory and more memory means bigger processors...
Indeed. Unless they can find a way to make that amount of cache smaller without taking it off the cpu die then CPU's will be fairly large. I think though per square mm i7 is more efficient than most if not all cpu's with a few exceptions.
Posted on Reply
Add your own comment
May 6th, 2024 14:19 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts