• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Displays Larrabee Wafer at IDF Beijing

Well! the sad thing for you is I don't even claim to be informed. And ""yesss! maybe it's like the Cell thing"

Oh then by all means come on in here and trash us. We will try to meet your "standards".
 
Yey, we gonna have a cpu as big as that guy's head =]
 
i honestly hope larrabee is a huge flop, i don't see it being very successful. As stated before it may be successful for cheaper PC's due to the integrated GPU, or whatever the hell it's going to be called, destroying current integrated graphics.

Do i see it doing away with directx or opengl? hell no, why would something from one major company destroy or change all the microsoft, nvidia, amd, and game developers know when they didn't even make the move up to dx10.1(cept AMD). I just see larrabee and fusion being great technology that could be implemented or used as a great idea for mobile chips, and even smaller versions for netbooks, and other portable devices. The way things are now won't change drastically especially since it would require lots of time and resources to support a great change like one away from DirectX and OpenGL
 
this a risky bet, if the results of performance are not that good, intel maybe be in troubles
 
this a risky bet, if the results of performance are not that good, intel maybe be in troubles

couldn't agree more, the way games are made, the API's and whatnot are not going to change for intels larrabee, even if it is a huge breakthrough in tech with great potential.
 
couldn't agree more, the way games are made, the API's and whatnot are not going to change for intels larrabee, even if it is a huge breakthrough in tech with great potential.

you're all forgetting a feature of DX11: CPU emulation. They ran crysis purely in software on an i7y for a techdemo, i'm pretty confident larrabee will do better than the i7's 30 FPS.
 
huh, i have no idea what your talking about lol, ima google it
 
ah i c, but larrabee may pwn at 800x600 and GPU emulation on CPU, however i see if being a far step behind what AMD and Nvidia are offeing now and what they will be offing by larrabee's release
 
You guys know about Nvidia's Tesla right? Google it. Lots of info around, not only on nvidias website!
Tesla is basically a NVIDIA graphics card without any display ports on it that runs CUDA code. They really aren't that special--mostly cheap marketing.


ah i c, but larrabee may pwn at 800x600 and GPU emulation on CPU, however i see if being a far step behind what AMD and Nvidia are offeing now and what they will be offing by larrabee's release
Intel is aimming for a very high FlOp processor that is fully programmable. As such, DirectX and OpenGL will be able to run on a driver which in turn runs on the card. Intel's Larrabee will be in direct competition to AMD's Radeon and NVIDIA's GeForce. More over, Larrabee isn't hard coded to do anything specific so it can be made to do much more than AMD Stream and NVIDIA CUDA can do using the same programming paradigms as programming for an x86 processor. In essence, it is a super computer on a card.

Remember, GPUs, at their core, are just high FlOp processors.
 
Last edited:
7.36 FPS is actually rather respectable. I would have expected around 5. CPUs are generalized processors and not very strong on the FPU front. That's why we need GPUs.
 
if you want Respectable FPS 30 as the Minimum and that is with alot of Crap going on.
 
Intel can't make a small CPU. Its genetics... they just can't. Sure it may be 32nm but whew... how big is that i7 man? Its like 2x the size as the PII. If Intel can't even fit an 8 core CPU on a single socket @ 45nm, and AMD can with room to spare then Larrabee won't see the mobile market at all.

I will believe it when I see it.
 
i dont get it why are the moost ppl here so negative abouth it? ur ppl scared that intel joins the high end - mid end gpu market?
larrabee can do allot more then a gpu can
 
i dont get it why are the moost ppl here so negative abouth it? ur ppl scared that intel joins the high end - mid end gpu market?
larrabee can do allot more then a gpu can

Im not scared. I just don't think Intel can do it with their current mindset. They are thinking monolithic, not efficient.
 
i dont get it why are the moost ppl here so negative abouth it? ur ppl scared that intel joins the high end - mid end gpu market?
larrabee can do allot more then a gpu can

Yeah we're scared incase intel destroys ATI and nvidia :(
 
Yeah we're scared incase intel destroys ATI and nvidia :(

Fat chance. :laugh:

However, I do see larrabee as one of those things that isn't going to be used to its full power; sure, people will play games on it, and certain programs may even utilise some of its processing capabilities. But it is not truly useful if, for example, Windows can't use it as a CPU when booting! Loading times, when not bottlenecked by the HDD (24 SSD RAID Array?) would be amazingly short, programs would load in the blink of an eye, and you could load your favorite game in seconds and (hopefully) be able to play at good frame rates!

If larrabee cant do that, then there's no point.
 
I have no doubt on Intel's ability to produce exceptional CPU and now possibly, GPU chips.

The problem comes with drivers and driver support. Intel has yet to release competent drivers for it's current integrated graphics solutions. Not to mention the actual onboard graphics chips are underpowered and are absolute junk with regards to any type of 3D gaming.

Intel is going to need a huge new army of programmers/engineers to design and produce effective drivers, which fully utilize Larrabee's potential. Without that, it's just a shiny new piece of silicon with nowhere to go...
 
hum, so larrabee from your alls posts is a lot more capable than i origonally thought. But still as stated i don't see intel will go from rather sad integrated GPU's to being able to produce something to compete will AMD and Nvidia. But we shall see, i with they already had engineering samples, course we should all remember that what they say it will do on paper is always better than how it will actually perform. I wonder bout AMD's fusion and how it will compare, i personally look for Fusion to be superior to Larrabee just because AMD has good experience with GPU's and CPU's where as Intel is soley good CPU knowledge.
 
hum, so larrabee from your alls posts is a lot more capable than i origonally thought. But still as stated i don't see intel will go from rather sad integrated GPU's to being able to produce something to compete will AMD and Nvidia. But we shall see, i with they already had engineering samples, course we should all remember that what they say it will do on paper is always better than how it will actually perform. I wonder bout AMD's fusion and how it will compare, i personally look for Fusion to be superior to Larrabee just because AMD has good experience with GPU's and CPU's where as Intel is soley good CPU knowledge.

This is going to be used in more creative ways than just a GPU. For example: using the WARP10 i linked before, these 32 Cores could run 30 virtual machines in a VMware, and remotely send them to thinware PC's. Normally you'd need a lot of servers in a datacenter for something like that, with this - just the one addin card.
 
This is going to be used in more creative ways than just a GPU. For example: using the WARP10 i linked before, these 32 Cores could run 30 virtual machines in a VMware, and remotely send them to thinware PC's. Normally you'd need a lot of servers in a datacenter for something like that, with this - just the one addin card.

card? isn't it just going to be a cpu chip with GPU die and CPU die on the same pcb but inserted into a socket, LGA1366 i thk is planned for that isn't it? going to be really interesting to just see exactly what larrabee can excel in compared to current technology
 
card? isn't it just going to be a cpu chip with GPU die and CPU die on the same pcb but inserted into a socket, LGA1366 i thk is planned for that isn't it?

i dunno. i'm throwing out theories at midnight here. you cant expect perfection.
I think its a card.. but i might be wrong.
 
i dunno. i'm throwing out theories at midnight here. you cant expect perfection.
I think its a card.. but i might be wrong.

I'm almost 100% sure it is a card.
 
if you want Respectable FPS 30 as the Minimum and that is with alot of Crap going on.
Which is why we don't game on CPUs. ;)

The GPU does most of the heavy lifting and the CPU handles the other tasks like caching images, queuing up sounds, etc.


Intel can't make a small CPU. Its genetics... they just can't. Sure it may be 32nm but whew... how big is that i7 man? Its like 2x the size as the PII.
Their processors are bigger because of their inclusive cache design which means more memory and more memory means bigger processors...


Im not scared. I just don't think Intel can do it with their current mindset. They are thinking monolithic, not efficient.
Remember, Intel IGPs are the most popular graphics chips in the world...
 
Back
Top