• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Larrabee Die Zoomed in

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Intel chose the occasion of the opening ceremony for the Intel Visual Computing Institute at the Saarland University in Germany, to conduct a brief presentation about the visual computing technologies the company is currently working on. The focal point was Larrabee, the codename for Intel's upcoming "many-cores" processor that will play a significant role in Intel's visual computing foray way beyond integrated graphics. The die-shot reveals in intricate network of what look like the much talked-about x86 processing elements that bring about computing parallelism in Larrabee. Another slide briefly describes where Intel sees performance demands heading, saying that its growth is near-exponential with growth in common dataset sizes.



View at TechPowerUp Main Site
 
that's a rather odd looking die, nothing like i expected, wonder how many processors it's going to have.
 
that's a rather odd looking die, nothing like i expected, wonder how many processors it's going to have.

Well in the first picture ive counted what seems to be 32 individual cores plus some extra silicon. Good to see they have progressed to showing die shots, now i wanna see it running :)
 
I'm bored with it. Hurry up and show us what's after Larrabee ;)
 
Hey we have been tricked! We were supposed to see a zoom in of Larrabee dieing.
/wink
 
Wow, so many pretty colors as Intel promised :D
 
That slide show is right--models are a PITA to compute. The average model has tens of thousands of triangles (not to mention most of them move) while your average 3D enviornment may have ten thousand triangles at the most and very few of them move. The reason why S.T.A.L.K.E.R. grass doesn't move very much is because of the massive commitment charge to make all those triangles transition.

But yeah, still smoke and mirrors. I want fps figures.
 
I don't get what this chip does. Do you plug the DVI straight into the chip lol? or this chip just help boost your graphic needs while needing integrated/specialized motherboard to help output to monitor. I can do that with neither ATI or Nvidia so what's the point? So many question but no answer is certain atm. Wonder if AMD has something to counter this or it could mean more trouble in an already trouble market.
 
basically its going to be a bunch of cpu's on 1 die kinda like the cell but has a much more flexible design, so it can execute many different kinds of code
 
It's just unbelieveable Intel, in all it's wisdom, chose to use the inefficient and outdated X-bloody-86 as their graphics architecture, when they had a chance to create something smart.
 
yea, it is retarded when I think about it too

this larrabee crap is based off their P54C, IE the PENTIUM MMX(LIKE 14 years old)
 
er, but so is Core 2 and so is i7.

Remember, you need very little x86 to build a basic code base. The key to larrabee performance is the vector register extensions and special vector operators. Here's an introduction to vector registers: http://en.wikipedia.org/wiki/SIMD

x86 is very clever as a base. It means ANYONE (who can program! lol) can code it with a short learning curve, and existing IDE's can be used. Think of CUDA, but without having to learn somthing new or use new code libraries. Just use existing x86 code and add a few extra commands to handle the process of vector data.
 
It's just unbelieveable Intel, in all it's wisdom, chose to use the inefficient and outdated X-bloody-86 as their graphics architecture, when they had a chance to create something smart.

You ever seen the IA-64 architecture? IMHO, it is one of the nicest and well built architectures out there (way better than the messes that are x86 and SPARC). However, it really didn't take off because OMG, companies would have to run their code through a different compiler (and rewrite some code if it didn't work straight through and/or wanted to optimize it). That is even after the fact that Microsoft built a whole native IA-64 version of Windows, and it still didn't help it.

As you can see, Intel is done trying to introduce new architectures to the market and will keep to its good old moneymaker (...errr, friend), the x86 architecture.
 
Last edited:
You ever seen the IA-64 architecture? IMHO, it is one of the nicest and well built architectures out there (way better than the messes that are x86 and SPARC). However, it really didn't take off because OMG, companies would have to run their code through a different compiler (and/or rewrite a little code if it didn't work straight through). That is even after the fact that Microsoft built a whole native IA-64 version of Windows, and it still didn't help it.

As you can see, Intel is done trying to introduce new architectures to the market and will keep to its good old moneymaker (...errr, friend), the x86 architecture.

It would've helped if itanic was fast except for three tasks :p.

Having 100 bagillion megs of cache was the only reason it wasn't even worse.
 
Hmmm... Well the thing isn't going to be good for gaming (well with the chip at least).
 
It would've helped if itanic was fast except for three tasks :p.

Having 100 bagillion megs of cache was the only reason it wasn't even worse.

Never said that beauty == better. Look at Agena vs Conroe... ;) But don't forget, I was speaking of a pre-EM64T era when all Intel had was the NetBurst.

Anyway, this Larrabee looks like something I'm definitely going to keep my eyes on!
 
Sceptical.

I dont know how much experience Nvidia and ATI have compared to intel.

Intel have purely money, and cpu designs, they havnt really done gpu work since intel 740 which the whole GMA is built on, so must have been quite some transition!.

I guess they will get more in the game when theyve gained some more experience.

I guess ATI's market strategy is one of the best there is.

one architecture becomes many many many chips.

Small memory bus with GDDR5 for less complexity(4layerpcb vs 8 on high ends) and so on.

Multi-GPU.

However, i aint fully against larabee to be a good strategy, i'm just guessing that 3rd gen will start to become what we will look at as a solid card, and first gen to be a card with little perfection compared to ati and nvidia.
I dont suspect it to have a bad raw power, the strategy is surely interesting, but i'm wondering how really x86 can do gpu work, they might capture many costumers that need rendering, and math power with little work with API( like cuda and ati stream computing needs) this is in fact something intel aims for.
 
They have a screenshot of world of warcraft which I have ran on the 845 Chipset so...........
 
However, i aint fully against larabee to be a good strategy, i'm just guessing that 3rd gen will start to become what we will look at as a solid card, and first gen to be a card with little perfection compared to ati and nvidia.
I dont suspect it to have a bad raw power, the strategy is surely interesting, but i'm wondering how really x86 can do gpu work, they might capture many costumers that need rendering, and math power with little work with API( like cuda and ati stream computing needs) this is in fact something intel aims for.

Intel is banking on the hope that people will adobt its easy programming to make it accelerate normal tasks like video encoding etc. I believe this will be an all in one which can be programmed to be able to do GPU and CPU work which means if they got enough support more powerful one's could dominate ATI and Nvidia in the CAD and ray tracing theatres.
 
Intel is banking on the hope that people will adobt its easy programming to make it accelerate normal tasks like video encoding etc. I believe this will be an all in one which can be programmed to be able to do GPU and CPU work which means if they got enough support more powerful one's could dominate ATI and Nvidia in the CAD and ray tracing theatres.

really? i seriously didn't see larrabee impacting ATI or nvidia for at least another 2-4 years. I mean they live to make GPU's, intel is just starting this new way to handle GPU tasks(lol i don't know all the technical terms). So as most new things, i expect it to be buggy slightly and take time to improve into something to look at, but then it's intel and with all their money i bet they're testing it like no other chip they have to ensure a good solid launch with good performance.
 
Hyperthreading isn't very useful in GPU architectures--only CPU. All hyperthreading really does is use parts of the processor that would otherwise be idle. Since all the cores in these GPUs are about as simple as they can be made, there is very little that is idle and what is idle would take more work than its worth to access.
 
really? i seriously didn't see larrabee impacting ATI or nvidia for at least another 2-4 years. I mean they live to make GPU's, intel is just starting this new way to handle GPU tasks(lol i don't know all the technical terms). So as most new things, i expect it to be buggy slightly and take time to improve into something to look at, but then it's intel and with all their money i bet they're testing it like no other chip they have to ensure a good solid launch with good performance.

Well the intels idea for larrabee is that there are lots of processor cores working in parallel like a GPU. So its pretty much x86 processors acting like gpu cores. Intel live to make cpu's and they make them like no other and as well as being able to act like a gpu they can be programmed to do cpu work as well. Larrabee isn't meant to be able to render as fast as ati and nvidia and maybe future one's will but larrabee is something different it can do whatever a cpu can do and thats why its exciting.
 
Back
Top