1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Larrabee Die Zoomed in

Discussion in 'News' started by btarunr, May 12, 2009.

  1. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,864 (11.07/day)
    Thanks Received:
    13,716
    Location:
    Hyderabad, India
    Intel chose the occasion of the opening ceremony for the Intel Visual Computing Institute at the Saarland University in Germany, to conduct a brief presentation about the visual computing technologies the company is currently working on. The focal point was Larrabee, the codename for Intel's upcoming "many-cores" processor that will play a significant role in Intel's visual computing foray way beyond integrated graphics. The die-shot reveals in intricate network of what look like the much talked-about x86 processing elements that bring about computing parallelism in Larrabee. Another slide briefly describes where Intel sees performance demands heading, saying that its growth is near-exponential with growth in common dataset sizes.

    [​IMG] [​IMG]

    Source: PC Games Hardware
     
    Matt Sakko and a_ump say thanks.
  2. a_ump

    a_ump

    Joined:
    Nov 21, 2007
    Messages:
    3,620 (1.41/day)
    Thanks Received:
    376
    Location:
    Smithfield, WV
    that's a rather odd looking die, nothing like i expected, wonder how many processors it's going to have.
     
  3. human_error

    human_error

    Joined:
    Nov 10, 2008
    Messages:
    1,757 (0.80/day)
    Thanks Received:
    491
    Well in the first picture ive counted what seems to be 32 individual cores plus some extra silicon. Good to see they have progressed to showing die shots, now i wanna see it running :)
     
  4. iStink

    iStink New Member

    Joined:
    Sep 22, 2008
    Messages:
    648 (0.29/day)
    Thanks Received:
    49
    I'm bored with it. Hurry up and show us what's after Larrabee ;)
     
  5. a111087

    a111087

    Joined:
    Apr 2, 2007
    Messages:
    2,766 (0.99/day)
    Thanks Received:
    201
    Location:
    US
    are you on drugs :laugh:
     
  6. ShogoXT

    ShogoXT New Member

    Joined:
    Feb 14, 2008
    Messages:
    974 (0.39/day)
    Thanks Received:
    84
    Location:
    Cincinnati, Ohio
    Hey we have been tricked! We were supposed to see a zoom in of Larrabee dieing.
    /wink
     
  7. PCpraiser100 New Member

    Joined:
    Jul 17, 2008
    Messages:
    1,062 (0.46/day)
    Thanks Received:
    68
    Wow, so many pretty colors as Intel promised :D
     
  8. FordGT90Concept

    FordGT90Concept "I go fast!1!11!1!"

    Joined:
    Oct 13, 2008
    Messages:
    13,963 (6.24/day)
    Thanks Received:
    3,803
    Location:
    IA, USA
    That slide show is right--models are a PITA to compute. The average model has tens of thousands of triangles (not to mention most of them move) while your average 3D enviornment may have ten thousand triangles at the most and very few of them move. The reason why S.T.A.L.K.E.R. grass doesn't move very much is because of the massive commitment charge to make all those triangles transition.

    But yeah, still smoke and mirrors. I want fps figures.
     
    Crunching for Team TPU
  9. my_name_is_earl

    my_name_is_earl New Member

    Joined:
    Feb 22, 2009
    Messages:
    228 (0.11/day)
    Thanks Received:
    3
    Location:
    Grand Prairie Texas
    I don't get what this chip does. Do you plug the DVI straight into the chip lol? or this chip just help boost your graphic needs while needing integrated/specialized motherboard to help output to monitor. I can do that with neither ATI or Nvidia so what's the point? So many question but no answer is certain atm. Wonder if AMD has something to counter this or it could mean more trouble in an already trouble market.
     
  10. KainXS

    KainXS

    Joined:
    Sep 25, 2007
    Messages:
    5,601 (2.14/day)
    Thanks Received:
    502
    basically its going to be a bunch of cpu's on 1 die kinda like the cell but has a much more flexible design, so it can execute many different kinds of code
     
  11. largon

    largon

    Joined:
    May 6, 2005
    Messages:
    2,782 (0.80/day)
    Thanks Received:
    433
    Location:
    Tre, Suomi Finland
    It's just unbelieveable Intel, in all it's wisdom, chose to use the inefficient and outdated X-bloody-86 as their graphics architecture, when they had a chance to create something smart.
     
  12. KainXS

    KainXS

    Joined:
    Sep 25, 2007
    Messages:
    5,601 (2.14/day)
    Thanks Received:
    502
    yea, it is retarded when I think about it too

    this larrabee crap is based off their P54C, IE the PENTIUM MMX(LIKE 14 years old)
     
  13. lemonadesoda

    lemonadesoda

    Joined:
    Aug 30, 2006
    Messages:
    6,267 (2.08/day)
    Thanks Received:
    968
    er, but so is Core 2 and so is i7.

    Remember, you need very little x86 to build a basic code base. The key to larrabee performance is the vector register extensions and special vector operators. Here's an introduction to vector registers: http://en.wikipedia.org/wiki/SIMD

    x86 is very clever as a base. It means ANYONE (who can program! lol) can code it with a short learning curve, and existing IDE's can be used. Think of CUDA, but without having to learn somthing new or use new code libraries. Just use existing x86 code and add a few extra commands to handle the process of vector data.
     
    1c3d0g says thanks.
  14. devguy

    devguy

    Joined:
    Feb 17, 2007
    Messages:
    1,239 (0.44/day)
    Thanks Received:
    171
    Location:
    SoCal
    You ever seen the IA-64 architecture? IMHO, it is one of the nicest and well built architectures out there (way better than the messes that are x86 and SPARC). However, it really didn't take off because OMG, companies would have to run their code through a different compiler (and rewrite some code if it didn't work straight through and/or wanted to optimize it). That is even after the fact that Microsoft built a whole native IA-64 version of Windows, and it still didn't help it.

    As you can see, Intel is done trying to introduce new architectures to the market and will keep to its good old moneymaker (...errr, friend), the x86 architecture.
     
    Last edited: May 13, 2009
    lemonadesoda says thanks.
  15. TheGuruStud

    TheGuruStud

    Joined:
    Sep 15, 2007
    Messages:
    1,643 (0.62/day)
    Thanks Received:
    173
    Location:
    Police/Nanny State of America
    It would've helped if itanic was fast except for three tasks :p.

    Having 100 bagillion megs of cache was the only reason it wasn't even worse.
     
  16. OmegaAI

    OmegaAI New Member

    Joined:
    Jan 24, 2009
    Messages:
    369 (0.17/day)
    Thanks Received:
    28
    Location:
    Stuff
    Hmmm... Well the thing isn't going to be good for gaming (well with the chip at least).
     
  17. devguy

    devguy

    Joined:
    Feb 17, 2007
    Messages:
    1,239 (0.44/day)
    Thanks Received:
    171
    Location:
    SoCal
    Never said that beauty == better. Look at Agena vs Conroe... ;) But don't forget, I was speaking of a pre-EM64T era when all Intel had was the NetBurst.

    Anyway, this Larrabee looks like something I'm definitely going to keep my eyes on!
     
  18. Imsochobo New Member

    Joined:
    Feb 19, 2009
    Messages:
    514 (0.24/day)
    Thanks Received:
    35
    Location:
    I live in Norway, in the province Buskerud.
    Sceptical.

    I dont know how much experience Nvidia and ATI have compared to intel.

    Intel have purely money, and cpu designs, they havnt really done gpu work since intel 740 which the whole GMA is built on, so must have been quite some transition!.

    I guess they will get more in the game when theyve gained some more experience.

    I guess ATI's market strategy is one of the best there is.

    one architecture becomes many many many chips.

    Small memory bus with GDDR5 for less complexity(4layerpcb vs 8 on high ends) and so on.

    Multi-GPU.

    However, i aint fully against larabee to be a good strategy, i'm just guessing that 3rd gen will start to become what we will look at as a solid card, and first gen to be a card with little perfection compared to ati and nvidia.
    I dont suspect it to have a bad raw power, the strategy is surely interesting, but i'm wondering how really x86 can do gpu work, they might capture many costumers that need rendering, and math power with little work with API( like cuda and ati stream computing needs) this is in fact something intel aims for.
     
  19. v12dock

    v12dock

    Joined:
    Dec 18, 2008
    Messages:
    1,611 (0.74/day)
    Thanks Received:
    321
    They have a screenshot of world of warcraft which I have ran on the 845 Chipset so...........
     
  20. DrPepper

    DrPepper The Doctor is in the house

    Joined:
    Jan 16, 2008
    Messages:
    7,483 (2.98/day)
    Thanks Received:
    813
    Location:
    Scotland (It rains alot)
    Intel is banking on the hope that people will adobt its easy programming to make it accelerate normal tasks like video encoding etc. I believe this will be an all in one which can be programmed to be able to do GPU and CPU work which means if they got enough support more powerful one's could dominate ATI and Nvidia in the CAD and ray tracing theatres.
     
  21. a_ump

    a_ump

    Joined:
    Nov 21, 2007
    Messages:
    3,620 (1.41/day)
    Thanks Received:
    376
    Location:
    Smithfield, WV
    really? i seriously didn't see larrabee impacting ATI or nvidia for at least another 2-4 years. I mean they live to make GPU's, intel is just starting this new way to handle GPU tasks(lol i don't know all the technical terms). So as most new things, i expect it to be buggy slightly and take time to improve into something to look at, but then it's intel and with all their money i bet they're testing it like no other chip they have to ensure a good solid launch with good performance.
     
  22. MikeX New Member

    Joined:
    Jun 15, 2006
    Messages:
    125 (0.04/day)
    Thanks Received:
    10
    Intel should make them hyper thread.
     
  23. FordGT90Concept

    FordGT90Concept "I go fast!1!11!1!"

    Joined:
    Oct 13, 2008
    Messages:
    13,963 (6.24/day)
    Thanks Received:
    3,803
    Location:
    IA, USA
    Hyperthreading isn't very useful in GPU architectures--only CPU. All hyperthreading really does is use parts of the processor that would otherwise be idle. Since all the cores in these GPUs are about as simple as they can be made, there is very little that is idle and what is idle would take more work than its worth to access.
     
    Crunching for Team TPU
  24. DrPepper

    DrPepper The Doctor is in the house

    Joined:
    Jan 16, 2008
    Messages:
    7,483 (2.98/day)
    Thanks Received:
    813
    Location:
    Scotland (It rains alot)
    Well the intels idea for larrabee is that there are lots of processor cores working in parallel like a GPU. So its pretty much x86 processors acting like gpu cores. Intel live to make cpu's and they make them like no other and as well as being able to act like a gpu they can be programmed to do cpu work as well. Larrabee isn't meant to be able to render as fast as ati and nvidia and maybe future one's will but larrabee is something different it can do whatever a cpu can do and thats why its exciting.
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page