1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Larrabee Capable of 2 TFLOPs

Discussion in 'News' started by btarunr, Jul 6, 2008.

  1. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,749 (11.14/day)
    Thanks Received:
    13,680
    Location:
    Hyderabad, India
    German tech-journal Heise caught up with Intel's Pat Gelsinger for an article discussing the company's past and future as the silicon giant heads towards 40 years of service this 18th of July.

    Among several topics, came up the most interesting one, visual computing and Intel's plans on it. 'Larrabee' strikes as a buzzword. It is the codename of Intel's upcoming graphics processor (GPU) architecture with which it plans to take on established players such as NVIDIA and AMD among others.

    What's unique (so far) about Larrabee is that it's entirely made up of x86 processing cores. The Larrabee is likely to have 32 x86 processing cores. Here's a surprise: These processing cores are based on the design of Pentuim P54C, a 13+ year old x86 processor. This processor will be miniaturised to the 45nm fabrication process, they will be assisted by a 512-bit SIMD unit and these cores will support 64-bit address. Gelsinger says that 32 of these cores clocked at 2.00 GHz could belt out 2 TFLOPs of raw computational power. That's close to that of the upcoming AMD R700. Heise also reports that this GPU could have a TDP of as much as 300W (peak).

    With inputs from Heise
     
    Last edited: Jul 6, 2008
    DrPepper and OnBoard say thanks.
  2. tkpenalty New Member

    Joined:
    Sep 26, 2006
    Messages:
    6,958 (2.35/day)
    Thanks Received:
    345
    Location:
    Australia, Sydney
    I want actual pics of the card...
     
  3. magibeg

    magibeg

    Joined:
    May 9, 2006
    Messages:
    2,000 (0.65/day)
    Thanks Received:
    203
    wow 300 watts eh? Could heat the lower level of my house. I get the strange feeling this is either going to horribly flop or do incredibly well. Very little middle ground :p
     
  4. Morgoth

    Morgoth

    Joined:
    Aug 4, 2007
    Messages:
    3,795 (1.43/day)
    Thanks Received:
    250
    Location:
    Netherlands
    i'm not sure but i tough it was an Intergrated GPU on Nehalem with 2 cores + Ht :S
    now i have seen this [​IMG]
    i'm starting to get confused lol



    btarunr Its spelled Larrabee
     
    Last edited: Jul 6, 2008
    btarunr says thanks.
  5. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,749 (11.14/day)
    Thanks Received:
    13,680
    Location:
    Hyderabad, India
    So, that 150W connector is 10-pin? (I'll eliminate 12-pin since two 6-pin connectors have a blank pin each that could be shared with pin #3)
     
  6. 1c3d0g

    1c3d0g

    Joined:
    Dec 9, 2007
    Messages:
    699 (0.28/day)
    Thanks Received:
    59
    300 watts. Can you say nuclear reactor? What happened to efficiency, Intel? :(

    I don't like where GPU's are heading. There's too much power draw for so little performance increase. This goes for all GPU makers. Something needs to be done to bring power demands back in line with the rest of computer components. They have enough trouble as it is squeezing last-generation high-end GPU's in notebooks, but this is just ridiculous.
     
  7. KieranD

    KieranD

    Joined:
    Aug 16, 2007
    Messages:
    8,043 (3.05/day)
    Thanks Received:
    822
    Location:
    Glasgow, Scotland
    most wack gpu ive ever seen

    whoever thought that having 32 old cpus and makign a gpu based out of it is either incredibly stupid or amazingly crafty

    im not even going to bother saying much because we all know nothing matters untill we get results

    STILL i know for a fact 300w is alot for gpu i mean you could run a full pc on that nearly
     
  8. Voyager

    Joined:
    Jun 18, 2008
    Messages:
    23 (0.01/day)
    Thanks Received:
    2
    x86 cores :twitch:
    Now we can run existing software on GPU :toast:
     
  9. DonInKansas

    DonInKansas

    Joined:
    Jun 2, 2007
    Messages:
    5,096 (1.88/day)
    Thanks Received:
    1,265
    Location:
    Kansas
    I'd never have to run the heater in the winter; I'll just play more games!:p
     
  10. KieranD

    KieranD

    Joined:
    Aug 16, 2007
    Messages:
    8,043 (3.05/day)
    Thanks Received:
    822
    Location:
    Glasgow, Scotland
    wonder if you could run programs on the gpu instead of the cpu LOL this card boggles me completely
     
  11. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,749 (11.14/day)
    Thanks Received:
    13,680
    Location:
    Hyderabad, India
    Why so much fuss about its TDP? Wasn't the HD2900 XT like 200W (peak)?
     
  12. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,400 (11.53/day)
    Thanks Received:
    9,698
    based on the design of Pentuim P54C
    Pentuim? is that some CPU i never heard of?

    *cough spellcheck*

    300W TDP... gack.
     
  13. jyoung75 New Member

    Joined:
    Jul 6, 2008
    Messages:
    15 (0.01/day)
    Thanks Received:
    4
    2 TFLOPS by Larabee a year from now is nice, but I can get 2.4 TFLOPS from the Radeon 4870x2 a month from now. And the Radeon cards are already rumored to be ray tracing monsters (used for ray tracing HD scenes in Transformers) http://www.tgdaily.com/content/view/38145/135/.
     
  14. TheGuruStud

    TheGuruStud

    Joined:
    Sep 15, 2007
    Messages:
    1,631 (0.63/day)
    Thanks Received:
    171
    Location:
    Police/Nanny State of America
    Since when is a general purpose cpu going to be able to process graphics at a respectable rate?

    If that was the case, everyone with a quad core would be getting 50 FPS in 3dmark with the cpu test (I don't care if it has high speed ram and cache attached or not). I'm calling intel retarded, again.

    edit: Or it's more fud. Like that 10 GHz pentium 4 they just had laying around :laugh:
     
    WarEagleAU says thanks.
  15. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,749 (11.14/day)
    Thanks Received:
    13,680
    Location:
    Hyderabad, India
    If a ~70 GFLOPs Core 2 Extreme can do ~6 fps, guess what 2000 GFLOPs can.
     
  16. dracoonpit

    dracoonpit New Member

    Joined:
    Jun 25, 2008
    Messages:
    17 (0.01/day)
    Thanks Received:
    2
    Location:
    Germany
    I disagree. Comparing to performance-increase, GPUs have been more efficient with every new model. Performance per Watt ratio of new GPUs is better - no matter if they need 200 watts ..
     
  17. TheGuruStud

    TheGuruStud

    Joined:
    Sep 15, 2007
    Messages:
    1,631 (0.63/day)
    Thanks Received:
    171
    Location:
    Police/Nanny State of America
    That's 100% theoretical max. I have a lot more faith in a c2e than some untested, whack design that supposed to be from old architecture. If it's new architecture or at least mostly from the ground up, then I'll be quiet. What they're claiming is just ridiculous.

    To me, this is like M$ saying the xbox 360 is fast b/c it has tri-core and runs at 3.2 GHz. But in reality there's not many transistors and it just can't push much data.
     
  18. eidairaman1

    eidairaman1

    Joined:
    Jul 2, 2007
    Messages:
    13,075 (4.88/day)
    Thanks Received:
    1,652
    intel can claim this and claim that, by the time they release the card it will already be obsolete by Nvidia and AMD, hell even Via.
     
  19. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,400 (11.53/day)
    Thanks Received:
    9,698
    the only thing this has over video cards, is the x86 architecture. that means you can effectively add 32 CPU cores to any machine. ANY app should be able to use it (games, encoding/decoding apps, etc)
     
  20. TheGuruStud

    TheGuruStud

    Joined:
    Sep 15, 2007
    Messages:
    1,631 (0.63/day)
    Thanks Received:
    171
    Location:
    Police/Nanny State of America
    VIA!!! Fastest. Stuff. Ever. :laugh:

    Seriously, though, VIA was cool back in the day, but they pissed me off when the athlon 64s came out. Those boards were slow and buggy.
     
  21. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,749 (11.14/day)
    Thanks Received:
    13,680
    Location:
    Hyderabad, India
    Just as you say you'd be silent if it was something built from scratch, you can't be loud about this either. As for scratch, these are 'old' processors, but shrunk, clocked to 2 GHz, .....etc. When something of this sort comes from Gelsinger, it's better we not jump to assumptions that it's a 'bad' architecture, since we've seen nothing to prove it's bad just as yet.
     
  22. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,400 (11.53/day)
    Thanks Received:
    9,698
    since the core architecture (core solo/duo, and then onto core 2 duo/quad) designs came from a pentium 3 tualatin, his argument really falls down anyway. old cores that hit a tech limit can really be revilatised with new tech and die shrinks.
     
  23. panchoman

    panchoman Sold my stars!

    Joined:
    Jul 16, 2007
    Messages:
    9,595 (3.60/day)
    Thanks Received:
    1,200
    WTF you call that a gpu? thats not a gpu! thats 32 p4 cores stuck together on a card with a 300w tdp after a huge ass die shrink! WTF intel.. i expected much much much more from you, come on.. 32 p4 cores stuck together to make a gpu...
     
  24. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,749 (11.14/day)
    Thanks Received:
    13,680
    Location:
    Hyderabad, India
    A quad-core QX9770 draws 130W, isn't 32 cores @ 300W an improvement?
     
  25. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,400 (11.53/day)
    Thanks Received:
    9,698
    well 300W TDP isnt so bad... oh wait yes it is. TDP means its not max, so it could even go upto 400W real draw.

    That said, this is intel. they could easily throw in some power saving features and have its power usage scale really well (modified speedstep, for example)
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page