1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Quashes Larrabee and Fusion Hype

Discussion in 'News' started by btarunr, Aug 25, 2008.

  1. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,713 (11.16/day)
    Thanks Received:
    13,669
    Location:
    Hyderabad, India
    NVIDIA took potshots at Intel's upcoming Larrabee graphics processor and AMD's "GPU on CPU" device, the Fusion. Speaking to the media ahead of the opening of the annual NVISION expo on Monday, Andy Keane, general manager of NVIDIA's GPU computing group, said that there is an "incredible amount about Larrabee that's undefined" commenting on whatever is known about Intel's GPU.

    "You can't just say 'it's x86 so it's going to solve the massively parallel computing problem.'" said Keane. "Look at the PC," he continued. "With an OS they don't control, and applications coming from everywhere... to say arbitrarily that everything's going to scale to 32 cores seems to me to be a bit of a stretch. " he added.

    John Mottram, the chief architect for the NVIDIA G200 graphics processor raised further doubts about Larrabee's real-world performance, brushing aside Intel's announcements as a PR stunt. He is quoted saying:

    "They've put out a certain amount of technical disclosure in the past five weeks," he noted, "but although they make Larrabee sound like it's a fundamentally better approach, it isn't. They don't tell you the assumptions they made. They talk about scaling, but they disregard memory bandwidth. They make it sound good, but we say, you neglected half a dozen things."

    "Every GPU we make, we always consider this type of design, we do a reasoned analysis, and we always conclude no. That's why we haven't built that type of machine."

    Essentially NVIDIA feels Intel is creating too much of a hype over something that doesn't look like it can take on established players, as it would need some mysterious powerful machinery apart from 32 x86 cores. Peter Glaskowsky, a CPU Architect and blogger says that Larrabee in 2010 will have the same level of performance as a GPU from NVIDIA or ATI had back in 2006.

    Mottram didn't spare AMD either. In the line of fire was AMD's upcoming Fusion processor, a CPU with a graphics processor embedded.

    "Joining both components on the same die doesn't buy you that much," he commented. "It's not like there's a real bottleneck there. And every square millimeter you add to the die is a very expensive millimeter. It's an incremental expense, not a linear function. It's cheaper to separate them."

    Andy Keane doubted whether buyers would even care about Fusion. "The class of buyer who buys that type of graphics isn't really buying graphics," he argued. "They don't care about it."

    "You're working out what you can really afford to put on a CPU, and you're selling it to a customer who doesn't care. The economics don't make sense." he added.

    Source: PC Pro
     
    newconroer says thanks.
  2. newconroer

    newconroer

    Joined:
    Jun 20, 2007
    Messages:
    3,159 (1.18/day)
    Thanks Received:
    340
    Thanks BT, good to hear some responses to Larrabee and Fusion I suppose.

    I think Larrabee even if it flops, is still worth the time spent, as it might open avenues for developers; but Fusion I have to agree seems more novel than anything. Though if it brings some light to AMD's darkened corner of existence, then by all means!
     
  3. Lillebror New Member

    Joined:
    Jul 28, 2007
    Messages:
    720 (0.27/day)
    Thanks Received:
    88
    Location:
    Denmark
    I think most of those who are gonna buy a thing like that, are using it with programs that uses a gpu - cause then you have a gpu and a cpu in one and the same chip!
     
  4. W1zzard

    W1zzard Administrator Staff Member

    Joined:
    May 14, 2004
    Messages:
    14,962 (3.92/day)
    Thanks Received:
    11,768
    thats the whole point of fusion. dirt cheap oem systems that need to be able to run vista aero, play back some basic video and be cheap, cheap, cheap. this is by far the biggest market in the pc industry, about 5,000 times bigger (educated guess) than all this GTX 260/280, 4870 X2 stuff
     
  5. [I.R.A]_FBi

    [I.R.A]_FBi New Member

    Joined:
    May 19, 2007
    Messages:
    7,664 (2.82/day)
    Thanks Received:
    540
    Location:
    c:\programs\kitteh.exe
    if its cheaper and does the same stuff they dont care, chances are they wont even notice
     
  6. Siman0 New Member

    Joined:
    Jul 6, 2008
    Messages:
    19 (0.01/day)
    Thanks Received:
    0
    Nvidida needs to watch what they are saying mostly to intel what if they say o well we don't want that old multi gpu thing on our board it jest takes up space
     
  7. Swansen New Member

    Joined:
    Nov 18, 2007
    Messages:
    182 (0.07/day)
    Thanks Received:
    9
    I think that is something that Nvidia is missing here, the fact that "those people won't care" is exactly why it will work. Like some one else said, it will be able to run a 3d desktop as well as some minor 3d applications all in a smaller space. Seems like a good idea to me, especially for laptops, and UMPCs.
     
  8. Jansku07 New Member

    Joined:
    May 24, 2008
    Messages:
    171 (0.07/day)
    Thanks Received:
    24
    Location:
    Finland
    The users wont notice this, but OEMs will. Every cent that cheapens the PC is a big win for them. Fusion, if carried out succesfully, will be a great succes for both OEMs and DAAMIT.
     
  9. [I.R.A]_FBi

    [I.R.A]_FBi New Member

    Joined:
    May 19, 2007
    Messages:
    7,664 (2.82/day)
    Thanks Received:
    540
    Location:
    c:\programs\kitteh.exe

    And that is what is going to rake in the sales.
     
  10. Jansku07 New Member

    Joined:
    May 24, 2008
    Messages:
    171 (0.07/day)
    Thanks Received:
    24
    Location:
    Finland
    Says the head architect of G200 core... :roll:
     
    Zubasa, DrPepper, WarEagleAU and 3 others say thanks.
  11. HTC

    HTC

    Joined:
    Apr 1, 2008
    Messages:
    2,240 (0.93/day)
    Thanks Received:
    303
    Quoted for truth!
     
  12. lemonadesoda

    lemonadesoda

    Joined:
    Aug 30, 2006
    Messages:
    6,260 (2.10/day)
    Thanks Received:
    967
    I think he means exponentially increasing, not incremental.

    Cheaper to separate them? No. That's why the CPU contains FPU math today, whereas before it was on a separate x87 chip. It's FAR cheaper to combine them, up to some critical point which is a combination of heat and bad dies, = fn(heat, bad dies) where we can also write this as =fn(power consumption, total die size, die technology scale, error rate per mm2 die).

    Oh look! The nVidia equation isnt good. Too much power consumption. Too big die size. Too big die technology scale nm. And too big fail per mm2.

    So nVidia wouldnt be able to pull it off. But I'm sure Intel or AMD could! :roll:
     
    Zubasa and WarEagleAU say thanks.
  13. [I.R.A]_FBi

    [I.R.A]_FBi New Member

    Joined:
    May 19, 2007
    Messages:
    7,664 (2.82/day)
    Thanks Received:
    540
    Location:
    c:\programs\kitteh.exe
    nvidia + gpu + cpu = fail
     
  14. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,713 (11.16/day)
    Thanks Received:
    13,669
    Location:
    Hyderabad, India
    It won't happen, they're yet to get a proper x86 license. So maybe this is true:

    ...because they can't. They're yet to get into out of the order execution processors.
     
    WarEagleAU says thanks.
  15. xfire

    xfire New Member

    Joined:
    Nov 22, 2007
    Messages:
    1,395 (0.55/day)
    Thanks Received:
    193
    Location:
    Hyderabad,India
    Who's still waiting for the can to open?
    People buying the can are buying graphics :p
     
  16. Morgoth

    Morgoth

    Joined:
    Aug 4, 2007
    Messages:
    3,795 (1.44/day)
    Thanks Received:
    250
    Location:
    Netherlands
    nvidia sould stfu there still in teh game becus of intel and amd
     
    WarEagleAU says thanks.
  17. truehighroller1 New Member

    Joined:
    Oct 20, 2007
    Messages:
    189 (0.07/day)
    Thanks Received:
    9
    I want to see what Intel says about this outlashing.
     
  18. Morgoth

    Morgoth

    Joined:
    Aug 4, 2007
    Messages:
    3,795 (1.44/day)
    Thanks Received:
    250
    Location:
    Netherlands
    i think asoon intel gets a good gpu and nvidia still doest have a x86 licanse
    nvidia wil probaly be on its own
     
  19. X1REME

    X1REME New Member

    Joined:
    Jan 3, 2008
    Messages:
    84 (0.03/day)
    Thanks Received:
    6
    nvidia is building its own x86 cpu and is about to pronounce it any day now, they don't have an x86 licence, so i don't understand what there gonna do about that (maybe use via = atom)

    nvidia trashing amd is not good as amd let nvidia use there chip sets/gpu/motherboards etc. in servers/desktops/laptops etc.
     
  20. Nkd New Member

    Joined:
    Sep 15, 2007
    Messages:
    93 (0.04/day)
    Thanks Received:
    18
    I am an nvidia owner but they seem to be getting desparate, they know they have no answer for fusion and amd will be taking a lot of oem systems and eliminating the need of dedicated graphics, Nvidia is just downplaying fusion because they got no answer for it, or they will be late with that answer.
     
  21. Darkrealms

    Joined:
    Feb 26, 2007
    Messages:
    852 (0.30/day)
    Thanks Received:
    23
    Location:
    USA
    That is a lot of talking for Nvidia right now. They better have something good with their GTX300 series or they could be in trouble.

    To an earlier comment about Nvidia on the Intel chipsets (sorry didn't see your post again when I scanned through). That is the MFGs putting Nvidias SLI chip on the boards not Intel. As long as Nvidia has a large fan base and a good product I think the MFGs will still work with Nvidia. Right now Nvidia is fighting head to head with ATI. So trying to cut them out would be a big mistake. If ATI/AMD and Intel can build somethig that clearly out performs Nvidia then I think Nvidia will be out.

    I still think the first chance Nvidia gets, they will buy a 86x company/license.
     
  22. PP Mguire

    PP Mguire New Member

    Joined:
    Aug 15, 2008
    Messages:
    5,005 (2.21/day)
    Thanks Received:
    453
    Location:
    Venus, Texas
    No, because their chipset market is only a small percent of where their money comes from.
     
  23. wolf2009 Guest

    agreed ! although its a smart company , it will come up with something .
     
  24. Swansen New Member

    Joined:
    Nov 18, 2007
    Messages:
    182 (0.07/day)
    Thanks Received:
    9
    Intel is huge, i think they could care less.
     
  25. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,713 (11.16/day)
    Thanks Received:
    13,669
    Location:
    Hyderabad, India
    Actually this is a result of Intel rubbishing the growing popularity of NVIDIA CUDA. It's a counteroffensive. Dig thru our news archives, you'll find the story.
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page