1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

First AMD Fusion Specifications, Hint Toward RV710 Specs.

Discussion in 'News' started by btarunr, Aug 22, 2008.

  1. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    33,583 (9.46/day)
    Thanks Received:
    17,149
    Location:
    Hyderabad, India
    AMD Fusion could well be the first CPU to feature a graphics processor core. It will incorporate a graphics processor with specifications identical to the RV710. This CPU would be based on the 45 nm silicon fabrication process, and this processor will be manufactured at the Taiwan Semiconductor Manufacturing Company (TSMC). This GPU will be called "Kong". Here are its specifications:
    • Core Frequency between 600 ~ 800 MHz
    • 128-bit wide memory bus (DDR3, Side-port supportive)
    • 40 Stream Processors
    • 8 TMUs, 4 ROPs
    • DirectX 10.1 support
    • UVD

    The GPU will not connect to the rest of the processor using HyperTransport , instead, a new interface referred to as "Onion" will replace it, keeping with the "green" scheme, the GPU will utilize a "Garlic" memory interface to enhance read/write performance and reduce latencies. Performance-wise, it's expected to perform 50% better than the RS780 chip. The addition of this GPU will step-up power consumption by 5~8 W (load) and 0.4~0.6 W (idle). This is the first implementation of such as design methodology, AMD wants to step into the pool only after dipping its toes.

    Source: IT.com.cn
     
    Last edited: Aug 22, 2008
  2. From_Nowhere New Member

    Joined:
    Jun 13, 2008
    Messages:
    661 (0.20/day)
    Thanks Received:
    77
    This will be awesome for laptops and HTPC's.

    As long as it doesn't smell like an onion or a clove of garlic.
     
  3. tkpenalty New Member

    Joined:
    Sep 26, 2006
    Messages:
    6,956 (1.77/day)
    Thanks Received:
    345
    Location:
    Australia, Sydney
    Garlic = ring bus. This will definately put the intel atom to shame as the K8 architecture excels at lower clockspeeds, moreover the IGP which is equivalent to what, today's low end GPUs Is present. Its faster than my old 9550 :(
     
    10 Year Member at TPU
  4. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    45,865 (9.87/day)
    Thanks Received:
    13,322
    Location:
    Australalalalalaia.
    garlic and onion? wtf?
     
    10 Year Member at TPU
  5. From_Nowhere New Member

    Joined:
    Jun 13, 2008
    Messages:
    661 (0.20/day)
    Thanks Received:
    77
    To make the vampiric Intel cry and run away? Dunno just nick names.
     
    Last edited: Oct 17, 2009
  6. laszlo

    laszlo

    Joined:
    Jan 11, 2005
    Messages:
    1,056 (0.23/day)
    Thanks Received:
    347
    Location:
    66 feet from the ground
    so future is here already;i expected this to happen after 2010 but seems we won't need stand alone gpus in a few year;this is bad news for nvidia and also for ati because the oem built pc's(not the expensive ones) has a weak gpu so they can spare now ...
     
    10 Year Member at TPU
  7. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    33,583 (9.46/day)
    Thanks Received:
    17,149
    Location:
    Hyderabad, India
    = stinks when raw , yummy when cooked.

    I'd say garlic = sideport, so even before implementing DDR3, or on cheap boards without it, the DDR3 Sideport memory should make use ot memory chips present on the motherboard.
     
  8. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    45,865 (9.87/day)
    Thanks Received:
    13,322
    Location:
    Australalalalalaia.
    I can see a few problems arising from this setup.

    #1. Motherboard needs support - otherwise you have no outputs
    #1.1. You're really screwed if you want more/different outputs (adding HDMI, etc)
    #2. what will happen if you add an unsupported video card (Nvidia for example)? will it be able to be disabled?

    bonus#. This will be fun for itx and really tiny systems. Only one heatsink needed...
     
    10 Year Member at TPU
  9. Wile E

    Wile E Power User

    Joined:
    Oct 1, 2006
    Messages:
    24,318 (6.20/day)
    Thanks Received:
    3,777
    Or alternatively, the gfx core could be used as strictly a GPGPU device. It can offload physics and stuff like that. They would just need to make sure it's driver registers it as a Co-Processor instead of a gpu in this situation, that way, if you wanted to add an NV card in Vista, you can retain some sort of functionality from the gpu core in the cpu, and not have to worry about Vista disabling one of your gfx drivers.
     
    10 Year Member at TPU
  10. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    33,583 (9.46/day)
    Thanks Received:
    17,149
    Location:
    Hyderabad, India
    Yup.

    Why? Isn't it the same with boards with onboard graphics? Aren't you equally screwed if you need say HDMI and the board gives you only D-Sub? (D-Sub - DVI - HDMI using MacGyver dongles won't work, you'd need wiring on a DVI port that conveys HDMI as what happens with say ATI cards that come with DVI-HDMI dongles.

    The same as what happens when you use one on a 780G board, nothing. Fusion gives out video through the board's connectors, when using ATI HybridGraphics, your monitor plugs into the connector on the board, not the card(s), so I'm not sure that on boards without display out(s) the output of Fusion would go to a graphics card. (I'm just guessing).

    Yes, ITX, notebooks....only 0.7 W (for the graphics) in idle is awesome.
     
  11. tkpenalty New Member

    Joined:
    Sep 26, 2006
    Messages:
    6,956 (1.77/day)
    Thanks Received:
    345
    Location:
    Australia, Sydney
    #1. Fusion uses different socket anyway
    #1.1 Read above
    #2. Duh.
     
    10 Year Member at TPU
  12. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    45,865 (9.87/day)
    Thanks Received:
    13,322
    Location:
    Australalalalalaia.
    hey i never stated they were facts or anything guys, just stating the obvious really.

    Using a different socket is good, i wasnt aware of that.
     
    10 Year Member at TPU
  13. Tatty_One

    Tatty_One Senior Moderator Staff Member

    Joined:
    Jan 18, 2006
    Messages:
    19,377 (4.64/day)
    Thanks Received:
    5,632
    Location:
    Worcestershire, UK
    Yes my thoughts....leading to the true concept of multiple Hybrid GPU's in a single system, a more cost effective way of enhancing performance.....providing of course the CPU's and motherboards are not too expensive.
     
    10 Year Member at TPU
  14. tigger

    tigger I'm the only one

    Joined:
    Mar 20, 2006
    Messages:
    10,476 (2.54/day)
    Thanks Received:
    1,695
    I think these chips would need bigger coolers than a cpu only though,you have a hot cpu and a hot gpu in the same package.

    Nice idea though.
     
    10 Year Member at TPU
  15. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    45,865 (9.87/day)
    Thanks Received:
    13,322
    Location:
    Australalalalalaia.
    "The addition of this GPU will step-up power consumption by 5~8 W (load) and 0.4~0.6 W (idle)"

    When you factor in that the average CPU Wattage is around 70W (TDP) these days, its basically irrelevant to temps.
     
    10 Year Member at TPU
  16. WhiteLotus

    WhiteLotus

    Joined:
    Jul 30, 2007
    Messages:
    6,560 (1.81/day)
    Thanks Received:
    865
    This looks like a very interesting method of doing things and i think it can dramatically improve the HTPC without doing much.

    As for the nVidia and ATI losing out on discreet graphics above^ ati and AMD are the same company now so ATI wont be losing out at all, and AMD will have a great selling point as well. This is something Intel can't do (i think) so could be the sole provider of such chips to such market.
     
  17. laszlo

    laszlo

    Joined:
    Jan 11, 2005
    Messages:
    1,056 (0.23/day)
    Thanks Received:
    347
    Location:
    66 feet from the ground
    just my 2 cents

    imagine a 22nm chip with a dual-core cpu and a hd4870 ;would you buy a discrete graphic card? because this will happen and we already have mobos with ram ,i don't think is hard to put 1 gb ram on mobo
     
    dawgnbirfar says thanks.
    10 Year Member at TPU
  18. mdm-adph

    mdm-adph

    Joined:
    Mar 28, 2007
    Messages:
    2,479 (0.66/day)
    Thanks Received:
    340
    Location:
    Your house.
    There pretty much always had been, and always will be, standalone video. :p There's always going to be a need for a discrete, ultra powerful, dedicated video solution -- maybe it'll become more expensive and in the domain of workstations in the future, but it'll always be there.
     
    10 Year Member at TPU
  19. wolf2009 Guest

    I love the names AMD comes up with , Venice , Brisbane ,Shanghai , and now Garlic , Onion . Lol , lets see tomatoes and oranges and apples too .
     
    10 Year Member at TPU
  20. MilkyWay

    Joined:
    Aug 16, 2007
    Messages:
    7,181 (1.99/day)
    Thanks Received:
    731
    what happens when you upgrade the cpu do you have to upgrade the gpu at the same time?

    what happens when the gpu is outdated but the cpu is still okay or the opposite?

    these questions believe me to come to the conclusion the only want to get rid of the IGP not gpus altogether
     
  21. jydie New Member

    Joined:
    Feb 2, 2006
    Messages:
    209 (0.05/day)
    Thanks Received:
    3
    I agree with your conclusion... This will probably be used in low cost (basic) systems, laptops, Home Theater Systems, etc. If you want to do some decent gaming, then you will need a good video card. The way I see it, this simply moves the GPU for integrated graphics from the motherboard to the CPU.

    This will be great for laptops!! If you can focus most of the heat buildup to one area, then I would think that more efficient cooling methods would follow.
     
    Last edited by a moderator: Feb 3, 2016
    10 Year Member at TPU
  22. PCpraiser100 New Member

    Joined:
    Jul 17, 2008
    Messages:
    1,062 (0.33/day)
    Thanks Received:
    68
    40 stream processors? Thats very little.
     
  23. MrMilli

    MrMilli

    Joined:
    Mar 1, 2008
    Messages:
    234 (0.07/day)
    Thanks Received:
    44
    Location:
    Antwerp, Belgium
    AMD Fusion could well be the first CPU to feature a graphics core?
    To name a few before the Fusion: Cyrix MediaGX (later AMD Geode), Via Mark CoreFusion & Intel Timna (cancelled though)! So it's really not the first of it's kind.

    Fusion will probably be a MCM chip which would imply that only the GPU will be produced at TSMC and the CPU in AMD's own fabs.

    To answer MilkyWay: Fusion will have PCI-E integrated, so you could add a discrete card.
     
    Last edited by a moderator: Feb 3, 2016
  24. substance90

    substance90 New Member

    Joined:
    Sep 22, 2007
    Messages:
    71 (0.02/day)
    Thanks Received:
    2
    Location:
    Bulgaria
    Too bad, I don`t see this getting into netbooks and UMPCs! Via Nano is also better than the Atom, but still do you know a single device using the Nano?
     
  25. Cuzza

    Cuzza

    Joined:
    Jul 3, 2007
    Messages:
    1,317 (0.36/day)
    Thanks Received:
    217
    Location:
    New Zealand
    That's what i'm talking about ! MacGyver can do anything! Sorry this is totally off topic but:

    [​IMG]
     
    dawgnbirfar says thanks.

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)