1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

First AMD Fusion Specifications, Hint Toward RV710 Specs.

Discussion in 'News' started by btarunr, Aug 22, 2008.

  1. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,750 (11.14/day)
    Thanks Received:
    13,683
    Location:
    Hyderabad, India
    AMD Fusion could well be the first CPU to feature a graphics processor core. It will incorporate a graphics processor with specifications identical to the RV710. This CPU would be based on the 45 nm silicon fabrication process, and this processor will be manufactured at the Taiwan Semiconductor Manufacturing Company (TSMC). This GPU will be called "Kong". Here are its specifications:
    • Core Frequency between 600 ~ 800 MHz
    • 128-bit wide memory bus (DDR3, Side-port supportive)
    • 40 Stream Processors
    • 8 TMUs, 4 ROPs
    • DirectX 10.1 support
    • UVD

    The GPU will not connect to the rest of the processor using HyperTransport , instead, a new interface referred to as "Onion" will replace it, keeping with the "green" scheme, the GPU will utilize a "Garlic" memory interface to enhance read/write performance and reduce latencies. Performance-wise, it's expected to perform 50% better than the RS780 chip. The addition of this GPU will step-up power consumption by 5~8 W (load) and 0.4~0.6 W (idle). This is the first implementation of such as design methodology, AMD wants to step into the pool only after dipping its toes.

    Source: IT.com.cn
     
    Last edited: Aug 22, 2008
  2. From_Nowhere New Member

    Joined:
    Jun 13, 2008
    Messages:
    661 (0.28/day)
    Thanks Received:
    77
    This will be awesome for laptops and HTPC's.

    As long as it doesn't smell like an onion or a clove of garlic.
     
  3. tkpenalty New Member

    Joined:
    Sep 26, 2006
    Messages:
    6,958 (2.35/day)
    Thanks Received:
    345
    Location:
    Australia, Sydney
    Garlic = ring bus. This will definately put the intel atom to shame as the K8 architecture excels at lower clockspeeds, moreover the IGP which is equivalent to what, today's low end GPUs Is present. Its faster than my old 9550 :(
     
  4. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,401 (11.53/day)
    Thanks Received:
    9,700
    garlic and onion? wtf?
     
  5. From_Nowhere New Member

    Joined:
    Jun 13, 2008
    Messages:
    661 (0.28/day)
    Thanks Received:
    77
    To make the vampiric Intel cry and run away? Dunno just nick names.
     
    Last edited: Oct 17, 2009
  6. laszlo

    laszlo

    Joined:
    Jan 11, 2005
    Messages:
    891 (0.25/day)
    Thanks Received:
    105
    Location:
    66 feet from the ground
    so future is here already;i expected this to happen after 2010 but seems we won't need stand alone gpus in a few year;this is bad news for nvidia and also for ati because the oem built pc's(not the expensive ones) has a weak gpu so they can spare now ...
     
  7. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,750 (11.14/day)
    Thanks Received:
    13,683
    Location:
    Hyderabad, India
    = stinks when raw , yummy when cooked.

    I'd say garlic = sideport, so even before implementing DDR3, or on cheap boards without it, the DDR3 Sideport memory should make use ot memory chips present on the motherboard.
     
  8. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,401 (11.53/day)
    Thanks Received:
    9,700
    I can see a few problems arising from this setup.

    #1. Motherboard needs support - otherwise you have no outputs
    #1.1. You're really screwed if you want more/different outputs (adding HDMI, etc)
    #2. what will happen if you add an unsupported video card (Nvidia for example)? will it be able to be disabled?

    bonus#. This will be fun for itx and really tiny systems. Only one heatsink needed...
     
  9. Wile E

    Wile E Power User

    Joined:
    Oct 1, 2006
    Messages:
    24,324 (8.23/day)
    Thanks Received:
    3,778
    Or alternatively, the gfx core could be used as strictly a GPGPU device. It can offload physics and stuff like that. They would just need to make sure it's driver registers it as a Co-Processor instead of a gpu in this situation, that way, if you wanted to add an NV card in Vista, you can retain some sort of functionality from the gpu core in the cpu, and not have to worry about Vista disabling one of your gfx drivers.
     
  10. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,750 (11.14/day)
    Thanks Received:
    13,683
    Location:
    Hyderabad, India
    Yup.

    Why? Isn't it the same with boards with onboard graphics? Aren't you equally screwed if you need say HDMI and the board gives you only D-Sub? (D-Sub - DVI - HDMI using MacGyver dongles won't work, you'd need wiring on a DVI port that conveys HDMI as what happens with say ATI cards that come with DVI-HDMI dongles.

    The same as what happens when you use one on a 780G board, nothing. Fusion gives out video through the board's connectors, when using ATI HybridGraphics, your monitor plugs into the connector on the board, not the card(s), so I'm not sure that on boards without display out(s) the output of Fusion would go to a graphics card. (I'm just guessing).

    Yes, ITX, notebooks....only 0.7 W (for the graphics) in idle is awesome.
     
  11. tkpenalty New Member

    Joined:
    Sep 26, 2006
    Messages:
    6,958 (2.35/day)
    Thanks Received:
    345
    Location:
    Australia, Sydney
    #1. Fusion uses different socket anyway
    #1.1 Read above
    #2. Duh.
     
  12. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,401 (11.53/day)
    Thanks Received:
    9,700
    hey i never stated they were facts or anything guys, just stating the obvious really.

    Using a different socket is good, i wasnt aware of that.
     
  13. Tatty_One

    Tatty_One Senior Moderator Staff Member

    Joined:
    Jan 18, 2006
    Messages:
    16,769 (5.23/day)
    Thanks Received:
    2,557
    Location:
    Worcestershire, UK
    Yes my thoughts....leading to the true concept of multiple Hybrid GPU's in a single system, a more cost effective way of enhancing performance.....providing of course the CPU's and motherboards are not too expensive.
     
  14. tigger

    tigger I'm the only one

    Joined:
    Mar 20, 2006
    Messages:
    10,183 (3.23/day)
    Thanks Received:
    1,399
    I think these chips would need bigger coolers than a cpu only though,you have a hot cpu and a hot gpu in the same package.

    Nice idea though.
     
  15. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,401 (11.53/day)
    Thanks Received:
    9,700
    "The addition of this GPU will step-up power consumption by 5~8 W (load) and 0.4~0.6 W (idle)"

    When you factor in that the average CPU Wattage is around 70W (TDP) these days, its basically irrelevant to temps.
     
  16. WhiteLotus

    WhiteLotus

    Joined:
    Jul 30, 2007
    Messages:
    6,551 (2.47/day)
    Thanks Received:
    857
    This looks like a very interesting method of doing things and i think it can dramatically improve the HTPC without doing much.

    As for the nVidia and ATI losing out on discreet graphics above^ ati and AMD are the same company now so ATI wont be losing out at all, and AMD will have a great selling point as well. This is something Intel can't do (i think) so could be the sole provider of such chips to such market.
     
  17. laszlo

    laszlo

    Joined:
    Jan 11, 2005
    Messages:
    891 (0.25/day)
    Thanks Received:
    105
    Location:
    66 feet from the ground
    just my 2 cents

    imagine a 22nm chip with a dual-core cpu and a hd4870 ;would you buy a discrete graphic card? because this will happen and we already have mobos with ram ,i don't think is hard to put 1 gb ram on mobo
     
    dawgnbirfar says thanks.
  18. mdm-adph

    mdm-adph New Member

    Joined:
    Mar 28, 2007
    Messages:
    2,478 (0.89/day)
    Thanks Received:
    340
    Location:
    Your house.
    There pretty much always had been, and always will be, standalone video. :p There's always going to be a need for a discrete, ultra powerful, dedicated video solution -- maybe it'll become more expensive and in the domain of workstations in the future, but it'll always be there.
     
  19. wolf2009 Guest

    I love the names AMD comes up with , Venice , Brisbane ,Shanghai , and now Garlic , Onion . Lol , lets see tomatoes and oranges and apples too .
     
  20. KieranD

    KieranD

    Joined:
    Aug 16, 2007
    Messages:
    8,043 (3.05/day)
    Thanks Received:
    822
    Location:
    Glasgow, Scotland
    what happens when you upgrade the cpu do you have to upgrade the gpu at the same time?

    what happens when the gpu is outdated but the cpu is still okay or the opposite?

    these questions believe me to come to the conclusion the only want to get rid of the IGP not gpus altogether
     
  21. jydie New Member

    Joined:
    Feb 2, 2006
    Messages:
    209 (0.07/day)
    Thanks Received:
    3
    I agree with your conclusion... This will probably be used in low cost (basic) systems, laptops, Home Theater Systems, etc. If you want to do some decent gaming, then you will need a good video card. The way I see it, this simply moves the GPU for integrated graphics from the motherboard to the CPU.

    This will be great for laptops!! If you can focus most of the heat buildup to one area, then I would think that more efficient cooling methods would follow.
     
  22. PCpraiser100 New Member

    Joined:
    Jul 17, 2008
    Messages:
    1,062 (0.46/day)
    Thanks Received:
    68
    40 stream processors? Thats very little.
     
  23. MrMilli

    MrMilli

    Joined:
    Mar 1, 2008
    Messages:
    216 (0.09/day)
    Thanks Received:
    35
    Location:
    Antwerp, Belgium
    AMD Fusion could well be the first CPU to feature a graphics core?
    To name a few before the Fusion: Cyrix MediaGX (later AMD Geode), Via Mark CoreFusion & Intel Timna (cancelled though)! So it's really not the first of it's kind.

    Fusion will probably be a MCM chip which would imply that only the GPU will be produced at TSMC and the CPU in AMD's own fabs.

    To answer KieranD: Fusion will have PCI-E integrated, so you could add a discrete card.
     
  24. substance90

    substance90 New Member

    Joined:
    Sep 22, 2007
    Messages:
    71 (0.03/day)
    Thanks Received:
    2
    Location:
    Bulgaria
    Too bad, I don`t see this getting into netbooks and UMPCs! Via Nano is also better than the Atom, but still do you know a single device using the Nano?
     
  25. Cuzza

    Cuzza New Member

    Joined:
    Jul 3, 2007
    Messages:
    1,318 (0.49/day)
    Thanks Received:
    207
    Location:
    New Zealand
    That's what i'm talking about ! MacGyver can do anything! Sorry this is totally off topic but:

    [​IMG]
     
    dawgnbirfar says thanks.

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page