1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ATI May Be Considering Multi-Die GPUs for R700 Family

Discussion in 'News' started by Polaris573, Dec 6, 2007.

  1. Polaris573

    Polaris573 Senior Moderator

    Joined:
    Feb 26, 2005
    Messages:
    4,281 (1.21/day)
    Thanks Received:
    718
    Location:
    Little Rock, USA
    After being late to market with high-performance graphics offerings for a number of times, ATI, graphics product group of Advanced Micro Devices, is reportedly considering high-end graphics solutions that utilize more than two or, perhaps, even more physical dice. The method has been successfully utilized by Intel Corp., but will it be feasible for graphics processors too? ATI Radeon HD 2900 (R600) graphics chip, which contains about 700 million of transistors had power consumption of 160W, or even more, but still did not manage to demonstrate performance on par with Nvidia GeForce 8800 GTX, a solution that also demands high amount of power and is rather expensive to manufacture. But ATI Radeon HD 3800 (RV670) graphics processing unit, which is made using 55nm process technology, has the same amount of horsepower as R600, but is cheaper to build and consumes less amount of energy. While two of such ATI RV670 chips would still consume quite a lot of power, they will be able to offer performance and features that were not available before without necessity to develop a chip that would have about 1.3 billion of transistors, the amount of elements that would require very thin process technology – so that the GPU would stay cheap enough to manufacture – and quite a lot of time to design it and verify the lack of bugs.

    It is projected that ATI Radeon HD 3800 X2 – graphics board running two ATI RV670 processors – will be announced at Consumer Electronics Show, or at another time early next year. While this graphics card will be the first multi-GPU consumer board developed by former ATI Technologies in years, it seems that multi-GPU is the future, at least when it comes to AMD’s graphics product group (GPG). It is little known about products code-named ATI R700 today, but, according to an article at PC Watch web-site, the next generation of graphics solutions from AMD may utilize multi-chip module (MCM) concept instead of multi-chip GPU concept, at least, in the high-end. Even though both approaches have drawbacks compared to single-chip solutions, in case of homogeneous MCM some issues are easier to solve. Intel Corp., the world’s largest maker of chips, puts two physical dice – each containing two processing engines – onto a single piece of substrate to create quad-core central processing units. This allows Intel to boost its yields, as monolith quad-core microprocessor would have larger size of the die and would be more expensive to manufacture, according to Intel. It is rumored that AMD’s GPG may think the same way and cease to develop large GPUs, but concentrate on making smaller chips working efficiently together. But ATI/AMD MCM graphics solutions will not be similar to Intel’s MCM CPUs. Instead of using an external bus to connect the two dice, a special chip-to-chip interface (or high-speed link) is expected to be used, which should improve their performance when they act together. Besides, it is reported that AMD’s GPG will attempt to use shared memory on its multi-dice ATI R700 graphics solutions and rely on the link between GPUs to organize their access to each other’s memory pools. The GPU dice are projected to be able to enter idle mode when their computing power is not needed, thus, preserving power consumption. At present graphics processors in multi-GPU configuration, or on a multi-chip graphics board, communicate using special multi-GPU interfaces – dubbed ATI CrossFire or Nvidia SLI – or via PCI Express bus. While the bandwidth of CrossFire and SLI is believed to be relatively low, the bandwidth of PCI Express 2.0 x16 bus is 8GB/s in each direction, still well below of graphics cards’ memory bandwidth that can be over 100GB/s. However, chip-to-chip interfaces like Rambus’ Flex IO can provide speeds of 76.8GB/s (32GB/s read and 44.8GB/s write) and beyond, therefore, the problem of chip to-chip interface can be solved. But an obvious problem with homogeneous multi-dice GPUs is that a high-end GPU consisting of two dice will nearly always be two times faster compared to a performance-mainstream GPU with one die. Of course, AMD GPG will still be able to sell lower-clocked two-dice GPUs, but is not obvious that such solution will be viable from financial standpoint. Therefore, with a large gap between the price and performance of single-chip and dual-chip products, it may be hard to form a comprehensive graphics card lineup that would cover all the price and performance segments. If AMD decides to install four homogeneous chips/dice on a high-end graphics card, three onto a performance-mainstream board, two on a mainstream solution and one will serve the low-end, then its driver team will have to spend a substantial amount of time tweaking each video game for single-, dual-, triple- and quad-GPU/dice graphics sub-system; a task that neither ATI, nor Nvidia have so far been truly successful in. Performance of all modern [homogeneous] multi-GPU solutions depends on drivers and in case the driver does not recognize an application, performance of a dual-, tripe- or quad-GPU graphics solution may be similar to a single-chip graphics card. Theoretically, ATI Catalyst driver developers may force so-called alternate frame rendering (AFR) multi-GPU rendering technology for all unknown applications for CrossFire configurations, but this may add lag effects in numerous games. While homogeneous graphics solutions have been widely discussed in the recent years, the first successful consumer 3D graphics accelerators as well as professional 3D boards for workstations until the recent years utilized heterogeneous multi-chip architecture, where all (or nearly all) the chips onboard had different functionality. Maybe, this is the way to go? Nowadays different units within a graphics processor have different memory bandwidth requirements, moreover, certain parts of the chips do not communicate with others. As a result, a heterogeneous multi-GPU, or heterogeneous multi-dice GPUs, may become feasible solutions against ultra-complex single-chip GPUs and triple-/quad-core homogeneous multi-GPU/multi-dice solutions. Unfortunately, heterogeneous multi-GPU/multi-dice GPU solutions will almost certainly not be viable for mainstream and entry-level graphics cards, where price matters quite a lot. Thus, ATI/AMD will have to develop single-chip solutions for segments of the market where price matters and create heterogeneous multi-chip/multi-dice graphics products for those, who demand ultimate performance. In both cases the gap between the price and performance of mainstream and high-end graphics sub-systems is likely to be fairly high. At the end, both homogeneous multi-GPU/multi-dice GPUa as well as heterogeneous multi-GPU/multi-dice GPUs have their advantages and disadvantages. So, maybe a single-chip high-performance graphics card has still a reason to live?

    Officials for ATI, graphics product group of AMD, did not comment on the news-story.

    Source: X-bit Labs
     
    Grings says thanks.
  2. phanbuey

    phanbuey

    Joined:
    Nov 13, 2007
    Messages:
    5,205 (2.05/day)
    Thanks Received:
    975
    Location:
    Miami
    WOOOOOW... so after touting it's TRUE QUAD CORE for soooooo loooong AMD is now making Intel-esque (minus the bus) GPU's. Hey, if you can't beat 'em, join 'em... no?

    Always thought this was a great idea... hopefully it will work - I would ditch my 8800GT for a 3870x2 or x4 without blinking... i really like ATI IQ.
     
    Last edited: Dec 6, 2007
  3. OrbitzXT

    OrbitzXT New Member

    Joined:
    Mar 22, 2007
    Messages:
    1,969 (0.71/day)
    Thanks Received:
    59
    Location:
    New York City
    That is one giant ass wall of text...
     
  4. Grings

    Grings New Member

    Joined:
    Nov 16, 2006
    Messages:
    2,303 (0.79/day)
    Thanks Received:
    184
    Location:
    Blighty
    Nice, i always thought a true multi setup (as opposed to sli or crossfire style links) was more viable, hopefully it scales a lot better

    and nice one polaris, good to see a long, more detailed article (personally i'd like to see more, though i imagine thats easier said than done, given the brief nature of most source articles/ press releases etc)
     
  5. sam0t New Member

    Joined:
    Oct 5, 2007
    Messages:
    99 (0.04/day)
    Thanks Received:
    4
    Yeah tell me about it, I just needed to glance it and got the feeling "not going to read that". Anyways, multicore cards are welcome, If they manage to keep the power consumptation in deacent levels.
     
  6. marsey99

    marsey99

    Joined:
    Jul 18, 2007
    Messages:
    1,592 (0.60/day)
    Thanks Received:
    299
    ty pol, thats some news story. :toast:

    do you think amd/ati can feel intel are also going to launch a real gpu and really put them between a big green rock and a hard place? :nutkick:

    this is the future of gpu's and its only a matter of time before nvidia follow suit too, i just hope the all can get their drivers to work better for these than they do for xfire/sli. :banghead:
     
  7. largon

    Joined:
    May 6, 2005
    Messages:
    2,780 (0.80/day)
    Thanks Received:
    432
    Location:
    Tre, Suomi Finland
    This has been publicly known for over a year.
     
  8. Neohazard New Member

    Joined:
    Sep 5, 2006
    Messages:
    237 (0.08/day)
    Thanks Received:
    0
    Location:
    Brasil - São Paulo - São Caetano do Sul
    They are very smart to develope a multi GPU with low cost and a small die manufacturer but the PPU was coming to VGA or no? i wana more realism for games , maybe whe can expect more from ATI in this part right ? :rockout:
     
  9. Dixxhead

    Dixxhead New Member

    Joined:
    Sep 13, 2007
    Messages:
    214 (0.08/day)
    Thanks Received:
    18
    I just hope this time the Multi-GPU thing really manages to emerge. Last time someone tried going Multi-GPU was with the Voodoo 5, and we all know what happaned with 3Dfx (damn the Hardware T&L; other than that the Voodoo 5 series were kick-ass :p)
     
  10. tkpenalty New Member

    Joined:
    Sep 26, 2006
    Messages:
    6,958 (2.36/day)
    Thanks Received:
    345
    Location:
    Australia, Sydney
    I would expect AMD to make something like a "true dual core" GPU; something like the phenom. I hope AMD will make the card function just like a single card and NOT a card in crossfire. It would be extremely annoying if it was considered as two cards in crossfire.
     
  11. hacker111

    hacker111 New Member

    Joined:
    Nov 6, 2007
    Messages:
    332 (0.13/day)
    Thanks Received:
    1
    Location:
    MA United States
    I already new this..lol
     
  12. TUngsten

    TUngsten

    Joined:
    Nov 24, 2006
    Messages:
    1,044 (0.36/day)
    Thanks Received:
    64
    Location:
    CT, USA
    This is such a better idea than X-fire/SLI.
     
  13. mdm-adph

    mdm-adph New Member

    Joined:
    Mar 28, 2007
    Messages:
    2,478 (0.89/day)
    Thanks Received:
    340
    Location:
    Your house.
    Hey, the Voodoo 5 -- my brother had one of those in his Macintosh, just sold it finally last year to some collector. I think he had to mortgage his house to get it back in the day.

    Though, what do you mean about the last time someone tried a multi-gpu approach? What about the 7950GX2?
     
  14. Scrizz

    Scrizz

    Joined:
    Aug 22, 2007
    Messages:
    2,950 (1.13/day)
    Thanks Received:
    408
    Location:
    Florida, US
    i want one now!
    make it two :rockout:
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page