1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon HD 5970 Specs Surface

Discussion in 'News' started by btarunr, Nov 11, 2009.

  1. Jizzler

    Jizzler

    Joined:
    Aug 10, 2007
    Messages:
    3,607 (1.24/day)
    Thanks Received:
    715
    Location:
    Geneva, FL, USA
    So we might see Eyefinity Crossfire support in the next set of drivers?
     
  2. gaximodo

    gaximodo

    Joined:
    Sep 7, 2008
    Messages:
    278 (0.11/day)
    Thanks Received:
    72
    Location:
    Canberra Australia
    :roll: it will run @5870's clock only if they make it 15" long
     
  3. theorw

    theorw New Member

    Joined:
    Jul 5, 2007
    Messages:
    771 (0.26/day)
    Thanks Received:
    50
    Location:
    Athens GREECE
  4. theorw

    theorw New Member

    Joined:
    Jul 5, 2007
    Messages:
    771 (0.26/day)
    Thanks Received:
    50
    Location:
    Athens GREECE
    I wanna see someone hardvMOD this card!!!!
    Will be interesting!!I bet u ll need to solder 12 volts directly to the PCB????:D:p:nutkick:
     
  5. Weer New Member

    Joined:
    Aug 15, 2007
    Messages:
    1,417 (0.49/day)
    Thanks Received:
    94
    Location:
    New York / Israel
    I'm not going to blame you, because you haven't been here long enough, but it is actually AMD that started this entire fiasco. Not just with changing the names, but mostly with the frequency of going forward to the next series, mostly to generate hype. nVidia just (had) to follow suit.

    It's like this:

    DAAMIT

    ATI - X-series - Spring 2004
    ATI - X1k-series - Fall 2005 [+ 1.5 Years]
    AMD - HD2k-series - Spring 2007 [+ 1.5 Years]
    AMD - HD3k-series - Fall 2007 [+ 0.5 Years]
    AMD - HD4k-series - Spring 2008 [+ 0.5 Years]
    AMD - HD5k-series - Fall 2009 [+ 1.5 Years]

    Highlight: HD 2900 XT -> 6 Months -> HD 3870 -> 6 months -> HD 4870

    nVidia

    nVidia - 6000-series - Spring 2004
    nVidia - 7000-series - Spring 2005 [+ 1 Year]
    nVidia - 8000-series - Fall 2006 [+ 1.5 Years]
    nVidia - 9000-series - Winter 2007/2008 [+ ~2 Years]
    nVidia - GTX 200-series - Spring 2008 [+ 0.5 Years]
    nVidia - GTX 300-series - Fall 2009 [+ 1.5 Years]

    Highlight: 8800 GT/GTS 512 -> 2 Months -> 9800 GT/GTX -> 4 Months -> GTX 280

    So, as you can see, the healthy timeline for the release of a new series from either and both graphics card manufacturers is 1.5 years. The unhealthy is 0.5 years, and also 2 years.
    After AMD bought and merged with ATI, they failed to deliver a solid performing chip in the R600. So, in order to be able to compete with nVidia, they required hype. They gained this through changing two series in a single year. What should have been the HD 2950, etc. was thus named 3850, as part of the new and completely fraudulent HD3k series.
    Then, nVidia got wind of this and needed to make a move to equal the hype. So, they used the Exact same GPU they did in the 8000 series, the G92, in the 9000 series, which was even worse than what AMD were doing, because nVidia was blatantly re-marketing their product under a superior name, solely in order to garner hype. Thus, they also jumped through two series in roughly the same amount of time (given the actual linear-based timeline).
    And in the end, AMD took themselves by the trousers and fashioned an actually competitively good.. and New, GPU, which started the HD4k series, that lasted for the healthy 1.5 Years. nVidia thus again followed suit, with their GTX 200-series, which will also last for 1.5 Years.
    So, in the mean time, all is well in the graphics card kingdom, and the terror of the HD3k and 9000 series, is forgotten. But who knows when these big companies will, again, try to trick us because they are too scared, in this almost childish mindset, to lose any piece of market share.
    All I can say is men like me will be here to enlighten the masses, and protect the commoners.
     
    Last edited: Nov 11, 2009
  6. MadClown

    MadClown New Member

    Joined:
    Jun 24, 2008
    Messages:
    1,362 (0.52/day)
    Thanks Received:
    108
    Location:
    NY, the state were you cant defend yourself
    There's my card.
     
  7. niko084

    niko084

    Joined:
    Dec 5, 2006
    Messages:
    7,636 (2.41/day)
    Thanks Received:
    729
    No, the card is it's own subsystem.
    And although it has 4gb of ram only 2gb is probably usable, just as if you had 2 2gb cards in crossfire, you only really have 2gb of video ram for all practical reasons.

    This card is going to be massive and heavy.
    Get ready to build your braces!
     
  8. niko084

    niko084

    Joined:
    Dec 5, 2006
    Messages:
    7,636 (2.41/day)
    Thanks Received:
    729
    You have it all wrong...

    ATI was releasing new technology, new cards, new cores.
    Nvidia would was just doing a small die shrink and calling it a 8800GT-9800GT.

    Calling what should be a 5870x2 a 5890, that is a bit stupid.
    But they did not take a 2600 and rename it a 3600.
     
  9. Binge

    Binge Overclocking Surrealism

    Joined:
    Sep 15, 2008
    Messages:
    6,981 (2.78/day)
    Thanks Received:
    1,753
    Location:
    PA, USA
    I forsee some strangely high ammount of suck coming from this card! It will probably beat the 5870 under load by a good margin, but it will probably be the same as 2x5870 in crossfire idle. This isn't good for the 5970.
     
  10. wolf

    wolf Performance Enthusiast

    Joined:
    May 7, 2007
    Messages:
    5,547 (1.84/day)
    Thanks Received:
    847
    A wave of disappointment washes over... Dual 5850's with all 1600sp's is not what I wanted from this card. *sigh*

    Seems like it will actually let a 5870 down in trifire.
     
  11. ToTTenTranz

    ToTTenTranz New Member

    Joined:
    Sep 8, 2009
    Messages:
    865 (0.40/day)
    Thanks Received:
    167
    Location:
    Porto
    That makes no sense at all! The memory subsystem is handled by the GPU itself, not the CPU.

    Only if the PC used an UMA architecture like the consoles (or systems with IGPs), that constraint could exist.
     
  12. Binge

    Binge Overclocking Surrealism

    Joined:
    Sep 15, 2008
    Messages:
    6,981 (2.78/day)
    Thanks Received:
    1,753
    Location:
    PA, USA
    It's a valid question regardless.
     
    niko084 says thanks.
  13. gumpty

    gumpty

    Joined:
    Apr 29, 2008
    Messages:
    744 (0.28/day)
    Thanks Received:
    134
    Location:
    Auckland
    I doubt it - from the article it said it was an early engineering sample - just bolt a couple of heat-sinks on so they can test it. I imagine the retail-ready piece will be a like the normal stock coolers.
     
  14. niko084

    niko084

    Joined:
    Dec 5, 2006
    Messages:
    7,636 (2.41/day)
    Thanks Received:
    729
    Agreed, some people don't understand things as well, and it's a very common assumption.

    To the point I have heard techs at Microcenter tells customers that, hopefully just because they didn't feel like explaining the truth, but I doubt it :roll:
     
  15. ToTTenTranz

    ToTTenTranz New Member

    Joined:
    Sep 8, 2009
    Messages:
    865 (0.40/day)
    Thanks Received:
    167
    Location:
    Porto
    Nonetheless, calling it HD5900 instead of HD5800X2 makes sense to me.

    It's about time we get to have shorter names in ATI's lineup.


    In fact, I think it's about time that both ATI and nVidia drop the Radeon and Geforce brands. It's been 10 years now, we need new names!

    I still remember when nVidia was so confident about NV30 that they even publicly stated that they'd drop the Geforce brand and call it a different name. When the architecture turned up as bad as it was, they had to use the Geforce brand again, for marketing purposes.
     
  16. Binge

    Binge Overclocking Surrealism

    Joined:
    Sep 15, 2008
    Messages:
    6,981 (2.78/day)
    Thanks Received:
    1,753
    Location:
    PA, USA
    Nice point? With whom are you arguing?
     
  17. ToTTenTranz

    ToTTenTranz New Member

    Joined:
    Sep 8, 2009
    Messages:
    865 (0.40/day)
    Thanks Received:
    167
    Location:
    Porto
    With this post and consequent discussion about the naming schemes.
     
  18. pr0n Inspector

    pr0n Inspector

    Joined:
    Dec 8, 2008
    Messages:
    1,334 (0.55/day)
    Thanks Received:
    164
    Tell that to those who(still insist on using a 32-bit OS) lost another chunk of their 4GB RAM due to the vRAM.
     
  19. Steevo

    Steevo

    Joined:
    Nov 4, 2005
    Messages:
    9,039 (2.54/day)
    Thanks Received:
    1,637
    I just jizzed..........in my pants.


    Any way will two of my MCW-60R blocks fit?
     
    Last edited: Nov 11, 2009
    10 Million points folded for TPU
  20. 1933 Poker

    1933 Poker New Member

    Joined:
    Oct 30, 2009
    Messages:
    93 (0.04/day)
    Thanks Received:
    8
    Location:
    Australasia
    Good to know thanks for the post! Right On!:slap:
     
  21. WarEagleAU

    WarEagleAU Bird of Prey

    Joined:
    Jul 9, 2006
    Messages:
    10,812 (3.26/day)
    Thanks Received:
    547
    Location:
    Gurley, AL
    That actually looks like two Zalman type coolers, or it could be the DuoOrb, either way it is sexy. What I don't like is two DVI and one minidisplay port. That was supposed to a be a six screen behemoth. HEll they should have kept it two dvi, one HDMI (which I use and Ive seen monitors with, haven't seen a monitor displayport, not saying there isn't one) and one display port; if they are not going to make it a six screen monster.
     
  22. devguy

    devguy

    Joined:
    Feb 17, 2007
    Messages:
    1,240 (0.40/day)
    Thanks Received:
    171
    Location:
    SoCal
    This may sound lame, but I would actually prefer it if they delayed the launch of the 5970 to the time of release of Fermi. My reasoning is that this card coming out is going to further reduce the stock of available RV870 chips available. Thus, even worse shortages of the more practical 5850/5870.

    I mean, in all honesty, how many people buy these $500+ dollar cards at launch? I know only a few who did, and when they sold them to buy a 5870, they only got around $200 (didn't even come close to cover their 5870 cost). That is the cost for buying into new technology, sure, but I'd rather AMD focus on getting more 58xx series cards available on the market. Plus, the 5970 coming out when Fermi does, while serve as a sort of distraction to nVidia.

    And as for the clocks, nVidia did the almost the exact same thing with the GTX 295. It was a GTX 285 with the memory bandwidth of the GTX 260 and similar clocks to it, yet had full shader count. And became the GTX 275.
     
  23. JrRacinFan

    JrRacinFan Served 5k and counting ...

    Joined:
    Mar 17, 2007
    Messages:
    19,418 (6.34/day)
    Thanks Received:
    4,495
    Location:
    Youngstown, OH
    OK since this is a dual gpu card, will we see a 5900 series single gpu card for trifire? Sorry for the rhetorical question. I just figured to bring up a point that if a certain person picks one of these up and they indeed keep the naming scheme as a 5970 we won't see CrossfireX with it paired with a current card, Well for what information we have today ....
     
  24. ToTTenTranz

    ToTTenTranz New Member

    Joined:
    Sep 8, 2009
    Messages:
    865 (0.40/day)
    Thanks Received:
    167
    Location:
    Porto
    What?!
    You're talking about losing memory to the IGP? But that's predictable, with or without a 64bit OS.
     
  25. inferKNOX

    inferKNOX

    Joined:
    Jul 17, 2009
    Messages:
    921 (0.42/day)
    Thanks Received:
    123
    Location:
    SouthERN Africa
    He's talking about the rename from die shrink of nv 8series to call it 9series & blatant rename of 9800GTX+ to GTS250.:p

    Dude-bra, that's kinda lame...:eek:

    The amount of memory on the card cannot exceed the amount in the system, thus 4GB+ would be necessary in the system, which could only be utilised by a 64-bit system.
    Binge, I've noticed some huge hate coming from you for anything non-nV.:wtf:
    And what happened to your specs? I saw them with your nV card 1 day, then just the name the next.
    Agreed totally.;)
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page