1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Planning Updated Radeon HD 5670

Discussion in 'News' started by btarunr, May 18, 2010.

  1. zithe

    zithe

    Joined:
    Jun 16, 2008
    Messages:
    3,107 (1.18/day)
    Thanks Received:
    356
    Location:
    North Chili, NY
    Oh wow. Now they'll have two names for the 5770.
     
  2. Aleksander

    Joined:
    Dec 2, 2009
    Messages:
    3,261 (1.56/day)
    Thanks Received:
    310
    5670 becomes 5730 (or 6570)
    5750 - 5760
    5770 - 5780
    5830 - 5840
    5850 - 5860
    5870 - 6770 :p
    5970 - 6850 :p
     
  3. mdm-adph

    mdm-adph New Member

    Joined:
    Mar 28, 2007
    Messages:
    2,478 (0.81/day)
    Thanks Received:
    340
    Location:
    Your house.
    Hey -- now maybe the 5670 is finally worth buying. :p
     
  4. xtremesv

    xtremesv

    Joined:
    Mar 11, 2010
    Messages:
    115 (0.06/day)
    Thanks Received:
    11
    What about texture units and ROP’s? Are they going up as well? Right now 5670 has 20 TU and 8 ROP’s which limits it. If the starting point is the similar performance compared to a 4770 then the TU and ROP count has to increase with a figure of 32 TU and 16 ROP’s if this is the case, the new 5670 would be very interesting for CrossFire users. See it this way, the 5830 isn’t exactly an appealing option given its price/performance ratio, I think you can get a 5830 for around $230 but you could buy two 5670 for $220 and obtain 1280 shaders versus 1120, 64 TU versus 56 and 32 ROP’s versus 16, definitely that would be a pretty competitive alternative with a performance perfectly fitted between 5830 and 5850, confronting the incoming GTX465 ($250?).
     
  5. $immond$ New Member

    Joined:
    Nov 30, 2009
    Messages:
    394 (0.19/day)
    Thanks Received:
    22
    Location:
    Edmonton
    2 of these crossfire will perform at 1x 5770 right?
     
  6. xtremesv

    xtremesv

    Joined:
    Mar 11, 2010
    Messages:
    115 (0.06/day)
    Thanks Received:
    11
    With current specs, yes.
     
  7. ToTTenTranz

    ToTTenTranz New Member

    Joined:
    Sep 8, 2009
    Messages:
    865 (0.40/day)
    Thanks Received:
    167
    Location:
    Porto
    Calm down guys, the name for the new card isn't out yet.

    The original news author just kept calling it HD5670 because the new card will replace its current price point.
     
  8. DrPepper

    DrPepper The Doctor is in the house

    Joined:
    Jan 16, 2008
    Messages:
    7,483 (2.69/day)
    Thanks Received:
    813
    Location:
    Scotland (It rains alot)
    It will still be called 5670 asfaik.
     
  9. Flanker

    Joined:
    Apr 8, 2010
    Messages:
    175 (0.09/day)
    Thanks Received:
    17
    awww, when i read the title i thought they are going to try their 28/32nm transistors already :roll:
    guess I'm a bit too optimistic
     
  10. Cheeseball

    Joined:
    Jan 2, 2009
    Messages:
    707 (0.29/day)
    Thanks Received:
    96
    This is going to fail if they just increase the shader count from 400 to 640. :p They have also have to increase the ROP and TU count if they want it to be worthwhile.

    It's overpriced at the amount of power it currently has.
     
  11. eidairaman1

    eidairaman1

    Joined:
    Jul 2, 2007
    Messages:
    14,206 (4.77/day)
    Thanks Received:
    2,072
    Nv did it and it worked for them so I say why not. Instead of just 640 Shaders make it 648. Rest assured 400 SM 5670 will be phased out. Who knows I say those with 400 SM models should try to see if they can unlock the 240 Extra Shaders.
     
  12. Imsochobo New Member

    Joined:
    Feb 19, 2009
    Messages:
    514 (0.22/day)
    Thanks Received:
    35
    Location:
    I live in Norway, in the province Buskerud.

    Because Nvidia is selling off years old tech under a new name.
    Ati never do it to dedicated hardware, alltho IGP get a rename ever now n then with minor changes.
    Like, hey i got a 8800 GT, ohh yeah, i got a 9800 GT, ahh yeah? i got a GT240, while all are the same, i've come across people wanting to upgrade from 8800 GT to 9800 GT, thats the problem!
    If you dont understand that, You got a problem.
    This cannot be complained "much" about, alltho, 5690 would be a welcome name.
    Mobile 5870 and so on, can be complained about, atleast its new tech instead of nvidia and their "GTX260M" thats a 8800 GT

    The fuzz is all about that nvidia renamed one DIE more than ati have done in their lifetime... Thats the reason of all this hate.
    The G92 G80 and so on is a technological Marvel, but i hate it for the sole reason that it ripped off people, upgrading from 8800 GT to 9800 GT.
     
    Last edited: May 19, 2010
  13. Aleksander

    Joined:
    Dec 2, 2009
    Messages:
    3,261 (1.56/day)
    Thanks Received:
    310
    Yes, nvidia is like intel. 4 chipset with different names, which all do same thingys.
    But: Ati is the first to "construct" the chip and than nvidia copies it from ati technology making some innovations. (anandtech mentioned it and it is true)
    This time nvidia killed itself and that makes me happy, cuz at least ati is sure wont do the same thing with its chips. People who buy ati know about hardware and nvidia is about publicity to be precise. That is why people bought more nvidia, people who don't know about cards, only to spend money for games. All trademarks like: Acer, Dell etc. are switching to Ati.
    Y??? Because it is better for less money. GTX470 is a card which was "invented" to absorb money. That is why they came to the problem ati had with 4770. Thing they didn't realize about gt240 or other stupid cards they sold. So Ati has the crown now and wants it till the end of the year cuz it deserves to be the best.
     
  14. wahdangun

    wahdangun New Member

    Joined:
    Oct 2, 2008
    Messages:
    1,512 (0.60/day)
    Thanks Received:
    114
    Location:
    indonesia ku tercinta
    ^
    ^
    hmm, are u sure? can u give me link that nvdia stole ati design (regarding chipset).
     
  15. Meaker New Member

    Joined:
    May 27, 2005
    Messages:
    52 (0.01/day)
    Thanks Received:
    6
    With tweaking 2x 5670 will overtake a 5770 (that includes OCing the 5770).
     
  16. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    21,581 (6.05/day)
    Thanks Received:
    7,485
    Yeah...not so much.

    The 8800GT had been out for about 9 months when the 9800GT came out, a tad bit shy of the 2 years that that would validate your claim of "years old tech".

    The 8800GS was out for a whole 3 months before the 9600GSO came out, also a tad shy of validating your claim.

    The 9800GTX+ was out for 6 months before the GTS250 came out, again a tad shy of validating your claim.

    Now, you can try to make the argument that really the GTS250 was the same as the 9800GTX, and many do, but they use different GPUs. One 65nm and one 55nm. The GPUs did progress, a die shrink is a progression.

    In terms of actual tech, in forms of new features, nVidia didn't really do anything in that front. They just tweaked to make what they already had slightly better. There wasn't much need to add any new features. ATi did the same, with pretty much every GPU from HD3800 and HD4800 being based on R600. RV670 was R600 tweaked and shrank, and RV770 was R670 with a GDDR5 memory controller(because GDDR4 was dead), and more shaders.

    And really GF200 was just G92 with more shaders, so there really wasn't a reason to redesign a mid-range GPU. If nVidia had, they would have just released a GPU that was weaker then G92 already was. ATi on the other hand did re-design their mid-range GPU, and sure enough it was weaker then R670, I actually would have preferred they just use RV670 to fill the mid-range, we would have seen better cards.

    Sure they do, I've given two examples already.


    I guess. Then again, I've seen people wanting to upgrade from an HD3850/70 to an HD4670. I'm guessing the people wanting to upgrade from a 8800GT to a 9800GT are the same ones that would upgrade from an HD3850 to an HD4670. Guess which upgrade is better...yep, the 9800GT, because at least they would get the same performance and not less.

    I understand it, it just isn't a problem created by renaming as much as it is a problem created by consumer stupidity. And the problem wouldn't have been any different if nVidia had released an entirely different cored card with the same performance and price points and called it the 9800GT.


    Actually, HD5690 is just as bad of a name. I've already covered why.


    Man, nVidia's mobile market and naming schemes have been completely jacked the fuck up...don't even get me started.

    At least performance wise, they were close though. As a GTX260M does compete pretty well with the mobility HD4850, just like the desktop counter parts. Largly thanks to ATi horribly underclocking the mobile HD4850.

    Not really, the 8800GT and 9800GT used the same G92 die, but the GTS240 doesn't.(And by the way, the GT240 uses a totally different die from the three.)

    What is the reason, you haven't made a valid point.

    And I hate that people upgrading from an HD3850 to an HD4670 got "ripped off"...though I wouldn't call it ripped off actually. If they are too stupid to do the research and find out how something performs before they buy it, then they aren't getting ripped off when they get the same performance, they are getting lucky.
     
    Crunching for Team TPU More than 25k PPD
  17. wahdangun

    wahdangun New Member

    Joined:
    Oct 2, 2008
    Messages:
    1,512 (0.60/day)
    Thanks Received:
    114
    Location:
    indonesia ku tercinta
    first of all, if ati just using same die R670, and use it for mid to low HD 4XXX, then it was not progress, and it will still have broken AA(and you know how bad it is if AA was broken), hight power consumption and so on (thats why ati card was so popular for HTPC crowd) and that was bad for tech industry.

    and people was more confused with nvdia naming scheme because usually people associate performance with the second number and newer tech with the first one( i know because i have computer store, even my friend store said that 9800 GT is more powerful than 8800GT).

    so the chance of people upgrade from 8800 GT to 9800 GT is higher

    than from HD 3850 to HD 4670.

    so in case of nvdia you must research the card you looking for.

    and people who upgrade from HD 3850 to HD 4670 is either want a more silent card (for use in HTPC), or just don't have a clue what a graphic card is.


    after all, tech industry was based on innovation.
     
    ToTTenTranz says thanks.
  18. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    21,581 (6.05/day)
    Thanks Received:
    7,485
    The RV670 would have still outperformed the RV730, even with AA on.

    And power consumption, it was something like 20w between the two...hey I think that might actually be worse then the difference between the 9800GTX and GTX250(when their clocked the same that is).

    Yeah, I know, ATi's naming is always so easy. It is almost like going from an HD2900XT to an HD3870. It is so easy to tell that you will get better performance. I mean, it is the new generation, and the highest card available! Wait...it doesn't give better performance...oh.

    I know, that logic makes perfect sense, and always works with new GPUs. Hey, you can even look at nVidia. Look at the 8800GTX, and the 9800GTX which uses a totally different and new GPU. The 9800GTX has to be faster, right? Oh...it isn't...damn...

    Its cool though, its fine to buy without doing any research, the names of the cards tell us all...:shadedshu

    And now we people buying an HD5670 expecting to crossfire it with another HD5670, only to get it home and find out they can't.

    The same people that don't have a clue what a graphics card is, are the same ones you say are stupid enough to upgrade from an 8800GT to a 9800GT.

    I know the quieter card was why I upgraded from an HD3850 to an HD4670...oh damn...that little stock fan on the HD4670 is one of the loudest on the market when idle(how my HTPC sits all the time)...that sucks...:banghead:

    Hmm...turns out it is hard to say anything about a card, be it performance, or power draw, or fan noise without actually doing some research. Name really tells you nothing. So I have no simpothy for those who buy without doing research, and they usually always end up worse off for their ignorance.

    All of that being said I actually do hate how nVidia handled the 8800GT to 9800GT. I was really excited when I first heard of the 9800GT and the original plans for it. Basically, the 8800GT on a 9800GTX PCB, allowing for Tri-SLi with 9800GTs, which would have been sweet, and the better power setup would have allowed higher clocks. But the AIB Partners fought nVidia to get them to just re-use the 8800GT PCB, which kept the card exactly the same.

    And I've really had an issue with nVidia's naming scheme as of late. I've said it before, and I'll say it again; G92 should have been the 8850 series. So:

    8800GT/9800GT = 8850GT
    8800GTS 512MB = 8850GTS
    9800GTX = 8850GTX

    Then the G92b varients would add the + symbol(and still allow for SLI with the older version).

    8850GT+
    8850GTS+
    8850GTX+

    Then the GT200 cards should have been the 9800 series. So:

    GTX260 = 9800GT
    GTX260 Core 216 = 9800GTS
    GTX280 = 9800GTX

    Then the GT200b cards should have also had the + symbol added:

    GTX260 Core 216 = 9800GTS+
    GTX275 = 9800GSO+
    GTX285 = 9800GTX+

    That just makes more sense to me...
     
    Last edited: May 20, 2010
    Crunching for Team TPU More than 25k PPD
  19. eidairaman1

    eidairaman1

    Joined:
    Jul 2, 2007
    Messages:
    14,206 (4.77/day)
    Thanks Received:
    2,072
    3870 came about due to 2900XT Heat and Powerdraw, basically what GF 480 is facing today- rest assured NV will release a Fix for 480 in name of 485 or scrap GF 400 and go straight to 500. On another Point My Card still beat upon 8600, 2600 and 3600 series as of performance across the board, Despite being considered obsolete.
     
  20. Aleksander

    Joined:
    Dec 2, 2009
    Messages:
    3,261 (1.56/day)
    Thanks Received:
    310
    It is not that it steals the design, it just waits for Ati to jump first and than nvidia makes the move, but normally it sees as ati "jumps" and than makes some improvements.
    I can't remember the link, but it was in a thread here in tpu and it was in anandtech.com website.
     
  21. Aleksander

    Joined:
    Dec 2, 2009
    Messages:
    3,261 (1.56/day)
    Thanks Received:
    310
  22. DrPepper

    DrPepper The Doctor is in the house

    Joined:
    Jan 16, 2008
    Messages:
    7,483 (2.69/day)
    Thanks Received:
    813
    Location:
    Scotland (It rains alot)
    No
     
  23. Aleksander

    Joined:
    Dec 2, 2009
    Messages:
    3,261 (1.56/day)
    Thanks Received:
    310
    When this card will come out guys? I am waiting so damn much for it :(
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page