Tuesday, May 18th 2010

AMD Planning Updated Radeon HD 5670

AMD is planning a major update for the ATI Radeon HD 5670 mainstream graphics card. Currently based on the 40 nm Redwood core with specifications which include 400 stream processors, 512 MB / 1 GB of GDDR5 memory across a 128-bit wide memory interface, the updated HD 5670 will be based on the larger Juniper core (on which are based HD 5750 and HD 5770). On the HD 5670, the Juniper will have 640 of its 800 stream processors enabled, while having the same memory interface of 128-bit GDDR5 @ 1000 MHz (4 GHz effective).

The core will be clocked at 750 MHz. Since the nomenclature remains the same, it is safe to assume that the price will be remain unchanged, around $90 for the 512 MB, and $110 for the 1 GB variant. Leading AIB partners such as PowerColor and Sapphire seem to be ready with their board designs which are essentially identical to those of their Radeon HD 5700 series products. Unlike the HD 5770 reference, the new HD 5670 will be able to make do without an additional PCI-Express power input. Looking purely at the specifications, the new HD 5670 will be able to perform on par with the Radeon HD 4770 in present applications, with the added DirectX 11 and hardware tessellation support. With CrossFireX connectors on some designs, these card will have the ability to pair with more than two of their kind. There is no word yet on the availability.
Source: inpai.com.cn
Add your own comment

47 Comments on AMD Planning Updated Radeon HD 5670

#26
GSquadron
5670 becomes 5730 (or 6570)
5750 - 5760
5770 - 5780
5830 - 5840
5850 - 5860
5870 - 6770 :P
5970 - 6850 :P
Posted on Reply
#27
mdm-adph
Hey -- now maybe the 5670 is finally worth buying. :P
Posted on Reply
#28
xtremesv
What about texture units and ROP’s? Are they going up as well? Right now 5670 has 20 TU and 8 ROP’s which limits it. If the starting point is the similar performance compared to a 4770 then the TU and ROP count has to increase with a figure of 32 TU and 16 ROP’s if this is the case, the new 5670 would be very interesting for CrossFire users. See it this way, the 5830 isn’t exactly an appealing option given its price/performance ratio, I think you can get a 5830 for around $230 but you could buy two 5670 for $220 and obtain 1280 shaders versus 1120, 64 TU versus 56 and 32 ROP’s versus 16, definitely that would be a pretty competitive alternative with a performance perfectly fitted between 5830 and 5850, confronting the incoming GTX465 ($250?).
Posted on Reply
#30
xtremesv
$immond$2 of these crossfire will perform at 1x 5770 right?
With current specs, yes.
Posted on Reply
#31
ToTTenTranz
Calm down guys, the name for the new card isn't out yet.

The original news author just kept calling it HD5670 because the new card will replace its current price point.
Posted on Reply
#32
DrPepper
The Doctor is in the house
ToTTenTranzCalm down guys, the name for the new card isn't out yet.

The original news author just kept calling it HD5670 because the new card will replace its current price point.
It will still be called 5670 asfaik.
Posted on Reply
#33
Flanker
awww, when i read the title i thought they are going to try their 28/32nm transistors already :roll:
guess I'm a bit too optimistic
Posted on Reply
#34
Cheeseball
Not a Potato
This is going to fail if they just increase the shader count from 400 to 640. :P They have also have to increase the ROP and TU count if they want it to be worthwhile.

It's overpriced at the amount of power it currently has.
Posted on Reply
#35
eidairaman1
The Exiled Airman
CheeseballThis is going to fail if they just increase the shader count from 400 to 640. :P They have also have to increase the ROP and TU count if they want it to be worthwhile.

It's overpriced at the amount of power it currently has.
Nv did it and it worked for them so I say why not. Instead of just 640 Shaders make it 648. Rest assured 400 SM 5670 will be phased out. Who knows I say those with 400 SM models should try to see if they can unlock the 240 Extra Shaders.
Posted on Reply
#36
Imsochobo
newtekie1I never got why nVidia caught so much hell. They both do this kind of shit, and it is completely idiotic. ATi was doing similar last generation too, but no one seemed to care, it was all about nVidia is evil talk.
Because Nvidia is selling off years old tech under a new name.
Ati never do it to dedicated hardware, alltho IGP get a rename ever now n then with minor changes.
Like, hey i got a 8800 GT, ohh yeah, i got a 9800 GT, ahh yeah? i got a GT240, while all are the same, i've come across people wanting to upgrade from 8800 GT to 9800 GT, thats the problem!
If you dont understand that, You got a problem.
This cannot be complained "much" about, alltho, 5690 would be a welcome name.
Mobile 5870 and so on, can be complained about, atleast its new tech instead of nvidia and their "GTX260M" thats a 8800 GT

The fuzz is all about that nvidia renamed one DIE more than ati have done in their lifetime... Thats the reason of all this hate.
The G92 G80 and so on is a technological Marvel, but i hate it for the sole reason that it ripped off people, upgrading from 8800 GT to 9800 GT.
Posted on Reply
#37
GSquadron
Yes, nvidia is like intel. 4 chipset with different names, which all do same thingys.
But: Ati is the first to "construct" the chip and than nvidia copies it from ati technology making some innovations. (anandtech mentioned it and it is true)
This time nvidia killed itself and that makes me happy, cuz at least ati is sure wont do the same thing with its chips. People who buy ati know about hardware and nvidia is about publicity to be precise. That is why people bought more nvidia, people who don't know about cards, only to spend money for games. All trademarks like: Acer, Dell etc. are switching to Ati.
Y??? Because it is better for less money. GTX470 is a card which was "invented" to absorb money. That is why they came to the problem ati had with 4770. Thing they didn't realize about gt240 or other stupid cards they sold. So Ati has the crown now and wants it till the end of the year cuz it deserves to be the best.
Posted on Reply
#38
Unregistered
^
^
hmm, are u sure? can u give me link that nvdia stole ati design (regarding chipset).
#39
Meaker
xtremesvWith current specs, yes.
With tweaking 2x 5670 will overtake a 5770 (that includes OCing the 5770).
Posted on Reply
#40
newtekie1
Semi-Retired Folder
ImsochoboBecause Nvidia is selling off years old tech under a new name.
Yeah...not so much.

The 8800GT had been out for about 9 months when the 9800GT came out, a tad bit shy of the 2 years that that would validate your claim of "years old tech".

The 8800GS was out for a whole 3 months before the 9600GSO came out, also a tad shy of validating your claim.

The 9800GTX+ was out for 6 months before the GTS250 came out, again a tad shy of validating your claim.

Now, you can try to make the argument that really the GTS250 was the same as the 9800GTX, and many do, but they use different GPUs. One 65nm and one 55nm. The GPUs did progress, a die shrink is a progression.

In terms of actual tech, in forms of new features, nVidia didn't really do anything in that front. They just tweaked to make what they already had slightly better. There wasn't much need to add any new features. ATi did the same, with pretty much every GPU from HD3800 and HD4800 being based on R600. RV670 was R600 tweaked and shrank, and RV770 was R670 with a GDDR5 memory controller(because GDDR4 was dead), and more shaders.

And really GF200 was just G92 with more shaders, so there really wasn't a reason to redesign a mid-range GPU. If nVidia had, they would have just released a GPU that was weaker then G92 already was. ATi on the other hand did re-design their mid-range GPU, and sure enough it was weaker then R670, I actually would have preferred they just use RV670 to fill the mid-range, we would have seen better cards.
ImsochoboAti never do it to dedicated hardware, alltho IGP get a rename ever now n then with minor changes.
Sure they do, I've given two examples already.
ImsochoboLike, hey i got a 8800 GT, ohh yeah, i got a 9800 GT, ahh yeah? i got a GT240, while all are the same, i've come across people wanting to upgrade from 8800 GT to 9800 GT, thats the problem!
I guess. Then again, I've seen people wanting to upgrade from an HD3850/70 to an HD4670. I'm guessing the people wanting to upgrade from a 8800GT to a 9800GT are the same ones that would upgrade from an HD3850 to an HD4670. Guess which upgrade is better...yep, the 9800GT, because at least they would get the same performance and not less.
ImsochoboIf you dont understand that, You got a problem.
I understand it, it just isn't a problem created by renaming as much as it is a problem created by consumer stupidity. And the problem wouldn't have been any different if nVidia had released an entirely different cored card with the same performance and price points and called it the 9800GT.
ImsochoboThis cannot be complained "much" about, alltho, 5690 would be a welcome name.
Actually, HD5690 is just as bad of a name. I've already covered why.
ImsochoboMobile 5870 and so on, can be complained about, atleast its new tech instead of nvidia and their "GTX260M" thats a 8800 GT
Man, nVidia's mobile market and naming schemes have been completely jacked the fuck up...don't even get me started.

At least performance wise, they were close though. As a GTX260M does compete pretty well with the mobility HD4850, just like the desktop counter parts. Largly thanks to ATi horribly underclocking the mobile HD4850.
ImsochoboThe fuzz is all about that nvidia renamed one DIE more than ati have done in their lifetime...
Not really, the 8800GT and 9800GT used the same G92 die, but the GTS240 doesn't.(And by the way, the GT240 uses a totally different die from the three.)
ImsochoboThats the reason of all this hate.
What is the reason, you haven't made a valid point.
ImsochoboThe G92 G80 and so on is a technological Marvel, but i hate it for the sole reason that it ripped off people, upgrading from 8800 GT to 9800 GT.
And I hate that people upgrading from an HD3850 to an HD4670 got "ripped off"...though I wouldn't call it ripped off actually. If they are too stupid to do the research and find out how something performs before they buy it, then they aren't getting ripped off when they get the same performance, they are getting lucky.
Posted on Reply
#41
Unregistered
newtekie1In terms of actual tech, in forms of new features, nVidia didn't really do anything in that front. They just tweaked to make what they already had slightly better. There wasn't much need to add any new features. ATi did the same, with pretty much every GPU from HD3800 and HD4800 being based on R600. RV670 was R600 tweaked and shrank, and RV770 was R670 with a GDDR5 memory controller(because GDDR4 was dead), and more shaders.

And really GF200 was just G92 with more shaders, so there really wasn't a reason to redesign a mid-range GPU. If nVidia had, they would have just released a GPU that was weaker then G92 already was. ATi on the other hand did re-design their mid-range GPU, and sure enough it was weaker then R670, I actually would have preferred they just use RV670 to fill the mid-range, we would have seen better cards.



Sure they do, I've given two examples already.




I guess. Then again, I've seen people wanting to upgrade from an HD3850/70 to an HD4670. I'm guessing the people wanting to upgrade from a 8800GT to a 9800GT are the same ones that would upgrade from an HD3850 to an HD4670. Guess which upgrade is better...yep, the 9800GT, because at least they would get the same performance and not less.



I understand it, it just isn't a problem created by renaming as much as it is a problem created by consumer stupidity. And the problem wouldn't have been any different if nVidia had released an entirely different cored card with the same performance and price points and called it the 9800GT.




Actually, HD5690 is just as bad of a name. I've already covered why.




Man, nVidia's mobile market and naming schemes have been completely jacked the fuck up...don't even get me started.

At least performance wise, they were close though. As a GTX260M does compete pretty well with the mobility HD4850, just like the desktop counter parts. Largly thanks to ATi horribly underclocking the mobile HD4850.



Not really, the 8800GT and 9800GT used the same G92 die, but the GTS240 doesn't.(And by the way, the GT240 uses a totally different die from the three.)



What is the reason, you haven't made a valid point.



And I hate that people upgrading from an HD3850 to an HD4670 got "ripped off"...though I wouldn't call it ripped off actually. If they are too stupid to do the research and find out how something performs before they buy it, then they aren't getting ripped off when they get the same performance, they are getting lucky.
first of all, if ati just using same die R670, and use it for mid to low HD 4XXX, then it was not progress, and it will still have broken AA(and you know how bad it is if AA was broken), hight power consumption and so on (thats why ati card was so popular for HTPC crowd) and that was bad for tech industry.

and people was more confused with nvdia naming scheme because usually people associate performance with the second number and newer tech with the first one( i know because i have computer store, even my friend store said that 9800 GT is more powerful than 8800GT).

so the chance of people upgrade from 8800 GT to 9800 GT is higher

than from HD 3850 to HD 4670.

so in case of nvdia you must research the card you looking for.

and people who upgrade from HD 3850 to HD 4670 is either want a more silent card (for use in HTPC), or just don't have a clue what a graphic card is.


after all, tech industry was based on innovation.
Posted on Edit | Reply
#42
newtekie1
Semi-Retired Folder
wahdangunfirst of all, if ati just using same die R670, and use it for mid to low HD 4XXX, then it was not progress, and it will still have broken AA(and you know how bad it is if AA was broken), hight power consumption and so on (thats why ati card was so popular for HTPC crowd) and that was bad for tech industry.
The RV670 would have still outperformed the RV730, even with AA on.

And power consumption, it was something like 20w between the two...hey I think that might actually be worse then the difference between the 9800GTX and GTX250(when their clocked the same that is).
wahdangunand people was more confused with nvdia naming scheme because usually people associate performance with the second number and newer tech with the first one( i know because i have computer store, even my friend store said that 9800 GT is more powerful than 8800GT).

so the chance of people upgrade from 8800 GT to 9800 GT is higher

than from HD 3850 to HD 4670.

so in case of nvdia you must research the card you looking for.
Yeah, I know, ATi's naming is always so easy. It is almost like going from an HD2900XT to an HD3870. It is so easy to tell that you will get better performance. I mean, it is the new generation, and the highest card available! Wait...it doesn't give better performance...oh.

I know, that logic makes perfect sense, and always works with new GPUs. Hey, you can even look at nVidia. Look at the 8800GTX, and the 9800GTX which uses a totally different and new GPU. The 9800GTX has to be faster, right? Oh...it isn't...damn...

Its cool though, its fine to buy without doing any research, the names of the cards tell us all...:shadedshu

And now we people buying an HD5670 expecting to crossfire it with another HD5670, only to get it home and find out they can't.
wahdangunand people who upgrade from HD 3850 to HD 4670 is either want a more silent card (for use in HTPC), or just don't have a clue what a graphic card is.
The same people that don't have a clue what a graphics card is, are the same ones you say are stupid enough to upgrade from an 8800GT to a 9800GT.

I know the quieter card was why I upgraded from an HD3850 to an HD4670...oh damn...that little stock fan on the HD4670 is one of the loudest on the market when idle(how my HTPC sits all the time)...that sucks...:banghead:

Hmm...turns out it is hard to say anything about a card, be it performance, or power draw, or fan noise without actually doing some research. Name really tells you nothing. So I have no simpothy for those who buy without doing research, and they usually always end up worse off for their ignorance.

All of that being said I actually do hate how nVidia handled the 8800GT to 9800GT. I was really excited when I first heard of the 9800GT and the original plans for it. Basically, the 8800GT on a 9800GTX PCB, allowing for Tri-SLi with 9800GTs, which would have been sweet, and the better power setup would have allowed higher clocks. But the AIB Partners fought nVidia to get them to just re-use the 8800GT PCB, which kept the card exactly the same.

And I've really had an issue with nVidia's naming scheme as of late. I've said it before, and I'll say it again; G92 should have been the 8850 series. So:

8800GT/9800GT = 8850GT
8800GTS 512MB = 8850GTS
9800GTX = 8850GTX

Then the G92b varients would add the + symbol(and still allow for SLI with the older version).

8850GT+
8850GTS+
8850GTX+

Then the GT200 cards should have been the 9800 series. So:

GTX260 = 9800GT
GTX260 Core 216 = 9800GTS
GTX280 = 9800GTX

Then the GT200b cards should have also had the + symbol added:

GTX260 Core 216 = 9800GTS+
GTX275 = 9800GSO+
GTX285 = 9800GTX+

That just makes more sense to me...
Posted on Reply
#43
eidairaman1
The Exiled Airman
3870 came about due to 2900XT Heat and Powerdraw, basically what GF 480 is facing today- rest assured NV will release a Fix for 480 in name of 485 or scrap GF 400 and go straight to 500. On another Point My Card still beat upon 8600, 2600 and 3600 series as of performance across the board, Despite being considered obsolete.
Posted on Reply
#44
GSquadron
wahdangunhmm, are u sure? can u give me link that nvdia stole ati design (regarding chipset).
It is not that it steals the design, it just waits for Ati to jump first and than nvidia makes the move, but normally it sees as ati "jumps" and than makes some improvements.
I can't remember the link, but it was in a thread here in tpu and it was in anandtech.com website.
Posted on Reply
#46
DrPepper
The Doctor is in the house
No
Posted on Reply
#47
GSquadron
When this card will come out guys? I am waiting so damn much for it :(
Posted on Reply
Add your own comment
Apr 26th, 2024 19:15 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts