Wednesday, November 11th 2009

AMD Radeon HD 5970 Specs Surface

In a few days from now, AMD will unveil its new flagship graphics accelerator, the ATI Radeon HD 5970, which will intends to cement the brand's performance leadership over every product from rival NVIDIA. The HD 5970, codenamed "Hemlock", is a dual-GPU accelerator, with two codenamed "Cypress" GPUs in an internal CrossfireX configuration.

Built on the 40 nm process, these GPUs will feature 1600 stream processors each, and will each have a 256-bit wide GDDR5 memory interface to connect to 2 GB of memory (4 GB total on card). The clock speeds are where the specifications of these GPUs differ from their single-GPU avatar, the Radeon HD 5870. The core is clocked at 725 MHz, while the memory runs at 1000 MHz (4000 MHz effective).

The accelerator will not have a rear panel identical to those of other Radeon HD 5000 series accelerators. It has the usual broad air vent occupying one slot, while the other has two DVI-D and one mini DisplayPort (DP) connector. The mini DP connector can give out DVI output using a dongle, and in this way, support for ATI Eyefinity technology remains intact. The NDA covering this accelerator is said to expire on the 19th of November, not very far away.
Source: TechConnect Magazine
Add your own comment

147 Comments on AMD Radeon HD 5970 Specs Surface

#26
gaximodo
:roll: it will run @5870's clock only if they make it 15" long
Posted on Reply
#28
theorw
gaximodo:roll: it will run @5870's clock only if they make it 15" long
I wanna see someone hardvMOD this card!!!!
Will be interesting!!I bet u ll need to solder 12 volts directly to the PCB????:D:p:nutkick:
Posted on Reply
#29
Weer
CHAOS_KILLAcall it the hd5870x2, stop changing the naming schemes, dont follow in nvidias footsteps! lol
I'm not going to blame you, because you haven't been here long enough, but it is actually AMD that started this entire fiasco. Not just with changing the names, but mostly with the frequency of going forward to the next series, mostly to generate hype. nVidia just (had) to follow suit.

It's like this:

DAAMIT

ATI - X-series - Spring 2004
ATI - X1k-series - Fall 2005 [+ 1.5 Years]
AMD - HD2k-series - Spring 2007 [+ 1.5 Years]
AMD - HD3k-series - Fall 2007 [+ 0.5 Years]
AMD - HD4k-series - Spring 2008 [+ 0.5 Years]
AMD - HD5k-series - Fall 2009 [+ 1.5 Years]

Highlight: HD 2900 XT -> 6 Months -> HD 3870 -> 6 months -> HD 4870

nVidia

nVidia - 6000-series - Spring 2004
nVidia - 7000-series - Spring 2005 [+ 1 Year]
nVidia - 8000-series - Fall 2006 [+ 1.5 Years]
nVidia - 9000-series - Winter 2007/2008 [+ ~2 Years]
nVidia - GTX 200-series - Spring 2008 [+ 0.5 Years]
nVidia - GTX 300-series - Fall 2009 [+ 1.5 Years]

Highlight: 8800 GT/GTS 512 -> 2 Months -> 9800 GT/GTX -> 4 Months -> GTX 280

So, as you can see, the healthy timeline for the release of a new series from either and both graphics card manufacturers is 1.5 years. The unhealthy is 0.5 years, and also 2 years.
After AMD bought and merged with ATI, they failed to deliver a solid performing chip in the R600. So, in order to be able to compete with nVidia, they required hype. They gained this through changing two series in a single year. What should have been the HD 2950, etc. was thus named 3850, as part of the new and completely fraudulent HD3k series.
Then, nVidia got wind of this and needed to make a move to equal the hype. So, they used the Exact same GPU they did in the 8000 series, the G92, in the 9000 series, which was even worse than what AMD were doing, because nVidia was blatantly re-marketing their product under a superior name, solely in order to garner hype. Thus, they also jumped through two series in roughly the same amount of time (given the actual linear-based timeline).
And in the end, AMD took themselves by the trousers and fashioned an actually competitively good.. and New, GPU, which started the HD4k series, that lasted for the healthy 1.5 Years. nVidia thus again followed suit, with their GTX 200-series, which will also last for 1.5 Years.
So, in the mean time, all is well in the graphics card kingdom, and the terror of the HD3k and 9000 series, is forgotten. But who knows when these big companies will, again, try to trick us because they are too scared, in this almost childish mindset, to lose any piece of market share.
All I can say is men like me will be here to enlighten the masses, and protect the commoners.
Posted on Reply
#31
niko084
theorwWOW,so now to use this card u need 64bit OS since it has 4 gb right???:wtf:
No, the card is it's own subsystem.
And although it has 4gb of ram only 2gb is probably usable, just as if you had 2 2gb cards in crossfire, you only really have 2gb of video ram for all practical reasons.

This card is going to be massive and heavy.
Get ready to build your braces!
Posted on Reply
#32
niko084
WeerI'm not going to blame you, because you haven't been here long enough, but it is actually AMD that started this entire fiasco. Not just with changing the names, but mostly with the frequency of going forward to the next series, mostly to generate hype. nVidia just (had) to follow suit.
You have it all wrong...

ATI was releasing new technology, new cards, new cores.
Nvidia would was just doing a small die shrink and calling it a 8800GT-9800GT.

Calling what should be a 5870x2 a 5890, that is a bit stupid.
But they did not take a 2600 and rename it a 3600.
Posted on Reply
#33
Binge
Overclocking Surrealism
I forsee some strangely high ammount of suck coming from this card! It will probably beat the 5870 under load by a good margin, but it will probably be the same as 2x5870 in crossfire idle. This isn't good for the 5970.
Posted on Reply
#34
wolf
Performance Enthusiast
A wave of disappointment washes over... Dual 5850's with all 1600sp's is not what I wanted from this card. *sigh*

Seems like it will actually let a 5870 down in trifire.
Posted on Reply
#35
ToTTenTranz
theorwWOW,so now to use this card u need 64bit OS since it has 4 gb right???:wtf:
That makes no sense at all! The memory subsystem is handled by the GPU itself, not the CPU.

Only if the PC used an UMA architecture like the consoles (or systems with IGPs), that constraint could exist.
Posted on Reply
#36
Binge
Overclocking Surrealism
ToTTenTranzThat makes no sense at all! The memory subsystem is handled by the GPU itself, not the CPU.

Only if the PC used an UMA architecture like the consoles (or systems with IGPs), that constraint could exist.
It's a valid question regardless.
Posted on Reply
#37
gumpty
theorwWTF???Is it gonna have that DUorb like cooler from stock?Cos i see ATI brand stickers on...?
I doubt it - from the article it said it was an early engineering sample - just bolt a couple of heat-sinks on so they can test it. I imagine the retail-ready piece will be a like the normal stock coolers.
Posted on Reply
#38
niko084
BingeIt's a valid question regardless.
Agreed, some people don't understand things as well, and it's a very common assumption.

To the point I have heard techs at Microcenter tells customers that, hopefully just because they didn't feel like explaining the truth, but I doubt it :roll:
Posted on Reply
#39
ToTTenTranz
Nonetheless, calling it HD5900 instead of HD5800X2 makes sense to me.

It's about time we get to have shorter names in ATI's lineup.


In fact, I think it's about time that both ATI and nVidia drop the Radeon and Geforce brands. It's been 10 years now, we need new names!

I still remember when nVidia was so confident about NV30 that they even publicly stated that they'd drop the Geforce brand and call it a different name. When the architecture turned up as bad as it was, they had to use the Geforce brand again, for marketing purposes.
Posted on Reply
#40
Binge
Overclocking Surrealism
ToTTenTranzNonetheless, calling it HD5900 instead of HD5800X2 makes sense to me.

It's about time we get to have shorter names in ATI's lineup.


In fact, I think it's about time that both ATI and nVidia drop the Radeon and Geforce brands. It's been 10 years now, we need new names!

I still remember when nVidia was so confident about NV30 that they even publicly stated that they'd drop the Geforce brand and call it a different name. When the architecture turned up as bad as it was, they had to use the Geforce brand again, for marketing purposes.
Nice point? With whom are you arguing?
Posted on Reply
#41
ToTTenTranz
BingeNice point? With whom are you arguing?
With this post and consequent discussion about the naming schemes.
Posted on Reply
#42
pr0n Inspector
ToTTenTranzThat makes no sense at all! The memory subsystem is handled by the GPU itself, not the CPU.

Only if the PC used an UMA architecture like the consoles (or systems with IGPs), that constraint could exist.
Tell that to those who(still insist on using a 32-bit OS) lost another chunk of their 4GB RAM due to the vRAM.
Posted on Reply
#43
Steevo
I just jizzed..........in my pants.


Any way will two of my MCW-60R blocks fit?
Posted on Reply
#44
1933 Poker
Good to know thanks for the post! Right On!:slap:
Posted on Reply
#45
WarEagleAU
Bird of Prey
That actually looks like two Zalman type coolers, or it could be the DuoOrb, either way it is sexy. What I don't like is two DVI and one minidisplay port. That was supposed to a be a six screen behemoth. HEll they should have kept it two dvi, one HDMI (which I use and Ive seen monitors with, haven't seen a monitor displayport, not saying there isn't one) and one display port; if they are not going to make it a six screen monster.
Posted on Reply
#46
devguy
This may sound lame, but I would actually prefer it if they delayed the launch of the 5970 to the time of release of Fermi. My reasoning is that this card coming out is going to further reduce the stock of available RV870 chips available. Thus, even worse shortages of the more practical 5850/5870.

I mean, in all honesty, how many people buy these $500+ dollar cards at launch? I know only a few who did, and when they sold them to buy a 5870, they only got around $200 (didn't even come close to cover their 5870 cost). That is the cost for buying into new technology, sure, but I'd rather AMD focus on getting more 58xx series cards available on the market. Plus, the 5970 coming out when Fermi does, while serve as a sort of distraction to nVidia.

And as for the clocks, nVidia did the almost the exact same thing with the GTX 295. It was a GTX 285 with the memory bandwidth of the GTX 260 and similar clocks to it, yet had full shader count. And became the GTX 275.
Posted on Reply
#47
JrRacinFan
Served 5k and counting ...
OK since this is a dual gpu card, will we see a 5900 series single gpu card for trifire? Sorry for the rhetorical question. I just figured to bring up a point that if a certain person picks one of these up and they indeed keep the naming scheme as a 5970 we won't see CrossfireX with it paired with a current card, Well for what information we have today ....
Posted on Reply
#48
ToTTenTranz
pr0n InspectorTell that to those who(still insist on using a 32-bit OS) lost another chunk of their 4GB RAM due to the vRAM.
What?!
You're talking about losing memory to the IGP? But that's predictable, with or without a 64bit OS.
Posted on Reply
#49
inferKNOX
Weer... actually AMD that started this entire fiasco. Not just with changing the names, but mostly with the frequency of going forward to the next series, mostly to generate hype. nVidia just (had) to follow suit....
He's talking about the rename from die shrink of nv 8series to call it 9series & blatant rename of 9800GTX+ to GTS250.:p
WeerDAAMIT
Dude-bra, that's kinda lame...:o
ToTTenTranzThat makes no sense at all! The memory subsystem is handled by the GPU itself, not the CPU.

Only if the PC used an UMA architecture like the consoles (or systems with IGPs), that constraint could exist.
The amount of memory on the card cannot exceed the amount in the system, thus 4GB+ would be necessary in the system, which could only be utilised by a 64-bit system.
BingeI forsee some strangely high ammount of suck coming from this card! It will probably beat the 5870 under load by a good margin, but it will probably be the same as 2x5870 in crossfire idle. This isn't good for the 5970.
Binge, I've noticed some huge hate coming from you for anything non-nV.:wtf:
And what happened to your specs? I saw them with your nV card 1 day, then just the name the next.
ToTTenTranzNonetheless, calling it HD5900 instead of HD5800X2 makes sense to me.

It's about time we get to have shorter names in ATI's lineup.


In fact, I think it's about time that both ATI and nVidia drop the Radeon and Geforce brands. It's been 10 years now, we need new names!

I still remember when nVidia was so confident about NV30 that they even publicly stated that they'd drop the Geforce brand and call it a different name. When the architecture turned up as bad as it was, they had to use the Geforce brand again, for marketing purposes.
Agreed totally.;)
Posted on Reply
#50
DanishDevil
59xx is dual GPU. If you want trifire with 2 cards, you pair a 5970 with a 5870.

I find it interesting that after the die shrink, they still had to underclock the cards to get them to run cool enough. ATi is feeling the heat (from their own cards).
Posted on Reply
Add your own comment
Apr 26th, 2024 11:02 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts