Friday, September 18th 2009

Sapphire Radeon HD 5870 and HD 5850 Smile for the Camera

Here are the fist pictures of Sapphire's Radeon HD 5800 series offerings: Radeon HD 5870 1GB and Radeon HD 5850. The cards sport the usual sticker design of a CGI girl in a reddish background. With these cards having the cosmetic "red-streak" cleaving the cooler shroud in the center, the so is the sticker. This is also perhaps the first public picture of the Radeon HD 5850, and our size projections were right: While the Radeon HD 5870 maintains a long PCB, the HD 5850 is about as long as a Radeon HD 4870 (reference design). Both accelerators stick to the reference AMD design.

* Images removed at request of Sapphire * Google for alternate source
Source: Hermitage Akihabara
Add your own comment

148 Comments on Sapphire Radeon HD 5870 and HD 5850 Smile for the Camera

#26
Imsochobo
newtekie1I'm just trying to figure out why ATi is changing their strategy all of a sudden. I wouldn't be surprised if they are still having issues with 40nm, and that is part of the decision.
***Note not full quote.

Well, ati have been doing it for quite some time.
4850 is crippled to some degree, however all shaders live.
4830 is crippled, shaders removed.
4860 is crippled, shaders removed. less crippled than 4830. or crippled 4890's.

Ati rushed to get 4xxx out while 5xxx is been a longer process, might have picked up more crippled cores due to longer production run than 4xxx had in the start.

might not get a clear answer on that from ati :P
Posted on Reply
#27
newtekie1
Semi-Retired Folder
Imsochobo***Note not full quote.

Well, ati have been doing it for quite some time.
4850 is crippled to some degree, however all shaders live.
4830 is crippled, shaders removed.
4860 is crippled, shaders removed. less crippled than 4830. or crippled 4890's.

Ati rushed to get 4xxx out while 5xxx is been a longer process, might have picked up more crippled cores due to longer production run than 4xxx had in the start.

might not get a clear answer on that from ati :P
Yes, I realize that they have done it in the last generation, but not at first. However, they haven't done it at the onset of a product life cycle in a long time. And I mentioned the HD4850 having a crippled memory subsystem, and that is actually what I was expecting on the HD5850 also. The full core, with weaker memory.
Posted on Reply
#28
mdm-adph
newtekie1I'm just trying to figure out why ATi is changing their strategy all of a sudden. I wouldn't be surprised if they are still having issues with 40nm, and that is part of the decision.
(Man -- always with the FUD, aren't you? Do you get paid for this? Just enjoy your Nvidia cards, and let ATI shine for a few months -- come on, give it a break..)

I think you're really looking too far into this. I don't think it's foundry problems -- I think it's more just to do with ATI finally increasing a bit of revenue and market share, and finally having the time and money to do those same practices that other video cards producers perform, having long enjoyed their time at the top.
Posted on Reply
#29
TheMailMan78
Big Member
mdm-adph(Man -- always with the FUD, aren't you? Do you get paid for this? Just enjoy your Nvidia cards, and let ATI shine for a few months -- come on, give it a break..)

I think you're really looking too far into this. I don't think it's foundry problems -- I think it's more just to do with ATI finally increasing a bit of revenue and market share, and finally having the time and money to do those same practices that other video cards producers perform, having long enjoyed their time at the top.
This week on Dawson's Creek.
Posted on Reply
#30
mdm-adph
TheMailMan78This week on Dawson's Creek.
Oh, give over. You can be just as bad sometimes, capitalist running-dog. :laugh:
Posted on Reply
#31
btarunr
Editor & Senior Moderator
newtekie1Not useless, just crippled. That same way crippling the core would hinder it. I don't believe dropping to GDDR3 would have hurt it any more than disabling shaders.
It would have since it needs memory bandwidth.

And even if it's harvesting "defective" cores, that's purely academic, not the consumer's concern at all. The consumer gets a warranty-backed product.
Posted on Reply
#32
TheMailMan78
Big Member
mdm-adphOh, give over. You can be just as bad sometimes, capitalist running-dog. :laugh:
Yeah man but to everyone. You seem to be stalking newtekie1.
Posted on Reply
#34
mdm-adph
TheMailMan78Yeah man but to everyone. You seem to be stalking newtekie1.
Not at all, but I do believe he's being paid to spread FUD.

I think there's a "preponderance of evidence." :laugh:
Posted on Reply
#35
MoonPig
Ahhh... they look good.

I don't understand the fecking CGI girl though... whats the point? Crap logo.

Whats the release date of the 5850? i forget..
Posted on Reply
#38
Imsochobo
btarunrIt would have since it needs memory bandwidth.

And even if it's harvesting "defective" cores, that's purely academic, not the consumer's concern at all. The consumer gets a warranty-backed product.
I support bt in that statement and adds to the cause.
newtekie1Yes, I realize that they have done it in the last generation, but not at first. However, they haven't done it at the onset of a product life cycle in a long time. And I mentioned the HD4850 having a crippled memory subsystem, and that is actually what I was expecting on the HD5850 also. The full core, with weaker memory.
Disabling those cores do less damage than going down to GDDR3 since it would be like a 4850 with 4850x2 performance, that card did suck you know.
Low memory bandwidth, limited to 1680x1050.

As a reminder, 256 bit GDDR3 on high end cards do 73 gb/sec
256 bit midrange GDDR5 does 140 gb/sec.
Going to 512 bit GDDR3 would make it more expensive than 5870 and still output less performance.
Posted on Reply
#39
mechtech
And I..........Jizz in my PANTS
Posted on Reply
#40
PVTCaboose1337
Graphical Hacker
Does it have dual HDMI, or is one of those HDMI and one Displayport?
Posted on Reply
#41
newtekie1
Semi-Retired Folder
btarunrIt would have since it needs memory bandwidth.

And even if it's harvesting "defective" cores, that's purely academic, not the consumer's concern at all. The consumer gets a warranty-backed product.
I don't think it is any more special than previous GPUs that it needs 100Gb/s+ memory bandwidth to be functional. Again, I'm just wondering if it really is because the memory bandwidth is needed, or because there are still problems with 40nm.

And the cores being defective doesn't matter, as I've said, it is standard practice.
TheMailMan78Yeah man but to everyone. You seem to be stalking newtekie1.
Maybe if we don't feed him, he'll go back under his bridge...
Posted on Reply
#42
ZoneDymo
I know that the look of the actual card should be the very last thing you worry about but imo this design is boring as hell.

The Saphire Vapor-X coolers for the 4870 and 4890, now THAT is design!
Posted on Reply
#43
morphy
agree ^^...I'm just waiting for the Vapor-X edition of the 5870 and by then the prices would have come down too. win-win :toast:
Posted on Reply
#44
Unregistered
newtekie1I don't think it is any more special than previous GPUs that it needs 100Gb/s+ memory bandwidth to be functional. Again, I'm just wondering if it really is because the memory bandwidth is needed, or because there are still problems with 40nm.

And the cores being defective doesn't matter, as I've said, it is standard practice.



Maybe if we don't feed him, he'll go back under his bridge...
yes it will definitely, bandwidth limited, even RV 770 when equipped with GDDR3 become bandwidth limited (HD 4850) and when they equipped with GDDR5(HD 4870) the performance increase drastically (and can compete with GTX 260), so i say it's useless to use GDDR3 for that amount of power
Posted on Edit | Reply
#45
morphy
MoonPigLol... which is?
Sept23rd
Posted on Reply
#46
Zubasa
newtekie1I don't think it is any more special than previous GPUs that it needs 100Gb/s+ memory bandwidth to be functional. Again, I'm just wondering if it really is because the memory bandwidth is needed, or because there are still problems with 40nm.

And the cores being defective doesn't matter, as I've said, it is standard practice.
The bandwidth really hinders the 4850's performance as you can see from reviews of the 4830.
The 4830's core is both crippled and clocked lower but still manager keep up with the 4850.
www.techpowerup.com/reviews/Powercolor/HD_4830/26.html
The 4850 is less than 10% faster due to its suckass bandwidth.

Not to mention the 4770 which can take on a 4850 simply becaue it is clocked higher. (And its memory clocks much better)
The number of tmus and shaders don't seems to matter as much as the number of ROPs,
this might be due to the fact that most games are still optimized for shader model 3.0.
Posted on Reply
#48
ArmoredCavalry
Well, comparing my reference Sapphire 4870 to these, I have to say the 5k series looks much better...

Now where are the benchmarks!!!! :cry:
Posted on Reply
#49
Unregistered
Not that i would buy sapphire ever again,that ahole GG is such a tit it as put me off sapphire permanently.
Posted on Edit | Reply
#50
Zubasa
ArmoredCavalryWell, comparing my reference Sapphire 4870 to these, I have to say the 5k series looks much better...

Now where are the benchmarks!!!! :cry:
I am still worry about the 5870 might get bottlenecked pretty hard by the bandwidth,
since it has basically twice the power of a 4870 but the bandwidth does not seems to grow nearly as much. :ohwell:
Posted on Reply
Add your own comment
Apr 26th, 2024 13:35 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts