• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

HD 5870 Discussion thread.

Status
Not open for further replies.
Indeed, i will be disappointed in this card if it turns out not to be able to max the first wave of dx11 games, butt hat's unlikely. Also, i agree with erocker in being disappointed in nvidia, i would expect at least some retaliation (even if just some 'our cards will be better' waffle), but the total lack seeds a lot of doubt whether or not they really can combat the new ati cards.
 
I like that concept erock :toast: Although trying it is such an expensive proposition and it's so cheap to speculate :D Until I gets a gtx295+ level of power on a single gpu I won't be satisfied. Tell me about your 5850! I would jump on these ATI cards if I knew the drivers were up to snuff.
 
I like that concept erock :toast: Although trying it is such an expensive proposition and it's so cheap to speculate :D Until I gets a gtx295+ level of power on a single gpu I won't be satisfied. Tell me about your 5850! I would jump on these ATI cards if I knew the drivers were up to snuff.

Lol, so far going from the 4890, I notice absolutely no difference in the games I play! But I just got this thing, so far no issues. It's relatively quiet, cool and sucks less power while increasing performance. Win!
 
This thread is somewhat dissapointing.

I dunno where the word dissapointed even came from, I'm there on 'below expectations', but in NO way am I 'dissapointed'
 
I am waiting for the 5870 2GB, and this time I am not going to bother with memory cooling as it seems a waste. None of the cards shown have benefited from memory cooling, neither does your system memory unless you really overvolt it, but it doesn't help one bit with overclockability unless you are voltage/heat limited, and these cards are not that on the memory.
 
i never said i was dissapointed, as the thread title says "below expectations". Honestly i had expected the HD 5850 to be at or just below HD 4870x2 performance levels, and the HD 5870 at or just above the HD 4870x2's. But i still have that expectation once drivers mature by xmas.
 
Last edited:
Well were quad cores exactly four times as fast as single cores, when first introduced? No, they weren't, unless running an extremely optimised program. IMO it's the same case here. Unless games and software aren't specifically coded to use all the shaders, we won't see the full potential of these new transistorial beasts
 
yeah i think they will be a ton faster in things that actually use their capability - like in a DX 11 title with tesselation and all the other fancy visuals, the 5870 will slaughter a 4870x2 that is trying to produce the same image quality.

But even now the card is under $400 which is amazing - and it is the fastest single GPU card on the planet by a decent margin. I think ATi did a phenomenal job with the card, and the option of an X2 in the very near future. They could have made it less ugly though. I think the ugliness was below my expectations.
 
yeah i think they will be a ton faster in things that actually use their capability - like in a DX 11 title with tesselation and all the other fancy visuals, the 5870 will slaughter a 4870x2 that is trying to produce the same image quality.

But even now the card is under $400 which is amazing - and it is the fastest single GPU card on the planet by a decent margin. I think ATi did a phenomenal job with the card, and the option of an X2 in the very near future. They could have made it less ugly though. I think the ugliness was below my expectations.

I like how the card looks though :) so it is a matter of taste.
 
lol i too like the appearance of the cards. and i had forgotten about upcoming things like tesselation. From screenshots it looks amazing so it'll probly work the cards good, or i could be completely wrong XD.
 
Yeah my expectation were FAR exceeded when it came to the look of the card.

I was expecting the 5870 to look exactly like the 48xx/38xx's before it, and they blew my mind with the awesome black and red colour scheme with back plate.

truly, sexy.
 
Yeah my expectation were FAR exceeded when it came to the look of the card.

I was expecting the 5870 to look exactly like the 48xx/38xx's before it, and they blew my mind with the awesome black and red colour scheme with back plate.

truly, sexy.
Second that, the color scheme is what I fell for. :toast:
My case has a side window, and that ATi Radeon just gives me an e-peen erection :roll:
 
To the Batmobile!
fullscreencapture917200u.jpg
 
There's more of a problem with cache on chip that's slowing down this GPU than the memory bandwidth, but the low bus width does not help matters. There needs to be a room stocked with food and overclockable computers that ATI can keep a few level headed enthusiasts as a design conscience. It seems that more than one of us can see a flaw in design.

Hmm, interesting about the cache part.. I'm wondering how much of a bottleneck this cache would be. Perhaps it's that ATI did not add enough L1/L2 cache, although it was "new" to the Radeon series.

Yeah, before ATI had an excess of memory bandwidth and o'plenty buffer (HD 2900XT 1GB 512-bit GDDR4), and now ATI has way too little bandwidth for so much GPU power!

Hey, how do you know there's more of a problem with the cache?
 
Hmm, interesting about the cache part.. I'm wondering how much of a bottleneck this cache would be. Perhaps it's that ATI did not add enough L1/L2 cache, although it was "new" to the Radeon series.

Yeah, before ATI had an excess of memory bandwidth and o'plenty buffer (HD 2900XT 1GB 512-bit GDDR4), and now ATI has way too little bandwidth for so much GPU power!

Hey, how do you know there's more of a problem with the cache?

LoL I didnt even think of that, the 512-bit 2900XT was a bandwidth monster, I guess they learned from that blunder :laugh:

Most internal caches within HD5xxx have doubled or more if I'm not mistaken, perhaps with increased size latency has become an issue?
 
This is just something that i've been wondering since i saw the results of the HD 5870's performance in w1z's review here.

Like the graphs' show the HD 5870 only ever outperforms the HD 4870x2 at 1024x768, which doesn't really matter since no one in their right mind that games at that res would purchase such a card. But i thought i'd heard a couple or few times somewhere that ATI might have trouble getting all their shaders utilized n that they had some trouble with utilizing even the 800 shaders on the HD 4890/70/50. Now i don't own any HD 58XX series card(yet), but I had thought that since the HD 5870 has all 1600 shaders on a single die and double the ROP's that the HD 5870 would outperform the HD 4870x2 by a decent margin esp once AA was enabled but it doesn't. Am i the only person that thinks this? I know we can expect at least 10% more performance with mature drivers, probly up to 20%, but i'm still curious if what i'd read a few times is or could be true with ATI having a difficult time managing all 1600 shaders.

It depends on what you expected out of a 5870. Here is how the 8800gtx, 9800gtx, gtx280, 4870 and 5870 performed when they were released:

2e499ua.jpg


v2zaxv.jpg


Not as impressive as a 8800GTX or HD4870, however miles farther from a 9800GTX, and even better in comparison to the GTX280.

It's impact in performance over previous gen is right between the impact made by a GTX280 over a 9800GTX and 8800GTX over a 7900GTX
 
you are correct in that aspect. Performance in comparison to ATI's previous flagship HD 4870 isn't what disappointed me. It was the specifications and the performance with those spec's that were less that what i had expected but as i've now come to realize and say, 3-5 months and the think performance will increase 10-20%.
 
It depends on what you expected out of a 5870. Here is how the 8800gtx, 9800gtx, gtx280, 4870 and 5870 performed when they were released:

http://i34.tinypic.com/2e499ua.jpg

http://i35.tinypic.com/v2zaxv.jpg

Not as impressive as a 8800GTX or HD4870, however miles farther from a 9800GTX, and even better in comparison to the GTX280.

It's impact in performance over previous gen is right between the impact made by a GTX280 over a 9800GTX and 8800GTX over a 7900GTX

LoL come on. You know as well as anyone that the 9xxx series were not a new generation. And also you should compare the GTX280 with the 8800 GT better than 9800GTX, because specs wise it's closer to half a GTX280 than the 9800GTX.

GTX280 => 32 ROP, 80 TMU, 240 SP, 602 Mhz core, 1296 Mhz shaders. Most Importantly: 19GPixels/s, 48 Gtexels/s, 933 Gflops, 140 GB/s

1/2 GTX280 => 16 ROP, 40 TMU, 120 SP, 602 Mhz core, 1296 Mhz shaders, Most Importantly: 9.5 GPixels/s, 24 Gtexels/s, 466 Gflops, 70 GB/s.

9800 GTX => 16 ROP, 64 TMU, 128 SP, 738 Mhz core, 1836 Mhz shaders, Most Importantly: 11.8 GPixels/s, 47 Gtexels/s, 705 Gflops, 70 GB/s.

8800 GT => 16 ROP, 56 TMU, 112 SP, 600 Mhz core, 1500 Mhz shaders., Most Importantly: 9.6 GPixels/s, 33.6 Gtexels/s, 504 Gflops, 57.6 GB/s.

1/2 the GTX280 More Equal Less

The GTX280 is 2x faster than 8800 GT and as you can see even the 8800GT is significantly more than half a GTX280 in most of the specs, so taking that into account, specs wise the GTX280 did very well. And as a ump said it's specs wise where the HD5870 "dissapoints". The 9800 GTX simply destroys the 1/2 GTX280 specs wise, so it was not expected a two-fold increase in performance. The GTX285 on the other hand is almost twice as fast as 9800GTX, thanks to higher clocks: http://www.techpowerup.com/reviews/HIS/HD_5770/30.html

9800GTX => GTX280

Pixel Fill Rate = 1.6x
Texture Fill Rate = 1x
Gflops = 1.3x
Performance at 1920x1200 = 1.7x according to Wizzard's review above.
Performance at 2560x1600 = 2x

HD4890 => HD5870

2x, 2x, 2x...

Performance 1920x1200 = 1.45x
Performance 2560x1600 = 1.53x

It's because of the above that has "dissapointed".
 
Last edited:
U miss the fact that one can have almost the peformance of a 500+euros 295gtx with a 299 euros card...as w1zzard proved in performance summary the295 is 114%of a single 5870...while the 5870 being 40-45%cheaper..That alone is not BELOW EXPECTATIONS IS IT??
 
jesus christ man, you need to spend some time away from your computer screen, and computers all together for that matter. (edit: @ benetanegia)
 
jesus christ man, you need to spend some time away from your computer screen, and computers all together for that matter. (edit: @ benetanegia)

:laugh::laugh::laugh:Many guys here need that!!
 
U miss the fact that one can have almost the peformance of a 500+euros 295gtx with a 299 euros card...as w1zzard proved in performance summary the295 is 114%of a single 5870...while the 5870 being 40-45%cheaper..That alone is not BELOW EXPECTATIONS IS IT??

Basing performance from a GTX295 is why I bought one, having owned a GTX295 this card definitely nips at its heels overall, and IMO is a smoother experience.

not to mention after OC, and my 5870 does 950mhz on stock voltage, its so sweet.
 
U miss the fact that one can have almost the peformance of a 500+euros 295gtx with a 299 euros card...as w1zzard proved in performance summary the295 is 114%of a single 5870...while the 5870 being 40-45%cheaper..That alone is not BELOW EXPECTATIONS IS IT??

It is way below expectations if you pay attention to specs. I'm not saying they are not outstanding cards, especially for the money, but most people were expecting much more considering the specs. And IMO anyone (esp. Ati fans) who says that it's not dissapointed by the performance is lying.

What if... what if Nvidia pulls out a 2x increase in performance? What's going to happen with the prices then and with the involved parties? How much would it take to Ati to recover from that? The last time that happened Ati went from a 55% market share to a 20%. So will Nvidia try to make big profits or will they try to own the whole market and almost erase Ati from the face of the earth? Either way we lose.

But that being said, I prefer Nvidia achieving a 2x increase in performance. After all, I'm an enthusiast, and unbeliebly fast hardware is what I want. I prefer the ZOMG! moment that we would experience, over paying less for my hardware. I would NOT pay more than I wanted anyway and so would everybody. The market does know how to look for itself...
 
Last edited:
Status
Not open for further replies.
Back
Top