• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Cypress Graphics Accelerator Pictured

Whoever says its going to be faster then 4870x2 is crazy

If its not i dont think people will pay 400 for it...I know i wont. Ill hold on to my 8800 GTS till Nvidia forces the prices down
 
Either way ~ this will be an interesting:)
Can't wait for the reviews!
 
One cannot rely on rumour and innuendo I will believe it all when I see it even the pictures.
 
Whoever says its going to be faster then 4870x2 is crazy

Why? The 4870 was faster than the 3870x2 in a lot of real-world game tests (not synthetic benchmarks, but i don't play synthetic benchmarks every night). It wouldn't be unrealistic to believe the 5870 can be faster than the 4870x2 in real-world game testing. I guess we'll have to wait a couple of weeks for some decent reviews to come out before we can know for definate.
 
img_webmaster_1203362304.jpg






Wonder why Cypress ?
 
How they could just kill a man Nvidia
 
The 3870x2 had faster clocks fewer shaders, so if the 5870 has as many shaders and faster clocks and memory, it should outdo the 4870x2
 
The 3870x2 had faster clocks fewer shaders, so if the 5870 has as many shaders and faster clocks and memory, it should outdo the 4870x2

wut? :wtf:
 
...Why is it so hideous? That is some of the cheapest looking plastic I've ever seen on a pc part. Those red vents look like shit you'd see on a happy meal in the 90s. Even without them the quality of the finish on the rest of it is so absurdly sub-par. I'm officially considering this card nonexistent until they fix the quality issue. Maybe glossing up the main body and removing the vents would require the least redesign effort.

All I can say to your opinion is WOW. A matte finish is what that finish is. Its by no means cheaper or more expensive than a gloss finish because all that is different is the mould that the plastic is injected into. Personally i find gloss finishes hideous ;). Those vents look awesome. Just like a forumula 1 car's sidepod intakes.

And anyway whats so official about what you said there?
 
So people would rather try and buy a defective nvidia card with a glossy cooler than buy a card that works.

Is the same with cars.
Americans might love big engines and american cars but they go with an toyota cause it doesnt break....

Nvidia have had alot of issues lately, wonder if they can make the G300 in time and no bugs on it.
 
Already it's is tomorrow :D
 
first thing i don't think about get reference card
 
I'm really loving the native HDMI support instead of using the converter that comes with the current gen cards. I"m eager to see how well this card does...and the inevitable 5850 version when it's announced as that's likely what I'll end up with if I buy a new generation card.

My 4850 is still monster enough to do anything I want now so it's no hurry, but with the native HDMI support and having the extra power wouldn't be a bad idea at all. :)

Kei
 
...Why is it so hideous? That is some of the cheapest looking plastic I've ever seen on a pc part. Those red vents look like shit you'd see on a happy meal in the 90s. Even without them the quality of the finish on the rest of it is so absurdly sub-par. I'm officially considering this card nonexistent until they fix the quality issue. Maybe glossing up the main body and removing the vents would require the least redesign effort.

I agree it looks hideous and like a happy meal toy from the 90s.

BUT, some people like happy meal toys... and some people will like this hideous look.
 
I agree it looks hideous and like a happy meal toy from the 90s.

BUT, some people like happy meal toys... and some people will like this hideous look.

Actually, the plastic looks about the same as our prototypes that have been created with a 3D printer. So don't expect this to be final.
 
wow that looks vey powerfull for 299$, that thing runs hot, Ati would never use such a cooler unless it's really needed:rockout:

I'll still beleive $299 when i see it, especially if it is 50% faster than a 4890(I really hope it is though :rockout:), but the cooling seems weak to me, just look at the size of the fan, it's small and with the hot air from the VRM's and memory having to travel up the length of that card and out the back I would imagine unless the fan is on leaf blower mode it's gonna get warm in there! Why on earth both sides cant put a low RPM angled 80mm in there for better airflow and quietness, design looks pretty minimalistic though and I like that...... "flash" is so nineties :)
 
yea i remember seeing leaked pics that looked exactly like that n we all denied it. it looks fine to me, could be better but eh idc i just want it :). though i thought it was going to be 299. I think if they priced it at $349/5870, $279/5850 it'd be perfect. But idc lol i'm getting one as soon as is released :D
 
yea i remember seeing leaked pics that looked exactly like that n we all denied it. it looks fine to me, could be better but eh idc i just want it :). though i thought it was going to be 299. I think if they priced it at $349/5870, $279/5850 it'd be perfect. But idc lol i'm getting one as soon as is released :D
Then the thing is, I don't see how aggressive this pricing is.:shadedshu
$299 5870 and $199 5850 will simply kick nVidias ass and nuts up and down the stairs. :rockout:
 
Can someone please explain to me why those holes at the end of the card are meant to be intakes.

As I understand it, those fans work by drawing air in from the top and pushing it radially out in all directions (physical barriers can be used to direct that flow). So for me those holes at the end seem more likely to be exhaust than intake - which would also explain the small exhaust out the back.
The problem with this idea though is that the air which vents out these holes into the case is not going to cool much on the board - maybe the power circuits though?

Feel free to correct me on that though.
 
Would it blow your mind if I suggested the fan ran... Backwards!? =O

Anyway, I am largely against the high-end coolers made by both ATi and nVidia. Arctic Cooling's excuse (when they stopped manufacturing them to exhaust out the back, their primary selling point for the longest time) had to do with the air pressure difference between the inside of the case and the outside. It's certainly a thought, to be sure, though I'm pretty sure few (if any) people on this forum (including myself) are qualified to do much more than speculate on it.

Regardless though, death to those fans!! Why aren't the blades longer, anyway? Especially if they're going to utilize the side (or better yet the bottom) for intake, they don't need to leave that much space between the blades and the bearing.

most cases have a shit design, and use negative air pressure. every moron and his dog gets an erection over cases with huge 200mm exhaust fans.

In a case with more air out than in, the case would suck air in the same hole the GPU fan was trying to blow out, effectively making the card run hotter, or just sucking the hot air back in either side of the card.
 
i think i'll stick with my 4870x2 for now , as for looks , who cares as long as it performs , personally i'd like to take all the cooling off and run it on water , then overclock it
 
If those 1600 SPs are in 80 groups of 20, like the 800SPs of 4870/90 were in 80 groups of 10 then my crystal ball says RV870 will only be 25-40% faster then RV770. In highly optimized games could be higher.

The problem is ATI drivers had a hard time loading up all those groups of 10, imagine how difficult it will be with groups of 20. So older/current games will only feel a little tingling sensation and no real performance boost. Whatever.

So one 4870X2 will be better then one 5870. Kinda sad really.

My crystal ball has been known to be wrong, but come tomorrow I will not need it for this particular problem.
 
sweet. but ima still stickin to my GTX200s.. not unless dx11 rendering method titles are out the same day that card comes out. Otherwise, its a waste of money and effort like them dx10.1 cards.
 
Back
Top