• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ATI R680 PCB Design and Cooling Early Pics

wheres the mem on the card :p


All the sixteen memory chips (8 on the front and 8 on the back side) are missing from the board, probably because of the early development stage (that's not a finished product)

read folks, dont just look at the pretty pix0rz
 
This killed 3dfx.

I think partially.... Mostly it was Nvidia and ATI tha killed 3dfx, that and 3dfx thought it was a good idea to make THEIR OWN 3D API.

Why not just make a dual core gpu? Why bother with these huge boards? Maybe they should make a separate add in card to hold the voltage regulators and mosfets (joking)? :confused::laugh:

And don't a lot of the 8800 series have 16 ROPs just like the HD2900 and 3850/70? Yet Nvidia still pulls ahead with the same amount of ROPs... I don't think it's the ROP's fault, completely, it's gotta be something else.
 
Last edited:
if the price is good then its great for amd/ati and it could lower the price of current cards, but if its not cheap why not just go crossfire way
 
I think partially.... Mostly it was Nvidia and ATI tha killed 3dfx, that and 3dfx thought it was a good idea to make THEIR OWN 3D API.

Why not just make a dual core gpu? Why bother with these huge boards? Maybe they should make a separate add in card to hold the voltage regulators and mosfets (joking)? :confused::laugh:

And don't a lot of the 8800 series have 16 ROPs just like the HD2900 and 3850/70? Yet Nvidia still pulls ahead with the same amount of ROPs... I don't think it's the ROP's fault, completely, it's gotta be something else.

Read this
 
ATI is the new 3dfx

V5onblack.jpg


This killed 3dfx.
I have one of those in my closet. The VooDoo5500 was a badass card. SO even if ATI does go down at least they will go out with a bang.
 
7950GX2 did not kill Nvidia, so why would this card bury AMD/ATI ?

To me this seems much more elegant than Nvidias Frankenstein (7950GX2) with two cards strap up as one with some Macgyver tape.
 
7950GX2 did not kill Nvidia, so why would this card bury AMD/ATI ?

To me this seems much more elegant than Nvidias Frankenstein (7950GX2) with two cards strap up as one with some Macgyver tape.

:laugh::roll::laugh:

Because at that time, when ATI was dominating the market with X1900 and X1950, NVidia was busy working on the G80 because DX10 was around the corner and it paid off. NVidia was well prepared for DX10 unlike ATi which rolled out its first DX10 offering months after NV. Even today, NV is two steps ahead of ATi

BTW, the 7950 GX2 is the FASTEST DX9 card. Period. It was just a makeshift card to tackle X1950 XTX and keep consumers' attention maintained.
 
Question is, could they run 4 of these in CrossfireX? Imagine harnessing the power of 8 GPUs! I'm sure support for something like that is far off and distant, but I'm pretty sure if properly usable, that would be quite a performance feat.

Looks interesting for sure, but I'm waiting to see what the actual product is capable of.

:toast:


A spider has eight legs? The implications of the name aren't too terribly subtle.
 
7950GX2 did not kill Nvidia, so why would this card bury AMD/ATI ?

To me this seems much more elegant than Nvidias Frankenstein (7950GX2) with two cards strap up as one with some Macgyver tape.

nv wasent worth under 5 billion at that time, remember Amd/Ati is the underdog, they dont well as much as Intel/Nvidia.. intel owns like 80% or more of the market even when amd was better and Nv pretty much does the same with its marketing elc.

Amd has been in a tight situation thier worth less than what they paid for Ati.


I think the spider might have to due with quad core and quad gpu's.
 
Well, there's also the issue with which ATi fans have to deal, in the "Way It's Meant to be Played." As to say, being that the Green Giant has more money than ATi, they can easily pay developers to "harness" bits of their cards; thus, causing a performance differential in this regard. Fuck fairness, of course, it's about who has money. I'm not a fanboy. I don't care which card I use, but I'm also highly averse to the notion of developers necessarily expressing any degree of favoritism in the developmental process of the games.
 
NVidia's developer relations played a key role in grabbing its market shares. It knows that people read benchmarks before they buy a card so if they fool around with a few numbers like FPS or 3DMarks, they've got the market.....c'mon 320 Stream processors, 740 MHz core, 512 bit memory.....all of that should've translated into something....poor HD2900 XT.
 
NVidia's developer relations played a key role in grabbing its market shares. It knows that people read benchmarks before they buy a card so if they fool around with a few numbers like FPS or 3DMarks, they've got the market.....c'mon 320 Stream processors, 740 MHz core, 512 bit memory.....all of that should've translated into something....poor HD2900 XT.

read this
 
So how much power consumption are we talking about even with their new power saving design? I can't help but think that this card will wind up like the 2900XT: Highly priced, great performance, but overly priced, and underwhelming for it's specs, and price.
 
651px-Rage_fury_maxx_board.jpg


ah, but this didnt kill ati (it did flop however)
 
Fingers crossed this eats the 8800gtx; dunno how it'll fit in this Lanbox lite tho :laugh:
 
Back to 2900xt heat and power levels I see. At least it should be faster than what nvidia has out now. But maybe when they roll out the 9*00 series it won't be king of the hill anymore....hope this is like a stop gap for ATI.
 
if yall are that desperate for a card to beat the 8800gtx... sad... especially considering this is a dual GPU and the 8800gtx has been out over a year and has a single gpu...

ATi is screwed...
 

I don't really get what do you want to point out with the link, except of how different are those architectures between each other. First, the link is about R600 and it is compared to G80 there. I'm sure he was talking about G92. HD2000 and HD3000 have a higher pixel fillrate than G92:
HD2900 = 11888 MP/s
HD3870 = 12400 MP/s
HD3850 = 10700 MP/s
8800GT = 9600 MP/s
The article at anandtech even suggests that Amd's ROP architecture is more balanced to say the least, and never worse:
If we compare this setup with G80, we're not as worried as we are about texture capability. G80 can complete 24 pixels per clock (4 pixels per ROP with six ROPs). Like R600, G80 is capable of 2x Z-only performance with 48 Z/stencil operations per clock with AA enabled. When AA is disabled, the hardware is capable of 192 Z-only samples per clock. The ratio of running threads to ROPs is actually worse on G80 than on R600. At the same time, G80 does offer a higher overall fill rate based on potential pixels per clock and clock speed.
So if ROP capacity was the problem, G92 would never perform better than Radeons. Back when they wrote the article someone could have concluded that even if ROPs on G80 are not as efficient, they have more and thus it performed better. That's not the case though as G92 proves otherwise.
In the end, he is right when he said it's gotta be something else. Whether it it is texturing power or shading power is something really difficult to say, since on G80/G92 SP and TU are tied together and at the same time doubling the texture addressing power in G92 didn't mean any significant improvement over G80.
 
I don't really get what do you want to point out with the link, except of how different are those architectures between each other. First, the link is about R600 and it is compared to G80 there. I'm sure he was talking about G92. HD2000 and HD3000 have a higher pixel fillrate than G92:
HD2900 = 11888 MP/s
HD3870 = 12400 MP/s
HD3850 = 10700 MP/s
8800GT = 9600 MP/s
The article at anandtech even suggests that Amd's ROP architecture is more balanced to say the least, and never worse:

So if ROP capacity was the problem, G92 would never perform better than Radeons. Back when they wrote the article someone could have concluded that even if ROPs on G80 are not as efficient, they have more and thus it performed better. That's not the case though as G92 proves otherwise.
In the end, he is right when he said it's gotta be something else. Whether it it is texturing power or shading power is something really difficult to say, since on G80/G92 SP and TU are tied together and at the same time doubling the texture addressing power in G92 didn't mean any significant improvement over G80.

EDIT: I have just realized you could have been trying to express exactly what I said instead of being contrary to his opinion. Sorry if that's the case. :o:o:o:o:o
 
if yall are that desperate for a card to beat the 8800gtx... sad... especially considering this is a dual GPU and the 8800gtx has been out over a year and has a single gpu...

ATi is screwed...

If what you say is true, then it's a sad day for the Video Card market.

I switched over to nVIDIA a year ago, but I want to see AMD/ATI survive.
 
Well, there's also the issue with which ATi fans have to deal, in the "Way It's Meant to be Played." As to say, being that the Green Giant has more money than ATi, they can easily pay developers to "harness" bits of their cards; thus, causing a performance differential in this regard. Fuck fairness, of course, it's about who has money. I'm not a fanboy. I don't care which card I use, but I'm also highly averse to the notion of developers necessarily expressing any degree of favoritism in the developmental process of the games.

I won't go to say Nvidia hasn't more money, because I don't know. But the next links suggest the picture should be otherwise. If Nvidia has more cash, I would say it was Ati's fault. Not only their annual revenue has been higher in the last 4 years, but their net income in the last two too. And revenue is so much higher...

http://www.marketwatch.com/tools/quotes/financials.asp?symb=ATI&sid=160919&report=1&freq=1
http://www.marketwatch.com/tools/quotes/financials.asp?symb=NVDA

Also don't forget that Ati has been in the game 8 years before Nvidia. Looking into this perspective who is the giant?
 
as much as i would like to see AMD put up a fight against Intel and Nvidia... they are not doing much to put up a fight, AMD messed up when they bought ATi. Not only were they reeling at the time but they are bringing down a good company. I used to be a ATi fanboy but now that i have a nvidia product, i like it, and it has more upside. AMD is going to go bankrupt, the Phenom is not gonna be as great as they expect and Intel is destroying them in sales... You could say the Core2Duo architecture was the best thing to happen to Intel.

AMD is gonna go down and prices are gonna skyrocket in the processor war... But once Intel brings in there GPU's it will be a good fight between them and Nvidia
 
651px-Rage_fury_maxx_board.jpg


ah, but this didnt kill ati (it did flop however)


IIRC, that was a desperate response to 3DFX's VooDoo 5000 series


I think partially.... Mostly it was Nvidia and ATI tha killed 3dfx, that and 3dfx thought it was a good idea to make THEIR OWN 3D API.

yeah - but the Glied API was superior to standard OGL, and their cards handled OGL applications much faster than nVidia's or ATI's cards of the time.




TBH, with all this leaked info coming out of ATI - which is extrmelly unusual, but has become more and more common over the last year . . . who's to say they're not going for some slight of hand? y'know, leaking info on stuff supposedly "in-development" to get their competition to look the other way while they're developing the real ass-kicker behind lock & key? ATI has done it before; releasing a sub-par "flagship" product while they get their hardware ironed out for the big-dog to be taken off the chain. For example, look at the X1800 series and how nVidia reacted to that while ATI prepared the X1900 series for launch.
 
Bullseye!

Remember before the HD2900XT came out, there were pictures of some long cards (prototypes) which ppl thought would be the product that'd be launched as ATI's card to face the 8800 GTX?

It was the RV680.

ATI has this mantra of devising certain things in advance and trying out the market before bringing out a shocker.

After the 7800 GTX, it was X1800 XT (flop). Then they came up with X1900 XTX (beat the 7800 GTX) later 7900 GTX was launched and was combated by the X1950 XTX, Pro.

So there's a similarity and speculators like us should look for clues.
 
:laugh::roll::laugh:NVidia was well prepared for DX10 unlike ATi

they were so prepared that they couldnt even get a proper gpu virtualisation to work, they barely got anything working without crying to microsoft to get dx10 nerfed so they were back in the game. ATI architecture is waaay ahead of NV, driver support may be another issue, resulting is a knee-capped card.

NV 2 steps ahead, ?? I think not...
 
Back
Top