Sunday, December 16th 2007

ATI R680 PCB Design and Cooling Early Pics

Here are some leaked pictures from ATI R680, courtesy of ChipHell. What you'll see on the PCB are 2x RV670XT GPUs and one PLX chip for communication between the two cores. All the sixteen memory chips (8 on the front and 8 on the back side) are missing from the board, probably because of the early development stage (that's not a finished product). Source said the card is using 0.7ns GDDR4 memory.
Source: ChipHell
Add your own comment

77 Comments on ATI R680 PCB Design and Cooling Early Pics

#1
Basard
by: wazzledoozle
This killed 3dfx.
I think partially.... Mostly it was Nvidia and ATI tha killed 3dfx, that and 3dfx thought it was a good idea to make THEIR OWN 3D API.

Why not just make a dual core gpu? Why bother with these huge boards? Maybe they should make a separate add in card to hold the voltage regulators and mosfets (joking)? :confused::laugh:

And don't a lot of the 8800 series have 16 ROPs just like the HD2900 and 3850/70? Yet Nvidia still pulls ahead with the same amount of ROPs... I don't think it's the ROP's fault, completely, it's gotta be something else.
Posted on Reply
#2
regan1985
if the price is good then its great for amd/ati and it could lower the price of current cards, but if its not cheap why not just go crossfire way
Posted on Reply
#3
[I.R.A]_FBi
by: Basard
I think partially.... Mostly it was Nvidia and ATI tha killed 3dfx, that and 3dfx thought it was a good idea to make THEIR OWN 3D API.

Why not just make a dual core gpu? Why bother with these huge boards? Maybe they should make a separate add in card to hold the voltage regulators and mosfets (joking)? :confused::laugh:

And don't a lot of the 8800 series have 16 ROPs just like the HD2900 and 3850/70? Yet Nvidia still pulls ahead with the same amount of ROPs... I don't think it's the ROP's fault, completely, it's gotta be something else.
Read this
Posted on Reply
#4
jaystein
by: wazzledoozle
ATI is the new 3dfx



This killed 3dfx.
I have one of those in my closet. The VooDoo5500 was a badass card. SO even if ATI does go down at least they will go out with a bang.
Posted on Reply
#5
sam0t
7950GX2 did not kill Nvidia, so why would this card bury AMD/ATI ?

To me this seems much more elegant than Nvidias Frankenstein (7950GX2) with two cards strap up as one with some Macgyver tape.
Posted on Reply
#6
btarunr
Editor & Senior Moderator
by: sam0t
7950GX2 did not kill Nvidia, so why would this card bury AMD/ATI ?

To me this seems much more elegant than Nvidias Frankenstein (7950GX2) with two cards strap up as one with some Macgyver tape.
:laugh::roll::laugh:

Because at that time, when ATI was dominating the market with X1900 and X1950, NVidia was busy working on the G80 because DX10 was around the corner and it paid off. NVidia was well prepared for DX10 unlike ATi which rolled out its first DX10 offering months after NV. Even today, NV is two steps ahead of ATi

BTW, the 7950 GX2 is the FASTEST DX9 card. Period. It was just a makeshift card to tackle X1950 XTX and keep consumers' attention maintained.
Posted on Reply
#7
prophylactic
by: Kursah
Question is, could they run 4 of these in CrossfireX? Imagine harnessing the power of 8 GPUs! I'm sure support for something like that is far off and distant, but I'm pretty sure if properly usable, that would be quite a performance feat.

Looks interesting for sure, but I'm waiting to see what the actual product is capable of.

:toast:
A spider has eight legs? The implications of the name aren't too terribly subtle.
Posted on Reply
#8
Chewy
by: sam0t
7950GX2 did not kill Nvidia, so why would this card bury AMD/ATI ?

To me this seems much more elegant than Nvidias Frankenstein (7950GX2) with two cards strap up as one with some Macgyver tape.
nv wasent worth under 5 billion at that time, remember Amd/Ati is the underdog, they dont well as much as Intel/Nvidia.. intel owns like 80% or more of the market even when amd was better and Nv pretty much does the same with its marketing elc.

Amd has been in a tight situation thier worth less than what they paid for Ati.


I think the spider might have to due with quad core and quad gpu's.
Posted on Reply
#9
prophylactic
Well, there's also the issue with which ATi fans have to deal, in the "Way It's Meant to be Played." As to say, being that the Green Giant has more money than ATi, they can easily pay developers to "harness" bits of their cards; thus, causing a performance differential in this regard. Fuck fairness, of course, it's about who has money. I'm not a fanboy. I don't care which card I use, but I'm also highly averse to the notion of developers necessarily expressing any degree of favoritism in the developmental process of the games.
Posted on Reply
#10
btarunr
Editor & Senior Moderator
NVidia's developer relations played a key role in grabbing its market shares. It knows that people read benchmarks before they buy a card so if they fool around with a few numbers like FPS or 3DMarks, they've got the market.....c'mon 320 Stream processors, 740 MHz core, 512 bit memory.....all of that should've translated into something....poor HD2900 XT.
Posted on Reply
#11
[I.R.A]_FBi
by: btarunr
NVidia's developer relations played a key role in grabbing its market shares. It knows that people read benchmarks before they buy a card so if they fool around with a few numbers like FPS or 3DMarks, they've got the market.....c'mon 320 Stream processors, 740 MHz core, 512 bit memory.....all of that should've translated into something....poor HD2900 XT.
read this
Posted on Reply
#12
acperience7
So how much power consumption are we talking about even with their new power saving design? I can't help but think that this card will wind up like the 2900XT: Highly priced, great performance, but overly priced, and underwhelming for it's specs, and price.
Posted on Reply
#13
Grings


ah, but this didnt kill ati (it did flop however)
Posted on Reply
#14
rhythmeister
Fingers crossed this eats the 8800gtx; dunno how it'll fit in this Lanbox lite tho :laugh:
Posted on Reply
#15
kwchang007
Back to 2900xt heat and power levels I see. At least it should be faster than what nvidia has out now. But maybe when they roll out the 9*00 series it won't be king of the hill anymore....hope this is like a stop gap for ATI.
Posted on Reply
#16
TXcharger
if yall are that desperate for a card to beat the 8800gtx... sad... especially considering this is a dual GPU and the 8800gtx has been out over a year and has a single gpu...

ATi is screwed...
Posted on Reply
#17
DarkMatter
by: [I.R.A]_FBi
Read this
I don't really get what do you want to point out with the link, except of how different are those architectures between each other. First, the link is about R600 and it is compared to G80 there. I'm sure he was talking about G92. HD2000 and HD3000 have a higher pixel fillrate than G92:
HD2900 = 11888 MP/s
HD3870 = 12400 MP/s
HD3850 = 10700 MP/s
8800GT = 9600 MP/s
The article at anandtech even suggests that Amd's ROP architecture is more balanced to say the least, and never worse:
If we compare this setup with G80, we're not as worried as we are about texture capability. G80 can complete 24 pixels per clock (4 pixels per ROP with six ROPs). Like R600, G80 is capable of 2x Z-only performance with 48 Z/stencil operations per clock with AA enabled. When AA is disabled, the hardware is capable of 192 Z-only samples per clock. The ratio of running threads to ROPs is actually worse on G80 than on R600. At the same time, G80 does offer a higher overall fill rate based on potential pixels per clock and clock speed.
So if ROP capacity was the problem, G92 would never perform better than Radeons. Back when they wrote the article someone could have concluded that even if ROPs on G80 are not as efficient, they have more and thus it performed better. That's not the case though as G92 proves otherwise.
In the end, he is right when he said it's gotta be something else. Whether it it is texturing power or shading power is something really difficult to say, since on G80/G92 SP and TU are tied together and at the same time doubling the texture addressing power in G92 didn't mean any significant improvement over G80.
Posted on Reply
#18
DarkMatter
by: DarkMatter
I don't really get what do you want to point out with the link, except of how different are those architectures between each other. First, the link is about R600 and it is compared to G80 there. I'm sure he was talking about G92. HD2000 and HD3000 have a higher pixel fillrate than G92:
HD2900 = 11888 MP/s
HD3870 = 12400 MP/s
HD3850 = 10700 MP/s
8800GT = 9600 MP/s
The article at anandtech even suggests that Amd's ROP architecture is more balanced to say the least, and never worse:

So if ROP capacity was the problem, G92 would never perform better than Radeons. Back when they wrote the article someone could have concluded that even if ROPs on G80 are not as efficient, they have more and thus it performed better. That's not the case though as G92 proves otherwise.
In the end, he is right when he said it's gotta be something else. Whether it it is texturing power or shading power is something really difficult to say, since on G80/G92 SP and TU are tied together and at the same time doubling the texture addressing power in G92 didn't mean any significant improvement over G80.
EDIT: I have just realized you could have been trying to express exactly what I said instead of being contrary to his opinion. Sorry if that's the case. :o:o:o:o:o
Posted on Reply
#19
jaystein
by: TXcharger
if yall are that desperate for a card to beat the 8800gtx... sad... especially considering this is a dual GPU and the 8800gtx has been out over a year and has a single gpu...

ATi is screwed...
If what you say is true, then it's a sad day for the Video Card market.

I switched over to nVIDIA a year ago, but I want to see AMD/ATI survive.
Posted on Reply
#20
DarkMatter
by: prophylactic
Well, there's also the issue with which ATi fans have to deal, in the "Way It's Meant to be Played." As to say, being that the Green Giant has more money than ATi, they can easily pay developers to "harness" bits of their cards; thus, causing a performance differential in this regard. Fuck fairness, of course, it's about who has money. I'm not a fanboy. I don't care which card I use, but I'm also highly averse to the notion of developers necessarily expressing any degree of favoritism in the developmental process of the games.
I won't go to say Nvidia hasn't more money, because I don't know. But the next links suggest the picture should be otherwise. If Nvidia has more cash, I would say it was Ati's fault. Not only their annual revenue has been higher in the last 4 years, but their net income in the last two too. And revenue is so much higher...

http://www.marketwatch.com/tools/quotes/financials.asp?symb=ATI&sid=160919&report=1&freq=1
http://www.marketwatch.com/tools/quotes/financials.asp?symb=NVDA

Also don't forget that Ati has been in the game 8 years before Nvidia. Looking into this perspective who is the giant?
Posted on Reply
#21
TXcharger
as much as i would like to see AMD put up a fight against Intel and Nvidia... they are not doing much to put up a fight, AMD messed up when they bought ATi. Not only were they reeling at the time but they are bringing down a good company. I used to be a ATi fanboy but now that i have a nvidia product, i like it, and it has more upside. AMD is going to go bankrupt, the Phenom is not gonna be as great as they expect and Intel is destroying them in sales... You could say the Core2Duo architecture was the best thing to happen to Intel.

AMD is gonna go down and prices are gonna skyrocket in the processor war... But once Intel brings in there GPU's it will be a good fight between them and Nvidia
Posted on Reply
#22
imperialreign
by: Grings


ah, but this didnt kill ati (it did flop however)
IIRC, that was a desperate response to 3DFX's VooDoo 5000 series
I think partially.... Mostly it was Nvidia and ATI tha killed 3dfx, that and 3dfx thought it was a good idea to make THEIR OWN 3D API.
yeah - but the Glied API was superior to standard OGL, and their cards handled OGL applications much faster than nVidia's or ATI's cards of the time.




TBH, with all this leaked info coming out of ATI - which is extrmelly unusual, but has become more and more common over the last year . . . who's to say they're not going for some slight of hand? y'know, leaking info on stuff supposedly "in-development" to get their competition to look the other way while they're developing the real ass-kicker behind lock & key? ATI has done it before; releasing a sub-par "flagship" product while they get their hardware ironed out for the big-dog to be taken off the chain. For example, look at the X1800 series and how nVidia reacted to that while ATI prepared the X1900 series for launch.
Posted on Reply
#23
btarunr
Editor & Senior Moderator
Bullseye!

Remember before the HD2900XT came out, there were pictures of some long cards (prototypes) which ppl thought would be the product that'd be launched as ATI's card to face the 8800 GTX?

It was the RV680.

ATI has this mantra of devising certain things in advance and trying out the market before bringing out a shocker.

After the 7800 GTX, it was X1800 XT (flop). Then they came up with X1900 XTX (beat the 7800 GTX) later 7900 GTX was launched and was combated by the X1950 XTX, Pro.

So there's a similarity and speculators like us should look for clues.
Posted on Reply
#24
mandelore
by: btarunr
:laugh::roll::laugh:NVidia was well prepared for DX10 unlike ATi
they were so prepared that they couldnt even get a proper gpu virtualisation to work, they barely got anything working without crying to microsoft to get dx10 nerfed so they were back in the game. ATI architecture is waaay ahead of NV, driver support may be another issue, resulting is a knee-capped card.

NV 2 steps ahead, ?? I think not...
Posted on Reply
#25
DarkMatter
by: TXcharger
as much as i would like to see AMD put up a fight against Intel and Nvidia... they are not doing much to put up a fight, AMD messed up when they bought ATi. Not only were they reeling at the time but they are bringing down a good company. I used to be a ATi fanboy but now that i have a nvidia product, i like it, and it has more upside. AMD is going to go bankrupt, the Phenom is not gonna be as great as they expect and Intel is destroying them in sales... You could say the Core2Duo architecture was the best thing to happen to Intel.

AMD is gonna go down and prices are gonna skyrocket in the processor war... But once Intel brings in there GPU's it will be a good fight between them and Nvidia
I don't really think that the fall of AMD is in the best interest to Intel. The better scenery for Intel is a weak but still alive AMD, and never a monopoly IMHO. I will explain.
In this kind of bussiness it's very easy to enter one market if there isn't any competition there. Only for entering the market you could take over ~20% of market share, of course if the product is good enough, not better than the compettion, just good. Samsung has followed this strategy many times, and I don't think I have to say they have succeeded.
On the other hand trying to enter a market in which there is already competition is very difficult. Can you remember XSI? Their cards were good, more or less on par with Nvidia or Ati (they had some drivers issues, but which new card doesn't nowadays?), but they were new in the game, there were alternatives to the better yet expensive cards in respective segments, so they didn't get any market share. There existed the possibility to buy the better Radeons or the worse but cheaper Nvidias. In this game there wasn't a place for XSI.
I know it's not the best example, since Volari had severe rendering issues on some games, but they could have had some market share for not gaming PCs for example.
If AMD goes down, someone will buy it, they are just not going to let it totally dissapear. The buyer could be IBM or Samsung, for example. If any of those buy AMD it could mean big troubles for Intel, since what AMD lacks, both of those have in excess. Money.
Posted on Reply
Add your own comment