Monday, January 28th 2008

NVIDIA GeForce 9800 GX2 Dismantled

Pictures from the NVIDIA GeForce 9800 GX2 dual card popped up online again, showing some new exclusive details never seen before. More pictures can be found here.

Source: CHIPHELL
Add your own comment

84 Comments on NVIDIA GeForce 9800 GX2 Dismantled

#1
PVTCaboose1337
Graphical Hacker
newtekie1 said:
Read the review on the 3870X2 here at TPU. One of the cores ran at 65°C under load, and the other ran at 80°C.
Ya the aluminum HS was on one core, and the copper HS on the other core... weird cooler design.
Posted on Reply
#2
Mussels
Moderprator
PVTCaboose1337 said:
Ya the aluminum HS was on one core, and the copper HS on the other core... weird cooler design.
it was designed so that the first GPU's heat wasnt entirely dumped into the second one. Didnt work so well, as the heat is still really unbalanced.
Posted on Reply
#3
Hawk1
Mussels said:
it was designed so that the first GPU's heat wasnt entirely dumped into the second one. Didnt work so well, as the heat is still really unbalanced.
That's why I'm waiting to see how the ASUS/GeCube dual fan versions do as far as cooling each core, and if it causes any other significant heat problems for the card (obviously the heat remaining in the case will be an issue). I cant wait to see the 9800 and the cooling for it (how stock performs and what aftermarket goodies come about). Will be very interesting.
Posted on Reply
#4
imperialreign
newtekie1 said:
Read the review on the 3870X2 here at TPU. One of the cores ran at 65°C under load, and the other ran at 80°C.
:twitch:

I musta completely missed that reading W1z's review. That's a ton of a difference, there!

newtekie1 said:

And wasn't it Alienware that first went with a dual card solution using nVidia GPUs(talking modern GPUs here). That is the first I ever remember hearing about multiple GPU setups, and nVidia soon released their SLI. I don't even remember crossfire being mentioned until after SLI was already on the market. In fact SLI was on the market in June of 2004 with the release of the and Crossfire wasn't on the market until September of 2005, more than a year later.

I think you have your time lines and who created what to compete with who confused. Crossfire was developed to compete with nVidia's SLI. And it only recently reached the level of performance improvement that SLI gives. ATI just finally got Crossfire working as solidly as SLI.
You're right, Crossfire was designed to compete with SLI, I never said otherwise; but nVidia didn't get on the ball with their technology until rumors were out as to what ATI were up to - but nVidia did not pioneer SLI; 3DFX did. nVidia originally acquired the technology when they bought 3DFX back in late 2000. They also acquired multi-gpu per PCB and multi GPU/PCB + SLI, as 3DFX was also the company that pioneered those designs in their quest for supreme performance domination (The VooDoo5 6000 - which was never released - was to have 4 GPUs on one PCB, and was to have come with it's own power supply: http://www.x86-secret.com/articles/divers/v5-6000/v56kgb-6.htm . . . actually, if you get a chance, check that whole site from page 1, lot's of interesting info there!). But, like I said, after the acquisition, nVidia didn't re-introduce SLI until '04. ATI released Crossfire a year later, in '05 - not to get fanboish here, but look who has come the furthest. nVidia acquired the technology and expanded on it, ATI designed theirs from the ground up.
Posted on Reply
#5
Mussels
Moderprator
crossfire was designed to beat SLI, and failed miserably at the start. However once they moved to an internal bridge like SLI, they finally got it right and have been equal to SLI since - the performance gain is higher than SLI for whatever reason, but it seems to be compatible with less games too. (especially DX10 titles)

Both of them have potential, i like where ATI is heading with fusion and crossfireX
Posted on Reply
#6
imperialreign
Mussels said:
crossfire was designed to beat SLI, and failed miserably at the start. However once they moved to an internal bridge like SLI, they finally got it right and have been equal to SLI since - the performance gain is higher than SLI for whatever reason, but it seems to be compatible with less games too. (especially DX10 titles)

Both of them have potential, i like where ATI is heading with fusion and crossfireX
the initial implimentations were a little . . . bulky. That external dongle wasn't that great an idea, and the need for a master/slave was a little odd, too.

Not sure 100% about performance gains, though. At this point I think it's 50/50 - TBH, I think it also comes down to game devs, too - look at the *amazing* Crossfire performance increase everyone saw with the Crysis 1.1 patch :wtf:
Posted on Reply
#7
btarunr
Editor & Senior Moderator
The sole reason behind Crossfire > SLI is this:

The northbridge, be it AMD 580X, 790FX or the Intel X38, supply all the 32 PCI-E lanes to the video cards and it eases inter-GPU communication than in the NForce 590 SLI, 680i SLI where the northbridge and southbridge each supply the video-cards with 16 lanes independently and the HyperTransport bus between the chipset is relatively congested when doing multi-GPU rendering. The same factor is what partly brings down the efficiency of running Crossfire setups on Intel P35 based boards where the second video card not only gets just 4 PCI-E lanes but also that the 4 lanes come from the southbridge.
Posted on Reply
#8
newtekie1
Semi-Retired Folder
imperialreign said:
You're right, Crossfire was designed to compete with SLI, I never said otherwise; but nVidia didn't get on the ball with their technology until rumors were out as to what ATI were up to - but nVidia did not pioneer SLI; 3DFX did. nVidia originally acquired the technology when they bought 3DFX back in late 2000. They also acquired multi-gpu per PCB and multi GPU/PCB + SLI, as 3DFX was also the company that pioneered those designs in their quest for supreme performance domination (The VooDoo5 6000 - which was never released - was to have 4 GPUs on one PCB, and was to have come with it's own power supply: http://www.x86-secret.com/articles/divers/v5-6000/v56kgb-6.htm . . . actually, if you get a chance, check that whole site from page 1, lot's of interesting info there!). But, like I said, after the acquisition, nVidia didn't re-introduce SLI until '04. ATI released Crossfire a year later, in '05 - not to get fanboish here, but look who has come the furthest. nVidia acquired the technology and expanded on it, ATI designed theirs from the ground up.
Nvidia didn't bring SLI out of mothballs because ATi was working on Crossfire. It is the other way around, ATi started developing Crossfire because nVidia was bringing SLI out of mothballs. And really to say that the 3DFX SLI had anything to do with nVidia's current SLI is kind of off. The only thing the two share is name and concept, other than that they are totally different. Nvidia build their current SLI from the ground up, they simply took the concept from 3DFX. Comparing who came the furthest isn't really worth anything. Obviously ATi has come the furthest with the technology because Crossfire was a piece of crap when it was released, they had the furthest to come to be competitive. SLI was a lot better when it was released, so nVidia hasn't needed to come as far.
Posted on Reply
#9
Xaser04
Hawk1 said:
That's why I'm waiting to see how the ASUS/GeCube dual fan versions do as far as cooling each core, and if it causes any other significant heat problems for the card (obviously the heat remaining in the case will be an issue). I cant wait to see the 9800 and the cooling for it (how stock performs and what aftermarket goodies come about). Will be very interesting.
It should be interesting to read a review about the Asus card as they have added an extra two DVI ports to it which whilst good on one hand is completely dump on the other as it blocks up the exhaust vent, thi whilst not such an issue on a single gpu card could be more of a concern on a dual gpu card especially if the case cooling can't get rid of the extra heat quickly enough.
Posted on Reply