Monday, January 28th 2008

NVIDIA GeForce 9800 GX2 Dismantled

Pictures from the NVIDIA GeForce 9800 GX2 dual card popped up online again, showing some new exclusive details never seen before. More pictures can be found here.
Source: CHIPHELL
Add your own comment

84 Comments on NVIDIA GeForce 9800 GX2 Dismantled

#26
Hawk1
candle_86never heard of RMA refusal for a scratch
No, but if it's clearly been tampered with/opened up, you may get some issues - I think its EVGA thats the only one that allows aftermarket cooling on thier cards without voiding warranty.
Posted on Reply
#28
TheLostSwede
News Editor
Well, this card seems to use some kind of ribbon cable to connect to the two PCB's together by the looks of this image - www.techpowerup.com/img/08-01-28/1-3.jpg
This suggests that this really is just a pair of cards in SLI using a single PCIe interface.
It seems like the only reason the second PCB has an eight-pin power connector is because it can't draw any power from the PCIe interface, as the ribbon cable doesn't allow for power between the two cards. Seems like a strange setup and a bit of a bodge job in all honesty.
It really looks like Nvidia likes to do "quick fixes" these days like the nForce 780i chipset...
Posted on Reply
#29
shoman24v
This card(s) will definatly be over $1000.
Posted on Reply
#30
Hawk1
shoman24vThis card(s) will definatly be over $1000.
LOL, I don't think they would get many buyers (maybe Candle;)) at that price. Even if it does a million FPS in Crysis on a 30" screen, it would not be a great seller.
Posted on Reply
#31
candle_86
TheLostSwedeWell, this card seems to use some kind of ribbon cable to connect to the two PCB's together by the looks of this image - www.techpowerup.com/img/08-01-28/1-3.jpg
This suggests that this really is just a pair of cards in SLI using a single PCIe interface.
It seems like the only reason the second PCB has an eight-pin power connector is because it can't draw any power from the PCIe interface, as the ribbon cable doesn't allow for power between the two cards. Seems like a strange setup and a bit of a bodge job in all honesty.
It really looks like Nvidia likes to do "quick fixes" these days like the nForce 780i chipset...
actully it makes perfect sense how they did it, if they do it like ATI did, there are lenght concerns. The 7950GX2 is built very similar and sold decently well. Also dont expect 1000 dollar card, expect say 500.
Posted on Reply
#32
AddSub
This card(s) will definatly be over $1000.
I wouldn't be surprised. Not that this card is actually worth that much. Or half that much, since multi-GPU solutions in general are feeble and unimpressive performers if similar products in the past are any kind of indicator. But, this is nvidia we are talking about and they know how to harness and exploit the hype around their products. Right now, nvidia is Antec of GPU's. In other words, actual performance has nothing to do with price.

Since ATI is pricing their 3870 X2's at $450, these GX2's will go for about $550-$700, depending if it's a OC/ultra/extreme/supa-dupa version or not.
Posted on Reply
#33
pentastar111
Hawk1Wonder if temps are going to be an issue with the cores sandwiched? Aftermarket cooler/waterblock makers will have their work cut out for them on this one (if even possible).
My thoughts exactly...Have fun cooling that m/f down...looks like a thermal nightmare.:wtf:
Posted on Reply
#34
pentastar111
I think this card/cards will perform well...provided the drivers are up to snuff(cough,cough) anyone remember the 7950 fiasco?...nVidia has and does have some really good stuff out there...I have 3 of their cards... a 7600 gt(best bang for the buck ever) and 2 640mb 8800GTS's. All of these products have performed flawlessly for over a year now. I don't believe this X2 card will be one of them...Heat will probably be just one of the factors making this card destroy all fun within a 3 mile radius:twitch:...drivers will be the next and the third of course is going to be price...While I don't think it will top a grand, I don't think $600 to $750 a card will be uncommon. I could be wrong...we shall see...right now those ATI's are looking pretty sweet....And that is good...more compitition spurs more innovation and better products AND (cough, cough) lower prices:toast:
Posted on Reply
#35
PVTCaboose1337
Graphical Hacker
I'm thinking about $600 or they will have no market.
Posted on Reply
#36
WOutZoR
pentastar111My thoughts exactly...Have fun cooling that m/f down...looks like a thermal nightmare.:wtf:
Lets wait until ViperJohn gets his hands on the 9800 GX2 :D

He's the first to watercool the 7900 GX2, remember?
Posted on Reply
#37
erocker
*
WOutZoRLets wait until ViperJohn gets his hands on the 9800 GX2 :D

He's the first to watercool the 7900 GX2, remember?
Yeah, that's nice and all, but it's too bad drivers weren't any good.
Posted on Reply
#38
tkpenalty
That cooler is just a 8800GT's cooler multiplied by two and soldered together....... I smell overheating. Unless these are the 8800GT G92s, and underclocked both of these cards will run around t he 90*C, unless that fan is super loud.

I'd still prefer the HD3870X2, a far easier option if you want aftermarket cooling. yes water cooling is possible with this however but its not easy.



That definately looks nice however.

I can expect this card to perform somewhat slower than the HD3870 in some cases, with the disadvantage that SLi brings. Moreover, will our PCs detect this as one or two GPUs? If its two... this card wont sell.
Posted on Reply
#39
imperialreign
TBH, to me this card looks like a last minute panic response to the 3870x2 that's been hyped up over the last month or two.

Just my opinion, unless nVidia gets their SLI on-par with ATI's Crossfire, nVidia is going to be out of their league in multi-GPU setups. This is going to become very interesting - I'm looking forward to seeing how this card performs when compared to ATI's new monstrosity.


I like how it looks, though. That casing around it is sweet.
Posted on Reply
#40
TooFast
lol thats the best nvidia could do, slap 2 cards together. lol let me guess you need an sli board to run it! and all that heat in the case. OMG
Posted on Reply
#41
Ripper3
This is what Nvidia does for a living. The 7950GX2 was their answer to the X1950XTX... couldn't get a decent card with a single GPU, so they stuck two together instead.
Seems they're doing te same with the 8800.
The cooling won't be a problem, me thinks. The 65nm 8800 doesn't seem to be bothered about heat at all, so this cooling solution will still work. I'm sure that cooler might even get better temps than the stock 8800GT one, since that fan looks like a very big, very chunky fan. If the new bigger fan on the single-slot stock 8800GT cooler is anything to go by, this might be a slow-spinning fan, trying to keep the noise down, and only just keeping the cores alive (since the 60mm ran extremely high speeds, to keep the GPU at, say, 83c, and made a hell of a racket, with the newer 70mm model capable of keeping the GPU a little tiny bit cooler, at ~80c but at much much lower RPMs, and hardly audible, second what the released materal stated).
Replacing the cooling may be trouble, since the two GPUs are facing each other, and that ribbon cable connecting the PCBs doesn't look like it likes stretching (hell, looks flimsy enough that it might come out while putting on the stock cooler, much less after-market).
Watercoolingwith a single replacement block in the middle is definately a possibility, and someone's probably already thinking it up.

I must say that the ATi arrangement for the 3870 X2 looks to be much better, in terms of simplicity, and in terms of having a tidier layout (no chance of misplacing that second PCB layer, when doing maintenance). Yes, on air cooling the second GPU will be receiving the hotter air, but I'm unsure if it will make such a large difference. Watercooling meanwhile should be a piece of cake. Just get two seperate waterblocks, like the Maze 5, along with two packs of RAM 'sinks, and a low-profile chipset 'sink for the PCI-E switcher between the cores.
Posted on Reply
#42
Mussels
Freshwater Moderator
DaMultaToms has a pic with the cooler on

even if its dual PCB, that looks quite good and is still only two slots WITH cooler.
Posted on Reply
#43
AddSub
I must say that the ATi arrangement for the 3870 X2 looks to be much better
Yup. X2 looks much more accessible as far as custom cooling solutions go. Heck, you could fix a water block on one GPU, and go passive with the other one. Not that it would be a good idea to do so, but it shows how much room you got to maneuver.

GX2 smells like desperation on part of nvidia. And the latest delay for the 9xxx lineup doesn’t help their image, an image they try so hard to keep up. Let’s face it, a large part of nvidia is in fact their “GeForce” branding, and in smaller part their “nVidia” branding. They could take some 10 year old Voodoo boards and rename them GeForce VoDo and people would eat em up. Certain type of people that is.

Anyways, this whole “delay” in order to work out some bugs, or whatever, seems like pure unfiltered guano to begin with. I mean, the only time a firm like nvidia would admit to having some serious bugs in their latest product lineup is when there is some bigger issue to hide, like, oh I don’t know, maybe performance issues?
Posted on Reply
#44
Unregistered
Hawk1Yeah, now that I have a second look at it, it may be easier to WC this thing compared to the 7950's. It looks like it may be difficult to take appart/put back together without undue damage/scratches to the outer shell (thinking for warranty purposes, if you have any issues with the card).
Well if you're watercooling you'll be voiding the warranty anyways... I think that there will actually be some after-market waterblocks for these because of simplicity of it.

Oh boy, I can't wait for this card. Since they're delaying because of driver issues, it makes me confident that this card will be pretty solid by its release. And I am very sure this card will be around $450 to $550. Those are very affordable prices for a card of this power.

Maybe I'm being over optimistic, but we'll see...

-Indybird
Posted on Edit | Reply
#45
imperialreign
AddSubYup. X2 looks much more accessible as far as custom cooling solutions go. Heck, you could fix a water block on one GPU, and go passive with the other one. Not that it would be a good idea to do so, but it shows how much room you got to maneuver.

GX2 smells like desperation on part of nvidia. And the latest delay for the 9xxx lineup doesn’t help their image, an image they try so hard to keep up. Let’s face it, a large part of nvidia is in fact their “GeForce” branding, and in smaller part their “nVidia” branding. They could take some 10 year old Voodoo boards and rename them GeForce VoDo and people would eat em up. Certain type of people that is.

Anyways, this whole “delay” in order to work out some bugs, or whatever, seems like pure unfiltered guano to begin with. I mean, the only time a firm like nvidia would admit to having some serious bugs in their latest product lineup is when there is some bigger issue to hide, like, oh I don’t know, maybe performance issues?
not so sure on the performance thing . . . but their kicker is staying ahead of ATI as far as performance goes.

The 8800 GTX/Ultra cards are going to be very, very hard for nVidia to top out performance wise - you can only go so far with processor architecture. The last few models of 8800s being pumped out varied more in the amount of DRAM than anything else, it seemed - almost like they were milking the architecture for all it's worth.

But, this is product is typical of nVidia, too, whenever they start to feel threatened by ATI, they literally slap something together and throw it out the door. TBH, this card setup really doesn't look like much thought was put into it - two PCBs, two GPUs, one cooler . . . :wtf:

If it wasn't for that pretty chasis enshrouding the two PCB's - it'd be one fugli VGA adapter - very hack & slash approach.

I'm curious to see if they're actually cooking up a single PCB/dual GPU offering for the second revision 9800 GX2s or 9800 Ultras (if they go the "Ultra" route again).

It wouldn't surprise me, though, if we start seeing more jimmy-rigged looking setups as we get closer to a release of the secretive R700.
Posted on Reply
#46
phanbuey
I dont think you guys realise the performance advantage nVidia had until now. And for how long... and how long the R600 got pushed back in the beginning. Lets face it, the 8800 series were the best... this X2 doesnt "dominate" by any means, only in a few games and at extremely high res... ATI still doesnt have all their garbage in one bag - and the fact that they recycled their R600 architecture from the HD 2900 to the HD 3800 is no different than what nvidia did with their "milking" of the g80 core... still the X2, which is 2 new chips gets beat out by NV cards that are over a year old now in some games - that should not happen, irrelevant of who wrote the game and blah blah. The VLIW architecture is just very hit and miss depending on application, which ultimately makes the R600 cards unreliable performers.

In reality, i think the next gen is going to be ludicrously fast... ATi and Nvidia are both buying time with their x2 and gx2 cards.

I love ATI, and i think this X2 is amazing even with crap drivers, but dont forget that Nvidia was forbes' company of the year out of EVERY industry - those guys are rolling in money. just pray for ATI that the 9800GX2 is not twice as fast as the X2. Also i thought the GX2 was announced a Looooong time ago... like months before the g92 core was even out.
Posted on Reply
#47
candle_86
TooFastlol thats the best nvidia could do, slap 2 cards together. lol let me guess you need an sli board to run it! and all that heat in the case. OMG
and all AMD could do was slap two GPU's on one PCB, make it longer and have them share the air, both are good cards, stop being a hater.


As for preformace remember this isn't an 8800GT SLI, this is an 8800GTS G92 so 2x128sp, meaning simply it will give the HD3870 a KO.
Posted on Reply
#48
candle_86
phanbueyI dont think you guys realise the performance advantage nVidia had until now. And for how long... and how long the R600 got pushed back in the beginning. Lets face it, the 8800 series were the best... this X2 doesnt "dominate" by any means, only in a few games and at extremely high res... ATI still doesnt have all their garbage in one bag - and the fact that they recycled their R600 architecture from the HD 2900 to the HD 3800 is no different than what nvidia did with their "milking" of the g80 core... still the X2, which is 2 new chips gets beat out by NV cards that are over a year old now in some games - that should not happen, irrelevant of who wrote the game and blah blah. The VLIW architecture is just very hit and miss depending on application, which ultimately makes the R600 cards unreliable performers.

In reality, i think the next gen is going to be ludicrously fast... ATi and Nvidia are both buying time with their x2 and gx2 cards.

I love ATI, and i think this X2 is amazing even with crap drivers, but dont forget that Nvidia was forbes' company of the year out of EVERY industry - those guys are rolling in money. just pray for ATI that the 9800GX2 is not twice as fast as the X2. Also i thought the GX2 was announced a Looooong time ago... like months before the g92 core was even out.
thats because of the design of the chip it has 5 groups of 64 shaders, while nvidia has 8 groups of 16 shaders. So nvidia can basiclly load the card better, if say you need 13 shader instructions that leaves Nvidia 7 more units of 112 shaders, but AMD cuts it and takes 64units to do 32 leaving it with 4 groups left. AMD's could be used better but they where so late most game devs didnt optimize for the R600GPU and thus access it the same way the G80 is, which leave R600 with a major disadvantage.
Posted on Reply
#49
TooFast
candle_86and all AMD could do was slap two GPU's on one PCB, make it longer and have them share the air, both are good cards, stop being a hater.


As for preformace remember this isn't an 8800GT SLI, this is an 8800GTS G92 so 2x128sp, meaning simply it will give the HD3870 a KO.
Hater! its the truth, its two cards glued together! the x2 is a true single card, even if it had 16 gpus on it. as for the the glued nvidia card it will surely be 600$+ card with LOWER CLOCKS, IT MIGHT NOT EVEN BEAT THE X2
Posted on Reply
#50
xvi
I wonder if I could go eBay my old ATI Radeon 9800 for $500 when this comes out.
"9800 GRAPHICS CARD".. "Play games faster than ever before!"
Posted on Reply
Add your own comment
Apr 18th, 2024 15:37 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts