• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce 9800 GTX Card Specs and Pictures

malware

New Member
Joined
Nov 7, 2004
Messages
5,422 (0.72/day)
Location
Bulgaria
Processor Intel Core 2 Quad Q6600 G0 VID: 1.2125
Motherboard GIGABYTE GA-P35-DS3P rev.2.0
Cooling Thermalright Ultra-120 eXtreme + Noctua NF-S12 Fan
Memory 4x1 GB PQI DDR2 PC2-6400
Video Card(s) Colorful iGame Radeon HD 4890 1 GB GDDR5
Storage 2x 500 GB Seagate Barracuda 7200.11 32 MB RAID0
Display(s) BenQ G2400W 24-inch WideScreen LCD
Case Cooler Master COSMOS RC-1000 (sold), Cooler Master HAF-932 (delivered)
Audio Device(s) Creative X-Fi XtremeMusic + Logitech Z-5500 Digital THX
Power Supply Chieftec CFT-1000G-DF 1kW
Software Laptop: Lenovo 3000 N200 C2DT2310/3GB/120GB/GF7300/15.4"/Razer
VR-Zone has obtained some details and photos of the first single GPU high-end GeForce 9 series card. The 65nm G92 based GeForce 9800 GTX (G92-P392) will come with 12-layer PCB. It will be clocked at 673MHz for the core and 1683MHz for the shader while memory clock speed is yet to be determined. The memory interface is 256-bit with 512MB of 136-pin BGA GDDR3 memory onboard. The card will come with two DVI-I and one HDTV-out. As mentioned earlier all GeForce 9800 GTX will have two SLI connectors (3-way SLI ready) and two 6-pin PCIe power connectors. The card will be cooled by the CoolerMaster TM67 cooler where the fan is rated at 0.34A, 4.08W, 2900rpm, 34dBA. During 100% load the card will consume around 168W. The GeForce 9800 GTX is set to be released around April.



View at TechPowerUp Main Site
 
Last edited:
So is this basically an overclocked 8800GTS (512mb) with dual sli connectors?
 
So is this basically an overclocked 8800GTS (512mb) with dual sli connectors?

Thats about what it sounds like, same core so....
 
it's nice they had to ad 2-3" to the card to add the second SLI connector.
 
I find it interesting that they are using a memory interface of 256-bit with 512MB on these Ultra High-end cards and still manage to get higher performance than "last gen" cards that shipped with 512-bit mem interface with 768MB vram. I mean wouldn't this slimmer interface cause a bottle neck now more than ever? And if this is a cost reduction measure,then thats an odd thing to do for a flagship card

broke_s

edit: nvm I didn't realize that they are using the same core as the 8800GT (which had the cost reduction measures applied to remain competitive) so I guess that we can just overclock a 8800GT and well know how these will perform?
 
I find it interesting that they are using a memory interface of 256-bit with 512MB on these Ultra High-end cards and still manage to get higher performance than "last gen" cards that shipped with 512-bit mem interface with 768MB vram. I mean wouldn't this slimmer interface cause a bottle neck now more than ever? And if this is a cost reduction measure,then thats an odd thing to do for a flagship card

broke_s

They are actually losing some performance with keeping to 256bit. It's not a lot but is noticeable in VERY large resolutions with high AA/AF.

The main idea for cutting back to 256bit, is they found that the price difference is really not worth the performance difference, and its costs a lot less to make a 256bit card compared to a 512bit card.
 
They are actually losing some performance with keeping to 256bit. It's not a lot but is noticeable in VERY large resolutions with high AA/AF.

The main idea for cutting back to 256bit, is they found that the price difference is really not worth the performance difference, and its costs a lot less to make a 256bit card compared to a 512bit card.

true but couldn't they have realized that, what was it 2 years ago before the first 512bit cards came out?
 
true but couldn't they have realized that, what was it 2 years ago before the first 512bit cards came out?

Well they could have but they are always being pushed to getting cards out and quickly, plus they have to build something fast enough to make the people with the upper level cards want to replace them. So getting an idea that is indeed faster and throwing it out to get it done vs completely re-working it is a better idea...

Now that they learned their lesson, that wont be a mistake again.
 
The old GTX/Ultras are 384bit IIRC, not 512. The old 8800GTS was 320bit as well.
 
The old GTX/Ultras are 384bit IIRC, not 512. The old 8800GTS was 320bit as well.

Yes, the 2900XT was 512bit, 2900Pro's are both 256bit and 512bit.
 
I think the 9800GTX is getting more ROP's and 12 more shaders than the G92 GTS. I think the GTX also uses a different core revision, which might explain the extra shaders and ROP's.
 
well i hope its another revision.

256bit also cuts down overclocking performance why do you think they where always using 2900xt's
for wr 3dmark06, i think its time for the gpu buisness to make 512bit a standard for high-end cards
 
the smart thing to do would be to wait till the 9800GTX comes out to buy a new 8800GTS (G92) and an aftermarket cooler or do water cooling.
 
revision to core 9 series has an update to purevideo engine. crisper hasselhoff and 2 more frames in crysis for msrp $399

- Christine
 
revision to core 9 series has an update to purevideo engine. crisper hasselhoff and 2 more frames in crysis for msrp $399

- Christine

Aye, but crisper suave, cool, 80's hasselhoff; or crisper drunken, overweight, and hamburger-eating today's hasselhoff?
 
Is this thing longer than the 8800 gtx?
 
what a long **** of a card ...
 
Last edited:
Fail card is fail
 
even tho this is bad news for most, i find it to be just the opposite, as i just purchased a g92 gts, and it will remain one of the top cards for another year =) im sure others with a g92 based card can relate
 
even tho this is bad news for most, i find it to be just the opposite, as i just purchased a g92 gts, and it will remain one of the top cards for another year =) im sure others with a g92 based card can relate

I guess thats kinda true, although I was hoping to make a step-up to this card. However, that doesn't look like its worth the trouble. :shadedshu
 
man, nvidia and ati are just pooping out cards left and right. there are so many choices these days that it is hard to figure out which cards are legit and which are just marketing tools used to piss off the competition.
 
time are changing.. folks used to want the card they have just bought to last a while.. now the majority seem to want the opposite.. a new toy every month.. i recon the average "enthusiast" is getting younger.. he he

trog
 
seems like nvidia is out of fresh ideas for now , no new architecture just revisions ......... a good time for Ati to deliver something like the x800 revolution .................. :nutkick:
 
Back
Top