Wednesday, September 5th 2007

NVIDIA G92 is GeForce 8700 GTS

VR-Zone has learned more about the upcoming NVIDIA G92 where the official marketing name is most likely to be GeForce 8700 GTS. As revealed earlier, G92 is 65nm based and has 256-bit memory bus width. The reference 8700 GTS card has 8-layer PCB and comes with 512MB GDDR3 1ns memory clocked between 900Mhz and 1GHz. Core clock is unknown yet.Source: VR-Zone
Add your own comment

36 Comments on NVIDIA G92 is GeForce 8700 GTS

#1
PVTCaboose1337
Graphical Hacker
The memory bus is very nice...

The bus I mean.
Posted on Reply
#2
mullered07
isnt the mobile version of this whats meant to be able to play crysis maxed?
Posted on Reply
#3
Grings
by: mullered07
isnt the mobile version of this whats meant to be able to play crysis maxed?
possibly, though sometimes they will give a mobile part a model number even though the thing is completely different to the desktop equivalent, best example of this was ati's mobile 9700, which was a boosted 9600 core (4 pipelines, 128bit mem), and nothing like the desktop 9700 (8 pipelines, 256bit mem)
Posted on Reply
#4
cdawall
where the hell are my stars
8700 sounds like a HD2950PRO beater? same bus widths shaders same GDDR amount.....
Posted on Reply
#5
Nemesis881
Are you serious?????!!! I just bought an 8600gts... :banghead:
Posted on Reply
#6
pbmaster
So does this mean the rumored specs we saw earlier are still relevant?
Posted on Reply
#7
erocker
This thing may beat out 8800's.
Posted on Reply
#8
Demos_sav
by: erocker
This thing may beat out 8800's.
I don't think so. If it does then I'll :cry:
Posted on Reply
#9
Ketxxx
Heedless Psychic
One company has woken up and realised 128bit memory bus is totally wank now for midrange cards, one more to go.
Posted on Reply
#10
niko084
Buh... Where is Ati's 2700xt/2800xt....... :(e
256 bit ram...... You know my complaints...
Posted on Reply
#11
Weer
Ha!

I told you people G92 wasn't 9800 GTX.
Posted on Reply
#12
Grings
by: Ketxxx
One company has woken up and realised 128bit memory bus is totally wank now for midrange cards, one more to go.
i bet the first geforce 9000 series/radeon 3000 series midrange cards launch with 128 bit again, the midrange 'refresh' cards usually go to 256 (x800gt/o, x1950pro etc), then the first of the next generation always take a step backwards

Its like they have short term memory loss or something
Posted on Reply
#13
Chewy
by: Weer
Ha!

I told you people G92 wasn't 9800 GTX.
woot my cards not getting pushed over :P it WILL run crysis maxed Im sure :P (maybe neeeding some overclocking :)).

I got my card for fairly cheap! was nice :D. glad I dident wait for g92 :p
Posted on Reply
#14
Weer
Of it will run Crysis on Max. Crytek said that 3 year-old computers could run it, so the second best in the world can't?
Posted on Reply
#15
Ketxxx
Heedless Psychic
by: Grings
i bet the first geforce 9000 series/radeon 3000 series midrange cards launch with 128 bit again, the midrange 'refresh' cards usually go to 256 (x800gt/o, x1950pro etc), then the first of the next generation always take a step backwards

Its like they have short term memory loss or something
I havent seen that since the 7600GT, its raw numbers let it beat out a 6800 (just) after that, it was clear to see midrange cards would have to start using a 256bit bus. Both ATi and nVidia havent seemed to grasp 2 rather simple concepts, 1; Games are more and more complex with crappier and crappier programming. Graphic companies must accomadate for this in their midrange layout. 2; using a 256bit bus in midrange cards far from gives midrange cards the ability to come within spitting distance of flagshit cards. Its easily balanced by each company using a 256bit bus, but slower memory. Essentially now it should be very clear cut for graphics, midrange cards have exactly half the spec of a flagshit model, and an entry level card exactly half the spec of a midrange spec.
Posted on Reply
#16
Chewy
by: Weer
Of it will run Crysis on Max. Crytek said that 3 year-old computers could run it, so the second best in the world can't?
run it but prob on low detail, BioShock pushes my card (as it is atm 550core) with everything maxed.. but that could be partly cause I havent tweakwed vista this install yet.. and running msn/side bar in the backround I have seen my comp lag some playtimes.

they might of meant run it on low detail with low res too... you know how they rate the min system requirments... so the game can hardly play i it lol :P
Posted on Reply
#17
OnBoard
Mmm, interesting, probable candidate for my next card. Not that there is really any game out there that kills my card in 1280x1024 yet :) So no hurry to buy one, but 65nm sounds good after this burning cloud of hydrogen I have now :P
Posted on Reply
#18
InfDamarvel
Ati already has the HD2950PRO in the making which is suppose to match 2900XT performance with less power use and less heat. Its only a midrange card though.
Posted on Reply
#19
WarEagleAU
Bird of Prey
dont seem special really, except the die shrink.
Posted on Reply
#20
Mussels
Moderprator
by: mullered07
isnt the mobile version of this whats meant to be able to play crysis maxed?
this should be close to the 8800GTS 320MB. Slower GPU. but more (and decent) memory.

and yes, this is the card meant to run crysis maxed in DX10, at an unspecified resolution (i assume DX10, 1024x768 with 0xaa)

This is the one with the new AGP bridge, so it will have an AGP version as well. Die shrink = cooler/less power too, so this should be a great card. Good speed, low power (low noise) and an AGP variant. kickass.
Posted on Reply
#21
Xaser04
Judging by the cards specs so far I would assume performance is similar to that of the X1950XT from ATI. That would make it a good deal faster than the 8600GTS whilst not quite as powerful as the 8800GTS. (this would make sense as nvidias naming system means this card falls directly between the 8600 and the 8800 and is unlikely to match the 8800 in terms of performance)
Posted on Reply
#22
Weer
by: Chewy
run it but prob on low detail, BioShock pushes my card (as it is atm 550core) with everything maxed.. but that could be partly cause I havent tweakwed vista this install yet.. and running msn/side bar in the backround I have seen my comp lag some playtimes.

they might of meant run it on low detail with low res too... you know how they rate the min system requirments... so the game can hardly play i it lol :P
Please.
I can play Bioshock on a 7600GT on HIGH.
And you can't play it on a card that is 4 times better?

If you can't play Crysis Maxed-Out @ 1280x800 with 2xAA, I'll eat my hat.
Posted on Reply
#23
tigger
I'm the only one
i play bioshock maxxed at 1280.

this 8700gts is lookin fiiiiine.thank god its not another possibly decent card ruined with a 128bit bus(ie 7600gt)

oh and death to 128 bit mem busses
Posted on Reply
#24
Xaser04
by: tigger69


this 8700gts is lookin fiiiiine.thank god its not another possibly decent card ruined with a 128bit bus(ie 7600gt)
I don't quite understand your comment about the 7600gt here.

It was a cracking card. It matched the performance of the previous high end card of the time (the 6800Ultra) despite having fewer pixel pipelines and only a 128bit memory interface.

Yes it could have been better with a 256bit memory interface however at the time there was no need for it (due to a complete lack of competition from ATI).
Posted on Reply
#25
Mussels
Moderprator
by: Xaser04
I don't quite understand your comment about the 7600gt here.

It was a cracking card. It matched the performance of the previous high end card of the time (the 6800Ultra) despite having fewer pixel pipelines and only a 128bit memory interface.

Yes it could have been better with a 256bit memory interface however at the time there was no need for it (due to a complete lack of competition from ATI).
^ backing him up here, 76GT owned 1650pro's, which was the price equivalent around here.

I'm really looking forward to SLI results on these...
Posted on Reply
Add your own comment