• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce 9900 GTX and 9900 GTS GT200 Powered Graphics Slated for July Launch

850xtpe > 6800ultra
x1950xtx > 7900gtx

The first comparison was right. The second one is wrong. 1950XTX is good when it comes to image quality but 7900 GTX pwns benchmarking and has a slight raise in frames per second over the 1950XTX
 
And theres one thing I hate about the GT200’s. Its still a DDR3, and ATI already has released DDR4 and by the time the 9900 GTX comes out ATI will be already released DDR5. :(
 
its time to prepare for the next gpu battle.. may some random non duoply company win!
 
Damn gonna have twist the damn wife's arm again oh well it's well worth getting divorced for lol
 
well if the GT200 rumors are true we can expect 384 real shader units, 64rops, a 512bit bus, and 1gb of ram.

512 bit bus? Not happening, the 9800 series had a 256 bit bus as compared to the 384 bit bus of the high-end G80 cores, they obviously rolled back for a reason.

ATI did the same thing after realizing that the 512 bit bus was way too expensive on the 2900XT for the little performance boost it gave.
 
Im a nvidia fan but i think both are doin just fine and about the new cards its about the choices if u like ATI u can buy a ATI and if u like nvidia just buy nvidia we don't need 2 start a war because of this IMO we just need to buy the one with better price for us and the one we like more just that...

lets leave the image quality and stuff like that apart cause in the very end well be playing the same games with the same quality no meter if u choose nvidia or ATI.

Don't get me wrong....
 
And theres one thing I hate about the GT200’s. Its still a DDR3, and ATI already has released DDR4 and by the time the 9900 GTX comes out ATI will be already released DDR5. :(

U can notice a big improvement in terms of performance between DDR4 and DDR3???

8800GT DDR3 kills the 3870 DDR4 and im not going to talk about 8800GTS G92 correct me if im wrong...
 
U can notice a big improvement in terms of performance between DDR4 and DDR3???

8800GT DDR3 kills the 3870 DDR4 and im not going to talk about 8800GTS G92 correct me if im wrong...

I _think_ that GDDR4 offers improvement mostly consumptionwise (some performance as well), but GDDR5 has the lower consumption of GDDR4 and monstrously high clocks.

"According to Qimonda, the new 512Mbit GDDR5 memory chips being sampled are three times faster than 800MHz GDDR3 RAM, and they can achieve bandwidth of 20GB/s per module. If I've got my math right, that translates to "effective" DDR speeds of around 5GHz and total maximum theoretical bandwidth of 160GB/s for a graphics card with a 256-bit memory bus and eight GDDR5 chips."

That's an enormous amount of bandwidth, I think my 8800GT's memory only has a bandwidth around 64GB/s
 
Last edited:
its time to prepare for the next gpu battle.. may some random non duoply company win!

i agree with that!
this is going to be one hell of a showdown
 
U can notice a big improvement in terms of performance between DDR4 and DDR3???

8800GT DDR3 kills the 3870 DDR4 and im not going to talk about 8800GTS G92 correct me if im wrong...

Here are the specs of both:

nVidia
2008-04-27_221953.png


ATI (compared to a 8800 GT)
2008-04-27_222045.png



The ATI has higher speed for both memory and GPU but it functions much cooler then the nVidia. I'm assuming that's because of the GDDR4, though i may be wrong.
 
holy shit 160GB/S bandwidth? :cry: :cry:
 
The first comparison was right. The second one is wrong. 1950XTX is good when it comes to image quality but 7900 GTX pwns benchmarking and has a slight raise in frames per second over the 1950XTX

No, nVidia didn't pull ahead until 8800 released. They spun out the 7950 series just after 1950's release to try and counter it, but they were at best equals.
 
im not exactly sure how the shader clocks work on Ati cards but i know that thats where they usually fall behind in performance compared to nvidia...nvidia uses really high shader clocks, while ati uses the core clock....however i could be very wrong about this, please enlighten me if i am...
 
well my plan is to use evga step up program goin from the 8800GTS to 9800GX2 and if the 9900 GTX is fast than it ill more to that
 
All thumbs up from me! Competition is great for us.

I'm no fanboy but while nvidia continue to prevent sli on intel chipsets. It's in my interest that ATI have decent cards available. If the two cards are comparable, then I'll be going ATI on my X38 to keep my options open. I guess many others will do likewise.
 
im not exactly sure how the shader clocks work on Ati cards but i know that thats where they usually fall behind in performance compared to nvidia...nvidia uses really high shader clocks, while ati uses the core clock....however i could be very wrong about this, please enlighten me if i am...

That's about it, although ATI's HD4k series are to address this issue. We'll see how it goes. :cool:
 
That's about it, although ATI's HD4k series are to address this issue. We'll see how it goes. :cool:

yea it seems with every card they address some kinda of issue so maybe this time around there wont be much to complain about...but then again nvidia has been doing the same so im looking forward to the battle...
 
whats with all the guys joinin, ive seen like 20 join so far, im wondering if a few of those are same person.
 
whats with all the guys joinin, ive seen like 20 join so far, im wondering if a few of those are same person.

well when stuff like this gets posted and people Google "gt200" core or "9900gtx" this comes up as one of the first links so people wanna join and post their 2cents about it...thats what forums are for...to talk about and discuss things like this...but i doubt people would join more than once...unless they intend to spam us or something :toast:

thats kinda off topic btw....sry mods
 
The first comparison was right. The second one is wrong. 1950XTX is good when it comes to image quality but 7900 GTX pwns benchmarking and has a slight raise in frames per second over the 1950XTX

obviously you must not have read nvidias shady files...as they always dropped IQ to gain FPS...thats why they are in the lead by so much...i have both teams so i dont care....just get me another 8800gt vs 3870 pricing and im fine.
 
the turnover of the high end cards over the last two years has been astounding.
 
Back
Top