Monday, May 26th 2008

Next-gen NVIDIA GeForce Specs Unveiled, Part 2

Although it's a bit risky to post information taken from unknown for me sources, not to mention that the site is German, it's worth trying. The guys over at Gamezoom (Google translated) reported yesterday that during NVIDIA's Editors Day, the same place where the F@H project for NVIDIA cards and the buyout of RayScale were announced, NVIDIA has also unveiled the final specs for its soon to be released GT200 cards. This information comes to complement our previous Next-gen NVIDIA GeForce Specs Unveiled story:
  • GeForce GTX 280 will feature 602MHz/1296MHz/1107MHz core/shader/memory clocks, 240 stream processors, 512-bit memory interface, and GDDR3 memory as already mentioned.
  • GeForce GTX 260 will come with 576MHz/999MHz/896MHz reference clock speeds, 192 stream processors, 448-bit memory interface and GDDR3 memory
The prices are 449 U.S. dollars for the GTX 260 and more than 600$ for the GeForce 280 GTX. That's all for now.
Source: Gamezoom
Add your own comment

108 Comments on Next-gen NVIDIA GeForce Specs Unveiled, Part 2

#76
Megasty
TheGuruStudI hope you are being sarcastic.

#1 30-60 fps sucks big time, looks horrible

#2 My cutoff in framerate is about 85 before I can't tell

#3 My refresh point is about 100 Hz

#4 I game at 1280x1024 at 100 Hz with vsync of course b/c tears blow

#5 You're using a shitty driver if you can't select 75 on an LCD

#6 You all need to get new monitors
You sir are superhuman & yes I was & still is being sarcastic :D
Posted on Reply
#77
TheGuruStud
MegastyYou sir are superhuman & yes I was & still is being sarcastic :D
just superhuman in bed... :D:D:D
Posted on Reply
#78
wolf
Performance Enthusiast
come on people, i thought we had actually gotten over comparing a single gpu config against a double.

surely enough the 4870x2 might beat a GTX280, however given these specs im fairly sure that both new nvidia cards will tear the ass out of one 4870/50.

sure.
Posted on Reply
#79
laszlo
a_ofYou must be joking. I have been playing FPS since I have memory, and I can easily distinguish between 30fps and 60fps. Hell, I can even notice between 60 and 80. Perhaps you use a flat panel? I still use a CRT, nothing beats it for serious gaming. That's why when playing counter strike with my clan a few years ago, I bought the best money could buy at that moment.
i also use crt and if the fps delivered by the video card is over 30 without dropping below it can't be noticed.

refresh rate has nothing to do with fps :hometheater.about.com/od/televisionbasics/qt/framevsrefresh.htm


when you watch a movie with a 30 fps do you see the frames?

is the same principle when you play a game or you watch a movie you don't need 200 fps to run it smooth
Posted on Reply
#80
Hayder_Master
X-TeNDeRSix hundred dollars!!! :eek: awww..

Specs look good, but man.. NVidia is hunting for fat wallets again :twitch:
in my country our shop they sell the 8800 winfast about 950$ i am buy a gigabyte 8800gt 300$ 9600 and 9800 don't come yet i don't know if the 280 come maybe i find it about 1500$ in market :banghead: :cry:
Posted on Reply
#81
trt740
TheGuruStudI hope you are being sarcastic.

#1 30-60 fps sucks big time, looks horrible

#2 My cutoff in framerate is about 85 before I can't tell

#3 My refresh point is about 100 Hz

#4 I game at 1280x1024 at 100 Hz with vsync of course b/c tears blow

#5 You're using a shitty driver if you can't select 75 on an LCD

#6 You all need to get new monitors
television quality video is like 25 fps if 60fps isn't good enought for you your eyes have a problem. A sustained FPS of 30 and above is more than enough
Posted on Reply
#82
TheGuruStud
trt740television quality video is like 25 fps if 60fps isn't good enought for you your eyes have a problem. A sustained FPS of 30 and above is more than enough
Television is bearable b/c they trick your eyes with motion blur (and a couple other things I think). If you look at any vertical object moving you can clearly see profound jerking.

If it's below 60, I can easily see those damn frames. It's a really fast stutter closer to 60, but annoying as hell.

And don't even get me started on refresh rate haha. I get a headache in a literal 2 mins with low refresh rate (low is below 100 haha).
Posted on Reply
#83
trt740
TheGuruStudTelevision is bearable b/c they trick your eyes with motion blur (and a couple other things I think). If you look at any vertical object moving you can clearly see profound jerking.

If it's below 60, I can easily see those damn frames. It's a really fast stutter closer to 60, but annoying as hell.

And don't even get me started on refresh rate haha. I get a headache in a literal 2 mins with low refresh rate (low is below 100 haha).
sounds like a tumor:slap: just kidding:D
Posted on Reply
#85
Guru Janitor
TheGuruStudTelevision is bearable b/c they trick your eyes with motion blur (and a couple other things I think). If you look at any vertical object moving you can clearly see profound jerking.

If it's below 60, I can easily see those damn frames. It's a really fast stutter closer to 60, but annoying as hell.

And don't even get me started on refresh rate haha. I get a headache in a literal 2 mins with low refresh rate (low is below 100 haha).
He could be seeing up to 85 FPS, the AVERAGE for humans is around 30, but every person sees FPS differently. Some might not even be able to tell that they are playing a game at 20 fps.
Posted on Reply
#86
acperience7
laszloi also use crt and if the fps delivered by the video card is over 30 without dropping below it can't be noticed.

refresh rate has nothing to do with fps :hometheater.about.com/od/televisionbasics/qt/framevsrefresh.htm


when you watch a movie with a 30 fps do you see the frames?

is the same principle when you play a game or you watch a movie you don't need 200 fps to run it smooth
I can also notice a diffrence between 35 and 60 fps; At least on Test Drive Unlimited for the PC and Halo (PC). Oh yeah I use a 19'' LCD by the way, maybe that has something to do with it.
Posted on Reply
#87
DaMulta
My stars went supernova
So let me get this right the GT200 is not the 9900GTX now?
Posted on Reply
#88
largon
"GT200" doesn't exist.
G200 = GeForce GTX 260 & 280
Posted on Reply
#89
DaMulta
My stars went supernova
Thank you


2 many cards are coming out.......
Posted on Reply
#90
imperialreign
as to all the FPS debating going on . . .


36-60 FPS is perfectly fine for slower paced games (like Splinter Cell, Thief, most RTS, etc), but for fast paced games (like Doom3 on nightmare, etc) it's unacceptable.

As to what people can visually see, it depends on the person, and on your setup - even here, I can usually tell when FPS starts dropping below 60 . . . but the only game I have an issue with "tearing" even over 60FPS is Crysis, and vsync causes to hard of a hit to enable it with 3x buffering. I've never experienced tearing with any other game, and multi-GPU setups are a ton more susceptible to it than single GPUs.

The higher the screen res you play at, the more noticeable tearing will be - even if you have high FPS and a beefcake GPU to handle it, that's a ton more pixels that the GPu has to render and keep up pace with.

you set refresh rate has a major impact on this if you're using LCDs. 60Mhz refresh on a native 75Mhz LCD will lead to tearing or otherwise.



But, this all boils down to individual ability to percieve what's on screen - everyone is different, some more sensitive to it than others.
Posted on Reply
#91
wolf
Performance Enthusiast
ok as for this silly fps debate...

like others have mentioned, some games can be fine at a lower fps, like crysis, is perfectly playable above 30fps, and test drive unlimited can be 45-50 and smooth as.... BUT;

first person shooters is a different story.

surely enough our eyes cannot see the difference above 60hz, however your hand can feel the difference in responsiveness at 100fps as oppose to 60fps.

ut3 is a great example of this, if you run the game stock, its capped at 62fps, which is more than smooth dont get me wrong, but when i make max fps 150, i usually get 90-110, and there is a HUGE difference in how fast you can react. especially in such an intense and fast paced game like UT3.

so the short answer is maybe we cant SEE the difference, but we can FEEL the difference of faster gameplay.

thus to me, 75+ fps is always what i aim for in those kind of games. and to get 75+ fps @ 1920x1200 +4xAA +16xAF in every game, your gfx card needs to be high end. and i will be abssolutely going for the new nvidia high end when it arrives.

especially cos it will sock it to the every more cruddy looking 48xx series.

all theyve done is added 50% more shader units, 50% more texture units, and given it faster memory.

the nvidia cards have more ROPS, more memory, more bus width, more shaders, more texture units, and a refined design of what we KNOW works. nvidia has literally made this card more beefy as oppose to trying to fill its shortcomings.

NV FTW.
Posted on Reply
#92
magibeg
wolfespecially cos it will sock it to the every more cruddy looking 48xx series.

all theyve done is added 50% more shader units, 50% more texture units, and given it faster memory.

the nvidia cards have more ROPS, more memory, more bus width, more shaders, more texture units, and a refined design of what we KNOW works. nvidia has literally made this card more beefy as oppose to trying to fill its shortcomings.

NV FTW.
Ok you were good until you said that -1
Posted on Reply
#93
TheGuruStud
wolfok as for this silly fps debate...
Surely enough our eyes cannot see the difference above 60hz, however your hand can feel the difference in responsiveness at 100fps as oppose to 60fps.

ut3 is a great example of this, if you run the game stock, its capped at 62fps, which is more than smooth dont get me wrong, but when i make max fps 150, i usually get 90-110, and there is a HUGE difference in how fast you can react. especially in such an intense and fast paced game like UT3.

so the short answer is maybe we cant SEE the difference, but we can FEEL the difference of faster gameplay.
Refresh rate is pretty analogous to FPS. Florescent lights flicker to me b/c I can easily see 60 Hz (monitors are even worse, think like page flipping). Even 85 Hz starts to strain my eyes after a couple minutes (that's an annoying fast flicker).

But I agree, higher framerate does play better (that's how I got good at CS - 120 vsync on). But I also don't know how anyone plays without vsync. Sure, you can get massive performance penalty (up to 50% if your card can't handle it), but it's better to me than the entire screen tearing and disrupting my view (and pissing me off haha). Of course, my refresh is 100 so it would be 50 fps minimum and not 30 :D
Posted on Reply
#94
wolf
Performance Enthusiast
Ok you were good until you said that -1
lol, kudos to ATi for giving Nv the competition, but they are thoroughly prawned this time around, and we will soon see about next gen.

in any case its each to their own, and the price is always right for both products, so theres so dumb choice.
Posted on Reply
#95
farlex85
TheGuruStudRefresh rate is pretty analogous to FPS. Florescent lights flicker to me b/c I can easily see 60 Hz (monitors are even worse, think like page flipping). Even 85 Hz starts to strain my eyes after a couple minutes (that's an annoying fast flicker).

But I agree, higher framerate does play better (that's how I got good at CS - 120 vsync on). But I also don't know how anyone plays without vsync. Sure, you can get massive performance penalty (up to 50% if your card can't handle it), but it's better to me than the entire screen tearing and disrupting my view (and pissing me off haha). Of course, my refresh is 100 so it would be 50 fps minimum and not 30 :D
Everybody can see florescent lights flicker. And yes, monitors can flicker also. 60fps is not too slow in any game, unless it is so fast paced things are moving 3-4 times as fast as they would normally. People don't see different hz (at least not anything worth noting), people just differ in how much they care, and occasionally how much it strains their eyes. Me, I can run a game at 25-30 fps with occasional tearing and not really mind, even in a game like crysis. Some need to have 60fps w/ no tearing to be happy, thats fine too. I personally am glad it doesn't bother me.
Posted on Reply
#96
magibeg
I just think its a little soon to be counting anyone out quite yet. Not only did ati double the number of shaders but they also increased the shader clock on them by a fair margin, from 775 to 1050mhz, a fair increase from any view. They also doubled their TMU's. ROP's supposedly don't have as much importance anymore when things are shader focused. Even so though, the nvidia cards listed there are MUCH more expensive than the ati cards that are coming out.
Posted on Reply
#97
farlex85
magibegI just think its a little soon to be counting anyone out quite yet. Not only did ati double the number of shaders but they also increased the shader clock on them by a fair margin, from 775 to 1050mhz, a fair increase from any view. They also doubled their TMU's. ROP's supposedly don't have as much importance anymore when things are shader focused. Even so though, the nvidia cards listed there are MUCH more expensive than the ati cards that are coming out.
I would wager it won't be much different from the way its been recently. Nvidia will remain on top but ati will offer very compelling price/performance. I really do hope that the 4870 competes w/ these gtx's though, then we wouldn't have to pay such outrageous prices for them, it would drive the whole market down. I hope *crosses fingers*.
Posted on Reply
#98
TheGuruStud
magibegI just think its a little soon to be counting anyone out quite yet. Not only did ati double the number of shaders but they also increased the shader clock on them by a fair margin, from 775 to 1050mhz, a fair increase from any view. They also doubled their TMU's. ROP's supposedly don't have as much importance anymore when things are shader focused. Even so though, the nvidia cards listed there are MUCH more expensive than the ati cards that are coming out.
I bet the price will be astronomical (current rumored price) until ATI releases, then they will be close in price according to real world performance. They're just trying to get some bank b/c they cost like $130 (read that somewhere) to make per chip. Which, is freaking insane haha.
Posted on Reply
#99
farlex85
TheGuruStudI bet the price will be astronomical (current rumored price) until ATI releases, then they will be close in price according to real world performance. They're just trying to get some bank b/c they cost like $130 (read that somewhere) to make per chip. Which, is freaking insane haha.
ATI is actually supposed to release theirs 3 days before these.
Posted on Reply
#100
TheGuruStud
farlex85ATI is actually supposed to release theirs 3 days before these.
Well, Idk then :laugh:

There's a marketing stunt going on or these are very bad details. *shrugs*
Posted on Reply
Add your own comment
May 6th, 2024 07:26 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts