• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Next-gen NVIDIA GeForce Specs Unveiled, Part 2

Yeah, anything over 60 fps don't matter anyway cause your monitor won't keep up. CRTs are stuck 60hz as a refresh rate while LCDs can go to 75hz but OS's & drivers keep them at 60hz anyway. Just because a card is running a game at 90+ fps don't mean you're seeing them unless you're superhuman :rolleyes:

I hope you are being sarcastic.

#1 30-60 fps sucks big time, looks horrible

#2 My cutoff in framerate is about 85 before I can't tell

#3 My refresh point is about 100 Hz

#4 I game at 1280x1024 at 100 Hz with vsync of course b/c tears blow

#5 You're using a shitty driver if you can't select 75 on an LCD

#6 You all need to get new monitors
 
I hope you are being sarcastic.

#1 30-60 fps sucks big time, looks horrible

#2 My cutoff in framerate is about 85 before I can't tell

#3 My refresh point is about 100 Hz

#4 I game at 1280x1024 at 100 Hz with vsync of course b/c tears blow

#5 You're using a shitty driver if you can't select 75 on an LCD

#6 You all need to get new monitors

You sir are superhuman & yes I was & still is being sarcastic :D
 
come on people, i thought we had actually gotten over comparing a single gpu config against a double.

surely enough the 4870x2 might beat a GTX280, however given these specs im fairly sure that both new nvidia cards will tear the ass out of one 4870/50.

sure.
 
You must be joking. I have been playing FPS since I have memory, and I can easily distinguish between 30fps and 60fps. Hell, I can even notice between 60 and 80. Perhaps you use a flat panel? I still use a CRT, nothing beats it for serious gaming. That's why when playing counter strike with my clan a few years ago, I bought the best money could buy at that moment.


i also use crt and if the fps delivered by the video card is over 30 without dropping below it can't be noticed.

refresh rate has nothing to do with fps :http://hometheater.about.com/od/televisionbasics/qt/framevsrefresh.htm


when you watch a movie with a 30 fps do you see the frames?

is the same principle when you play a game or you watch a movie you don't need 200 fps to run it smooth
 
Six hundred dollars!!! :eek: awww..

Specs look good, but man.. NVidia is hunting for fat wallets again :twitch:


in my country our shop they sell the 8800 winfast about 950$ i am buy a gigabyte 8800gt 300$ 9600 and 9800 don't come yet i don't know if the 280 come maybe i find it about 1500$ in market :banghead: :cry:
 
I hope you are being sarcastic.

#1 30-60 fps sucks big time, looks horrible

#2 My cutoff in framerate is about 85 before I can't tell

#3 My refresh point is about 100 Hz

#4 I game at 1280x1024 at 100 Hz with vsync of course b/c tears blow

#5 You're using a shitty driver if you can't select 75 on an LCD

#6 You all need to get new monitors

television quality video is like 25 fps if 60fps isn't good enought for you your eyes have a problem. A sustained FPS of 30 and above is more than enough
 
television quality video is like 25 fps if 60fps isn't good enought for you your eyes have a problem. A sustained FPS of 30 and above is more than enough

Television is bearable b/c they trick your eyes with motion blur (and a couple other things I think). If you look at any vertical object moving you can clearly see profound jerking.

If it's below 60, I can easily see those damn frames. It's a really fast stutter closer to 60, but annoying as hell.

And don't even get me started on refresh rate haha. I get a headache in a literal 2 mins with low refresh rate (low is below 100 haha).
 
Television is bearable b/c they trick your eyes with motion blur (and a couple other things I think). If you look at any vertical object moving you can clearly see profound jerking.

If it's below 60, I can easily see those damn frames. It's a really fast stutter closer to 60, but annoying as hell.

And don't even get me started on refresh rate haha. I get a headache in a literal 2 mins with low refresh rate (low is below 100 haha).

sounds like a tumor:slap: just kidding:D
 
i agree with trt
 
Television is bearable b/c they trick your eyes with motion blur (and a couple other things I think). If you look at any vertical object moving you can clearly see profound jerking.

If it's below 60, I can easily see those damn frames. It's a really fast stutter closer to 60, but annoying as hell.

And don't even get me started on refresh rate haha. I get a headache in a literal 2 mins with low refresh rate (low is below 100 haha).

He could be seeing up to 85 FPS, the AVERAGE for humans is around 30, but every person sees FPS differently. Some might not even be able to tell that they are playing a game at 20 fps.
 
i also use crt and if the fps delivered by the video card is over 30 without dropping below it can't be noticed.

refresh rate has nothing to do with fps :http://hometheater.about.com/od/televisionbasics/qt/framevsrefresh.htm


when you watch a movie with a 30 fps do you see the frames?

is the same principle when you play a game or you watch a movie you don't need 200 fps to run it smooth

I can also notice a diffrence between 35 and 60 fps; At least on Test Drive Unlimited for the PC and Halo (PC). Oh yeah I use a 19'' LCD by the way, maybe that has something to do with it.
 
So let me get this right the GT200 is not the 9900GTX now?
 
"GT200" doesn't exist.
G200 = GeForce GTX 260 & 280
 
Thank you


2 many cards are coming out.......
 
as to all the FPS debating going on . . .


36-60 FPS is perfectly fine for slower paced games (like Splinter Cell, Thief, most RTS, etc), but for fast paced games (like Doom3 on nightmare, etc) it's unacceptable.

As to what people can visually see, it depends on the person, and on your setup - even here, I can usually tell when FPS starts dropping below 60 . . . but the only game I have an issue with "tearing" even over 60FPS is Crysis, and vsync causes to hard of a hit to enable it with 3x buffering. I've never experienced tearing with any other game, and multi-GPU setups are a ton more susceptible to it than single GPUs.

The higher the screen res you play at, the more noticeable tearing will be - even if you have high FPS and a beefcake GPU to handle it, that's a ton more pixels that the GPu has to render and keep up pace with.

you set refresh rate has a major impact on this if you're using LCDs. 60Mhz refresh on a native 75Mhz LCD will lead to tearing or otherwise.



But, this all boils down to individual ability to percieve what's on screen - everyone is different, some more sensitive to it than others.
 
ok as for this silly fps debate...

like others have mentioned, some games can be fine at a lower fps, like crysis, is perfectly playable above 30fps, and test drive unlimited can be 45-50 and smooth as.... BUT;

first person shooters is a different story.

surely enough our eyes cannot see the difference above 60hz, however your hand can feel the difference in responsiveness at 100fps as oppose to 60fps.

ut3 is a great example of this, if you run the game stock, its capped at 62fps, which is more than smooth dont get me wrong, but when i make max fps 150, i usually get 90-110, and there is a HUGE difference in how fast you can react. especially in such an intense and fast paced game like UT3.

so the short answer is maybe we cant SEE the difference, but we can FEEL the difference of faster gameplay.

thus to me, 75+ fps is always what i aim for in those kind of games. and to get 75+ fps @ 1920x1200 +4xAA +16xAF in every game, your gfx card needs to be high end. and i will be abssolutely going for the new nvidia high end when it arrives.

especially cos it will sock it to the every more cruddy looking 48xx series.

all theyve done is added 50% more shader units, 50% more texture units, and given it faster memory.

the nvidia cards have more ROPS, more memory, more bus width, more shaders, more texture units, and a refined design of what we KNOW works. nvidia has literally made this card more beefy as oppose to trying to fill its shortcomings.

NV FTW.
 
especially cos it will sock it to the every more cruddy looking 48xx series.

all theyve done is added 50% more shader units, 50% more texture units, and given it faster memory.

the nvidia cards have more ROPS, more memory, more bus width, more shaders, more texture units, and a refined design of what we KNOW works. nvidia has literally made this card more beefy as oppose to trying to fill its shortcomings.

NV FTW.

Ok you were good until you said that -1
 
ok as for this silly fps debate...
Surely enough our eyes cannot see the difference above 60hz, however your hand can feel the difference in responsiveness at 100fps as oppose to 60fps.

ut3 is a great example of this, if you run the game stock, its capped at 62fps, which is more than smooth dont get me wrong, but when i make max fps 150, i usually get 90-110, and there is a HUGE difference in how fast you can react. especially in such an intense and fast paced game like UT3.

so the short answer is maybe we cant SEE the difference, but we can FEEL the difference of faster gameplay.

Refresh rate is pretty analogous to FPS. Florescent lights flicker to me b/c I can easily see 60 Hz (monitors are even worse, think like page flipping). Even 85 Hz starts to strain my eyes after a couple minutes (that's an annoying fast flicker).

But I agree, higher framerate does play better (that's how I got good at CS - 120 vsync on). But I also don't know how anyone plays without vsync. Sure, you can get massive performance penalty (up to 50% if your card can't handle it), but it's better to me than the entire screen tearing and disrupting my view (and pissing me off haha). Of course, my refresh is 100 so it would be 50 fps minimum and not 30 :D
 
Ok you were good until you said that -1

lol, kudos to ATi for giving Nv the competition, but they are thoroughly prawned this time around, and we will soon see about next gen.

in any case its each to their own, and the price is always right for both products, so theres so dumb choice.
 
Refresh rate is pretty analogous to FPS. Florescent lights flicker to me b/c I can easily see 60 Hz (monitors are even worse, think like page flipping). Even 85 Hz starts to strain my eyes after a couple minutes (that's an annoying fast flicker).

But I agree, higher framerate does play better (that's how I got good at CS - 120 vsync on). But I also don't know how anyone plays without vsync. Sure, you can get massive performance penalty (up to 50% if your card can't handle it), but it's better to me than the entire screen tearing and disrupting my view (and pissing me off haha). Of course, my refresh is 100 so it would be 50 fps minimum and not 30 :D

Everybody can see florescent lights flicker. And yes, monitors can flicker also. 60fps is not too slow in any game, unless it is so fast paced things are moving 3-4 times as fast as they would normally. People don't see different hz (at least not anything worth noting), people just differ in how much they care, and occasionally how much it strains their eyes. Me, I can run a game at 25-30 fps with occasional tearing and not really mind, even in a game like crysis. Some need to have 60fps w/ no tearing to be happy, thats fine too. I personally am glad it doesn't bother me.
 
I just think its a little soon to be counting anyone out quite yet. Not only did ati double the number of shaders but they also increased the shader clock on them by a fair margin, from 775 to 1050mhz, a fair increase from any view. They also doubled their TMU's. ROP's supposedly don't have as much importance anymore when things are shader focused. Even so though, the nvidia cards listed there are MUCH more expensive than the ati cards that are coming out.
 
I just think its a little soon to be counting anyone out quite yet. Not only did ati double the number of shaders but they also increased the shader clock on them by a fair margin, from 775 to 1050mhz, a fair increase from any view. They also doubled their TMU's. ROP's supposedly don't have as much importance anymore when things are shader focused. Even so though, the nvidia cards listed there are MUCH more expensive than the ati cards that are coming out.

I would wager it won't be much different from the way its been recently. Nvidia will remain on top but ati will offer very compelling price/performance. I really do hope that the 4870 competes w/ these gtx's though, then we wouldn't have to pay such outrageous prices for them, it would drive the whole market down. I hope *crosses fingers*.
 
I just think its a little soon to be counting anyone out quite yet. Not only did ati double the number of shaders but they also increased the shader clock on them by a fair margin, from 775 to 1050mhz, a fair increase from any view. They also doubled their TMU's. ROP's supposedly don't have as much importance anymore when things are shader focused. Even so though, the nvidia cards listed there are MUCH more expensive than the ati cards that are coming out.

I bet the price will be astronomical (current rumored price) until ATI releases, then they will be close in price according to real world performance. They're just trying to get some bank b/c they cost like $130 (read that somewhere) to make per chip. Which, is freaking insane haha.
 
I bet the price will be astronomical (current rumored price) until ATI releases, then they will be close in price according to real world performance. They're just trying to get some bank b/c they cost like $130 (read that somewhere) to make per chip. Which, is freaking insane haha.

ATI is actually supposed to release theirs 3 days before these.
 
Back
Top