Originally Posted by BigMack70
Didn't expect to see a "human eye can only see [x] fps" troll in this thread
I'm not even gonna get into that nonsense...
First of all, I'm not trolling. Second, what I said is correct, and its not something to take as lightly. But of course I made a mistake of applying it to gaming, and it really is another story that movies. (As I am a professional video editor, and not a gamer, at all)
Just to polish my argument, take a bluray movie, if its NTSC it runs at 30fps. You don't see any movie losing frames at that framerate. Even a computer animated movie like toy story, unless the disc gets dirty or something and the thing begins stuttering. Of course, toy story was smoothly rendered prior its final release, not real time rendering. This makes difference in games, talking with my colegues about it, it may happen that motion blur and other things don't look good when rendering at 30fps and more frames are needed indeed.
Also, in a real time rendering scene, a colegue of mine demoed to me a 3D scene where some elements were rendered at a different pace. Yes it may happen, depending on the developer of course, that some elements could be rendered on faster framerates than others, on the same scene. So of course, a videocard that reaches 120frames per second may be able to overcome this issue very easily.