Originally Posted by MatTheCat
My point was, that V-sync off = tearing. No matter what the refresh rate of the monitor is or how fast the GFX card is.
It depends on the engine. Some tearing is worse than other. For instance, Rage (id soft's) looks like an untracked VHS tape, while Source games and BF3 barely have any tearing (IMO).
I personally don't notice tearing unless its REALLY bad - to me, as an FPS gamer, control input LAG is a death sentence, especially if you are going online. Your performance just plummets.
When you say FPS gamers, I am assuming someone like me, who has a mouse-pad the size of a couch pillow, low to mid DPI DeathAdder or some other high speed sensor mouse and plays FPS for that bit of adrenaline. FPS gamers to me are people that can, in the middle of a run, jump up, do a full 180, and hit someone in the head with a shot in far under a second, and that complain about shot reg on servers with a low TIC rate.
These are the type of people that back in the day used to hack their intellipoint to get the USB to poll at 1000hz instead of 250 to get that bit of a microsecond edge. And have entire forum threads on optimum sensitivity, and also are developing a time machine to go back and assassinate the people responsible for mouse acceleration and mouse smoothing.
For those people, being able to see and control to 120 FPS is much more important than not having any tearing, so yes, it gives a noticeable advantage. For everyone else - i am sure that it is nice to have, and it makes things a bit smoother - but an advantage if any is minimal.