I think it's more a combination of setup, software, and also a individual's natural ability to see it or not. Though on my setup, AMD was much more guilty of it. Hard to say why that is for sure, but notice I said my
setup. Again, ymmv.
Movies and games are recorded and rendered differently. Movies have blurring which effect the perceived smoothness. The blurring is caused by the camera at capture time. Games generate the images, not capture them, and therefore are not blurred, but perfect frame by frame. The human eye DOES perceive the difference. Blurring fools the human eye and brain into seeing smooth movement.
The few games that do have some sort of blurring option, generally run much smoother at much lower framerates. Just look at Crysis 1 as an example. With blurring, it rendered smoothly on most setups all the way down in the 30's, whereas other games require a much higher framerate to achieve the same level of smoothness.
Besides, you do not have to be able to see each individual frame to recognize when something isn't looking smooth. Most people I let see these issues first hand can't put a finger on what's wrong, but they see microstuttering as something that's just a little off, and doesn't feel quite right.
EDIT: Found what I was looking for to prove my point. Even at 60fps, some settings show a noticeable difference. It's even more pronounced if you view on a high quality CRT.