When did people start with saying/wanting you need 60 fps for gaming. (probably 60 @ 1280x1024 at that time).
I don't think there ever was a "single year" changeover, just a gradual 1995-2005, DirectX 7-8 era trend that became normalized:-
- 30fps - Most 1980-1996 DOS games (Arena, Daggerfall, etc) were 20-40fps. Doom Engine (1993-1996) was capped at 35fps. Infinity Engine games (Baldur's Gate (1998), Planescape Torment (1999), etc), has a 30fps cap. Jagged Alliance 2 (1999) was 30fps, Diablo 2 (2000) was 25fps, etc. There were some early 2000's flash games capped at 30fps plus the Adventure Game Studios engine's 40fps cap, but for many genres like point & click adventures, it didn't impact gameplay much.
- 60fps - Quake 1-3 (1996-1999), Thief 1-2 (1997-1998), Unreal 1 (1998), Deus Ex (2000), No One Lives Forever 1-2 (2000-2002), System Shock 2 (1999), Neverwinter Nights (2002), Morrowind (2002), Serious Sam FE & SE (2001-2002), Medal of Honor: Allied Assault (2002), etc, and many more could all run at 60fps.
So it's really the late 90's where games changed "technically", and probably the early 2000's where the "expectation" of 60fps changed to the point where it would have been weird / felt out of place to have released games like Call of Duty (2003), Far Cry (2004), FEAR (2005), Oblivion (2006), Bioshock (2007), etc, with 30fps caps.
Were you able to play at 60fps gaming on your first PC.
Not a chance in hell given my first PC was a 16MHz 286 with 1MB RAM, 256kb VRAM and 40MB HDD.