• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Since when became 60fps gaming a must

At the time I was a high schooler with a job and some money, my game of choice was Quake 3. That game's physics calculations were fps dependent and if you were at 125fps, you could make some jumps that you otherwise couldn't. So that was the fps target, resolution and quality settings be damned.
 
Oh shit, keeping the balls (other than mine) clean... I don't miss ball mice. :3

That person who invented optical mice, is a hero.
The person who invented laser mice is a hero too. So is the one who invented LCD. I couldn't play or do anything on the PC for longer than 2-3 hours with CRT monitors without getting severe headaches.

PC games have a positive effect on brain structure

Hippocampus: Switching point in the brain
The hippocampus is an area in the brain where information from different senses converges, is processed and transmitted to the cortex. Thus, it is crucial for the formation of new memories and the transfer of memory content from short-term to long-term memory. Each hemisphere of the brain has a hippocampus.

The hippocampus adapts to current challenges. This is because the brain is not a rigid structure, but changes continuously with our personal experiences, for example through the formation and networking of new nerve cells.

Cognitive tests and magnetic resonance imaging studies of the brain show that computer games are well suited as a challenge for seniors and can compensate for the consequences of a lack of exercise, at least in the brain.

Physical activity has been shown to help prevent and treat dementia. Computer games have similar effects on brain structure, although those affected hardly move at all. They also lead to an increase in the size of the hippocampus. After a certain training phase, the challenges contained in the game map themselves in the relevant areas of the brain.

It is crucial that the players develop a three-dimensional imagination, i.e. that they move in virtual space. It then doesn't matter to their brain whether they train with real or virtual movement - the hippocampus grows and with it the performance of the memory.

In this case, this applies to everyone, not just the elderly.
That's interesting - especially the part that says games substitute physical exercise for the brain. No wonder I never go to the gym. :D Although, I don't think it necessarily has to be reflex-based shooters. Any game can be a form of brain exercise.
 
I'm probably considered crazy, but I've gamed at 30FPS for years, though the only 'online' gaming I do is coop play with a friend in Far Cry New Dawn and Far Cry 6. I have always purposely set the frame rate limiter to 30FPS, mostly because it cuts down on the noise and heat output, and frankly I can't tell the difference between 30 and 60+ in the games I play.

This also allows me to enjoy many modern games on very low end hardware too!:D On my Windows 98 retro PC I don't see an easy way to limit frames, so on that system whatever it's got, I use!
 
I'm probably considered crazy, but I've gamed at 30FPS for years, though the only 'online' gaming I do is coop play with a friend in Far Cry New Dawn and Far Cry 6. I have always purposely set the frame rate limiter to 30FPS, mostly because it cuts down on the noise and heat output, and frankly I can't tell the difference between 30 and 60+ in the games I play.

This also allows me to enjoy many modern games on very low end hardware too!:D On my Windows 98 retro PC I don't see an easy way to limit frames, so on that system whatever it's got, I use!

do you play with a controller?

i play some games with a controller, and i fell with it 30 or 60fps doesn't make much difference (putting competitive gaming aside) as the controller range of motion is much slower. With a mouse and kb even on slow games you can tell the difference in moving the mouse around. More then 60fps is more debatable
 
No definitely not my first pc which was a 486 variant with a 50mb hard drive and it could run Doom, it wasn't until I bought a high end video card much later that I began to appreciate high framerates and 60fps or higher is better for your experience gaming, motion looks smoother the greater your framerate.
 
Oh yeah. First game I played in 60 fps was Tanki Online on my current PC that I bought because Tanki X was launching. Before that, I used to play it on 20-30 fps on some really old PC.
 
Since when people became allowed to have personal preferences.
You're free to play at <10 fps slideshows if you want to. It's your right as well.
Just like it's other people's right to expect 60+ fps.

I actually remember first time I saw confirmed 60+ fps back in 2000-2001 in Colin McRae Rally 2.0. Buddy showed me how smoothly it worked on his Voodoo 3 as opposted to my paltry Riva TNT2 M64. On mine it was playable, but seeing that smoothness first time kind of blew me away.
 
Last edited:
Yep 60 fps synced with Triple Buffering was the aim. Crysis didn't help... I've got with the times and now 120/144 fps synced where I can.

I remember reading I had to enable "Triple Buffering" in the Radeon display driver options (9800Pro/X800) when added with Vsync to get a smoother 60 FPS.

I'm not sure if this is done automatically these days because I haven't seen a Triple Buffering option in display driver for some time (Radeon) or cant find in the display driver option. Do Nvidia cards have this option still? I haven't used their cards for a while now so not sure if this option is still in the Nvidia display driver options?

Does anybody still use this option? Or was/is it for OpenGL only?
 
Last edited:
I'm too old to care about FPS, either the game feels smooth or it doesn't. If you need a FPS counter to see if you are having fun, you are running a benchmark not playing a game.

I'm in the same boat, well pretty much except the too old part.:oops: 'turning 33 in 1+ week'

Until my mid high school years I never checked the FPS in games and I was semi seriously playing Unreal Tournament 2003/4 at the time and I have no idea what FPS it was running at but it was smooth enough to not hinder my performance/fun.

Nowadays I almost only play single player games and rarely MMOs and in those I'm totally fine with anything around 45-50 FPS/75Hz display as long as it doesn't have crazy stutters and such.
Usually I try to tweak the settings according to that and then disable Afterburner/Riva overlay and just keep playing/enjoying the game as it is.

At least this way I don't have to spend a fortune on a new GPU every few years or less.:) 'Whatever GPU I'm buying next, will have to serve me for the next 3-4 years as usual'
 
Back in the day, 30fps was playable even with a keyboard and a mouse because all of our displays were interlaced CRT's........and it was rare for a game to be able to be run at 60fps. Back in the CRT days on multisync monitors you could cheat a bit if the game had an INI you could tweak by locking the game at, say, 50hz instead of 60.....but trustly old vsync at 30fps was good enough because that's what most of our video cards could handle.

60fps became more of a 'necessary thing' when LCD's took over the display market in the early 2000's and our ginormous 19" Flatscreen 4x3 CRT monitors hit the recycle bin....sorry, I meant the Trash. Nobody recycled in 2000 :D
 
For me 40ish is good enough for me for most games, 60 for online FPS shooter and as close to 90 as possible in VR games.
On last gen consoles 30 FPS was fine for me as they somehow space out the frames equally so 30FPS on console feels much smoother than 30 FPS on PC.

Yep 60 fps synced with Triple Buffering was the aim. Crysis didn't help... I've got with the times and now 120/144 fps synced where I can.

I remember reading I had to enable "Triple Buffering" in the Radeon display driver options (9800Pro/X800) when added with Vsync to get a smoother 60 FPS.

I'm not sure if this is done automatically these days because I haven't seen a Triple Buffering option in display driver for some time (Radeon) or cant find in the display driver option. Do Nvidia cards have this option still? I haven't used their cards for a while now so not sure if this option is still in the Nvidia display driver options?

Does anybody still use this option? Or was/is it for OpenGL only?
I think that thing would add smoothness but latency as well pre rendering frames so might not be up to standards these days, :D
 
I remember in 2006 I started my first own desktop PC with a ATi x300SE GPU, at that time I was happy I could play GTA SA. :rockout:
Lateron I did an upgrade to ATi X1600XT, playing NFS games and GTA SA, Farcry and don't remember what else I was playing.
Later on some more GPU upgrades and so on.
I wasn't really focusing on fps at the time, I even could play Crysis at one moment.:D

When did people start with saying/wanting you need 60 fps for gaming. (probably 60 @ 1280x1024 at that time)

Were you able to play at 60fps gaming on your first PC.
I'm almost certain that you want a higher FPS to drive games that have a lot of activity/movement occurring on the screen. Just to give an example, for the Front Lines game mode in world of tanks where its 30 vs. 30, you want a higher frame rate so that when it dips, it will still be fluid enough for you to control your aiming, tank movement, etc., when you end up with 40+ tanks fighting it out at the same objective.
 
I've wanted 60+ since BF2 (2005). My computer back then could do 25 FPS. It was terrible. Now I prefer to see 100+, I will lower settings until I reach that point. My monitor displays 1440 165Hz, but my 3060 videocard isn't powerful enough to display that frame rate in most games at 1440. If crypto stays down, I will probably buy 4000 series or equivalent.
 
A very good question that I wonder about myself.

I remember gaming magazines in the early 2000s used to call 25 fps "playable" and 30 fps "good". I'm still on this opinion, to be honest, with one addition: 40 fps is "excellent". I can't tell the difference above that. :laugh:

I think the high refresh rate mania is just an artificial creation to sell 6900 XTs and 3090s and crappy TN panel monitors, and for Intel to be able to label their products "the best gaming CPU". Nothing more.

Edit: DOS games didn't run anywhere near 60 fps, but we still enjoyed the heck out of them. Youngsters of this day and age don't know what gaming meant back then.
Ludicrous take. Even if you supposedly can't tell the difference above 40fps, which i find very hard to believe, the vast majority of gamers can, and do.
 
I remember in 2006 I started my first own desktop PC with a ATi x300SE GPU, at that time I was happy I could play GTA SA. :rockout:
Lateron I did an upgrade to ATi X1600XT, playing NFS games and GTA SA, Farcry and don't remember what else I was playing.
Later on some more GPU upgrades and so on.
I wasn't really focusing on fps at the time, I even could play Crysis at one moment.:D

When did people start with saying/wanting you need 60 fps for gaming. (probably 60 @ 1280x1024 at that time)

Were you able to play at 60fps gaming on your first PC.
Surprise, surprise, but it's been from very early days. Basically after 3dfx VooDoo launched people became a aware of higher fps and how important it is and soon magazines like Maximum PC started pushing it. And frankly they were damn right back then, because in those days fps fluctuated a lot in games due to single core chips and due to vastly different level complexity. My FX 5200 128MB could run Far Cry quite well in most levels. I got 40-50 fps, but volcano level brought it to 15 fps average. In UT 2004 it could do ~80 fps, but if you flick mouse at right direction it could go down to 30 fps and those GOTY levels were unplayable due to low fps. Rather than just average, 1% lows with most hardware used to be shit and you could do nothing about it, only to get as good hardware as you can. Well, that's one thing, perhaps CRTs and their flickering also played a role before. But then again, even arcades ran at 60 fps locked and Sega launched Dreamcast, which ran pretty much all games at 60 fps. 30 fps has always been more like poor excuse for pushing graphics hardware can't handle or just pushing games it can't reasonably well handle on consoles. Too bad that most people didn't care and we ended up with X360/PS3 era of consoles, which in plenty of games couldn't even reach 720p and ran them with 30 fps lock, but dipped way bellow that to like 10 fps.

And if you want more humanistic approach, well most people didn't have any idea what fps was at all and just used what they have and adjusted game setting by feel and once they became aware of definition of fps, they started to associate good experience with 60 fps. Thus as always, a lot of misinformation must have been spread and then compounded by newbies, who don't even understand what fps is and thus cycle continued as audience of video gaming, computer users and internet users actually grew a lot since 2000s.

nah - the monitor is 250€ cus a tournament resolution is 1080p.
the graphics card was the worst.


View attachment 251053

I am one of those stupids... :D :roll:

serve music video GIF by Polyvinyl Records
I'm sorry mate, but that is TV.

On a serious note, I've always been crap at reflex-based online shooters. Never liked them, anyway. Maybe that's why I don't need 60 fps in my games up to this day.
Not even Unreal Tournament, Quake or Doom?

The person who invented laser mice is a hero too. So is the one who invented LCD. I couldn't play or do anything on the PC for longer than 2-3 hours with CRT monitors without getting severe headaches.
Optical mouse is always a laser mouse. What gets advertised as laser mouses are just mouses with acceleration.

Until my mid high school years I never checked the FPS in games and I was semi seriously playing Unreal Tournament 2003/4 at the time and I have no idea what FPS it was running at but it was smooth enough to not hinder my performance/fun.
That's mostly down to the fact that it was ridiculously easy to run well. Even my FX 5200 128MB ran it at ~60-80 fps 1024x768 medium-high and it's equivalent to GeForce 2 GTS. It's still playable with 6150 Go, crappy integrated graphics from that time which were 5 times slower than that FX 5200. So about VooDoo 2 level performance. And by fine I mean it wasn't fine, but it ran it at 640x480 with ~45 fps. Of course with terrible dips.
 
When did people start with saying/wanting you need 60 fps for gaming. (probably 60 @ 1280x1024 at that time).
I don't think there ever was a "single year" changeover, just a gradual 1995-2005, DirectX 7-8 era trend that became normalized:-

- 30fps - Most 1980-1996 DOS games (Arena, Daggerfall, etc) were 20-40fps. Doom Engine (1993-1996) was capped at 35fps. Infinity Engine games (Baldur's Gate (1998), Planescape Torment (1999), etc), has a 30fps cap. Jagged Alliance 2 (1999) was 30fps, Diablo 2 (2000) was 25fps, etc. There were some early 2000's flash games capped at 30fps plus the Adventure Game Studios engine's 40fps cap, but for many genres like point & click adventures, it didn't impact gameplay much.

- 60fps - Quake 1-3 (1996-1999), Thief 1-2 (1997-1998), Unreal 1 (1998), Deus Ex (2000), No One Lives Forever 1-2 (2000-2002), System Shock 2 (1999), Neverwinter Nights (2002), Morrowind (2002), Serious Sam FE & SE (2001-2002), Medal of Honor: Allied Assault (2002), etc, and many more could all run at 60fps.

So it's really the late 90's where games changed "technically", and probably the early 2000's where the "expectation" of 60fps changed to the point where it would have been weird / felt out of place to have released games like Call of Duty (2003), Far Cry (2004), FEAR (2005), Oblivion (2006), Bioshock (2007), etc, with 30fps caps.

Were you able to play at 60fps gaming on your first PC.
Not a chance in hell given my first PC was a 16MHz 286 with 1MB RAM, 256kb VRAM and 40MB HDD. :)
 
Last edited:
HI,
To me fluid is a must and always has been
No fps second counting and most on screen stuff causes more issues than it supposed to help with.
 
Frankly I don't think real gamers care about this arbitrary number. It is just some reviewers who obsess about this number for no logical reason, I think we should just look at their benchmark, their opinions as just as valuable as any other gamer, they have their biases, 60fps is just one of them.

I do play at high refresh rate and can notice the difference between 120, 100, 60fps..etc but I don't feel it to be all that important, I am a sucker for true high resolution (no DLSS or FSR shenanigans), good textures and especially lots of polygons, I like to see curvy things displayed correctly!!!
 
Since when? It still not a 'must', imo. But i'm gessing around time of previous console gen, when pc vs console gaming getting comparable more and more.
When the marketing getting more focus than the actual product/ip itself. Consumer society. Genereted needs, not actual needs.
But it can happen good things among the massive crap we get as 'progress', that is actually useful.
Maybe a bit harsh opinion, but it is as it is.

Above 25 fps and without tearing is it okay for me ;)
 
I don't remember much about the first gaming computer I had, but I wouldn't call it a gaming computer...it was just good enough to run Half-Life and the Counter Strike mod back in '99.

My first official gaming PC that I purchased had AMD X2 3800+ (manchester) 939 socket, 4GB of DDR2 and ran 7600GT cards in SLI. That system powered through Oblivion (that just released around the time I got my PC) like it was nothing. I learned how to OC on that system. Took it up from the stock 2.0 to 3.1GHz....took a bit of tinkering with RAM settings to get things to run stable at that speed.

I never, not once, back then checked what my FPS was and I really don't do that today. I just play games and try to enjoy them. So I don't know what kind of FPS any system of mine really does in any game. If the performance looks good, I play. If things are choppy with my settings, I adjust them down and play.
 
Yep 60 fps synced with Triple Buffering was the aim. Crysis didn't help... I've got with the times and now 120/144 fps synced where I can.

I remember reading I had to enable "Triple Buffering" in the Radeon display driver options (9800Pro/X800) when added with Vsync to get a smoother 60 FPS.

I'm not sure if this is done automatically these days because I haven't seen a Triple Buffering option in display driver for some time (Radeon) or cant find in the display driver option. Do Nvidia cards have this option still? I haven't used their cards for a while now so not sure if this option is still in the Nvidia display driver options?

Does anybody still use this option? Or was/is it for OpenGL only?

Triple buffering is still in NVCP and I enable it sometimes for framecapped games.

Sometimes it might help a little with game engines that have horrible framepacing (Halo Infinite) or just cannot function without Vsync (War Thunder), but it mostly doesn't do anything over plain Vsync.

Industry has long since moved away from Vsync and 3xbuffering anyway, since high refresh enables much lower input lag and both Vsync and 3xbuffering are completely antithetical to reducing latency. For smoothness, Freesync/Gsync even at 60fps gives you the smoothness of Vsync without the penalty of Vsync so.
 
I grew up around PSX and PC. There were a lot of games that ran like crap when 3d gaming was still maturing so I'm conditioned to be fairly tolerant to poor performance. Yes, games running in the 20s (PAL...) always felt crappy, but "that's just how it is". However as technology improved, my expectations adjusted accordingly.

60 FPS became a "must" as soon as it became realistically attainable. There were games where it just would not happen, but 60 was always the goal. Nowadays, not so much. I admit, I was a late adopter of high refresh rate monitors, but as soon as I got myself a 120Hz monitor there was no going back; 120 is now the goal. Nowadays gaming on a 240Hz monitor 240 is the goal for games where it makes sense, but for casual games I'm still happy with 120. Suffice to say it's hard to go back to consoles nowadays, especially switch. Their performance reminds me too much of the bad old days. I'm also "one of those people" that uses motion interpolation to bring TV content up to 120 FPS, regardless of artefacting - more fluid more better.

As to when exactly did 60 become a "must", I think in the mid 2000s it started becoming a strong want (I certainly wished my P4 GeForce 6600 machine would have ran BF2 better than it did) and by 2010 it was pretty much a must.
 
Last edited:
Every time I get above 60, graphics advancements knock me back down :toast:

It started when powerful cards were causing tearing and display shutter. G Sink/freesync and other limiters would cap the frame rates at 30 / 60. Most monitors back then were 60hz so that became the goal.
 
Last edited:
This thread guy's... I remember 30 FPS was the normal but 60 now? I get it but if you don't have the internet speed or ping count then one should not even bother gaming
 
60 probably became the baseline because when you're doing testing, you need a baseline of some sort. 60 makes a lot of sense because that's the default refresh rate of the majority of monitors out there, and if you can maintain 60fps, you're probably doing pretty well on the screen tear and smoothness front (assuming reasonable frametime averages). So without knowing a given reader's hardware and tolerance for performance hiccups, 60's a more-or-less reasonable target for a hypothetically smooth experience.
 
Back
Top