• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

That time when you had to play at low settings..

Ruru

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
14,102 (3.06/day)
Location
Jyväskylä, Finland
System Name 4K-gaming / console
Processor 5800X @ PBO +200 / i5-8600K @ 4.7GHz
Motherboard ROG Crosshair VII Hero / ROG Strix Z370-F
Cooling Custom loop CPU+GPU / Custom loop CPU
Memory 32GB DDR4-3466 / 16GB DDR4-3600
Video Card(s) Asus RTX 3080 TUF / Powercolor RX 6700 XT
Storage 3TB SSDs + 3TB / 372GB SSDs + 750GB
Display(s) 4K120 IPS + 4K60 IPS / 1080p projector @ 90"
Case Corsair 4000D AF White / DeepCool CC560 WH
Audio Device(s) Sony WH-CH720N / Hecate G1500
Power Supply EVGA G2 750W / Seasonic FX-750
Mouse MX518 remake / Ajazz i303 Pro
Keyboard Roccat Vulcan 121 AIMO / Obinslab Anne 2 Pro
VR HMD Oculus Rift CV1
Software Windows 11 Pro / Windows 11 Pro
Benchmark Scores They run Crysis
Well, what I can say. Completed FEAR with GF4 Ti 4200 64MB and Crysis with 6800 GS. That was truly years ago, IIRC FEAR was at about the minimum settings, Crysis was at 1024x768 low (effects high) though I had to drop to 800x600 in the last level.

Still finished both tho. I kinda miss those days when the constant 60fps wasn't that neccessary.
 
Doom3. Even though I bought a 6600 GT just for that game, man did it barely work.
 
PC Mainboard with SIS 6326 with a Slot 1 Celeron 333 256MB PC 66 SDRAM. (IGP)

(IBM COMPATIBLE AT)

It couldn't handle Expert PC Software SEGA Saturn Games that well, and It was a Total lag fest on Motocross Madness 2, that was in 2001.
 
Last edited:
Pentium MMX 233 + 32MB SIMM FP DRAM + S3 Virge = 5 FPS in 640x480 on software rendering mode in Half-Life 1.

Which is why I snicker at any who thinks their 10 year old PC is slow.
 
Doom3. Even though I bought a 6600 GT just for that game, man did it barely work.
Wasn't 6600 GT fine for Doom 3 at medium settings? At least in my recent benchmarks (AGP version)..
 
The core of my PC is pretty near retirement age at this point the CPU/GPU are both "tired" so that's basically every other game. :laugh:
 
Doom3. Even though I bought a 6600 GT just for that game, man did it barely work.
Gotta love that old gem of a game, still recall the sweaty armpits from being scared.... had an X800 XT displaying through a 1280x1024 CRT monitor back then. I had no idea about fps measurements. All I knew then is that it wasn't a slide show! :D
 
I kinda miss those days when the constant 60fps wasn't that neccessary.
It's your choice whether constant 60 fps is necessary or not. For me, it's not. Anything above 30, I'm happy. ;)

On topic: I played a bit of Cyberpunk at low settings on a GTX 1650 while I was saving up for something better. The experience wasn't bad at all. Sometimes I wonder why high end graphics cards exist nowadays.
 
It's your choice whether constant 60 fps is necessary or not. For me, it's not. Anything above 30, I'm happy. ;)

On topic: I played a bit of Cyberpunk at low settings on a GTX 1650 while I was saving up for something better. The experience wasn't bad at all. Sometimes I wonder why high end graphics cards exist nowadays.
Depends of the game.. these days a stable 60fps is a must at least. :3
 
Pentium MMX 233 + 32MB SIMM FP DRAM + S3 Virge = 5 FPS in 640x480 on software rendering mode in Half-Life 1.

Which is why I snicker at any who thinks their 10 year old PC is slow.
Same here. My first PC was basically the same as the one you mentioned. I used it for 6 years... in an era when you literally had to build a new PC every year!

Depends of the game.. these days a stable 60fps is a must at least. :3
Not for me, it's not. ;)
 
Not for me, it's not. ;)
We have own our taste I guess :D

I may probably lower the settings in the future when I'll get a 4K60 monitor. 1080 Ti is still a monster but it's not enough for modern games with Ultra settings, especially with 4K.
 
Depends of the game.. these days a stable 60fps is a must at least. :3
True that & then optimisations for PC platform under windows OS as well. Playing Wolfenstein: The New Order atm in 1440p Ultra settings & its capped at 60fps but I've had lows of 55fps in some parts with an RX 6800 XT syncing with an i7-11700k OC.
 
True that & then optimisations for PC platform under windows OS as well. Playing Wolfenstein: The New Order atm & its capped at 60fps but I've had lows of 55fps in some parts with an RX 6800 XT syncing with an i7-11700k OC.
What the hell, I remember streaming the whole game, 1080p maximum, I had a 980 and totally solid 60fps all the time.

Weird.
 
We have own our taste I guess :D

I may probably lower the settings in the future when I'll get a 4K60 monitor. 1080 Ti is still a monster but it's not enough for modern games with Ultra settings, especially with 4K.
This is why I'm not planning on upgrading from 1080p60. I'm more than happy with my 6500 XT running it. :)

And when I'm not, I've got a 2070 on the shelf that I'm not using because it's too noisy for my taste.

In the 90s, we played with noisy computers at 30 fps max (if we were lucky). Since then, some people became FPS freaks. I didn't. I only became a silence freak. :D

I'd choose 30 FPS with a silent PC over 60 FPS with a noisy one any time.
 
This is why I'm not planning on upgrading from 1080p60. I'm more than happy with my 6500 XT running it. :)

And when I'm not, I've got a 2070 on the shelf that I'm not using because it's too noisy for my taste.

In the 90s, we played with noisy computers at 30 fps max (if we were lucky). Since then, some people became FPS freaks. I didn't. I only became a silence freak. :D

I'd choose 30 FPS with a silent PC over 60 FPS with a noisy one any time.
4K is wonderful because 1080p doesn't need any scaling, just double the pixels if I need to run something at 1080p. :)
 
Which is why I snicker at any who thinks their 10 year old PC is slow.
How about a 5 year old PC getting 6 FPS on an older game?

Assassin's Creed: Origins - integrated graphics. Slideshow it was, but the real killer was input lag. Actually, even Wolfenstein: Enemy Territory (the multiplayer one) benefitted considerably from an upgrade to discrete graphics. I couldn't figure out how I improved so much so suddenly - the input lag was worse (iGPU) than my ping to a server in Europe from the USA. Or maybe it added up to noticeability.
 

Attachments

  • 1658282806029.png
    1658282806029.png
    234.8 KB · Views: 68
I grew up with a very low end XP machine. 1.5GHz Pentium 4, 256MB PC100 RAM, 64MB GeForce2 MX400. I mainly played older games (from the mid-late 90s), but I still had a fair few "modern" games like Max Payne, Mafia, Vice City, etc, and the framerates on those could be in the low-mid 20s (sometimes even in the high teens). But I put up with it, because my parents wouldn't buy anything better. As a result, I'm very forgiving of low framerates in games, and really don't understand people who say they can't enjoy a game unless it's at 60fps or higher. Spoiled brats.
 
Pentium MMX 233 + 32MB SIMM FP DRAM + S3 Virge = 5 FPS in 640x480 on software rendering mode in Half-Life 1.

Which is why I snicker at any who thinks their 10 year old PC is slow.
It's not if you have a Sandy/Ivy i7 and a SSD.
 
I grew up with a very low end XP machine. 1.5GHz Pentium 4, 256MB PC100 RAM, 64MB GeForce2 MX400. I mainly played older games (from the mid-late 90s), but I still had a fair few "modern" games like Max Payne, Mafia, Vice City, etc, and the framerates on those could be in the low-mid 20s (sometimes even in the high teens). But I put up with it, because my parents wouldn't buy anything better. As a result, I'm very forgiving of low framerates in games, and really don't understand people who say they can't enjoy a game unless it's at 60fps or higher. Spoiled brats.
Low-end and P4? Yo I played the whole summer of 2004 games with P2-400 @ 450, 160MB RAM, GF2 MX (the basic MX with 128bit SDR).

And hell, I did enjoy that. :D
 
I grew up with a very low end XP machine. 1.5GHz Pentium 4, 256MB PC100 RAM, 64MB GeForce2 MX400. I mainly played older games (from the mid-late 90s), but I still had a fair few "modern" games like Max Payne, Mafia, Vice City, etc, and the framerates on those could be in the low-mid 20s (sometimes even in the high teens). But I put up with it, because my parents wouldn't buy anything better. As a result, I'm very forgiving of low framerates in games, and really don't understand people who say they can't enjoy a game unless it's at 60fps or higher. Spoiled brats.
I would have given my left leg for a GeForce 2 when I was stuck with an S3 Virge for 6 years! :oops:

Then I got a new computer with a Radeon 9600 in it, so I was happy. :D
 
How about a 5 year old PC getting 6 FPS on an older game?

Assassin's Creed: Origins - integrated graphics. Slideshow it was, but the real killer was input lag. Actually, even Wolfenstein: Enemy Territory (the multiplayer one) benefitted considerably from an upgrade to discrete graphics. I couldn't figure out how I improved so much so suddenly - the input lag was worse (iGPU) than my ping to a server in Europe from the USA. Or maybe it added up to noticeability.

Fair enough, but keep in mind this was a 1997 CPU running a 1998 game...
 
Then I got a new computer with a Radeon 9600 in it, so I was happy. :D
I probably had a GF 4 Ti 4200-8X when you had your Radeon. The GF4 was a solid performer, but damn when I upgraded it when I got an used 9700 Pro... that boost was something amazing.
 
Back
Top