• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

That time when you had to play at low settings..

Yeah! When games were just games, not sequels or prequels. Man, I miss those times!
Totally agree. Though the early 2000s was the shit as we had PS2 :)
 
The last time I played on minimum settings was to play Wolfenstein: The New Order in 2016, on my previous lappy as I didn't have a desktop at the time. It was running on a gt745m, using minimum settings at 720p and I could only manage 30fps average. I don't miss those days at all.
 
I grew up with a very low end XP machine. 1.5GHz Pentium 4, 256MB PC100 RAM, 64MB GeForce2 MX400. I mainly played older games (from the mid-late 90s), but I still had a fair few "modern" games like Max Payne, Mafia, Vice City, etc, and the framerates on those could be in the low-mid 20s (sometimes even in the high teens). But I put up with it, because my parents wouldn't buy anything better. As a result, I'm very forgiving of low framerates in games, and really don't understand people who say they can't enjoy a game unless it's at 60fps or higher. Spoiled brats.
Still was better than my AT system.

I got a P4 Williamette 1.7 with a Hercules 3D Prophet 2 Gts Pro 64 with 1G of ram at that point, then Northwood 533 2.4 2G 333 DDR and a ATI R AIW 9700 Pro then Moved to Athlon XP, still on the FX here (thinking about getting a 6900XT Nitro for it lol)
 
Still was better than my AT system.

I got a P4 Williamette 1.7 with a Hercules 3D Prophet 2 Gts Pro 64 with 1G of ram at that point, then Nortwood 533 2.4 2G 333 DDR and a ATI R AIW 9700 Pro then Moved to Athlon XP, still on the FX here (thinking about getting a 6900XT Nitro for it lol)
Damn that sounds more than insane, but who I am to judge.. :D
 
Spoiled brats.
Or people who've experienced worse, and hated it. My whole childhood from the late 80s through 2001ish was on the same 486 DX2 Packard Bell with no hardware gpu accel and 8MB of ram.

Now I can get better, and I insist on it. No more sub 60fps 320x240 Tie Fighter for me.
 
Damn that sounds more than insane, but who I am to judge.. :D
Not insane just to have fun with it, do some testing of my own.
 
I played about 2000 hours worth of MMO at sub 30 FPS. WoW wasnt buttery smooth for me... but Tera and guild wars 2 on a throttling laptop was the best : all the way from 10 to 19FPS of pure goodness. It was mindless zerg so whatever, images moved and there was loot at the end :)
 
What I can still remember:
  • Tiberian Sun still ran pretty poorly on a K6-2 500MHz.
  • Jedi Outcast got 20 avg FPS on GF2 Pro + K6-2 500.
  • Warcraft 3 was close to unplayable on a K6-2 500 even with a GF2.
  • The Celeron 433A was a ton faster than the K6-2 in everything.
 
Me and my brother bought a shiny new PC. Now we could choose a 3dfx but why should we when we could have a powervr card. What we suffered with that choice. Much worst then having to choose low settings.
 
I don't remember if I ever played anything at low settings but the Doom 3 and Crysis, probably a few more, were the ones that I had to compromise a lot on the graphic settings.
 
4K is wonderful because 1080p doesn't need any scaling, just double the pixels if I need to run something at 1080p. :)
Nah, it doesnt work that way at all. 1080p stretched on 4K with no scaling looks like ass, because the pixels arent perfect squares of one colour - they're multiple colours shining together in odd shapes
 
Nah, it doesnt work that way at all. 1080p stretched on 4K with no scaling looks like ass, because the pixels arent perfect squares of one colour - they're multiple colours shining together in odd shapes
There shouldn't be odd shapes if you're scale up from an even multiple. Upscaling 1080p to 4k in the ideal world would use 4 pixels (in a square,) on a 4k display to represent a single pixel on 1080p output. Pixels are squares of 3 colors and having 4 pixels in a larger square is better than going from say, 1440p to 4k because now it's not 1 pixel to 4 pixels (in a square,) it's part of a pixel in a square. So ideally, 1080p should look better than upscaling from a resolution that isn't a nice multiple of the original resolution.
 
Not insane just to have fun with it, do some testing of my own.
You can crank up the graphics settings but your CPU will be a bottleneck.

Was playing GTA Vice City on my first own laptop back in 2003, it had a 2Ghz Celeron and ATi Radeon Xpress chipset.:D
Now idea how many fps it ran since I did not use a fps counter back then but in my eyes back then Vice City ran pretty smooth. :D
Also tried other games, just don't remember the titles, but I also had fun with TAXI.
I also bought FarCry in 2004 but I was unable to play it on my laptop, because it required a newer shader version I believe which my laptop didn't have.:ohwell:
 
Last edited:
I still sometimes do that
 
There shouldn't be odd shapes if you're scale up from an even multiple. Upscaling 1080p to 4k in the ideal world would use 4 pixels (in a square,) on a 4k display to represent a single pixel on 1080p output. Pixels are squares of 3 colors and having 4 pixels in a larger square is better than going from say, 1440p to 4k because now it's not 1 pixel to 4 pixels (in a square,) it's part of a pixel in a square. So ideally, 1080p should look better than upscaling from a resolution that isn't a nice multiple of the original resolution.
That's what I meant, so basically 1080p is easy on a 4K monitor.
 
I remembered building my first gaming PC with Pentium 4 2.8ghz on a Intel motherboard with 2gb DDR 1 paired with ATI radeon X1650 512mb (AGP model) ran Resident Evil 4 without a hiccup not a single frame drop with a few settings turned down. Still have the graphic card to this day but the rest of the system gone to ewaste due to the board suffered cap leaks. Looking back now it was great while it lasted :)
 
I remembered building my first gaming PC with Pentium 4 2.8ghz on a Intel motherboard with 2gb DDR 1 paired with ATI radeon X1650 512mb (AGP model) ran Resident Evil 4 without a hiccup not a single frame drop with a few settings turned down. Still have the graphic card to this day but the rest of the system gone to ewaste due to the board suffered cap leaks. Looking back now it was great while it lasted :)
Damn, I can't remember what card I had back then when I also played the first RE4 PC version... I just remember that we beat the crap out of the Mercenaries mode with my stepbrother. :toast: Still one of the best games ever, can't wait for the remake!
 
Damn, I can't remember what card I had back then when I also played the first RE4 PC version... I just remember that we beat the crap out of the Mercenaries mode with my stepbrother. :toast: Still one of the best games ever, can't wait for the remake!
Resident Evil 3 had the mercenaries mode on PC same with the PlayStation 1 but nothing on the remake. But currently playing the modded Resident Evil Two on pc it the one from sourcenext
 
I'm happy to say that I have never had to play a game on low to enjoy it. I bought my first PC at 27 years old. In those 22 years since then, I have always bought upper mid-range to high-end graphics cards. If I ever had to play a game with "low" settings then I'd just buy a better video card and then play.
 
First decent PC I got was Dell Optiplex GX100, Celeron 600MHz with intel i810 graphics. Missing textures in RtCW, though MoHAA runs fine at around 10fps, and worse is GTA3 runs below 10fps even at 640x480. Still play it though since thats the only thing I got. Too bad no AGP slot, but the i810 do have like 4MB framebuffer on the motherboard. Intel graphics driver was horrible.

My older PC with Pentium non MMX 166MHz with S3 Trio 64V+ barely runs Red Alert 2, loading mission took 15 minutes, and not even 1fps when it finishes
 
I think in the beginning it was more common then it is now. When games dependend exclusive on CPU's and it got worst later when dedicated GPU's came along. It was much harder then today to play games at decent settings. Now you don't even have to spend a lot to play decently in 1080p.

the voodoo was launched for 299$, 570$ in today's money, and you didn't have a cheaper version. It's like the cheaper GPU today was 570$ and there wasn't even a used one to buy. There would be a lot of ultra low settings gaming :D
 
Trying to play TW medieval 2 using a GTS 450.
 
Back
Top