You're still missing the point. I wanted to explain what I mean by "unstable but high framerate", and not talk about old and dual core CPUs. (the COD and Cyberpunk parts are quite stuttery, btw)
I think that I have found what you mean:
Basically you need to have a fast chip, but the one that is limited by cache or by core count. Overclocked i3 7350K is a good example of otherwise fast CPU, but the one that doesn't have enough cores. And while average performance was quite okay, in games like Far Cry Primal it had inconsistent frame times, meanwhile i5 was doing a lot better.
Or you can just pair a decent CPU with poopy RAM to simulate poor cache on it, unfortunately frametimes were still decently consistent:
In 2021 it's really hard to find a chip that performs well and stutters or has unstable framerate. So I found this:
To me that's adequately inconsistent framerate, but even the nit still performs somewhat predictably. And you really can't top those Athlons without L3 in terms of random stuttering and otherwise good, but unstable performance:
That I agree with. A GPU bottleneck gives you lower, but stable framerates, which is more desirable than a stuttery CPU bottleneck.
Depends on how old you wanna go with such statement. AGP cards with less pipelines (like 4) often had unstable framerates. Like this FX 5200:
In Doom it could get 50 fps in one are, in others it was 15-25 fps. BTW it's 64 bit model, not faster 128 bit model, although they all had rather unpredictable performance in many games. Since it was like GT 710-GT730 of the time, I would expect new potatoes to have similar problems. Particularly DDR3 models.
Well, either you were, or you were making a pointless platitude about "at some level, performance is unacceptably low", which ... yes. I think it's safe to assume we all know that. And in which case you misunderstood the point of the analogy you were responding to.
My point is that some low end hardware is quite okay, if you play at potato settings and could be enjoyable, but there is such rubish hardware that is under any circumstances isn't good, it just sucks. I had one of such unfortunate experiences with nVidia GeForce 6150 Go. It's was insufferable garbage. Nothing was playable at 640*480, except for Unreal Tournament 2004, which had a very unstable framerate of 30-50 fps average and resolution was so low that it was legitimately hard to see opponents. And infuriating thing is that same laptop had Turion X2 TL-60 (2GHz K8 cores) and 2x2GB DDR2. So it was otherwise very capable machine, but ruined with integrated graphics. Originally it had single core Sempron at 1.8GHz and 512MB RAM, so it's not like it came with Turion. I just decided to upgrade old laptop.
Chill is a bit of a case of a great solution without a matching problem. The idea is superficially great, but flawed in that it assumes that low user input = low framerates are acceptable. To work well, such a system at least needs to account for on-screen movement. After all, if you're camping in a corner with a sniper rifle, it's hardly ideal that the GPU slows you down to 30fps just because you aren't moving. I used Chill for a bit when playing Divinity: Original Sin, which on the surface seemed like an ideal use case for a system like that, but in the end resulted in it being far too aggressive, slowing down framerates annoyingly in relatively action-heavy parts of the game. In the end, in order to select parts of a game where it's okay to show things at a lower frame rate, you need to know what is going on in the game, not just whether or not the user is using the input devices. And even on-screen movement is difficult to judge - detailed facial animations look far better at 60fps than 30, after all, yet a cutscene close-up of someone talking is hardly a screen with much movement.
Well, you can set minimum and maximum Chill framerate for that. Unfortunately, I set Chill to 60-60 and tried to play CS:GO and I got very unstable and even stuttery fps. It was mostly in 50s, but sometimes went to 40s. It clearly didn't work as advertised.