• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Is 4 cores and 8 threads enough for todays computer,gaming in 2023

Nothing beats point and click at 4K high fps.....
 
I guess I'm lucky that slow-paced single player is the only thing I'm interested in. :D (with a few exceptions)
Hella agree. I'm just so paranoid that I goof something in a multiplayer game (unless it's dm/tdm) that I rather play single.

Nothing beats point and click at 4K high fps.....
Heh, why not. Heroes III is hella dope to play with HD mod and a 32" 4K monitor. :D
 
Hella agree. I'm just so paranoid that I goof something in a multiplayer game (unless it's dm/tdm) that I rather play single.
As for me, the atmosphere and a good story is all I care about. I feel no desire to mingle with angry teenagers in a bland and emotionless jumper-shooter. :)
 
just retired my old 3770k, bought a 5800X3D. I am amazed how fast it is is, and how slow my old system was, i play old and new games, i can feel it in all my games.
 
just retired my old 3770k, bought a 5800X3D. I am amazed how fast it is is, and how slow my old system was, i play old and new games, i can feel it in all my games.
This. @Sithaer, I was talking about this. You are comfortable with i3-12100F. You will be even more comfortable with a faster CPU. Of course I understand you might be the opposite of rich but getting an i5-13600K or something along these lines when budget-appropriate is a completely sane thing to do.
 
Question is, do you actually notice that 5 fps difference w/o a fps overlay if its not reaching that 'magical' 60 so much that its suddenly not enough?

Personally I don't and I don't play with an overlay constantly enabled cause its a distraction to me/bothers my immersion so after I'm done tweaking my settings I turn off the statistics overlay and let it run in the background while I focus on the game instead.
I'm yet to run into any game where my 12100F couldn't offer me an enjoyable experience and still most of the time its my 3060 Ti thats a limiting factor when I crank up the settings in newer games.

Ofc part of that is entirely subjective cause anything over 50 fps average is totally fine with me especially in single player games.
I definitely feel lag when my fps is 45fps avg when using 9400f though. When reaching 55fps i dont feel like something wrong.

Its true if fps meter can make you distracted but it can be useful for knowing the limit of the your current cpu. So with this information its safe to recommend at least 12400f or 5600 for others who wants to buy a new build. But for us who already buy like 12100f i think its fairly enough for now.

Yes i think its fine at 50 fps average minimum for me too for now. But for a new build i'm still think its worth to save up more for 12400f
 
I definitely feel lag when my fps is 45fps avg when using 9400f though. When reaching 55fps i dont feel like something wrong.

Its true if fps meter can make you distracted but it can be useful for knowing the limit of the your current cpu. So with this information its safe to recommend at least 12400f or 5600 for others who wants to buy a new build. But for us who already buy like 12100f i think its fairly enough for now.

Yes i think its fine at 50 fps average minimum for me too for now. But for a new build i'm still think its worth to save up more for 12400f
For a new build yeah I would agree a 12400/13400/5600 is a wise idea but if someone is on a budget the 12100/13100 is a good deal imo.
I've bought this CPU in 2022 february not long after its launch date and so far I've had no issues with it and it still gets the job done in everything I play.

This. @Sithaer, I was talking about this. You are comfortable with i3-12100F. You will be even more comfortable with a faster CPU. Of course I understand you might be the opposite of rich but getting an i5-13600K or something along these lines when budget-appropriate is a completely sane thing to do.

You cannot compare an old gen 4/8 CPU to a 12-13 gen i 3 since its much faster, beats both i5 10400/11400 in gaming. 'you can check TPU's review for that'
I definitely don't feel that my system is slow for general everyday use nor for the gaming I do with it.
Again my monitor is only a 75Hz one and I have a 73 global fps cap enabled and this CPU is still capable of pushing that or at least close enough in majority of the games + especially the ones I'm playing so there is zero reason for me to upgrade at this point of time.

Even if I do upgrade I'm not going higher than a 13400 possibly a 14400 but I will have to see the reviews first so maybe sometime in 2024 I will consider it.
 
I remember people disabling HT back in the day on their 2600Ks because "it generates too much heat". Why the hell people then went for 2600K instead of 2500K and disabled the i7's main feature...
People did that because they thought the i7 was also a better die/bin. I honestly never saw complaints of heat, that happened with Ivy Bridge, not Sandy. Sandy was frosty as fck!

I think what was more of an issue with HT back then was its limited purpose. I mean, SB on release was so blazing fast, it would destroy all content then and the next 7 years ahead of it. You simply didn't need HT and nothing ever used more than 1-2 cores either, and if there was core load on 3 and 4, it was low. Disabling HT did provide OC / temp headroom.

This is where mighty CPUs come to the rescue. They limit the stuttering, they get rid of freezing, they also allow for "cyber sport" 100+ FPS easier than lesser cored siblings.

As an high FPS count addict I totally disapprove of skimping on the CPU. Snailness of your GPU hurts experience less than that of the CPU.

Cyberpunk, though, despite being not precisely fast paced and not multiplayer, is way more nice at triple digit framerates than at 60. 144 FPS is a sweet spot as you will most likely fail to distinguish 240 from 180 but 144 VS 100 is huge. I left it vsynced at 80 FPS and adjusted settings (going 1080p included) the way it never drops below north of 70. Significantly sweeter than 50 to 60 FPS at 4K.
Yeah, high FPS also means the time a frame is displayed is shorter, so you can resolve more detail in motion. The more detailed games get, the more they'll fancy higher framerates, simply to resolve it all proper without becoming a mess. But, there is a certain charm in 50~ish FPS too, for reasons similar to avoiding a soap opera effect. I find that in lower detail games, a very high framerate isn't really adding to the game's immersion, quite the contrary. Additionally, many single player console games, third person action falls under that category for me. Stuff like Sekiro, God of War, etc. They are still annoying at 30 FPS IMHO, but anywhere 40 and up is palatable and a stable 50 is pure joy. Its really ye olde PAL standard, perhaps new generations of gamers don't even have that.

Totally agree on the Cyberpunk experience. Played it on a 1080 and recently on a 7900XT for 2,5x the frames... world of difference. What's funny is that The Witcher 3 is not the same, it's absolutely fine to me with 50~70 FPS which is what I played it at. Most shooters / first person though... give me 100+ every time.

For a new build yeah I would agree a 12400/13400/5600 is a wise idea but if someone is on a budget the 12100/13100 is a good deal imo.
I've bought this CPU in 2022 february not long after its launch date and so far I've had no issues with it and it still gets the job done in everything I play.



You cannot compare an old gen 4/8 CPU to a 12-13 gen i 3 since its much faster, beats both i5 10400/11400 in gaming. 'you can check TPU's review for that'
I definitely don't feel that my system is slow for general everyday use nor for the gaming I do with it.
Again my monitor is only a 75Hz one and I have a 73 global fps cap enabled and this CPU is still capable of pushing that or at least close enough in majority of the games + especially the ones I'm playing so there is zero reason for me to upgrade at this point of time.

Even if I do upgrade I'm not going higher than a 13400 possibly a 14400 but I will have to see the reviews first so maybe sometime in 2024 I will consider it.
Its a fact most CPUs are pretty overpowered for what they generally do, especially for gaming. Games don't generally choke on raw CPU grunt, they choke on peak loads somewhere in the pipeline. Better cache alleviated a big one there - the data can be transported faster. The core power was always there - that's why those X3Ds fly in a lot of games. Better multithreading alleviated another. But even today, not all games make use of all available core power, not even with new APIs. They just can't feed it readily enough, game logic is sequential, etc.
 
Last edited:
You cannot compare an old gen 4/8 CPU to a 12-13 gen i 3 since its much faster, beats both i5 10400/11400 in gaming. 'you can check TPU's review for that'
Not the point.
I definitely don't feel that my system is slow for general everyday use nor for the gaming I do with it.
I didn't argue with this.
Again my monitor is only a 75Hz one and I have a 73 global fps cap enabled and this CPU is still capable
Yet a faster CPU like high-tier i5-13th will provide with a smoother frametime graph. Which is very pleasant.

I was in similar shoes a couple years ago when I had i3-10100F. I was forced to sell my GTX 1070 (luckily, for a stupid amount of money, about $500) and that was my only GPU so I bought an engineering sample CPU with 8 cores and newer architecture, and also with an iGPU. I'd be a complete phony if I told you the i3 does not suck at gaming compared to my current CPU. It really sucks.
 
Yet a faster CPU like high-tier i5-13th will provide with a smoother frametime graph. Which is very pleasant.
There are only a few month untill we have 14 gen, my mb has support for 12,13 and 14 gen cpus (asrock B760)

Thats rather rare one intel mb supports 3 generations, although you could be lucky to have an amd mb that supports up to ryzen 5000
 
There are only a few month untill we have 14 gen, my mb has support for 12,13 and 14 gen cpus (asrock B760)

Thats rather rare one intel mb supports 3 generations, although you could be lucky to have an amd mb that supports up to ryzen 5000

There;s nothing wrong by keep using the board and CPU for 3 or 5 years, if you get a i7 14700K you don't need to upgrade anytime soon....
 
rather rare one intel mb supports 3 generations
The first one this century actually.

Socket 478, 2 gens.
LGA775, 2 gens of NetBurst + 2 gens of Core.
LGA1156, 2 gens.
LGA1155, 2 gens.
LGA1150, 1 gen + refresh.
LGA1151, 2 + 2 gens, 4 in total.
LGA1200, 2 gens.

As you can see, neither of them equals to 3.

Probably HEDT and more extreme platforms had 3-gen support at some time but I doubt it.
only a few month untill we have 14 gen
I know and this was an example CPU, not a "you should buy exactly this model."

Anyway this "AMD has more gens per socket" is balls because normally you buy PCs every 6 years or even less frequently with some folks upgrading once a decade. You don't care what your motherboard supports. You know for sure that even the best CPU you can slot into it is obsolete and irrelevant as of date so you buy an entirely new platform. The number of those who REALLY need upgrades more often is negligible. I bought my LGA1200 platform in late 2020 or early 2021 and I'm planning on using it till late 2020s, probably 2028. Then, probably, i3s (or whatever they roll out) will outperform i9-11900 in every task.

What really is driving me nuts is no desktop Intel APUs whatsoever. There is some space for a serious iGPU like halved RTX 4060 die with lowered clocks. With DDR5, especially in quad channel, it could've been sick. Alas...
 
The first one this century actually.

Socket 478, 2 gens.
LGA775, 2 gens of NetBurst + 2 gens of Core.
LGA1156, 2 gens.
LGA1155, 2 gens.
LGA1150, 1 gen + refresh.
LGA1151, 2 + 2 gens, 4 in total.
LGA1200, 2 gens.

As you can see, neither of them equals to 3.

Probably HEDT and more extreme platforms had 3-gen support at some time but I doubt it.

I know and this was an example CPU, not a "you should buy exactly this model."

Anyway this "AMD has more gens per socket" is balls because normally you buy PCs every 6 years or even less frequently with some folks upgrading once a decade. You don't care what your motherboard supports. You know for sure that even the best CPU you can slot into it is obsolete and irrelevant as of date so you buy an entirely new platform. The number of those who REALLY need upgrades more often is negligible. I bought my LGA1200 platform in late 2020 or early 2021 and I'm planning on using it till late 2020s, probably 2028. Then, probably, i3s (or whatever they roll out) will outperform i9-11900 in every task.

What really is driving me nuts is no desktop Intel APUs whatsoever. There is some space for a serious iGPU like halved RTX 4060 die with lowered clocks. With DDR5, especially in quad channel, it could've been sick. Alas...

LGA 775 was actually longer lived than that, it had Prescott, Presler, Cedar Mill, Conroe/Kentsfield and Wolfdale/Yorkfield, spanning DDR1, DDR2 and DDR3, as well as AGP, PCIe 1.0 and PCIe 2.0. It was definitely the longest lived socket around, maybe even longer lived than the Socket 7 which was multi-vendor and went from P5 Pentium to Sharptooth K6-IIIs.

It was LGA 1156 that had only one gen (it's Lynnfield and Clarkdale, the latter just being a dual-core processor/adaptation of mobile Arrandale for desktop), 1366 had Nehalem/Bloomfield and Westmere/Gulftown, so at the HEDT range it did have 2 gens.
 
The first one this century actually.

Socket 478, 2 gens.
LGA775, 2 gens of NetBurst + 2 gens of Core.
LGA1156, 2 gens.
LGA1155, 2 gens.
LGA1150, 1 gen + refresh.
LGA1151, 2 + 2 gens, 4 in total.
LGA1200, 2 gens.

As you can see, neither of them equals to 3.
Why, ryzen does it

So nice to have 3 gen options when wanting more power, just sad your limit to z mb's if you have a k cpu and want's to do real oc, not just bckl
 
Why, ryzen does it

So nice to have 3 gen options when wanting more power, just sad your limit to z mb's if you have a k cpu and want's to do real oc, not just bckl

Likely to avoid the crapshoot socket AM4 was. It took its entire service life for AMD to iron out all of the bugs, perfect memory training routines, etc.
 
z mb's if you have a k cpu and want's to do real oc

There isn't much "real" OC'ing left with intel these days, just set it all-P-cores at highest boost speed.
 
you can have one without e cores like the 12100f :)
 
Not the point.

I didn't argue with this.

Yet a faster CPU like high-tier i5-13th will provide with a smoother frametime graph. Which is very pleasant.

I was in similar shoes a couple years ago when I had i3-10100F. I was forced to sell my GTX 1070 (luckily, for a stupid amount of money, about $500) and that was my only GPU so I bought an engineering sample CPU with 8 cores and newer architecture, and also with an iGPU. I'd be a complete phony if I told you the i3 does not suck at gaming compared to my current CPU. It really sucks.
Well that is not the experience I have so there is that + you can check any reputable review that had the same overall experience with the 12/3100.
That and the 10100 is quite bit weaker than a 12-13 th gen i 3.
I could also compare your CPU to a 12/3900K and call yours bad but what good does that to your actual use case experience anyway? 'we could argue this all day, what sucks to you might be perfectly enough for most budget-average user'
 
Last edited:
People did that because they thought the i7 was also a better die/bin. I honestly never saw complaints of heat, that happened with Ivy Bridge, not Sandy. Sandy was frosty as fck!

I think what was more of an issue with HT back then was its limited purpose. I mean, SB on release was so blazing fast, it would destroy all content then and the next 7 years ahead of it. You simply didn't need HT and nothing ever used more than 1-2 cores either, and if there was core load on 3 and 4, it was low. Disabling HT did provide OC / temp headroom.
I think there was also an issue with thread allocation. That is, if a game that used let's say, 2 threads, got its threads assigned to the same core, you could lose performance vs those threads being allocated to separate cores.
 
LGA 775 was actually longer lived than that, it had Prescott, Presler, Cedar Mill, Conroe/Kentsfield and Wolfdale/Yorkfield, spanning DDR1, DDR2 and DDR3, as well as AGP, PCIe 1.0 and PCIe 2.0. It was definitely the longest lived socket around, maybe even longer lived than the Socket 7 which was multi-vendor and went from P5 Pentium to Sharptooth K6-IIIs.

It was LGA 1156 that had only one gen (it's Lynnfield and Clarkdale, the latter just being a dual-core processor/adaptation of mobile Arrandale for desktop), 1366 had Nehalem/Bloomfield and Westmere/Gulftown, so at the HEDT range it did have 2 gens.

Didn't 478 have 3 gens as well?

Willamette, Northwood and Prescott.

Does the P4EE Gallatin slip in there also?
 
Prescott.
Crap, forgot about it. Yeah, then 3 and a half.
Clarkdale, the latter just being a dual-core processor/adaptation of mobile Arrandale for desktop
It is different. Lynnfield is 45 nm and closer to Wolfdale than Clarkdale which also is 32 nm.
Yeah, I forgot about this arch.
I could also compare your CPU to a 12/3900K and call yours bad
And you will be correct. My CPU is bad compared to i9-12900.
 
I remember the later super socket 7 boards supported Intel, AMD and Cyrix (the 6x86 I believe). Intel's socket 7 CPU's were so expensive my friends and I had no choice but Cyrix and AMD. There was nothing so fun as setting a bunch of dip switches to get a super socket 7 motherboard to work w/a given CPU and having to get an entire EEPROM shipped to you if the BIOS needed updates.
 
Didn't 478 have 3 gens as well?

Willamette, Northwood and Prescott.

Does the P4EE Gallatin slip in there also?

Yup, Gallatin is also on 775 and we both forgot about Smithfield
 
Yup, Gallatin is also on 775 and we both forgot about Smithfield

Yeah, believe it or not I got a P4 EE here in its original box (775) and I used to run a couple of Smithfields, Pentium D's if I'm not mistaken : )
 
Back
Top