• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

love this article about CPU threads and gaming

I was a great fan of the I7-5775C. I've used one for the past 5 years, before upgrading in May. But, now I honestly believe it is somewhat overrated. I had mine with a moderate overclock of 4.0 GHz, that would fit in a ~ 70 Watt power budget. Now, looking back, I was severely CPU limited in many games. I felt this especially, trying to play on a 155 Hz monitor.

Examples:
1. Shadow of the Tomb Raider: in extreme cases I had 50-60 FPS with the 5775 (with my current GPU / RTX 3070). With my current 10850K, in the same situation I have 120 FPS, with the GPU reaching its limits.
2. Doom Eternal: Super Gore Nest, I had frame rates of 80-90 FPS with almost max CPU load. Now I reach the limit of the monitor.

So, in spite of the 128 MB of L4 cache, which by the way had a ~ 50 GB/s bandwidth (not so great by modern standards), there were many instances where I felt limited by the performance of CPU. No amount of cache can help in these situations.

I've read the Anandtech article and I don't really agree with the conclusion. They did not show instances where the 4C/8T of the 5775C are at the limit, with the game selection and testing methodology they used.

I mean, isn't Shadow of the Tomb Raider like *the* demanding CPU benchmark amongst modern games? If you play games that just need more threads or simply scale like crazy on core freq, then obviously no amount of cache will save you from 1% lows. If I push my 4790K under water and pair it with fast RAM I can get by in a lot of games, but I'm under no illusion how it compares to a 6 or 7-year newer CPU.

Whether the 5775C is a substitute for a 10900K or 5900X is a silly question and misses the point. Of course it won't be, who on earth would upgrade their CPU then? The point is that the 5775C demonstrates that cache is the low-hanging fruit with modern CPUs, and there's a lot to gain there for relatively little cost compared to costly redesigns of other parts of the core.

Don't discount 50GB/s. You need DDR4-3466-3533 to break the 50 mark on copy perf, and even then it probably needs 16-19-19 or lower and the recent non-APU Ryzens almost always suffer on bandwidth to some degree. And then there's the question of latency, nothing after Comet Lake has been impressive.

L3 saved Ryzen 3000's gaming perf from the gutter. Part of the reason why Rocket Lake sometimes is in the gutter is down to the lack of L3 (amongst other reasons, overloaded ringbus, idiotic IMC design, etc.). So it's easy to see why AMD's about to milk Zen 3 for another generation.
 
pours an ice cold bucket of reality onto all the fan boys who have been screaming "more cores equal better gaming" (you know who you are) on these forums for the past several years



Did they copy information from hardware unboxed? :D
 
Last edited:
I mean, isn't Shadow of the Tomb Raider like *the* demanding CPU benchmark amongst modern games? If you play games that just need more threads or simply scale like crazy on core freq, then obviously no amount of cache will save you from 1% lows. If I push my 4790K under water and pair it with fast RAM I can get by in a lot of games, but I'm under no illusion how it compares to a 6 or 7-year newer CPU.

Whether the 5775C is a substitute for a 10900K or 5900X is a silly question and misses the point. Of course it won't be, who on earth would upgrade their CPU then? The point is that the 5775C demonstrates that cache is the low-hanging fruit with modern CPUs, and there's a lot to gain there for relatively little cost compared to costly redesigns of other parts of the core.

Don't discount 50GB/s. You need DDR4-3466-3533 to break the 50 mark on copy perf, and even then it probably needs 16-19-19 or lower and the recent non-APU Ryzens almost always suffer on bandwidth to some degree. And then there's the question of latency, nothing after Comet Lake has been impressive.

L3 saved Ryzen 3000's gaming perf from the gutter. Part of the reason why Rocket Lake sometimes is in the gutter is down to the lack of L3 (amongst other reasons, overloaded ringbus, idiotic IMC design, etc.). So it's easy to see why AMD's about to milk Zen 3 for another generation.
I had my 5775 paired with a GTX1080 for 4 years and I've been mostly playing at 1440p/60Hz, so it was a good combo. But I wanted to try high refresh rate gaming, so I bought a 1440p/155Hz Monitor and as you probably know, one upgrade leads to the next, until there is nothing more to upgrade. :D

I am still using the 5775 in a secondary, low power system with the iGPU. Still a great CPU!

For my current needs, it was not enough anymore. If you want high refresh gaming, there is no way around using a new(ish) CPU.
 
I haven't read every comment but I think some are missing the point of this article.

This isn't about how many cores someone needs and more about why Intel 10th gen scales from 6-8-10 cores the way it does in some cpu demanding games when the core/ring frequency is matched and paired with a high end gpu...... Not about how viable a 4-8 thread cpu is with a slow gpu gaming at 60hz.

My 5800X is significantly more smooth in Warzone and BFV when paired with a 2080 ti than my 9900K was at 5ghz running with low latency 4000 mem on both. Regardless of the fact that they are both 8 core cpu. I would see a similar benefit with a 10900k and that has nothing to do with the core count.

Technically that's a upper mid-range gpu now..... I'm sure with my 3080 ti the difference would be even more pronounced.

Does anyone need more than a 9900k for high framerates nope but are there cpu that give a much better gaming experience yep.
 
i when from a 7700k 4/8 to a 2700x 8/16 to 3900x 12/24 and trust me folks it makes a hell of a diff in VR when playing proper games.
 
Well yeah, I've got a 500 core CPU and therefore get 500 fps in everything. Stands to reason, dunnit? ;)
 
You can still play very triple A games on a 4 core 4 thread i5 today still.


unless you want high refresh rate gaming, a 4c 8 thread CPUs would do plenty fine

Yes, you can play Minesweeper on it too. And a metric ton of indie games. And shooters. But that doesn't make an i5 quadcore from the Ivy/Haswell days a sensible CPU to game on. Will it run? Sure. But its also...
- very prone to stutter. I could NOT - I repeat - simply NOT - run World of Warcraft stutter free on an overclocked i5 3570k. Similarly, Total War games stuttered. FPS would take a nosedive when background tasks started running. And that was pre-patch WoW Legion, so with the older, lighter engine. We're talking 1080p here, but a higher res would definitely not help anything as it is even more data through the pipeline.

If you want an eye opener, I've produced these screenshots before... Total War pre-upgrade and post-upgrade. The performance jump is immense and all the dips are gone. Its night and day - on the exact same GPU.

- runs on slower DDR3 most likely, and if you're opting for 16GB, thats money down the drain. Upgrade paths are very bad for your wallet, never mind the moment something fails like your mobo.

As for 4c8t... it runs into similar issues because if your 4 physical cores are pegged, HT won't help you all that much. And they're likely to be, or close to it.

So, sure you CAN run games on quads and on 4c8t, but its far from a smooth experience and if you think it is... you've not played a whole lot on it or you're avoiding the stuff that will tank it completely.

TL DR - The bottom line with system upgrades: timing matters and falling way behind or going way ahead of the mainstream norm is going to hurt your wallet and not necessarily increase your fun factor. I've got two decades worth of experience in that direction and every time, I had to draw this same conclusion.
 
Last edited:
I personally feel 4c/8t are still viable gaming CPUs even for modern games as long as you are not chasing high FPS. For a new build with a decent budget it's a different story.

I also see posters (even on this forum) stating 6c/12t are obsolete for gaming and 8c CPUs are the bare minimum for gaming, even aft CPUs like the AMD 5600x launched so it's still a modern argument (mostly to justify people's purchase of high core CPUs).
Ok so quads with HT I think are fine and will do ok if the user's expectations are tamed, I still will build quad based PC's for anyone with very limited fund's but I do tell them to expect some constraints given a bit more time.

As for people saying 6/12t are obsolete for gaming , I have not seen anyone say that in TPU land, proof please.

A few can be found pushing 8 cores as the goto, but who's actually saying 6 cores are not enough ( without going back to phenomII era)
 
Before my current rig I used a i 3 4160 for almost 3 years, it was fine since all I did is play single core heavy games with dated engines/no optimization. 'crappy MMOS and such'
Then I wanted top lay some new games like Far Cry 5 and Just Cause 3 and they completely murdered that poor 2/4 i 3 to the point of even bottlenecking my GTX 950. :laugh:

When I built my current rig in 2018 may I was kinda torn between going for first gen Ryzen like the 1600/x or stay with Intel and pick i 5 8400 since they were the same price here at the time.
Ended up going for the 1600x cause I remembered what happened with my i 3 and I was like, well I rather lose some single core perf but rather have extra threads just in case the same happens in a few years.

To be honest I'm kinda glad I did, I still have no issues with this CPU with whatever game I play and if I ever need faster cores I can still drop in a second hand 3600 in my current mobo just fine.
Yeah I know the 1600x would bottleneck anything over a RX 5700/XT/3060 but since I'm not into that kind of price range market its a non issue for me. 'also only have to drive a 75Hz monitor anyway'
 
Kinda stupid test.

If they target a high FPS panel, where you need CPU horsepower so it means 144FPS as a minimum, adjust the game settings to medium even to have stable frame rate on that mark... They do a mashup and resulting graph? The heck? Like counting apples and mushrooms in one basket.

On my older rig did like them, use highest settings, game engine stumbles into GPU limit and call it a day at 60FPS and you don't need anything more for sure...

They can't really evaluate the difference if they run into GPU limit and do not target high FPS. They have the Rainbow Six in the bench... that clearly illustrates the math, that actually something does scale up.
 
As for people saying 6/12t are obsolete for gaming , I have not seen anyone say that in TPU land, proof please.
I have seen that being said a couple of times here, but mostly elsewhere. I don't recall who said it though.. I also disagree with it. Sure, one day it wont be enough.., but not today :toast:
 
I have seen that being said a couple of times here, but mostly elsewhere. I don't recall who said it though.. I also disagree with it. Sure, one day it wont be enough.., but not today :toast:

I think it's related to the consoles having 8 customized/cutdown Zen 2 cores with 16 threads..... I partly agree that anyone who can fit an 8+ core 10th gen/Ryzen 3000 cpu or newer into their budget should but my guess is 6 core 12 thread parts will be fine for the next 2-3 years unless you're chasing the maximum performance. At the same time who knows what 6 core 12 thread cpu will look like in 3 years they could have 40-50% higher ipc vs Zen 3

At the same time the majority of people with limited budgets who fall in the sub 200 usd cpu range likely can't afford a high end rdna2 or Ampere gpu anyways.
 
People need to stop justifying their choices based on "randos on the internet forum" excuse. If you wanna go eight core go eight core. If you wanna stay on four core so be it. Just don't tell me how great your four core is cuz it isn't.
 
I think it's related to the consoles having 8 customized/cutdown Zen 2 cores with 16 threads..... I partly agree that anyone who can fit an 8+ core 10th gen/Ryzen 3000 cpu or newer into their budget should but my guess is 6 core 12 thread parts will be fine for the next 2-3 years unless you're chasing the maximum performance.
- PS5 has 8c/16t Zen2 at max 3.5GHz.
- Xbox Series X has 8c/16t Zen2 at 3.66GHz (or 8c/8t at 3.8GHz).
In context of the article that started this thread - both only have 8MB L3 cache.

Basically, to get the same performance from 6 cores those would need to run at 4.6GHz. This is a bit more than what we currently have.
On the other hand, consoles have Zen2 and low L3 cache. We are at Zen3 now on desktop with additional IPC improvements and 16/32MB L3 cache.
Consoles also have dedicated cores for OS/background stuff which PC does not do.

When it comes to consoles driving the needs, 6-core should be just fine for this console generation.
 
- PS5 has 8c/16t Zen2 at max 3.5GHz.
- Xbox Series X has 8c/16t Zen2 at 3.66GHz (or 8c/8t at 3.8GHz).
In context of the article that started this thread - both only have 8MB L3 cache.

When it comes to consoles driving the needs, 6-core should be just fine for this console generation.

I was mostly just pointing out why some are jumping to the conclusion that 6 core 12 thread cpu are dead.... I even pointed out that future varients with much highet ipc are likely to be more than viable sorta like a 2700X will likely never be better than a 5600X.
 
My kids play just fine on a 3770K.

5600X is an absolute beast, I love mine. 5900X is good to.. but not as fun.

I roll my eyes when people say 6 cores are dead.. there is definitely more to computing than Cinebench..
Agreed. I have a 3770k at 4.4ghz and it's fine for my purpose (single player games at 75hz with a 2060).

Seems like 90% of the AAA titles i've played in the last 24 months have either had moderate CPU requirements or have been very well optimized
 
I think it's related to the consoles having 8 customized/cutdown Zen 2 cores with 16 threads..... I partly agree that anyone who can fit an 8+ core 10th gen/Ryzen 3000 cpu or newer into their budget should but my guess is 6 core 12 thread parts will be fine for the next 2-3 years unless you're chasing the maximum performance. At the same time who knows what 6 core 12 thread cpu will look like in 3 years they could have 40-50% higher ipc vs Zen 3

At the same time the majority of people with limited budgets who fall in the sub 200 usd cpu range likely can't afford a high end rdna2 or Ampere gpu anyways.

That but its just a new argument. I can prob regurgitate the posts where people bought a 9th gen Skylake 8c8t (;)) and started signalling it was oh so much better for gaming than the 8700K.

Boy were they wrong. HT/SMT is utilized now ;) Something about mainstream adoption hmmmm :D

In other words the epeen crowd will always find reasons to tell all peasants thats what they are. All I hear/see is a shriveled penis hanging slightly to the left there with a desperate need to stand taller.
 
That but its just a new argument. I can prob regurgitate the posts where people bought a 9th gen Skylake 8c8t (;)) and started signalling it was oh so much better for gaming than the 8700K.

Idk, I always hated the 9700k and 9600k from the minute they were announced and even though I enjoyed my 9900k it was a dumb CPU for 499 usd at launch but it was mostly because I felt intel was shitting on it's consumer base more so than them being bad products..... The only thing that annoyed me during the 9th gen/Ryzen+ era was intel fanboys harping on how much faster they were at gaming but then rocking 1050-1060s meaning a potato cpu like a R5 1600 would perform the same.

The 8700k may be the last intel skylake product that ages well that didn't start life at 500+ usd.... What is it nearly 4 1/2 years old now and still rocks. I do feel bad for the 7700k people 10 months and a substantially better cpu comes out with 50% more cores for the same price all cuz intel was feeling the heat from AMD.
 
Last edited:
I honestly have no use for 12 cores, I was perfectly ok with 6.. but the little guy won’t do 5GHz+ single core speeds.
Blasphemy! There are never enough cores for BOINC, never!

lmao, are all the quad core diehards going to come out of the woodworks now?
The problem is that quads are still fine. This is a good comparison:

You really don't lose performance linearly in accordance of core count. i3 might be only slower, due to less L3 cache and due to sometimes not having excessive threads for no reason. But notice how well i3 is utilized. It's not pegged to 100%. Nah, it still has some unused headroom. In many games 8 threads aren't utilized and they seem to only need 4. Some games utilize closer to 6-7 threads, but still not a single title utilizes them all. So really, there's is no reason to excessively push perfectly adequate into retirement home. That's just stupid. Only in Tomb Raider there is a significant difference between hexa and quad, but even then framerate was still way above what you need for game to remain playable. Quads are still fine and will last for some years. Hexas are quite future proof (well at least as much future proof as computers can realistically be). Anything more, is so far, simply excessive in games. HWUB was stupid not to test any quad and that just shows how credible they are other than for stirring up drama.
 
The problem is that quads are still fine. This is a good comparison:

You really don't lose performance linearly in accordance of core count. i3 might be only slower, due to less L3 cache and due to sometimes not having excessive threads for no reason. But notice how well i3 is utilized. It's not pegged to 100%. Nah, it still has some unused headroom. In many games 8 threads aren't utilized and they seem to only need 4. Some games utilize closer to 6-7 threads, but still not a single title utilizes them all. So really, there's is no reason to excessively push perfectly adequate into retirement home. That's just stupid. Only in Tomb Raider there is a significant difference between hexa and quad, but even then framerate was still way above what you need for game to remain playable. Quads are still fine and will last for some years. Hexas are quite future proof (well at least as much future proof as computers can realistically be). Anything more, is so far, simply excessive in games. HWUB was stupid not to test any quad and that just shows how credible they are other than for stirring up drama.
Modern quads are definitely fine with their modern IPC. However I was more referring to ppl with old stuff like skylake. That stuff is slow today and it's especially slow considering how ppl game today with all the other crap used, streamers, chat, music, more chat on top of chat, voice servers etc. Hell I am always multi-tasking so I hardly ever strictly just run a simple game. I'd never buy a quad today... lol. But again whatever floats your boat.
 
I personally feel 4c/8t are still viable gaming CPUs even for modern games as long as you are not chasing high FPS. For a new build with a decent budget it's a different story.

I also see posters (even on this forum) stating 6c/12t are obsolete for gaming and 8c CPUs are the bare minimum for gaming, even aft CPUs like the AMD 5600x launched so it's still a modern argument (mostly to justify people's purchase of high core CPUs).

All depends on what you play and how old the 4 core CPU is. When i jumped from a 3770 to a 3900 Red Dead played much smoother. Sadly i don't have a newer 4 core CPU to test or i would of by now just to see if that would of made all the difference, thinking it would tbh.
 
The one cool thing about boosting, at least on the AMD side.. is that they boost to where the little guy pretty much tops out, and then keeps going.. they really are pretty savage. I almost bought a 6700 XT 10 minutes ago but chickened out :D

Meow..

ugh.
 
Back
Top