• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

love this article about CPU threads and gaming

Yeah that's ito_Oo_Oo_O. Single core is life, single core is the futureo_O

Let's all go back to single core and click, wait, click again, wait, click, wait... What why dit it open it 3 times...
Good ol' times...
While single core CPUs can't be considered adequate today, dual cores backed by an SSD (with enough RAM) can still be used to browse the web comfortably.
This is what I'm using daily :laugh:
fx.jpg
 
Good ol' times...
While single core CPUs can't be considered adequate today, dual cores backed by an SSD (with enough RAM) can still be used to browse the web comfortably.
This is what I'm using daily :laugh:
View attachment 213162
Nice try, but Go single core:slap:. I can't believe you, you are also running dual channel memory. Shame on you, now do so it runs single channel. Single core and single channel memory. That's the proper way to run a pc in 2021o_O

Or even better, go buy a 5950X and run it single core as I do and remember to deaktivate SMT:nutkick:
 
pours an ice cold bucket of reality onto all the fan boys who have been screaming "more cores equal better gaming" (you know who you are) on these forums for the past several years


Its a new article on old news. It should be common knowledge by now that a game only benefits from more cores if the game itself is programmed for it. Either way, we're not going to be seeing them getting optimized for high core count CPUs anytime soon, if at all, since it doesn't make any sense at this point. If you want to strictly redline game & such, you definitely shouldn't buy an AMD Threadripper or Intel X-series cpu.
 
Seems like HWUB missed the perfect opportunity to test the i7-5775C. Granted, LGA1150 boards are scarce, but the fact that the 5775C keeps up with modern chips while the 4790K has fallen by the wayside speaks volumes about the importance of 3D cache. Broadwell had like zero OC headroom while 4790Ks were hitting 4.6GHz.

It's also the philosophy Intel themselves adopted for the high end quad cores of Tiger Lake / Tiger Lake H35, and the massive Willow Cove L3 seems to be doing well there despite only being quad cores.

Then they shot themselves in the foot with Rocket Lake, but I guess 24MB of L3 on 14nm wouldn't fit the LGA1200 package LOL

afaik Anandtech is the only one seriously testing 5775C lately (and they tend to gimp everything else by running rated memory speed only) but the point stands.
I was just about to link Anandtech but then I finished reading your post. :D

Yeah, they seem to be the only ones who've tested the 5775C as of late. Would certainly be interesting to see how it'd line up with more realistic memory timings.
 
I just ordered a Ryzen 5 5600X this morning. Party time!

The money I saved over the Ryzen 7 5800X will be going towards a 2TB NVMe 4.0x4 ssd.
 
Last edited:
I just ordered a Ryzen 5 5600X this morning. Party time!

The money I saved over the Ryzen 7 5800X will be going towards a 2TB NVMe 4.0x4 ssd.
All that really matters is that it's doing what you want for what you want to pay, good buy.
 
I also see posters (even on this forum) stating 6c/12t are obsolete for gaming and 8c CPUs are the bare minimum for gaming, even aft CPUs like the AMD 5600x launched so it's still a modern argument (mostly to justify people's purchase of high core CPUs).
Many "I just bought myself a new toy and that should now be everyone's baseline" types regularly forget not everyone cares about 144-360Hz gaming as the single metric of immersion. I tested a 75Hz 34" 3440x1440 Ultrawide the other day, and I definitely found actually being able to see more + the larger screen size much more immersive than simply throwing more of the same frames at 24-27" 16:9 monitors, especially for single player games. Even older Indie games like The Vanishing of Ethan Carter, there was this "damn..." reaction to being able to take so much in on screen at once that I never got with 60 -> 144Hz. And the higher res you go the more of the load ends up on the GPU, usually ending in 4K CPU comparisons being "flatlined" anyway. The way Techspot only ever test at 1080p may be great for eliminating bottlenecks and "isolating" the CPU, but it also often gives exaggerated % gains for 1440p and higher users, and beyond a certain point money is usually better spent on more GPU than overspending on CPU.
 
Last edited:
At least on my second rig, going from i7 920 (4c/8t @ 3.8GHz) to Xeon X5650 (6c/12t @ 4.2GHz) felt like a nice boost.
 
More cores, better (psychological) gaming performance lol
 
I'll be retiring my i7-7700K with Gigabyte Z270X Extreme gaming 9 shortly (waiting for delivery from the UK, of my new cpu = i9-9900K, delid with liquid metal) to see how good the performance uplift will be for gaming.

I only did this slight upgrade do to the TPM 2.0 requirement for the Windows 11 u/g when it's finally released to the general population + my i7-7700K cpu isn't on the official Intel list for windows 11 u/g path..........Bugger !!!!
 
I honestly have no use for 12 cores, I was perfectly ok with 6.. but the little guy won’t do 5GHz+ single core speeds.

Sig rig is 5.0 across all lol

More cores, better (psychological) gaming performance lol

It has been tested on youtube that multicore fx is smoother than a quad from intel then in 2020/2021
 
lmao, are all the quad core diehards going to come out of the woodworks now?
 
Sig rig is 5.0 across all lo
I had maybe 15-20 Intel CPU's over the last 15 years or so, and I came close a few times, but 5GHz was always "just" out of reach for me. Such a tease.. left me twisting in the wind a few times. At the time I wasn't willing to spend on a GPU, so I bought my 5900X instead :D But all core OC is too much of a bear, I can barely keep 4750 @ 1.45v in check for something like R23.. its just got too much beef to handle. My last run was 240w from this poor 105w CPU :rockout:
 
This most certainly is NOT news.

The fact that secondary CPU cache has been a major influence on chip performance has been known for decades.

Look at Intel Pentium vs. Intel Celeron in the late Nineties. Back in that era, many of the Celeron processors had no secondary cache and their performance reflected this.

This is possibly the least newsworthy "finding" that TechSpot has published.

Remember that one can pick any random CPU benchmark and many of them won't exploit weaknesses in CPU cache size. That's wrong because that's not one typically uses microprocessors on a daily/normal basis.

Moreover most videogames aren't even multi-threaded. You really need to be ultra-dense to think that the number of CPU cores correlates with videogame performance.

Videogame performance is largely dictated by single thread performance relative to cache size.
 
Last edited:
I have 4 core cpu
Its chugging in some big AAA games
ima jump to 6 cores 12 threads soon
 
I moved from 6 slow cores (2600x) to 6 fast ones (5600x) and could not be happier.

The last time I had a quad core was 2016 and things were pretty choppy back then especially GTA V hitting 100% usage in the i5 3570k.
 
I had maybe 15-20 Intel CPU's over the last 15 years or so, and I came close a few times, but 5GHz was always "just" out of reach for me. Such a tease.. left me twisting in the wind a few times. At the time I wasn't willing to spend on a GPU, so I bought my 5900X instead :D But all core OC is too much of a bear, I can barely keep 4750 @ 1.45v in check for something like R23.. its just got too much beef to handle. My last run was 240w from this poor 105w CPU :rockout:
Im air cooled too, just look at the cpu and cooler in my sig rig, you'll be shocked, in fact I dont need 2 fans on the cpu cooler either.

I stress tested using Ryzen Blender then, never froze lol
 
Good ol' times...
While single core CPUs can't be considered adequate today, dual cores backed by an SSD (with enough RAM) can still be used to browse the web comfortably.
This is what I'm using daily :laugh:
View attachment 213162
Yep, know where your coming from. Still use my old FX-8350 for daily browsing too & even got an older i7-860 rig I use as a computer for the Kitchen whilst cooking & browsing! Great investments those were back in the day, still paying dividends today!

I think that article the thread links to should be more specific about what gaming engines benefit from more threads than anything else. Take the infamous creation engine from Bethesda. In FO4 as an example, HWiNFO indicates consistently all 12 threads on my 2600X working throughout the game, no matter where in that world the player is.
 
Seems like HWUB missed the perfect opportunity to test the i7-5775C. Granted, LGA1150 boards are scarce, but the fact that the 5775C keeps up with modern chips while the 4790K has fallen by the wayside speaks volumes about the importance of 3D cache. Broadwell had like zero OC headroom while 4790Ks were hitting 4.6GHz.

It's also the philosophy Intel themselves adopted for the high end quad cores of Tiger Lake / Tiger Lake H35, and the massive Willow Cove L3 seems to be doing well there despite only being quad cores.

Then they shot themselves in the foot with Rocket Lake, but I guess 24MB of L3 on 14nm wouldn't fit the LGA1200 package LOL

afaik Anandtech is the only one seriously testing 5775C lately (and they tend to gimp everything else by running rated memory speed only) but the point stands.
I was a great fan of the I7-5775C. I've used one for the past 5 years, before upgrading in May. But, now I honestly believe it is somewhat overrated. I had mine with a moderate overclock of 4.0 GHz, that would fit in a ~ 70 Watt power budget. Now, looking back, I was severely CPU limited in many games. I felt this especially, trying to play on a 155 Hz monitor.

Examples:
1. Shadow of the Tomb Raider: in extreme cases I had 50-60 FPS with the 5775 (with my current GPU / RTX 3070). With my current 10850K, in the same situation I have 120 FPS, with the GPU reaching its limits.
2. Doom Eternal: Super Gore Nest, I had frame rates of 80-90 FPS with almost max CPU load. Now I reach the limit of the monitor.

So, in spite of the 128 MB of L4 cache, which by the way had a ~ 50 GB/s bandwidth (not so great by modern standards), there were many instances where I felt limited by the performance of CPU. No amount of cache can help in these situations.

I've read the Anandtech article and I don't really agree with the conclusion. They did not show instances where the 4C/8T of the 5775C are at the limit, with the game selection and testing methodology they used.
 
Last edited:
As a gamer who went from quad cores for almost a decade(various i5 chips) to an affordable 8 core(2700X in my current Linux build), to my current 6 core(5600X W10 build). Which are 2 of the CPUs mentioned in the Techspot article itself. I can honestly say that just skipping a generation gives more performance uplift, and allows for actually noticeable improvements even with the same GPU. I was blessed to get everything at MSRP and even a few sales back when I built my current rig.
 
Last edited:
pours an ice cold bucket of reality onto all the fan boys who have been screaming "more cores equal better gaming" (you know who you are) on these forums for the past several years


I've seen the video version, but still a great test idea, with surprising results! :)
 
pours an ice cold bucket of reality onto all the fan boys who have been screaming "more cores equal better gaming" (you know who you are) on these forums for the past several years



if you look at 5900x review vs 5600x, it gets about 20 fps more in several titles. I think that is more to do with the 5900x being able to boost higher on 1-3 cores though than the 5600x is capable of... so I suppose that is what it boils down to in end, is speed of a few cores... I almost wish they would make a 4 core CPU that could do like 6-7 ghz... instead of always doing more cores... I bet we would get more FPS in games that way, but who knows.
 
if you look at 5900x review vs 5600x, it gets about 20 fps more in several titles. I think that is more to do with the 5900x being able to boost higher on 1-3 cores though than the 5600x is capable of... so I suppose that is what it boils down to in end, is speed of a few cores... I almost wish they would make a 4 core CPU that could do like 6-7 ghz... instead of always doing more cores... I bet we would get more FPS in games that way, but who knows.
Intel Core i1 9.3ghz 1 core with HT.
 
Intel Core i1 9.3ghz 1 core with HT.

lol, it would be interesting to see what that is actually capable of. prob bad actually. maybe a dual core at 8-9 ghz... would be interesting to see what that would look like in game benches. but yeah I wish we would stop the core wars, and stick with 6-8 cores max and higher clock goals...

the fact my 2500k did 5ghz 24/7 for like 7 years straight... is pretty dang sad we have not gained anything in terms of actual clocks in over a decade... i understand clock speed isn't everything, no need to go on that rant... just saying...
 
Back
Top