• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Is more than 8 cores still overkill for high end gaming for 1440P with RTX 4090

True, very true. However, if a game cannot utilize more than say 4 cores, your CPU's performance is limited to the 4 best cores (plus a bit as other things are not competing). If a game can use 128 threads, a threadripper 3990WX is a better choice than a 7950X despite having ~40% less single core performance.
especially badly coded open world / multiplayer games for sure.

Here's cyberpunk before and after they fixed the AMD 6 core thread-detection bug on it. More cores allow you to weather poorly coded open world games (my favorite variety) much better.

1666402237984.png


Notice the 5600x losing to much lesser processors - really because the game detected the CPU improperly and spawned a bunch more threads than it needed to. 5800/5900/and 5950x all had the same issue but they were just better able to deal with that nonsense.

I can still bind my 12600k and peg it to 100% in certain areas - GPU usage drops to 80% and it feels like gsync just switched off. Everything is just a bit jittery with frame pacing noticeably struggling to stay constant -- an issue I did not have on the 10850K go figure.
 
Last edited:
For what? Is there some certain performance threshold the CPU needs to meet?

Some CPUs play games better than others. Go look at reviews for the AMD 7950X and Intel 13900, they're the newest flagship models from both companies. The gaming benchmarks should include other processors including older generations as well as the 5800X3D. Decide what performance you want at what budget.

Whether the L3 cache is 64MB, 96GB, 128GB or 69MB doesn't really matter as long as the CPU part in its entirely provides the performance you are seeking.

There is always a new technology around the corner that will run Program X faster than yesterday's tech. Will the 7800X3D outperform the 5800X3D? It better otherwise AMD engineering didn't do their job.

No one here knows the 7800X3D's future specs nor its price and how it will compete with Intel products that they release between now and the 7800X3D debut.

Some people find it fun to speculate about unannounced products but it doesn't give you any data points to make an informed purchase decision NOW.

And after the 7800X3D is released, the 14900 and later 8800X3D will probably beat it.

Just buy what will provide adequate performance for your specific usage case. If it doesn't exist, then keep waiting... Nothing wrong with that, most people don't marry the first non-family member they meet.

And even if you can find a CPU that provides enough performance, just remember that at some point, some evil game developer will write a game that will require even more performance than what your gear can do.
Best single threaded 8 core 16 thread CPU with best IPC for gaming?

How do you think the 7800X3D will compare to a 13900K with e-cores disabled and overclocked to 5.8 to 6GHz all P cores with 7200MHz DDR5.

Obviously tuning a 7800X3D with RAM will have far less impact as all Zen 4 seem to be limited to 6000MHz DDR5 and overclocking headroom with consistency is far less reliable on AMD than Intel CPUs.

So stock 7800X3D vs 5.8GHz 13900K e-cores off with 7200MHz DDR5 RAM. I know we will not know until Ryzen 7800X3D hits, but if you had to guess what will be better and will one be a lot better than the other or will they be close??
 
Best single threaded 8 core 16 thread CPU with best IPC for gaming?

(truncated for brevity)

I know we will not know until Ryzen 7800X3D hits, but if you had to guess what will be better and will one be a lot better than the other or will they be close??

Here's the difference between you and me: I don't try to guess.

All I know is that no matter how good the 7800X3D ends up something else will eventually surpass it. It might be take six weeks or six months. I have no idea. But the 7800X3D's reign on the gaming throne will be counted in months not years. If Moore's Law is dead, it's dead for everyone.

I own a Ryzen 5900X (in the build in my System Specs). Is it the best CPU for gaming? Not anymore. It used to be second only to the 5950X. Today there are quite a few that outperform the 5900X in some benchmarks. Would I get better performance with the 5800X3D (which came out after I purchased my 5900X)? Why yes, I probably would but I don't worry about it.

That build is for gaming at 4K anyhow, I'm completely GPU bound for recent titles.

Amusingly I have a secondary gaming build: Ryzen 5600X + GeForce RTX 3060 Ti for gaming at 1440p. It works fine for all but the most demanding titles. So despite the much lower spec-ced hardware, it does just fine and I have plenty of fun using both systems.

For the 3060 Ti machine, I go to a game's graphics settings, press the Ultra button then dial most sliders one notch. I despise effects that imitate optical deficiencies in camera equipment so I try to disable most of those: motion blur, chromatic aberration, lens flare, lens dirt. I'll even minimize depth of field.

Oh yeah, I don't play with framerate counters running. No RTSS stats overlay. Some games even have in-game fps counters; those are turned off too. I really don't care if I'm getting 113 or 118 fps. My old and tired eyes can't tell the difference.

If the game isn't stuttering then I can concentrate on gameplay. There's a lot more to game enjoyment than frames per second. I am just as happy to play Legend of Zelda: Breathe of the Wild on my docked Nintendo Switch outputting 1080p at whatever frame rate it pushes out as whatever AAA title my 3080 Ti tackles on my 4K@120Hz television.
 
Last edited:
Here's the difference between you and me: I don't try to guess.

All I know is that no matter how good the 7800X3D ends up something else will eventually surpass it. It might be take six weeks or six months. I have no idea. But the 7800X3D's reign on the gaming throne will be counted in months not years. If Moore's Law is dead, it's dead for everyone.

I own a Ryzen 5900X (in the build in my System Specs). Is it the best CPU for gaming? Not anymore. It used to be second only to the 5950X. Today there are quite a few that outperform the 5900X in some benchmarks. Would I get better performance with the 5800X3D (which came out after I purchased my 5900X)? Why yes, I probably would but I don't worry about it.

That build is for gaming at 4K anyhow, I'm completely GPU bound for recent titles.

Amusingly I have a secondary gaming build: Ryzen 5600X + GeForce RTX 3060 Ti for gaming at 1440p. It works fine for all but the most demanding titles. So despite the much lower spec-ced hardware, it does just fine and I have plenty of fun using both systems.

For the 3060 Ti machine, I go to a game's graphics settings, press the Ultra button then dial most sliders one notch. I despise effects that imitate optical deficiencies in camera equipment so I try to disable most of those: motion blur, chromatic aberration, lens flare, lens dirt. I'll even minimize depth of field.

Oh yeah, I don't play with framerate counters running. No RTSS stats overlay. Some games even have in-game fps counters; those are turned off too. I really don't care if I'm getting 113 or 118 fps. My old and tired eyes can't tell the difference.

If the game isn't stuttering then I can concentrate on gameplay. There's a lot more to game enjoyment than frames per second. I am just as happy to play Legend of Zelda: Breathe of the Wild on my docked Nintendo Switch outputting 1080p at whatever frame rate it pushes out as whatever AAA title my 3080 Ti tackles on my 4K@120Hz television.


Of course something else will eventually surpass it. Though after 7800X3D there is nothing new coming until at least late 2023 from either AMD or Intel except possible refreshes. So at least 1 year of best of CPU.
 
Of course something else will eventually surpass it. Though after 7800X3D there is nothing new coming until at least late 2023 from either AMD or Intel except possible refreshes. So at least 1 year of best of CPU.

What happens if AMD releases a 7900X3D after the 7800X3D? And then maybe a 7950X3D?

Or if Intel decides to put out an all P-core chip with huge L3 cache?

There's no guarantee that won't happen.
 
Are there any games that actually meaningfully benefit from more than 8 cores??
It really depends on the game. A high performance 8core CPU should last you for a while. A Ryzen 5900X would be a rock solid option, but right now the Ryzen 5800X3D is the gaming king. Get one of those and you will not need to upgrade your PC anytime soon(read 2.5 or 3 years). It's that good!
 
You honestly probably can't go wrong with anything made in the last 2 years if you are looking at upper midrange and up. Both Intel and AMD are rockin it.
 
What happens if AMD releases a 7900X3D after the 7800X3D? And then maybe a 7950X3D?

Or if Intel decides to put out an all P-core chip with huge L3 cache?

There's no guarantee that won't happen.

Yes true. Though AMD 7900X3D will be 2 6 core CCDs with 96MB L3 cache each and thus inter latency penalty unless you lock the game threads to one CCD??

Intel yeah they could. Though since Alder Lake, I have been waiting for a 10 P core part and here we are almost 1 year later and nothing.

Or an 8 P core chip with huge L3 cache. Is that are their roadmap. Not saying it cannot happen, but do you ever remember a CPU coming out in a few months that was not either company's roadmap.

AMD does have 3D cache versions of Ryzen 7000 on their roadmap.

Intel does not appear to have anything other than a 13900KS and of course lower tier Raptor Lake non-K SKUs until Meteor Lake and even Meteor Lake seems very unknown other than late 2023 release unless it is pushed back which given meager details seems likely.

Though I would love it if Intel came out with a 10 P core Raptor Lake and more L3 cache or even an 8 P core Raptor Lake with double or triple L3 cache. A drop in upgrade unlike Meteor Lake.
 
I vote Intel leave the Core naming scheme behind, since they have nothing to do with any previous Core iterations :D

Less syllables are gooder ;)
 
Yes true. Though AMD 7900X3D will be 2 6 core CCDs with 96MB L3 cache each and thus inter latency penalty unless you lock the game threads to one CCD??

Intel yeah they could. Though since Alder Lake, I have been waiting for a 10 P core part and here we are almost 1 year later and nothing.

Or an 8 P core chip with huge L3 cache. Is that are their roadmap. Not saying it cannot happen, but do you ever remember a CPU coming out in a few months that was not either company's roadmap.

AMD does have 3D cache versions of Ryzen 7000 on their roadmap.

Intel does not appear to have anything other than a 13900KS and of course lower tier Raptor Lake non-K SKUs until Meteor Lake and even Meteor Lake seems very unknown other than late 2023 release unless it is pushed back which given meager details seems likely.

Though I would love it if Intel came out with a 10 P core Raptor Lake and more L3 cache or even an 8 P core Raptor Lake with double or triple L3 cache. A drop in upgrade unlike Meteor Lake.

If you want the best then just wait for the x3d variants. meteor lake is the next big intel release and it's a ways out still.
 
You honestly probably can't go wrong with anything made in the last 2 years if you are looking at upper midrange and up. Both Intel and AMD are rockin it.
This is a good point!
I vote Intel leave the Core naming scheme behind, since they have nothing to do with any previous Core iterations :D

Less syllables are gooder ;)
Also a good point!
 
Exceptionally few games, however, every single GPU flagship would refer to the card as a miracle when they launched. The 3090, 2080 Ti, 1080 Ti... all got the same treatment. The truth is, the RTX 4090 is not special. It is extremely powerful, but it will age just like all GPUs that came before it, and it too, will lose its throne.

Don't worry yourself with bottlenecks too much. A 5800X3D is probably going to run the RTX 50 series cards perfectly well, especially at the high resolutions you should be running these GPUs with. Cheers
 
If you don't want to wait pick up a 5800X3D, 13700k, or 7700X.... If you don't mind waiting pick up whatever 8 core X3D chip AMD releases.

No need to make it complicated all these cpu will last you years. The only upside of AM5 is you'll likely get a decent in socket upgrade in 2025.

In order of importance what matters most for games

1. single threaded performance
2. L2/L3 cache speed/amount
3. DDR compatibility speed/latency although more cache can reduce this necessity
4. Core count

As an example if you stuck the same amount of cache that comes on the 13900k on the 13600k and ran them at the same frequency they'd likely be within margin of error of each other.
 
Last edited:
If you do go with the 13 series, you really don't have to disable the e-cores. I wouldn't be too fixated on their presence.

Edit then there is this:

But sounds like something that needs verification.
 
Last edited:
If you do go with the 13 series, you really don't have to disable the e-cores. I wouldn't be too fixated on their presence.

Edit then there is this:

But sounds like something that needs verification.


If I want to overclock and stay on air, wouldn't it be much less heat output with e-cores disabled.

And plus I will be running WIN10.

Plus how are e-cores less of a nuisance with 13th gen than 12th gen. They are still there and scheduling issues and such. Only thing they do not seem as bad at is dragging ring clock down, but other than that the same. And plus I am sure ring can still be clocked much higher with them off like over 5GHz where as with 12th gen I read like 4.8GHz with off and 4.2GHz max with on.
 
I am trying to future proof a build a bit so it can last video card upgrades and even drop in a CPU upgrade potentially? I know with AMD they stated they are supporting AM5 through 2025, but will that mean it will work with Zen 5?? If not it is almost pointless except 3D vcache upgrade. Cause its possible they still use same socket but require a new chipset like Intel has done before??
Still 8 physical cores are fine. 8 threads not so much, if you get half from SMT ;)

6-12 threads is sweet spot right now and if it goes up it will be 8c16t at best. Still fine with 8c.

Been on 8700K now for quite awhile and nothing suffers. 2025 is nowhere near the life expectancy, think 2027-2028, rather. My CPUs generally do fine for gaming for 5-7 years, it was like that since the quad core era and you wil be fine with midrange even then.

And note I do play cpu focused games too, grand strategy and stuff.
 
Last edited:
my second rigg is a 7700k with a gtx1080 and dos 1440p lovely not ultra-settings but still what id say to me is eye candy and all on 4 cores @5ghz.
 
If I want to overclock and stay on air, wouldn't it be much less heat output with e-cores disabled.

And plus I will be running WIN10.

Plus how are e-cores less of a nuisance with 13th gen than 12th gen. They are still there and scheduling issues and such. Only thing they do not seem as bad at is dragging ring clock down, but other than that the same. And plus I am sure ring can still be clocked much higher with them off like over 5GHz where as with 12th gen I read like 4.8GHz with off and 4.2GHz max with on.
I don't think you will be overclocking on air with the 8 core parts, these parts are too hot for that.

I am on high end aircooling and Win 10 and 12900KS, the e core isn't an issue for me, I haven't felt the need to disable it, the processor just tears through gaming workloads with ease. Contrary to overclocking on air, I underclock and undervolt on air, I run the 12900KS with multiplier at 50x and the load voltage 1.18vcore (using offset), in gaming workloads the temperatures are around 65-70C. They are quite reasonable on aircoolers if you reduce the clock speed, the performance drop isn't noticeable for gaming.
 
I don't think you will be overclocking on air with the 8 core parts, these parts are too hot for that.

I am on high end aircooling and Win 10 and 12900KS, the e core isn't an issue for me, I haven't felt the need to disable it, the processor just tears through gaming workloads with ease. Contrary to overclocking on air, I underclock and undervolt on air, I run the 12900KS with multiplier at 50x and the load voltage 1.18vcore (using offset), in gaming workloads the temperatures are around 65-70C. They are quite reasonable on aircoolers if you reduce the clock speed, the performance drop isn't noticeable for gaming.


Even only 8 cores and e-cores shut down with a well binned chip and a top tier air cooler like Noctua NH-D15??

WHen I had it before I sold it then now want to get back into gaming after I will have lots of time again and should in forseeable future do to changes in my life once again the past year, I was able to easily cool a 12700K 5GHz all core with e-cores shut off with a Noctua NH-D15S and tmeps peaked mid 80s Cinebench R23. Of course they get to 100C with Prime95 Blend on Small FFT part, but that test is beyond torture.

Even Linpack XTREME temps peaked in low 90s and only averaged low 80s. Real world usage never goes above 70s and usually not above high 60s.

So would I really struggle to cool a Raptor Lake 13700K or 13900K with e-cores shut down and all core clocked to 5.5 to 6GHz given these things are much better binned and can clock so much higher if I was able to cool a 12700K 5GHz ring 4.8GHz easily on a good air cooler.
 
Last edited:
Thanks for the explanation. That's fine if your attention span is only one hour for gaming and you mostly get your kicks playing with the hardware. I never tell anyone that they must play video games. If you want to use an RTX 4090 as a doorstop, you are free to do so.

Hell, my own interest in video games is limited and my sessions are usually over in a couple of hours.

Why don't you just run benchmarks instead? Some are very pretty and you don't have to worry about your character dying and inconveniently interrupting your admiration of the RTSS statistics overlay.
I downloaded Cyberpunk for that, it makes my RX 580 cry and now robs my time, but man, it's pretty.

You honestly probably can't go wrong with anything made in the last 2 years if you are looking at upper midrange and up. Both Intel and AMD are rockin it.
i9 12900T exists, 35 watts, 1.4/1.0 GHz base speed. :fear:
 
Even only 8 cores and e-cores shut down with a well binned chip and a top tier air cooler like Noctua NH-D15??

WHen I had it before I sold it then now want to get back into gaming after I will have lots of time again and should in forseeable future do to changes in my life once again the past year, I was able to easily cool a 12700K 5GHz all core with e-cores shut off with a Noctua NH-D15S and tmeps peaked mid 80s Cinebench R23. Of course they get to 100C with Prime95 Blend on Small FFT part, but that test is beyond torture.

Even Linpack XTREME temps peaked in low 90s and only averaged low 80s. Real world usage never goes above 70s and usually not above high 60s.

So would I really struggle to cool a Raptor Lake 13700K or 13900K with e-cores shut down and all core clocked to 5.5 to 6GHz given these things are much better binned and can clock so much higher if I was able to cool a 12700K 5GHz ring 4.8GHz easily on a good air cooler.
Alderlake at 5Ghz is easy on air cooler. It seems progressively harder with each +100 MHz above that on air without any throttling. I am running the 12900KS 5GHz with the e cores on with the NHD15S, so our setups aren't that different. I think Raptor is the same deal but it was just released so maybe there will be more info about it. Might be worth using the aftermarket mounting bracket which is claimed to produce an ~8C drop, that might be significant enough for an additional +100 MHz.
 
I doubt you'll be able to run a 13900k at 6ghz on air cooling even with the E cores disabled
 
Alderlake at 5Ghz is easy on air cooler. It seems progressively harder with each +100 MHz above that on air without any throttling. I am running the 12900KS 5GHz with the e cores on with the NHD15S, so our setups aren't that different. I think Raptor is the same deal but it was just released so maybe there will be more info about it. Might be worth using the aftermarket mounting bracket which is claimed to produce an ~8C drop, that might be significant enough for an additional +100 MHz.


Oh yes getting any past 5GHz was so much harder each 100MHz more needed a big vcore jump and temps through the roof.

Though with Raptor Lake I wonder if it has another 500-600MHz or even 1000MHz more headroom without eh thermal wall.

I doubt you'll be able to run a 13900k at 6ghz on air cooling even with the E cores disabled


What is fastest on air you think I could do it. 5.8GHz?? 5.6GHz?? I see Tomshardware had 13900K 5.6GHz all P core on standard AIO and Buildzoid had 1 3600K at 5.6GHz all P core with a mediocre 240mm AIO doing just fine. And the NH-Da15S is comparable I believe to mediocre 240 AIO s or maybe even a little better.

And isn't the 13600K worse binned than the 13700K and 13900K. Though 2 more P cores to deal with.
 
Oh yes getting any past 5GHz was so much harder each 100MHz more needed a big vcore jump and temps through the roof.

Though with Raptor Lake I wonder if it has another 500-600MHz or even 1000MHz more headroom without eh thermal wall.




What is fastest on air you think I could do it. 5.8GHz?? 5.6GHz?? I see Tomshardware had 13900K 5.6GHz all P core on standard AIO and Buildzoid had 1 3600K at 5.6GHz all P core with a mediocre 240mm AIO doing just fine. And the NH-Da15S is comparable I believe to mediocre 240 AIO s or maybe even a little better.

And isn't the 13600K worse binned than the 13700K and 13900K. Though 2 more P cores to deal with.

You should definitely match the 240 aio unless he was blasting 3000 RPM fans with it. Kinda depends what you plan on doing with it I guess and if you are ok removing the 100C Tjmax in bios. I almost want you to get one just to report back to us how hard you can push it. Maybe @ir_cow could test this not sure if he has a D15S though but maybe he has a similar cooler.

You may want to wait for the KS model if you plan on doing something like this it should be much better binned than the vanilla K model.

And yes while the 13600k is typically worse than the i7 and i9 models that isn't always the case someone could get lucky with the i5 and get a golden sample silicon lottery is still a thing with all these chips.
 
Last edited:
Back
Top