• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Is more than 8 cores still overkill for high end gaming for 1440P with RTX 4090

If you would buy new gen stuff I'd go with an i7 13700K or 7950X for that RTX 4090.
 
I am trying to future proof a build a bit so it can last video card upgrades and even drop in a CPU upgrade potentially? I know with AMD they stated they are supporting AM5 through 2025, but will that mean it will work with Zen 5?? If not it is almost pointless except 3D vcache upgrade. Cause its possible they still use same socket but require a new chipset like Intel has done before??

AMD wasn't specific but logic/common sense would tend to indicate that whatever they call their next CPU generation would also work on the AM5 platform. They have not actually stated what they are calling it, whether it's Zen 4+, Zen 5, or SuperBanana 6969.

They could change chipsets and break backwards compatibility.

No one here has a crystal ball and no one here will guarantee that whatever you buy today will work with everything AMD releases three years hence.

The past is no guaranteed predictor of the future but there are probably market forces that will pressure AMD to offer some sort of backwards compatibility. Hopefully that is enough for you because AMD isn't going to promise you anything more.
 
Anything more then 8 cores is still extreme overkill. Games dont use those cores (look at any ryzen 9 review), and likely never will in a realistic timeframe. Game consoles are still limited to 6c/12t for games, and you've got another 5 years before the next major generation. Unless that gen uses a ryzen 9, likely the next gen will still use 6c/12t.
 
But in reality in your opinion for only gaming including multi player if you are not doing any streaming or running intensive background tasks other than the basic and HWInfo64 to monitor temps and MSI AfterBurner to monitor frame rates, is there any tangible benefit in any games to more than 8 cores on your CPU.
A lot of the time not and i3 12100F is perfectly capable for example, but my point is you always want as much power as you can reasonably afford and want to afford. Sure i3 might be fine usually, but there may be few games that benefit from more cores. Even right now, there are games that benefit a bit from more cores, even if you get like 90% performance on i3

And if you think yes, how are the 7900X and 7950X?? Is there a penalty of having 2 CCD with cross latency that could severely dip performance if game threads swap CCDs or have to communicate with one another on different CCDs?? or no is it not at all an issue if a heavily threaded game is coded correctly??
Sorry, can't answer that, I don't have Ryzen. My last "modern" AMD chip was FX 6300.

of course I wish there was a CPU with more than 8 strong cores on a single ring/CCD. Last CPU with that was Intel Comet Lake series which had 10. And some of the Broadwell-E and Haswell-E had 10 on a ring as well. But those archs IPC is way way worse and outdated compared the modern CPUs so would not be an answer.
True, but there was an interesting aberration, Broadwell. It has L4 cache. Obviously, now it's ancient, but back then it was kinda like 5800X3D. Yet it didn't sell well at all, media didn't care, despite it often beating Skylake chips. Forget about it, it's just me being grumpy about decade old injustice.
 
X3D right now is “only” good for games.

And power efficiency too I guess.
 
If you want to test if you really need eight cores, go get a G3258 and stick it with the 4090.
 
If you want to test if you really need eight cores, go get a G3258 and stick it with the 4090.
That's a moronic recommendation. First of all it doesn't have good IPC anymore and if it will perform like crap, that says nothing about modern quad core cpu performance. Right now there's still not much need even for i5, i3 is fine, until you get something RTX 3080 or higher:

I would rather be interested in which games that i3 falls apart and you start needing i5, but that's still not an 8 core chips we are talking about.
 
That's a moronic recommendation. First of all it doesn't have good IPC anymore and if it will perform like crap, that says nothing about modern quad core cpu performance. Right now there's still not much need even for i5, i3 is fine, until you get something RTX 3080 or higher:

I would rather be interested in which games that i3 falls apart and you start needing i5, but that's still not an 8 core chips we are talking about.
You really don't know how to take a joke do you?
 
The one good thing about the big multi core cpus is their boost speed, hella fast compared to stock.
 
A lot of the time not and i3 12100F is perfectly capable for example, but my point is you always want as much power as you can reasonably afford and want to afford. Sure i3 might be fine usually, but there may be few games that benefit from more cores. Even right now, there are games that benefit a bit from more cores, even if you get like 90% performance on i3


Sorry, can't answer that, I don't have Ryzen. My last "modern" AMD chip was FX 6300.


True, but there was an interesting aberration, Broadwell. It has L4 cache. Obviously, now it's ancient, but back then it was kinda like 5800X3D. Yet it didn't sell well at all, media didn't care, despite it often beating Skylake chips. Forget about it, it's just me being grumpy about decade old injustice.


True though Intel did stick 10 P cores on Comet Lake which was released in May 2020 at the beginning of Pandemic. I wish they would release a 10 P core Alder Lake or Raptor Lake which I think they could instead of only option being those e-cores which I am not fond of at all. Though could have done both for those that want gaming and want a little extra leeway in case future games do start to benefit form more than 8 cores of which none currently do and for those that do not want to deal with hybrid arch.
 
In my experience, it seemed as though going from 8600K (6 core) to 12900KS (8 core) made a big difference on GTA V.
 
True though Intel did stick 10 P cores on Comet Lake which was released in May 2020 at the beginning of Pandemic. I wish they would release a 10 P core Alder Lake or Raptor Lake which I think they could instead of only option being those e-cores which I am not fond of at all. Though could have done both for those that want gaming and want a little extra leeway in case future games do start to benefit form more than 8 cores of which none currently do and for those that do not want to deal with hybrid arch.
There weren't any P cores back then, they were just cores.

It seemed as though going from 8600K (6 core) to 12900KS made a big difference on GTA V.
Weird, because it stutters badly, if it runs at too high fps.
 
There weren't any P cores back then, they were just cores.


Well yeah the only reason P cores became a thing is because of hybrid arch. They were strong cores for SMP though.

To me the e-cores are shunned, So Alder Lake and Raptor Lake are just 8 core CPUs as the P cores only cores that count to me as good cores that can be used without the hybrid arch crap.
 
This is what GTA V (2014) does with my CPU usage

Screenshot 2022-10-21 213803.png
 
Well yeah the only reason P cores became a thing is because of hybrid arch. They were strong cores for SMP though.

To me the e-cores are shunned, So Alder Lake and Raptor Lake are just 8 core CPUs as the P cores only cores that count to me as good cores that can be used without the hybrid arch crap.
The E cores are useful, it runs the background stuff.
 
The E cores are useful, it runs the background stuff.
Can vouch. My wife's 12600KF has prevented any issues with background tasks from stealing the P-cores. Even having four E-cores would help with Windows goodies for.. probably future chip.
 
Yet it didn't sell well at all, media didn't care, despite it often beating Skylake chips. Forget about it, it's just me being grumpy about decade old injustice.
That was because it was late, made in limited quantities & basically EOL about 6(9?) months later by 6700k ~ that was the first real sign of major node troubles for Intel, though originally 22nm was also late by a quarter or two IIRC.
 
Well yeah the only reason P cores became a thing is because of hybrid arch. They were strong cores for SMP though.

I don't think hybrid architecture is really a key point. Reviewers have disabled P-cores and E-cores for benchmarking purposes and still run the same software.

P-cores/E-cores really became a thing because efficiency is valued by some.

Apple lives by the performance-per-watt mantra because most of their business is iPhone; over 85% of the Mac sales are from notebooks. That's why they were really the first to widely market a device with differentiated CPU cores (yes, in the A-series SoC for iPhone).

Many people here at TPU (and other PC sites for that matter) ignore the fact that enterprise computing is a major influencer in how PC hardware develops.

CPU core differentiation (performance and efficiency) is being driven largely by organizations who also value performance-per-watt. The US federal government has power efficiency mandates that extend to computer equipment. It's not Joe Gamer who wants E-cores, it's the General Accounting Office purchasing agent who needs 5,000 desktop PCs from Dell, HP, etc.

When the operating system supports it and the task scheduler is properly configured, workloads will be directed to the more appropriate silicon. Apple does this pretty well with iOS/iPadOS/macOS. I think I read somewhere that Apple claimed that their Blizzard (efficiency) cores provide something like 80% of the performance of the Avalanche (performance) cores at a fraction of the power. Maybe my figures aren't exact but that's the point. Most mundane workloads can be handled by efficiency cores; the performance cores are waiting for those rarer instances when the system needs as much performance as it can get.

Because of Intel's botched migration from their 14nm process node, their power consumption skyrocketed which probably forced them to adopt P-cores and E-cores faster. But don't worry, AMD will likely have to implement them at some point if they want to keep being competitive for enterprise sales.

And remember Datacenter customers are all about performance-per-watt.
 
Last edited:
Hi,
Running a 1600.us + 4090 and nit picking core or in intel's case thread count is pretty insane :laugh:
 
Back
Top