• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Is Intel going to deliver a processor/chipset worth waiting for?

I'm comparing the 13600kf to the 7800x 3d. According to this very website the 3d is 10% faster at 1080p with a 4090, and pretty much identical in higher resolutions. Also the i5 is faster in every other task.

The 3d was for the most part at 420 to 450 euros while the 13600kf was usually at or below 300. You ignored the obvious value and bought the 3d instead, yet here you are talking about the value of the 7600x. Okay...

Oh yeah, cause that was so obvious. What a facking troll reply. Regardless, in actual cpu limited scenarios the gap will be alot bigger than 10%. And the 13600kf is 2300 dkk here, and the 7800x3d is 2900 dkk - not a 50% gap.

7600x is the same speed in games as the 13600kf and alot cheaper.


 
I'm comparing the 13600kf to the 7800x 3d. According to this very website the 3d is 10% faster at 1080p with a 4090, and pretty much identical in higher resolutions. Also the i5 is faster in every other task.
Nobody can deny that 7800X3D is an excellent one trick pony. Is it a good universal CPU? Not at all.

I must say I have not read this entire thread, but to the title question I would reply, that Intel first needs to demonstrate and prove, that their product is worth buying, because the last product really worth buying from them was Alder lake, which brought significant imrovement in performance and was notably better than AMD competition at that time (not in energy efficiency).
 
Oh yeah, cause that was so obvious. In actual cpu limited scenarios the gap will be alot bigger than 10%.

7600x is the same speed in games as the 13600kf and alot cheaper.


How much more cpu limited can you get, TPU tested 720p with a 4090, average difference between the 3d and the 13/14 600k is 11%. I mean come on now man....

I mean I'm willing to test even lower resolutions for you, I'll turn off 2 pcores on my 12900k and simulated the 13600k. Wanna drop to 480p last of us?
 
Yeah....

Deer Popcorn GIF
 
Nobody can deny that 7800X3D is an excellent one trick pony. Is it a good universal CPU? Not at all.

I must say I have not read this entire thread, but to the title question I would reply, that Intel first needs to demonstrate and prove, that their product is worth buying, because the last product really worth buying from them was Alder lake, which brought significant imrovement in performance and was notably better than AMD competition at that time (not in energy efficiency).
I don't know man the way I'm looking at it, as an owner of a 12900k the only reasonable upgrades would be the 7950x 3d and the 14900k. If someone suggested me to upgrade to the 7800x 3d Id just laugh them out of the room, it's a huge downgrade for me.
 
How much more cpu limited can you get, TPU tested 720p with a 4090, average difference between the 3d and the 13/14 600k is 11%. I mean come on now man....

I mean I'm willing to test even lower resolutions for you, I'll turn off 2 pcores on my 12900k and simulated the 13600k. Wanna drop to 480p last of us?

It's 10% on AVERAGE, cause it varies on a game per game basis - some games are never cpu limited, regardless of settings, while others are massively cpu limited. And in the ones that are actually cpu limited, the 7800x3d roflstomps all over the cpu's that are "on average" 10% slower.

borderlands-3-1280-720.png


cyberpunk-2077-1280-720.png


elden-ring-1280-720.png


But it's quite obvious at this point that you are just trolling...
 
I don't know man the way I'm looking at it, as an owner of a 12900k the only reasonable upgrades would be the 7950x 3d and the 14900k. If someone suggested me to upgrade to the 7800x 3d Id just laugh them out of the room, it's a huge downgrade for me.
I am running a slow eco efficient 14900K (I think now at 5200/4200 MHz), do not feel need to buy anything alse on the market now and honestly I have doubts about what Intel will bring as a next CPU. I will need to see it first to believe.

I do not believe my 14900K would last long with default settings. The insane power draw and frequencies Intel sells some CPUs with should be reminded frequently so that people do not destroy their CPUs for no good reason.
 
1280x720 :D
 

Are you one of those intelligent people that don't understand why comparing cpu performance should be done while eliminating the gpu bottleneck ?
 
It's 10% on AVERAGE, cause it varies on a game per game basis - some games are never cpu limited, regardless of settings, while others are massively cpu limited. And in the ones that are actually cpu limited, the 7800x3d roflstomps all over the cpu's that are "on average" 10% slower.

borderlands-3-1280-720.png


cyberpunk-2077-1280-720.png


elden-ring-1280-720.png


But it's quite obvious at this point that you are just trolling...
Of course it's 10% on average, just like the 5% you quoted between the 7600x and the 13600k was the average. To make your argument, averages were fine, when I'm copying your argument averages are misleading? Lol man

Easy example, in AOE according to TPU the 13600k is 30% faster than the 7600x. See, I can find extreme scenarios too. You really are using different metrics to suit your argument every time, and then accuse me of trolling when you are the one fully engaging on trollish behavior.
 
Are you one of those intelligent people that don't understand why comparing cpu performance should be done while eliminating the gpu bottleneck ?

I know right, but come on man....
 
Of course it's 10% on average, just like the 5% you quoted between the 7600x and the 13600k was the average. To make your argument, averages were fine, when I'm copying your argument averages are misleading? Lol man

Easy example, in AOE according to TPU the 13600k is 30% faster than the 7600x. See, I can find extreme scenarios too. You really are using different metrics to suit your argument every time, and then accuse me of trolling when you are the one fully engaging on trollish behavior.

The higher fps a cpu is able to get, the more the gpu bottleneck comes into play - hence why you only see the real difference between the 7800x3d and the rest in games that are not gpu bottlenecked at the settings used, while the differences with midrange cpus are apparent in nearly all games. And those are not extreme examples - microsoft flight simulator is an extreme example. These just shows the real difference between the 7800x3d and the rest.

Yes, AOE and tlou are 2 of the games that can be counted on one hand that actually makes use of the E cores... i wonder why you are so eager to compare exactly that... however, fact of the matter is that most games don't even fully use 8 cores, hence why there is less than a 2% difference between the 7600x and 7700x.

I know right, but come on man....

Come on what ? If you are complaining about low res when testing cpu performance, then you clearly haven't understood the purpose.
 
Is Intel going to deliver a processor chipset worth waiting for..

You clearly missed the thread title....
 
The higher fps a cpu is able to get, the more the gpu bottleneck comes into play - hence why you only see the real difference between the 7800x3d and the rest in games that are not gpu bottlenecked at the settings used, while the differences with midrange cpus are apparent in nearly all games. And those are not extreme examples - microsoft flight simulator is an extreme example. These just shows the real difference between the 7800x3d and the rest.

Yes, AOE and tlou are 2 of the games that can be counted on one hand that actually makes use of the E cores... i wonder why you are so eager to compare exactly that... however, fact of the matter is that most games don't even fully use 8 cores, hence why there is less than a 2% difference between the 7600x and 7700x.
AOE is age of empires, I don't think it really uses the ecores. But doesn't really matter, that's not the point. When you claimed that the average difference is 5% I accepted it, since that's the average. When I told you the difference is 10% you jumped me and called me a troll cause of some extreme cases. The same extreme cases exist for the 7600x too.

Have you tried cyberpunk with RT / PT on that thing? Tom's dinner hits 100% utilization and frames drop to as low as 50ies. If you want to use extreme cases then you can't tell me that the difference in gaming between a 13600k and a 7600x is 5%.

Most games actually use ecores, I have no idea why you keep suggesting otherwise. So far I've tested warzone, cyberpunk, tlou, spiderman remastered, starfield, they all work better with ecores on. Specifically on warzone 1% lows went from 120 all the way up to 190 when ecores were turned on.
 
AOE is age of empires, I don't think it really uses the ecores. But doesn't really matter, that's not the point. When you claimed that the average difference is 5% I accepted it, since that's the average. When I told you the difference is 10% you jumped me and called me a troll cause of some extreme cases. The same extreme cases exist for the 7600x too.

Have you tried cyberpunk with RT / PT on that thing? Tom's dinner hits 100% utilization and frames drop to as low as 50ies. If you want to use extreme cases then you can't tell me that the difference in gaming between a 13600k and a 7600x is 5%.

Most games actually use ecores, I have no idea why you keep suggesting otherwise. So far I've tested warzone, cyberpunk, tlou, spiderman remastered, starfield, they all work better with ecores on. Specifically on warzone 1% lows went from 120 all the way up to 190 when ecores were turned on.

cyberpunk-2077-1280-720.png
 
Isn't it obvious that he is not using rt and tests on a very light area? A 12900k does not get anywhere near 170 fps at heavy areas with RT on. More like 90-100.

That obviously wasn't the point - you claim that E-waste cores makes cyberpunk runs better, his testing clearly shows otherwise.
 
That obviously wasn't the point - you claim that E-waste cores makes cyberpunk runs better, his testing clearly shows otherwise.
Obviously if the test is done in a very light area then extra cores won't do anything, do I really have to explain that? I can find areas on TLOU that the ecores don't do anything either cause they are very light.

It's common knowledge for Intel users that for maximum gaming perofmenace you turn off HT and leave ecores on. Anything else is just suboptimal. But if you wanna keep arguing, whatever, how can I convince you?
 
Are you one of those intelligent people that don't understand why comparing cpu performance should be done while eliminating the gpu bottleneck ?
That obviously wasn't the point - you claim that E-waste cores makes cyberpunk runs better, his testing clearly shows otherwise.
You had a different word there in place of "intelligent people". You also insulted most cores in my home PC CPU.

Why dont you go for a walk for a while to calm down?
 
I am running a slow eco efficient 14900K (I think now at 5200/4200 MHz), do not feel need to buy anything alse on the market now and honestly I have doubts about what Intel will bring as a next CPU. I will need to see it first to believe.

I do not believe my 14900K would last long with default settings. The insane power draw and frequencies Intel sells some CPUs with should be reminded frequently so that people do not destroy their CPUs for no good reason.
You'd get better performance with comparable efficiency with something like 5600/3200 MHz, or a per P core OC depending on threads used, like 6, 6, 5.9, 5.8, 5.6, 5.6, 5.5, 5.5, E cores are most efficient in power draw @3.2 GHz.

You can also turn off hyperthreading in BIOS and lock processes to specific cores for further performance using Process Lasso (ring can be clocked higher and so can RAM/general OC with HT off, not necessary with 16 E cores anyway). E.g. all background tasks/messaging/steam overlay etc to E cores, everything else for P cores, perhaps with two set aside for background tasks you want done as priority. That way you get more ST performance without compromising efficiency.

Per core/all core OC is the only thing I miss on my X3D. Locked clocks are great and my current scientific work doesn't require MT, though that may change in future.

Oh, Intel validates their CPUs using advanced accelerated wear (higher temperature/voltage/heat cycles than they'll ever see in normal use) to last approx 20 years with stock settings under typical conditions (including at their thermal limit of 100/115 C), so don't worry about that.
 
Last edited:
Since i got my 12700k i have just left it as is and not turned off the E-cores. I have not found no detriment to having them enabled. Funny thing is how people who hate them don't accept that eventually all CPU's from Intel and AMD will have some form of BIGlittle core setup, get used to it.
 
Yes, AOE and tlou are 2 of the games that can be counted on one hand that actually makes use of the E cores... i wonder why you are so eager to compare exactly that... however, fact of the matter is that most games don't even fully use 8 cores, hence why there is less than a 2% difference between the 7600x and 7700x.
TPU review shows that "e-cores enabled" is faster on average...it would be interesting to see why this happens, I would have argued that the e-cores are faster because they handle background tasks, but my understanding is that wizard is using a very barebone configuration of windows for reviews anyway
RTX 4090 & 53 Games: Core i9-13900K E-Cores Enabled vs Disabled Review - Benchmark Results | TechPowerUp
 
You'd get better performance with comparable efficiency with something like 5600/3200 MHz, or a per P core OC depending on threads used, like 6, 6, 5.9, 5.8, 5.6, 5.6, 5.5, 5.5, E cores are most efficient in power draw @3.2 GHz.
...
Per core/all core OC is the only thing I miss on my X3D. Locked clocks are great and my current scientific work doesn't require MT, though that may change in future.

Oh, Intel tests their CPUs to last approx 20 years with stock settings, so don't worry about that.
I do not want any high frequencies to avoid high voltage.

E-cores at such a low frequency do not perform well.

How does per core setting work in normal AMD CPUs? With Intel you can limit individual P cores and groups of four E cores to different frequencies.

That thing with 20 years CPU life at stock settings is simply not true anymore and INTEL LIES. Run your 14900K at stock settings, load it heavilly, let it run at thermal limit and see what will happen...
 
Back
Top