• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

RTX 4090 & 53 Games: Core i9-13900K E-Cores Enabled vs Disabled

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,747 (3.75/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
In this week's TPU50 Megabench we're testing whether you can unlock additional gaming performance by disabling the E-Cores on a Raptor Lake Core i9-13900K processor. For our benchmarks we used the mighty GeForce RTX 4090, with 53 games at three resolutions.

Show full review
 
Nice review, I wonder what difference it would make with many (background) applications open. I'd expect that e-cores enabled would fare much better given they can do the other tasks and the p-cores the game.
 
@W1zzard Since overall performance is virtually the same, with a few outliers in either direction, did it have any positive, negative, or game dependant affect on power comsumption, or maybe 1(5) % lows?

Since the average user will have more background processes running, I would venture a guess that in the real world usage, it would tilt toward e-cores on rather than off if it even has an impact.
 
The e cores to Intel is what the chiplet is for AMD. A very good investment that the opposite should adopt in order to stay competitive.

I wasn't aware that anybody was worried about the e cors in games...

Anyway, 13900k is a bad investment for gaming and should be avoided if no other tasks are in use. Put your money on the GPU.

But it shows nicely that you can peacefully grab 13700 or better, 13600, for gaming only with e cors "on".
 
I always left all cores enabled since my Alder Lake upgrade, no plans to disable cores, for me it runs just fine the way it is @ stock.
 
Very very interesting, with e-cores disabled, Âżwouldn't a 13900k be essentially the same as a i7-13700k?, save for some minor clock differences that can be shaved away by overclocking and 6MB of more L3 (i7 has 5.4g max vs 5.8G but both have the same max turbo p core of 5.4, the rest is tb 3.0 or IVB).

If you don't want to use E-cores you could save a ton of money by going to an i7 and overclocking the snot out of it.

A permanently disabled i7 platform with DDR5 is very enticing price-wise
 
I found my frametimes are better with e cores enabled. Even with farcry 5, the average is a bit lower, but the game feels smoother, not as jittery.
 
The e cores to Intel is what the chiplet is for AMD. A very good investment that the opposite should adopt in order to stay competitive.

I wasn't aware that anybody was worried about the e cors in games...

Anyway, 13900k is a bad investment for gaming and should be avoided if no other tasks are in use. Put your money on the GPU.

But it shows nicely that you can peacefully grab 13700 or better, 13600, for gaming only with e cors "on".
Maybe it is just the opposite? You can get a CPU without e-cores just for gaming and don't lose any performance. E-cores are made to allow Intel to stay competitive at multithreading and not lose very badly at efficiency.
 
Last edited:
The results are as expected. A small handful of games don't play great with ecores, but the majority do, even in Windows 10.

I'd assume the final step in this review chain is 13900k & 4090 with Windows 10 vs Windows 11 with ecores enabled.
 
Very very interesting, with e-cores disabled, Âżwouldn't a 13900k be essentially the same as a i7-13700k?, save for some minor clock differences that can be shaved away by overclocking and 6MB of more L3 (i7 has 5.4g max vs 5.8G but both have the same max turbo p core of 5.4, the rest is tb 3.0 or IVB).

If you don't want to use E-cores you could save a ton of money by going to an i7 and overclocking the snot out of it.

A permanently disabled i7 platform with DDR5 is very enticing price-wise
Yes but I've seen the i7 13700k max oc at 5.5 ghz so the p cores on the i9 13900k might be binned for better clocks.
 
Gaming performance is only influenced by the overall IPC of 1 core and if you have 5-7 such super fast cores you will be flying in the games.
Gaming is really ridiculous load for modern CPUs - it demands infinite IPC but at the same time the overall CPU resources utilisation is extremely low - like 10-15-20% maximum, while everything else sleeps.
 
To be fair the OS basically assigns priority control like using rough sandpaper with fractional priority control of 100% CPU utilization in limited fractional values.
 
Last edited:
Do you expect a little more gains with e-cores enables on Win11?
 
Results don't make sense to me. The games that run faster with e cores enabled are more likely just random variance? Far Cry being faster really makes no sense to me.
 
More intelligently assigning cores along with more granular priority usage away from first core or sequentially accessed cores I think is key to get the most out of additional cores. Sometimes you need a bit of middle ground as a example audio where you can run into readily audible issues is setup improperly in exaggerated ways and under heavier workloads. That's kind of true or programs in general it's just with audio it's audible to the end user when something is wrong in the form of audio glitches.

I'd say software is far from perfect at fully leveraging hardware well as a rule of thumb. I wouldn't call it poor it can be far from exceptional at the same time on average. I would say it's improved a lot in more recent years, but that's because core and thread count has gone up and expectations around that are higher along with the ability to leverage more capable hardware to see a more easily noticeable uplift without lots of additional effort.
 
A great read. Well done @W1zzard and i have to say, I love this new graph style, this is only the second or third time I have seen it I think. I just really like it, it makes disseminating the information so much easier than older chart styles on here.

A side note for you, Giagbyte motherboards have a toggle on and of software for turning e-cores on and off (and it doesn't even require their bloated main app, its just a simple little toggle called DRM Fix Tool). I probably will use this toggle when I play Prey, but other than that I will leave e-cores on.
 
The biggest takeaway from this article is that you really shouldn't worry about the E-Cores.

@W1zzard Unfortunately, intended or not, that is not the picture your graphs are showing (literally). Clipping graphs at +/-15% makes it look like there are significant differences. I really wish you didn't do that. In any of your comparison charts.
 
@W1zzard Unfortunately, intended or not, that is not the picture your graphs are showing (literally). Clipping graphs at +/-15% makes it look like there are significant differences. I really wish you didn't do that. In any of your comparison charts.

Can you explain to me what clipping graphs means? So the game Prey for example, can you put it in more context?
 
Hi W1zzard.

Can you please add an MMO to your games list? New world (extremely cpu/ram intensive), Final Fantasy (cpu), WoW (ram/cpu), something. We can compare the DX api or engines to get a ballpark guesstimate but generally speaking MMOs operate at a different scale and cross comparison of games like this don't reflect well

Thank you very much, love the detailed breakdown DX 11 vs 12 vs older game etc, and the continued effort. Really enjoyed the 5800x3d series, super helpful all around. Looking forward to more (SMT disabled AMD next? DDR4 vs 5 Ram differences 13900k, :))


signed,

~a fan
 
Last edited:
Can you explain to me what clipping graphs means? So the game Prey for example, can you put it in more context?
The max value is not +/-100%. It's +/-15%. That makes a tiny 5% variance go 1/3 of the way, which the brains really perceives as 33%. It's a means to emphasize small differences, but when used without a disclaimer is really just a way condition the mind towards a certain outcome.
 
The comparison is very interesting, but why it is done in Win10...
E-core scheduling support is available only in Win11 afaik.
 
Awesome job with the thorough testing. Handy to know for those wanting to get the absolute most out of the cpu for certain games.

I am curious if there's any noticeable difference in frame times with e cores on or off. Would be an interesting follow up. Maybe mix in hyper threading on or off as well. All core and threads on. All cores on no HT, p cores only and no HT, and for fun e cores only lol. Maybe even test with all cores locked at max clock vs allowing them to down clock based on use. Test fps and 1% and .1% lows as well.

Just an idea for a slow day I suppose. HT and turbo boosting might be negligible, but I do remember arguments back in the day about possible impact on latency with HT or turbo speeds or frequency changes, etc. Maybe figure out which games or game engines prefer which environments. This could be a lot of work though so limiting to a couple of games/engines would be understandable. Especially if running just one batch of tests shows its basically a pointless task.

The max value is not +/-100%. It's +/-15%. That makes a tiny 5% variance go 1/3 of the way, which the brains really perceives as 33%. It's a means to emphasize small differences, but when used without a disclaimer is really just a way condition the mind towards a certain outcome.
Hah, reminds me of a video that I just watched today (or rather listened to)

 
The comparison is very interesting, but why it is done in Win10...
E-core scheduling support is available only in Win11 afaik.
A windows 11 test on its own would be meaningless because as you said windows 11 has the 'official' support for e-core scheduling, so you shouldn't typically expect less performance with ecores enabled, right? It was also exactly what the 13900k's review was.

This test shows e-cores enabled aren't a detriment in the majority of games in Windows 10, so everyone can have a good time and likely get the same performance in Windows 10 as they will in Windows 11 in games with the 4090.

But, like I said in my first comment on this review, I expect the logical conclusion to this chain of reviews being Windows 10 vs Windows 11 with ecores enabled to test if the performance is indeed basically the same.
 
Thanks for the exhaustive testings, I tried disabling E-cores in Spiderman Remastered and get worse frametimes consistency (Win11), I guess E-cores have its uses despite some people claim it to be e-waste cores.
 
Back
Top