• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

RTX 4090 & 53 Games: Core i9-13900K E-Cores Enabled vs Disabled

You've kind of answered your own question: for percentage graphs, you should always go to 100%.

That, too.
Nop, there is no rule about fixing % at 100% max. What do you do if you have 150% or 2500% change? or important 0.5% change? (not in game fps, other scenarios)
You need to fix it slightly above the max absolute value and that how you can see the relative change without zooming in, which is very inconvenient.
The data in the article presented very clearly and in a good, convenient way.
Maybe you want to fix it to 100% for it to be easier ignore the result that e-cors doesn't really matter in games so is is very much OK to leave them on.
I speculate that you don't fancy e-cores, correct?
 
One surprising thing to me was how DOTA 2 at 4k performed better with ecores enabled. One would assume an older game would have a negligible difference.

edit "Development of Dota 2 began in 2009 when IceFrog, lead designer of Defense of the Ancients, was hired by Valve to create a modernized remake for them in the Source game engine."

Ahh source engine, guess that makes sense now.
 
the 13900K doesn't even come close to 180W while gaming.
Mine mostly stands between 60 and 80W, with just a few spikes over 120/130W (it depends on the game).


actually not. E-cores allow for an higher core counting without being crazy expensive and power limited.


Actually it is not "more optimized" for Windows 11. It is completely NOT OPTIMIZED in Windows 10, according to their words

View attachment 269699

So it is not very smart to use Windows 10 with 12th or 13th Gen Intel CPUs.


what about power consumption with those settings ?
At what resolution do you game? if it's 60/80W at 4K then that's impressive but they are clearly not cpu bound games.
 
We absolutely know that the 13900K and the 4090 is a KILLER COMBO.

Unfortunately, anyone who didn't get the 4090, might as well NOT BUY ONE until the recalls on the cards are done and a safer, option is available for the connections.

I bought the 4090 ROG Strix on launch day and to tell you the truth, I'm NOT using it right now - I'm using my 3090 ti Kingpin.

I don't want my computer burning down my house.
 

Attachments

  • 311484405_3031177687192105_6757019300615541816_n.jpg
    311484405_3031177687192105_6757019300615541816_n.jpg
    263.3 KB · Views: 100
  • 315191559_1356345991797906_8845535410675067262_n.jpg
    315191559_1356345991797906_8845535410675067262_n.jpg
    261.6 KB · Views: 142
We absolutely know that the 13900K and the 4090 is a KILLER COMBO.

Unfortunately, anyone who didn't get the 4090, might as well NOT BUY ONE until the recalls on the cards are done and a safer, option is available for the connections.

I bought the 4090 ROG Strix on launch day and to tell you the truth, I'm NOT using it right now - I'm using my 3090 ti Kingpin.

I don't want my computer burning down my house.
I will bite: What recall? link a source please.

I disagree intel forced the e cores up on us tbh noone asked for the e cores without them intel would sell better because they probably would be cheaper
No one force you anything, go with an Intel CPU without e-cores or with AMD or a if the e-cores deter you so much for some reason :)

The e-cores concept is a wonderful thing and AMD are on the way to adopt them in the future. If not, they will stay behind.

I too don't see the point in E-cores in HEDT CPU's.
E cores is the way to, go also in HEDT. Just see how much an 13900k can do with it`s e-cores compere to 7950X and for considerably less money.
Generally, you will preferer a real core over HT one (in the high count numbers, you still want those 8P at least for HEDT).
Going 8P+64E or 16P+32E will be dream to many and will enable a much better cost/pref.
Combine E-cores with Intel version of chiplet and you get a really strong and affordable CPU.
 
While this was done in the same format as the other tests, It really needs 0.1% lows and power consumption


If gaming performance was the same but it used less power, ran colder, or had less frame spikes that's worth knowing about
 
The e-cores concept is a wonderful thing
Agreed.
and AMD are on the way to adopt them in the future. If not, they will stay behind.
Everyone knows that, except AMD. Yes, many people here expect to see a chip with one Zen 5 and one Zen 5c die in the next generation Ryzen. But it's no more than a thin speculation right now.

Will/would AMD stay behind without heterogeneous cores? Maybe. That's not a given. AMD and Intel are taking quite different paths to solve the same engineering problems, yet both achieve similar results and remain competitive.
 
I will bite: What recall? link a source please.


No one force you anything, go with an Intel CPU without e-cores or with AMD or a if the e-cores deter you so much for some reason :)

The e-cores concept is a wonderful thing and AMD are on the way to adopt them in the future. If not, they will stay behind.


E cores is the way to, go also in HEDT. Just see how much an 13900k can do with it`s e-cores compere to 7950X and for considerably less money.
Generally, you will preferer a real core over HT one (in the high count numbers, you still want those 8P at least for HEDT).
Going 8P+64E or 16P+32E will be dream to many and will enable a much better cost/pref.
Combine E-cores with Intel version of chiplet and you get a really strong and affordable CPU.
Uhm, no.
2ld7m4.jpg
 
@W1zzard
"On weaker GPUs, you'll be even more CPU limited, so the differences should be smaller."
Did you mean: even LESS CPU limited (or eventually: even more GPU limited)?
 
I think the e-Cores have been a great addition as they certainly provide a significant uplift in pro workloads especially software that utilises multiple cores and as importatnly taking on all the background and low priority tasks with smaller more effecient cores. Using Windows 11 across rendering and encoding workloads for me on a 13700K has seen a huge perfoirmance uplift and in the few games I play, mostly sim racing like AMS2, Assetto Corse Comp, R3R and of course CyberPunk, these 13th gen CPU's are just stellar. Intel were in a quandry and this hybrid solution has come through and I can only see it getting better on smaller nodes with there new packaging technology on the horizon...Bottom line competition is good.
 
@W1zzard
"On weaker GPUs, you'll be even more CPU limited, so the differences should be smaller."
Did you mean: even LESS CPU limited (or eventually: even more GPU limited)?
Nice catch, should be "even more GPU limited" :)
 
Nice catch, should be "even more GPU limited" :)

Low CL RAM affects 7000 series' performances that much. 13600K v 7600X or 13700K v 7700X with the best RAM for both sides should be next, chief.
 

Low CL RAM affects 7000 series' performances that much. 13600K v 7600X or 13700K v 7700X with the best RAM for both sides should be next, chief.
Which "best ram" is more expensive and how much change?
 
Agreed.

Everyone knows that, except AMD. Yes, many people here expect to see a chip with one Zen 5 and one Zen 5c die in the next generation Ryzen. But it's no more than a thin speculation right now.

Will/would AMD stay behind without heterogeneous cores? Maybe. That's not a given. AMD and Intel are taking quite different paths to solve the same engineering problems, yet both achieve similar results and remain competitive.
Just imagine a 8+40 cpu the multi-core will be insane
 
One surprising thing to me was how DOTA 2 at 4k performed better with ecores enabled. One would assume an older game would have a negligible difference.

edit "Development of Dota 2 began in 2009 when IceFrog, lead designer of Defense of the Ancients, was hired by Valve to create a modernized remake for them in the Source game engine."

Ahh source engine, guess that makes sense now.
Dota 2 was arguably re-released in 2015 as source2 instead of source1. It's also had continuous technical upgrades
What is curious is that the difference increases as the resolution increases
 
Huge testing, the e-cores don't hinder performance good to know as it's bothersome to disable them and enable them.

One correction though at least in the US you can get the 7950 for less than the 13900k.
 
Huge testing, the e-cores don't hinder performance good to know as it's bothersome to disable them and enable them.

One correction though at least in the US you can get the 7950 for less than the 13900k.
Tbh, some mobos offer a key combo to toggle E cores. Or you can use process lasso. But yeah, nothing beats working properly out-of-the-box.
 
Need a 8 core P core only chip now from 13th gen.
 
Need a 8 core P core only chip now from 13th gen.
I'm sure if Intel was aware fo that, they would bent over backwards to build one for you. :wtf:
 
I'm sure if Intel was aware fo that, they would bent over backwards to build one for you. :wtf:
probably one coming, they did for 12th gen if i remember right, or was that 6 core?
 
What is the power consumption difference with the ecores enabled vs disabled in all these games?
 
Dota 2 was arguably re-released in 2015 as source2 instead of source1. It's also had continuous technical upgrades
What is curious is that the difference increases as the resolution increases
Source engines have always put more on the cpu. Not sure why that helps at 4k?
@W1zzard
Time to do more source engine games testing. ;)
 
Can the P cores be disabled so we can have a P vs E core comparison? Duke out 8P vs 16E? That would be pretty interesting comparison imo. Would also like another comparison like what was done, but with SMT disabled.

In general, I don't trust OS CPU schedulers when it comes to oddities. AMD chips can have two or more CCDs that the scheduler has to properly balance and devide up threads froma single application, and Intel now has big.LITTLE with the P and E cores the scheduler needs to properly balance. Sometimes you win sometimes you lose. AMD likely has similar numbers of some games doing better and some worse if you disable a CCD.

People love E cores because they can do background tasks or crank out some very parallel threaded work. What I say to that, remove those E cores and just give me more P cores. Intel could have easily given us 12P cores on a smaller die or go slightly larger and given us 14P cores. Or even 12P cores and still fit in say 4 Ecores. Makes me feel like this is primarily for Intel to be able to compete at the core count level against AMD, where with the 13th Generation Intel has the upper hand here.

Efficiency is important, but I don't think this P+E layout is really delivering here. It doesn't seem to be giving the 12th or 13th gen cores an edge in anyway. Maybe they are helping to keep Intel's numbers from exploding if all they did was offer P cores?

But also, things don't make a lot of sense. Look at Dota2. At 1080p and even 4k it is 7% to the better with the E-Cores enabled. Why? On my 3700X, Dota2.exe uses 2~8% CPU in a bot match running around. This is at 4k getting over 200fps. So you wouldn't think that enabled E cores would matter at all, even if you got a bunch of other stuff going on taking up CPU cycles.
Are there any behavior changes with the P cores with the E cores disabled or enabled? With the E Cores disabled, can the P cores sustain higher clocks? Is there more latency to cache and/or memory if the E cores are disabled? Is it the CPU scheduler doing weird stuff?
 
Back
Top