RTX 4090 & 53 Games: Core i9-13900K E-Cores Enabled vs Disabled Review 101

RTX 4090 & 53 Games: Core i9-13900K E-Cores Enabled vs Disabled Review

(101 Comments) »

Conclusion

The Core i9-13900K "Raptor Lake" really is the best processor you can buy for gaming, if money is no bar. It can actually save you a few bucks over the AMD Ryzen 9 7950X, money saved that you can spend on a faster GPU. The price difference between the i9-13900K and 7950X could get you an RX 7900 XTX over the RX 7900 XT. But we're not comparing the i9-13900K to the 7950X in this article; rather we're comparing the i9-13900K against itself, with its E-cores disabled. Technically, the top "Raptor Lake" part is a 24-core/32-thread processor. The 8 P-cores support HyperThreading, and so we get 16 threads from them. The 16 E-cores lack HT, so 16 threads from there, and we end up with 32. In a way, we're halving the number of logical processors when we disable the E-cores; and in another way, we're disabling two-thirds of the cores. Intel determined 8 to be the ideal number of P-cores for the client-desktop platform right now, which means they think that 8 cores ought to be enough for gaming. If you look at AMD's 12-core and 16-core Ryzen chips, they only have 6 or 8 cores per CCD, and gaming workloads being localized to one of the two CCDs has a positive impact on frame-rates. Enough theory.

We begin our comparison with the 4K Ultra HD (3840x2160 pixels) resolution, which is what the GeForce RTX 4090 was designed for. At this resolution, the bottleneck is in the GPU's court. With its E-cores disabled, the i9-13900K is a negligible 0.1% slower than the stock i9-13900K (E-cores enabled), but this is when averaged across all 53 games. You have to pay attention to the individual game tests. Games like Far Cry 5 and Spider-Man Remastered see 2.6-4.3% frame-rate improvements. On the other hand, games like Dota 2, Hitman 3, and AoE IV, post 2.8-7.2% gains with E-cores enabled. Every other title has a negligible performance delta, which is why these handful of games can almost considered outliers.

Things begin to get interesting at 1440p. The RTX 4090 is way overkill for this resolution, no matter the refresh rate. Still, the 53-game average delta ends up just 0.4% in favor of the stock i9-13900K. The average is a bit misleading though. There are more games that post noteworthy performance differences in either configurations. With E-cores disabled, Prey gains 11.3% performance. GreedFall picks up 8.8%. Far Cry 5 gains 6.2%. Conversely, with the E-cores enabled, Far Cry 6 improves by 10.2%, Spider-Man Remastered by 8%, and Dota 2 by 7%. But then again, these games are too small in number to affect the average, and the gains cancel each other out. The average is also dragged down by the vast number of games with under 2% deltas, and so we end up with the 0.4% average.

1080p is the new 720p in context of our TPU50 articles, as this is where the graphics processing bottleneck is firmly in the court of the CPU. The RTX 4090 rips through frames, and the CPU isn't fast enough to keep up. Even with the highest IPC and clocks in the market, when averaged across all games, the two configurations of the i9-13900K post a difference of just 0.9% in favor of the stock configuration with the E-cores enabled. And much like 1440p, there are outliers to make the case for each setup—Prey, Metro Exodus, GreedFall, and Far Cry 5 each post significant 8-10% performance gains with the E-cores being disabled; whereas Warhammer III, Spider-Man Remastered, Far Cry 6, Dota 2, and Civilization VI, prefer the E-cores left untouched. In between these two extremes, the majority of games still show too insignificant a variation, and we end up with a stalemate between the two configurations.

If we look through all the games we can divide them into several groups. There's those games that really show no differences, E-Cores enabled or not (AC: Valhalla, Days Gone, DOOM Eternal, Monster Hunter, Witcher 3). The next group of games are titles that always run better with E-Cores enabled (Age of Empires, Dota 2). We also have games that run better with E-Cores disabled, but only as long as they are CPU limited (Far Cry 5, Metro Exodus). The opposite behavior is observed, too, of course. Titles such as Far Cry 6 and Divinity Original Sin 2 run better with E-Cores enabled, while CPU limited, but as soon as they are GPU limited, there is only marginal difference.

The biggest takeaway from this article is that you really shouldn't worry about the E-Cores. Even in worst case scenarios, the performance differences aren't huge—up to 10% is something you'd barely notice—even with the fastest graphics card available. On weaker GPUs, you'll be even more GPU limited, so the differences should be smaller. Of course it's always possible to fine-tune your Hybrid system, for example when you only play one game. You can enable or disable E-cores in the motherboard BIOS setup program, or simply set the CPU affinity in Windows for your game's executable, to localize them to the P-cores, if they prefer the E-cores to be disabled. We would still recommend that the E-cores be left untouched, for the simple reason that the P-cores don't appear to be gaining from the freed up power budget of the processor to sustain boost clocks better, as hypothesized in the article's introduction. Leaving the E-cores enabled benefits games, but the OS itself can dump low-priority background processes (think Antivirus, firewall, etc.,) as well as audio and networking stacks, to the E-cores (which have plenty of compute power to deal with them). We wish Intel made it a touch easier to toggle E-cores (maybe through XTU), so gamers wouldn't have to pull up the system BIOS or set core affinity. In a perfect world, Intel would provide predefined lists of games, for which E-Cores automatically get skipped, with the user having an option to override the behavior. With two successful Hybrid architectures from Intel, it's inevitable that game developers will notice the value in making future games more "aware" of E-cores, and start using them more appropriately.
Discuss(101 Comments)
View as single page
Apr 27th, 2024 05:33 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts