- Joined
- Jan 14, 2019
- Messages
- 15,796 (6.86/day)
- Location
- Midlands, UK
System Name | My second and third PCs are Intel + Nvidia |
---|---|
Processor | AMD Ryzen 7 7800X3D @ 45 W TDP Eco Mode |
Motherboard | MSi Pro B650M-A Wifi |
Cooling | be quiet! Shadow Rock LP |
Memory | 2x 24 GB Corsair Vengeance DDR5-4800 |
Video Card(s) | PowerColor Reaper Radeon RX 9070 XT |
Storage | 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda |
Display(s) | Dell S3422DWG 34" 1440 UW 144 Hz |
Case | Corsair Crystal 280X |
Audio Device(s) | Logitech Z333 2.1 speakers, AKG Y50 headphones |
Power Supply | 750 W Seasonic Prime GX |
Mouse | Logitech MX Master 2S |
Keyboard | Logitech G413 SE |
Software | Bazzite (Fedora Linux) KDE Plasma |
That's the thing - all cores have the same amount of cache. The only thing that differs is the density of circuits and the resulting clock speed difference. The scheduler only needs to know which cores are faster, which it already does since preferred cores were invented with Intel's 11th gen and Zen 3.100% agreed
They used higher density cache (and less of it) which is something the OS doesnt know or care about, so all those core types appear the same.
The only thing needed is something the chipset driver already does, with a way to push games onto cores with higher cache if available