Tuesday, July 16th 2024

Intel Planning P-core Only "Bartlett" LGA1700 Processor for 2025
In a surprising development, Intel plans to extend the longevity of its Socket LGA1700 platform even as the newer LGA1851 platform led by the Core Ultra 200 "Arrow Lake" remains on track for a late-Q3/early-Q4 2024 debut. This, according to a sensational leak by Jaykihn. It plans to do this with a brand-new silicon for LGA1700, codenamed "Bartlett." This should particularly interest gamers for what's on offer. Imagine the "Raptor Lake-S" die, but with four additional P-cores replacing the four E-core clusters, making a 12-core pure P-core processor—that's "Bartlett." At this point we're not sure which P-core is in use—whether it's the current "Raptor Cove," or whether an attempt will be made by Intel to backport a variant of "Lion Cove" to LGA1700.
This wouldn't be the first pure P-core client processor from Intel after its pivot to heterogeneous multicore—the "Alder Lake" H0 die has six "Golden Cove" P-cores, and lacks any E-core clusters. Intel is planning to give launch an entire new "generation" of processor SKUs for LGA1700 which use the newer client processor nomenclature by Intel, which is Core 200-series, but without the "Ultra" brand extension. There will be SKUs in the Core 3, Core 5, Core 7, and Core 9 brand extensions. Some of these will be Hybrid, and based on the rehashed "Raptor Lake-S" 8P+16E silicon, and some "Alder Lake-S" 8P+8E; but "Bartlett" will be distinctly branded within the series, probably using a letter next to the numerical portion of the processor model number. There will not be any Core 3 series chips based on "Bartlett," but Core 5, Core 7, and Core 9.The Core 5 "Bartlett" series will feature an 8-core configuration. That's 8 P-cores, and no E-cores. The Core 7 "Bartlett" will be 10-core, no E-core. The Core 9 "Bartlett" will draw the most attention, as being 12-core. If Intel is using "Raptor Cove" P-cores, these should be 8-core/16-thread, 10-core/20-thread, and 12-core/24-thread, respectively. Depending on the K- or non-K SKUs, these chips feature a processor base power value of 125 W, or 65 W, or even 45 W.
Intel is planning to launch these non-Ultra Core Socket LGA1700 processors in Q1-2025, but the "Bartlett" silicon won't arrive before Q3-2025.
Source:
Jaykihn (Twitter)
This wouldn't be the first pure P-core client processor from Intel after its pivot to heterogeneous multicore—the "Alder Lake" H0 die has six "Golden Cove" P-cores, and lacks any E-core clusters. Intel is planning to give launch an entire new "generation" of processor SKUs for LGA1700 which use the newer client processor nomenclature by Intel, which is Core 200-series, but without the "Ultra" brand extension. There will be SKUs in the Core 3, Core 5, Core 7, and Core 9 brand extensions. Some of these will be Hybrid, and based on the rehashed "Raptor Lake-S" 8P+16E silicon, and some "Alder Lake-S" 8P+8E; but "Bartlett" will be distinctly branded within the series, probably using a letter next to the numerical portion of the processor model number. There will not be any Core 3 series chips based on "Bartlett," but Core 5, Core 7, and Core 9.The Core 5 "Bartlett" series will feature an 8-core configuration. That's 8 P-cores, and no E-cores. The Core 7 "Bartlett" will be 10-core, no E-core. The Core 9 "Bartlett" will draw the most attention, as being 12-core. If Intel is using "Raptor Cove" P-cores, these should be 8-core/16-thread, 10-core/20-thread, and 12-core/24-thread, respectively. Depending on the K- or non-K SKUs, these chips feature a processor base power value of 125 W, or 65 W, or even 45 W.
Intel is planning to launch these non-Ultra Core Socket LGA1700 processors in Q1-2025, but the "Bartlett" silicon won't arrive before Q3-2025.
151 Comments on Intel Planning P-core Only "Bartlett" LGA1700 Processor for 2025
Its weird given Once Human only minimum requires an i5 4460 and recommends an i7-7700 both of which are only 4 cores one with better IPC and HT being the 7700. So it really struggles with 8 and HT??
What about other games. Any need or big benefit from more than 8 cores even in 4K?
I have heard so many conflicting stories on this?
But many gamers do not want hybrid or dual CCD setups so stick to 8 P core only models with SMT/HT (7800X3D, 7700X, 5800X3D) or Intel 12th to 14th Gen with e-cores disabled) and if a 12 P core came about, many gamers who want no scheduling or heterogenous or dual CCD would jump on it with such a game example of desiring more than 8 cores.
But I have not heard anything since about this Bartlett Lake so not sure how much legs this rumor has.
Still, if Intel releases a 12P CPU with all AVX-512 extensions enabled, it'll make for a fine Core non-Ultra Series 2 processor line.
Or maybe the small server segment that wants a ring bus and AVX512 than package it as enthusiast :LGA 1700 chip as a beta test run for us gamers niche who want more than 8 P cores and all such cores on one die of homogenous arch and if it works good produce more for small server and network edge market.
Maybe they intend to use us as a beta test run which could mean even worse problems when they milk us as we run out to buy it the chip we so waited for that never was made and they scrap it and hang up on our RMA requests when it degrades like a paper tiger and has stability issues like the current 8 + 16 chips do. And I am skeptical even of the microcode update.
But a new 12 + 0 die on a ring bus maybe its good. I may be wiling to take that risk.
But we know most games are made for consoles with 8 cores and the best PC gaming cpu has 8 cores. It will certainly do worse in multi-threaded performance. Still... A lot of people seem to really not like ecores so that alone might sell a good amount of chips.
I'm sure there's some niche use cases where it would be beneficial, otherwise why would they make it? I heard networking, but not sure about that.
I like Raptor Lake arch. Problem is the B0 8 + 16 stepping seems to degrade so so fast and/or is not stable. The microcode update not sure if its a real fix or a band aide. And if it is a fix here and now how long will they last without degrading in 6 months or 1 year.
And I have seen you absolutely need a non Asus 2 DIMM board to run XMP RAM reliable or otherwise OCCT Large Dataset variable will randomly throw WHEA errors or CPU core errors with XMP turned on on any 4 DIMM board even with onk1 1DPC (2 DIMMs total).
But to my other comment above I have had other stability problems even on 2 DIMM board not related to RAM like weeks down the road running CIbnebench a random WHEA error or a random BSOD or WHEA shader compilation even with an undervolted underclocked CPU
I see you have a 12900K as your system in your profile. That is a more stable chip and can that be tuned to match or beat 7800X3D in gaming. You get extra cores though only 8, but with HT on 24 threads and it does not have the degradation/stability issues 13th/14th Gen have. Benchmarks seem to suggest no, but wonder if that is because ring clock is crippled to 3600MHz with e-cores on?? With e-cores off you can push ring to close to 5GHz stable I think. But then e-cores off is a waste on that chip for gaming and miswell go 7800X3D/9800X3D if gonna do that which I would not.
Though I mention that because I think you can manually get the ring clock to 4200MHz or 4300MHz (But I doubt any more lol maybe if super lucky 4400MHz) stable on 12900K with e-cores on and wonder if that would shoot gaming performance up a lot with well tuned RAM while having those extra cores handy as well without having to go the 13th/14th Gen route and deal with the stability degradation fiasco.
I would like more than 8 cpores, but the options for it suck. Raptor Lake would be best bet if it were not for degradation and stability, but its there so its out. Maybe tuned Alder Lake???
Arrow Lake is an absolute flop.
7950X3D and 9950X3D both only have 1 CCD with 3D vcache and scheduling issues are bad as the XBOX Game bar parks all non 3D cores so the games that can use more threads cannot because only 8 3D cache cores can be used while gaming so less important threads cannot be sent of to other CCD that route. Intel 12th to 14th Gen do not have this problem as thread director is good at hardware level. So 12th Gen maybe given 13th and 14th Gen degradation/stability fiasco.
or otherwise be stuck with only 8 cores 7800X3D/9800X3D and get hampered in Cities Skylines 2 with large populations.
Sucks options are so limited.
And AMD no not gonna have 3D cache on both CCDs on 9950X3D unfortunately:
www.forbes.com/sites/antonyleather/2024/11/29/amd-revealing-ryzen-9-9950x3d-and-9900x3d-second-week-in-january/
Seriously am I in upside down land right now....? Why is everybody talking about alder lake and raptor lake beating the 7800x3d? I know in some select titles there is an advantage, but with the 12900k, even if its more, stable, its definitely even slower than the 14900k, by not an insignificant amount, and even the 14900k cannot beat the 7800x3d unless we are cherry picking games.
Its about options of wanting more than 8 cores for a gaming rig for now and future (without upgraidng core system, but just GPUs for go to options)
AMD has the dual CCD cross latency penalty and the weird scheduling issues esopeically with dual CCD chips with one CCD with 3D cache.
Intel has a very good thread diretcor 12th to 14th Gen, but still a hybrid arch. But their thread director hardware level is very goiod and works well.
So 13th and 14th Gen seem like best choice if you wnat more than 8 cores and tune them right?? One problem. Degradation/Stability fiasco.
So that leaves Intel 12th Gen which does not have degradation and stability issues. And 13th Gen was just a refined 12th Gen so maybe 12th Gen can be configured to be almost or just as fast? Or if not no good go to options of more than 8 cores that are reliable and will age well.
And yeah I know HW Unboxed did a video showing how some thought 3950X would age better than 5800X3D because it had 16 cores and future videos showed that was so false in gamning benchmarks as 5800X3D smokes it in todays games. However lets not forget the 3950X is like 4 4 core CPUs glued together and has those weak IPC Zen 2 cores. Zen 2 to Zen 3 was such a big IPC and latency improvement.
Its not unreasonable (though only time will tell) to think Intel 12th Gen, 13th and 14th Gen (if only they were reliable and did not degrade) and 16 core Zen 4 and Zen 5 and maybe even Zen 3 parts will age better than even 8 core Zen 4 and Zen 5 X3D counterparts?
Of course more than 8 homogenous cores on a single die would be best with at least Zen4 X3D/Golden/Raptor Cove IPC, but no such option exists and appears will not a long while if ever.
That means no CPU best for all around games like Cities Skylines 2 and others
My 13700k gets better figures than what reviewers got and not within error of margin, typically 10-20% better. Assuming its my RAM config. Also could be down to that I tuned the power scheduler, and have custom affinities.
The scheduler stuff thats happening affects p-cores as well, on p-cores with default setup Windows will overload the preferred cores when it has a lot of threads, they can end up saturated whilst other p-cores have very low load, I fixed it by making all my p-cores run at same speed.