• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Readies Core 5 120F Socket LGA1700 Processor

I haven't finished reading up on Bartlett Lake yet but from what I have read that is not entirely true. It has HDMI 2.1 off the CPU, which is a characteristic of Raptor Lake not Alder Lake. Also certified for DDR5-5600 which is RPL not ADL.

P-Cores are all +200Mhz vs RPL.

The 12th Gen CPUs have that capability as well, perhaps some Z690 boards don't support HDMI 2.1 output, but the iGPU should unless this is intentionally disabled if the driver detects a certain CPU model. The UHD 770 graphics core is the same on 12th, 13th/14th gen and Bartlett CPUs. All models released thus far (all on embedded segment) are either Alder Lake "H0" or Raptor Lake "B0" derivatives without physical changes. The H0 being the "small" 12th Gen die with no E-cores and the B0 being i9-13900K cuts.

As you're well aware, the 14th gen is a fake generation as no models feature new silicon at all, and these Core 3/7/9 re-re-releases don't seem to, either.

That's the odd thing, the specs strongly suggest the 120F is not Raptor Lake. 4800MHz DDR5 and 18MB L3 cache strongly suggest a mildly bumped 12400F Alder Lake chip.
Or perhaps worded properly, it looks like the first leaked product is a refreshed Golden Cove only chip instead of the expected Raptor Cove.
The others probably are proper Raptor chips, not the 120F based on leaks however.
Kinda sad that Intel is still doing this in 2025, Raptor Lake is almost 3 years old by now.

If you consider that Raptor Cove is just a cache bumped Golden Cove, these CPUs are actually around 5 years old ISA-wise. 12th Gen released just a little after Zen 3. BTW, it´s 100% going to be an Alder H0 chip, if we go by the Core 3 201E processor which is H0...

 
That's the odd thing, the specs strongly suggest the 120F is not Raptor Lake. 4800MHz DDR5 and 18MB L3 cache strongly suggest a mildly bumped 12400F Alder Lake chip.
Or perhaps worded properly, it looks like the first leaked product is a refreshed Golden Cove only chip instead of the expected Raptor Cove.
The others probably are proper Raptor chips, not the 120F based on leaks however.
Kinda sad that Intel is still doing this in 2025, Raptor Lake is almost 3 years old by now.

Yeah and I can't download the spec sheet from Intel.



If that slide is correct, it's not Bartlett Lake. The interesting thing about Bartlett Lake is that in order to get 12 P-cores they need to rework the cache design, otherwise too much latency. My thought is that Bartlett Lake would be RPL with a Nova Lake cache design.

As you're well aware, the 14th gen is a fake generation as no models feature new silicon at all, and these Core 3/7/9 re-re-releases don't seem to, either.

14th gen and 13th gen are the same, but 12th and 13/14 are not.

Also the HDMI comment was off the CPU itself. Alder Lake does not have HDMI 2.1 off the CPU.

There aren't huge differences between 12 and 13th gen, but they are there and make a difference.

The same criticism could be said of Zen 2 / Zen 3. When you look closely the main difference is how many cores there are per CCX. The old 3300X Zen 2 4-core was a preview of Zen 3 performance back in the day. Zen 3 is Zen 2 without the latency, because most things can run on 8 cores.

1750083247074.png
 
This is just like what Intel did with Raptor Lake-S vs Raptor Lake, slap a new name on it and clock it another 100 MHz higher. At least with this renamed 12400F there's at least some more differences than that. And no official DDR5-5200 support is just an insult to injury.
 
I haven't finished reading up on Bartlett Lake yet but from what I have read that is not entirely true. It has HDMI 2.1 off the CPU, which is a characteristic of Raptor Lake not Alder Lake. Also certified for DDR5-5600 which is RPL not ADL.
Intel Alder Lake CPUs that had an iGPU featured UHD Graphics 770 or lower spec versions like the 730. This was carried over to Raptor Lake. Some Bartlett Lake CPUs like the 201 variants such as the 201E support up to DDR5 4800 so are clearly Alder Lake derived. Which would fit as they are 4P/8 threads and the Alder Lake i5 non-K die is Performance cores only. Only now used for the 12400/F.
 
Kinda sad that Intel is still doing this in 2025, Raptor Lake is almost 3 years old by now.
Did you share this sentiment on the thread about the "new" 5500X3D released, based on the 2020 Zen 3 Architecture? As far as I can tell GC 6 core die area smaller than RC 6 cores, because less cache, no e cores, etc. So makes sense for budget chips to be based off the non tweaked core.
No, the price difference is minimal anyways.
As is typical with these products you're better off getting the significantly cheaper product or the slightly more expensive one with some E-cores.
These chips mainly go to OEMs for bulk discount, hence $300 dell machine sold to pensioners and office, with pretty capable chip for office/web.
 
14th gen and 13th gen are the same, but 12th and 13/14 are not.

Also the HDMI comment was off the CPU itself. Alder Lake does not have HDMI 2.1 off the CPU.

There aren't huge differences between 12 and 13th gen, but they are there and make a difference.

The same criticism could be said of Zen 2 / Zen 3. When you look closely the main difference is how many cores there are per CCX. The old 3300X Zen 2 4-core was a preview of Zen 3 performance back in the day. Zen 3 is Zen 2 without the latency, because most things can run on 8 cores.

Maybe a limitation on UHD 730? AFAIK the UHD 770 on ADL/RPL/BTL is 100% identical :confused:
 
Maybe a limitation on UHD 730? AFAIK the UHD 770 on ADL/RPL/BTL is 100% identical :confused:
Correct. UHD 730/750 were the gen before ADL, Rocket Lake. ADL was UHD 710/730/770, RPL UHD 730/770. BTL is UHD 770 only. There are two BTL non-graphics units, the 201EF and 211EF.
 
I won't bother with E-Cores.

Before 6.15.0 kernel we had many changes to core parts because of those E-Cores. This is not finsihed slightly.

I still do not know what those E-Cores can do or not. I doubt they can execute the same Code as my Ryzen 7600X. I doubt AVX512 can be executed by those intel processors anyway, regardless of E-Core or P-Core. should i optimise the code for 2005 processors because Intel can not provide proper silicion?
 
I really like Intel manufacturing CPUs without low-performance cores. I hope this is the start of a new trend.

But, to really take advantage of that they would need to put more power-cores on the die space freed up from the removing low-performance cores.
 
Did you share this sentiment on the thread about the "new" 5500X3D released, based on the 2020 Zen 3 Architecture? As far as I can tell GC 6 core die area smaller than RC 6 cores, because less cache, no e cores, etc. So makes sense for budget chips to be based off the non tweaked core.
I care a whole lot less about manufacturers releasing a lower tier chip for a lower price which is appropriately named.
The 5500X3D checks all three boxes.
The 120F checks none of these boxes.
The only thing that would somewhat redeem the 120F is an actual great MSRP, everything else still sucks.
Like 100$ would be neat. But it's Intel, so it won't happen.
I'd even go out of a limb and wager that this is just a chip for OEM's to sell this thing as a "modern" chip where it clearly is not.

And let me be clear, I'm not excusing AMD from this BS either, but they've been behaving for desktop chips lately.
 
Last edited:
So the interesting part here for most on LGA 1700 would be the possible 12P 0E chip.

Yeah, saw this on OCN a while ago now so will be interesting what this 12p can do and how much cache it's going to get.

It's great to see socket 1700 still surviving! I can't remember the last intel socket that's had this much longevity.
 
what this 12p can do
Just a smidge more than a 13900K (in games I mean), unless for some reason this 12P-CPU has a lot of cache improvements, be it sheer size or efficiency.
and how much cache it's going to get.
This, I can't tell but my wild guess is it's gonna be 48 MB of L3. Very unlikely they'll slap more but you never know.
 
makes it abundantly clear that gamers want maxed out 1080p...

If I was to buy something like that it better come in a "K" version. Otherwise it's an office / budget / welfare CPU. But I need not mention this because it's been said time and time again.
 
E-cores ain't stupid, they allow much more performance per die area. The problem with them is they don't cut it for gaming.

If Intel were to market an actual gaming CPU they woulda gone for something with 12 non-HT blazing fast cores plus some 3DVCache equivalent. But that's a gaming-only CPU, it woulda sucked in working tasks compared to much more efficient E-core models.

That is not performance what E cores gives, that is wasted space and energy on the processor. AMD do better processors, because they give real performance, not Chinebench scores.
 
Finally Intel has just relized E Cores are very stupid and actually their whole LGA 1851 platform. Hope will be overclockable processor with Asus B660-F and will have unlockable SA voltage.

wow Cpu without stinky E-Cores. What next, they gonna invent bigger l3 cache?

That is not performance what E cores gives, that is wasted space and energy on the processor. AMD do better processors, because they give real performance, not Chinebench scores.
Most gamers didn't cared so much about E-cores and condemned Intel without any thoughts for no reasons. Like talking the common words, E-cores wasn't bad at all because it needs for working process like office and business purpose, if gamers really cared about it, go to BIOS settings and just turn off E-cores (stay on P-cores only), but it won't be any significant improved and still same performance as well.

Giving biased comments isn't going helpful, nobody would want to respond like these passive emotions. That's why I tried my best to avoid any misinformation for own good.
 
That is not performance what E cores gives, that is wasted space and energy on the processor. AMD do better processors, because they give real performance, not Chinebench scores.
This is only how it works for gaming purposes. Games don't need a million cores, they need just a handful of very fast ones. Working applications sometimes don't mess around and want ALL cores in the world. E-cores give more performance per die area, that's how i7-13700 beats the crap outta similarly priced Ryzen 9700X in these tasks. If you don't care about them it doesn't mean nobody does.

We don't need to get rid of them, we need to optimise them. Have better scheduling, lower latencies, more efficient cache. And something to prevent stupid deaths from overvoltage and overwattage.
 
I haven't finished reading up on Bartlett Lake yet but from what I have read that is not entirely true. It has HDMI 2.1 off the CPU, which is a characteristic of Raptor Lake not Alder Lake. Also certified for DDR5-5600 which is RPL not ADL.
HDMI 2.1 TMDS or FRL? TMDS is supported by both architectures' integrated Generation 12 GPU but FRL isn't. This is from Intel's datasheet, it mentions that it does. Same deal with Raptor Lake's integrated GPU. One thing that I find odd is the mention of HDMI 2.0b in these NEX product brief PDFs (1 & 2), especially with this quote here 'HDMI 2.0b integrated, HDMI 2.1 supported with LSPCON'. Seems like it implies HDMI 2.1 FRL isn't supported on all Intel Gen12-based GPUs, it only works with an LSPCON chip. Wouldn't that mean they're all the same?

I reckon the Core 5 120F is a Core i5-12400F with a 0.1 GHz increase on Turbo Boost based on the limited details that came out from that X post, so I agree with Nostras and Dro's observations. Unless... the Intel 7 Ultra/10ESF+ process was used to fab it, which would be very unlikely. Maybe this is just Bartlett Lake-S being prepared for launch, not Bartlett Lake-S 12P.
BTL is UHD 770 only.
I don't know about Bartlett Lake-S 12P, but Bartlett Lake-S isn't using the UHD Graphics 770 across all of its models. I'll use the Core 3 201E as an example - if you took a closer look at its integrated GPU's PCI device ID, you'll notice that it is the same device ID used by the Core i3-12300 (UHD Graphics 730). Both CPUs have the same stepping as well, so I doubt Bartlett Lake-S is UHD 770-only. If it really did, the PCI revision ID would have been reported to be different or the number of execution units in the integrated GPU would have remained the same at 32 across all models. This also applies to the Core 5 211E, not surprising that it uses the same GPU PCI device ID as the Core i5-13400E. This time, even the GPU's frequency is the same so there is absolutely no way it can be a UHD Graphics 770.

Intel leaves very stupid typos in their CPU database sometimes, and for that I would recommend that you don't just trust - but verify at the same time.
 
This is only how it works for gaming purposes. Games don't need a million cores, they need just a handful of very fast ones. Working applications sometimes don't mess around and want ALL cores in the world. E-cores give more performance per die area, that's how i7-13700 beats the crap outta similarly priced Ryzen 9700X in these tasks. If you don't care about them it doesn't mean nobody does.

We don't need to get rid of them, we need to optimise them. Have better scheduling, lower latencies, more efficient cache. And something to prevent stupid deaths from overvoltage and overwattage.

And to add to this, E-cores, being area-efficient, cram 4 cores into the area of one bigger core but each of these 4 cores aren't 4x slower, either. The Gracemont cores in these CPUs are easily at least as powerful as a i7-6700K's, it's really a mountain out of a molehill situation. Scheduling is acceptable, even on Windows 10, due to hardware feedback support. The "death" situation has been addressed and continuously refined and improved, new microcode (version 12F) was recently released to further polish the vmin shift behavior on the CPUs.

HDMI 2.1 TMDS or FRL? TMDS is supported by both architectures' integrated Generation 12 GPU but FRL isn't. This is from Intel's datasheet, it mentions that it does. Same deal with Raptor Lake's integrated GPU. One thing that I find odd is the mention of HDMI 2.0b in these NEX product brief PDFs (1 & 2), especially with this quote here 'HDMI 2.0b integrated, HDMI 2.1 supported with LSPCON'. Seems like it implies HDMI 2.1 FRL isn't supported on all Intel Gen12-based GPUs, it only works with an LSPCON chip. Wouldn't that mean they're all the same?

I reckon the Core 5 120F is a Core i5-12400F with a 0.1 GHz increase on Turbo Boost based on the limited details that came out from that X post, so I agree with Nostras and Dro's observations. Unless... the Intel 7 Ultra/10ESF+ process was used to fab it, which would be very unlikely. Maybe this is just Bartlett Lake-S being prepared for launch, not Bartlett Lake-S 12P.

I don't know about Bartlett Lake-S 12P, but Bartlett Lake-S isn't using the UHD Graphics 770 across all of its models. I'll use the Core 3 201E as an example - if you took a closer look at its integrated GPU's PCI device ID, you'll notice that it is the same device ID used by the Core i3-12300 (UHD Graphics 730). Both CPUs have the same stepping as well, so I doubt Bartlett Lake-S is UHD 770-only. If it really did, the PCI revision ID would have been reported to be different or the number of execution units in the integrated GPU would have remained the same at 32 across all models. This also applies to the Core 5 211E, not surprising that it uses the same GPU PCI device ID as the Core i5-13400E. This time, even the GPU's frequency is the same so there is absolutely no way it can be a UHD Graphics 770.

Intel leaves very stupid typos in their CPU database sometimes, and for that I would recommend that you don't just trust - but verify at the same time.

That's an interesting catch, as the 201E is listed as having UHD 770 in their product specification page


And according to the whole "formerly Bartlett Lake" list, all models are listed as having UHD 770:


But 0x4692 is UHD 730, not UHD 770 (0xA780). Man, Intel :confused:
 
The "death" situation has been addressed and continuously refined and improved, new microcode (version 12F) was recently released to further polish the vmin shift behavior on the CPUs.
I meant prevention of the upcoming generations being as silly in this department more than anything else.
The Gracemont cores in these CPUs are easily at least as powerful as a i7-6700K's
With an asterisk. In games, they're approximately at an Ivy Bridge level.
 
I wonder if is worth waiting for that 12P Core CPU and if is worth replacing my 13700KF CPU with it.
 
Love everyone that is trashing the hybrid design of Intel CPU's and that E-Cores are a waste of time and money. Ever heard of the game Beam.Ng Drive?

Its one of the best programmed games out there that will eat of every thread and core of a CPU that is can use. It loads up all the Performance cores first then starts using the E-Cores, and the E-Core on Arrow Lake are basically at the performance of the Alder lake P-cores, but somehow that's a bad thing.

I play the above mentioned game a lot, and have built multiple systems to play it, both AMD and Intel.

Generationally Raptor Lake to Arrow Lake there was a large increase in performance.

I stumbled upon this when I built a Core Ultra 5 225 system (6P-Core 4 E-Core 10 Threads)

For some reason the 225 was giving me performance I was getting from a 14900K ( 8P Core HT 16 E-Core)



I won't even get in to how much better my 285K system is in that game.

Sure the game is a bit of an outlier in performance, but it shows exactly what happens when developers know what they are doing and can take advantage of cpu architectures

That is not performance what E cores gives, that is wasted space and energy on the processor. AMD do better processors, because they give real performance, not Chinebench scores.
So how come this crappy welfare garbage intel CPU with E-cores (only 4 of them and 6 P Cores) gives better gaming performance than a 16 Core AMD X3D chip with 16 cores and 32 threads? And before you say OMFG its a laptop CPU, its a lower power 7950x3d.......but why is the intel chip better with a 22 thread handicap?


That sure isn't cinebench scores.....

People like you are why I love reading forums, all BS and no facts or data to prove anything.
 
Love everyone that is trashing the hybrid design of Intel CPU's and that E-Cores are a waste of time and money. Ever heard of the game Beam.Ng Drive?

Its one of the best programmed games out there that will eat of every thread and core of a CPU that is can use. It loads up all the Performance cores first then starts using the E-Cores, and the E-Core on Arrow Lake are basically at the performance of the Alder lake P-cores, but somehow that's a bad thing.

I play the above mentioned game a lot, and have built multiple systems to play it, both AMD and Intel.

Generationally Raptor Lake to Arrow Lake there was a large increase in performance.

I stumbled upon this when I built a Core Ultra 5 225 system (6P-Core 4 E-Core 10 Threads)

For some reason the 225 was giving me performance I was getting from a 14900K ( 8P Core HT 16 E-Core)



I won't even get in to how much better my 285K system is in that game.

Sure the game is a bit of an outlier in performance, but it shows exactly what happens when developers know what they are doing and can take advantage of cpu architectures


So how come this crappy welfare garbage intel CPU with E-cores (only 4 of them and 6 P Cores) gives better gaming performance than a 16 Core AMD X3D chip with 16 cores and 32 threads? And before you say OMFG its a laptop CPU, its a lower power 7950x3d.......but why is the intel chip better with a 22 thread handicap?


That sure isn't cinebench scores.....

People like you are why I love reading forums, all BS and no facts or data to prove anything.


TPU isn't really the place to go to get technical.

overclock.net is that place.

Like, this is the impact of moving a high resolution mouse around while playing Fortnite with scheduler set to prefer P-cores only on a 14900K:

1750262679141.png


This is with all cores (including e-cores) scheduling:

1750262802456.png


Another tidbit, this is only true with DX12 not DX11.
 
Too technical??? That's lame it wasn't really that technical, but i'll check out overclock.net
 
Back
Top