• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core i3-14100 "Raptor Lake Refresh" Listed for $150

Local brick n motar sells the i5 12400 (6c/12t)for the exact same price as the 14100 so pointless chip. 14100 would need to be like $100 for the F model and $110-115 range for non F
 
13th and 14th gen i3s are both DOA. i3-12100 series is just cheaper in the first place and also is overclockable if you have a specific motherboard with a specific BIOS (13th and 14th gen non-Ks are fully locked from that).

In case you are desperate for spending more on your CPU just pick an i5-12400.
 
They can execute threads fast enough that you don't need the additional cores.

Cores are just extenders of computational power, when a core can only go so fast, the only way to increase throughput is to add another core. If you have a core that's fast enough, you don't need the additional cores. The alder lake/raptor lake IPC is very strong and also they're relatively high clocked, so that's why the 4 cores can hang with the 6 core chips.
Well said,bro. I have a 12100f coupled with RTX4070 (Asus Dual OC) running on my PC, which can perfectly execute all games in my steam libs on a 2560*1440(100hz) monitor. I was kinda confused by those PC setup recommand of many new-released games that most of them requires a at least 6-core cpu. (like Alan Wake 2; the last of US, etc.)

Nothing noticable upgrade, just......meh...:kookoo:
 
Well said,bro. I have a 12100f coupled with RTX4070 (Asus Dual OC) running on my PC, which can perfectly execute all games in my steam libs on a 2560*1440(100hz) monitor. I was kinda confused by those PC setup recommand of many new-released games that most of them requires a at least 6-core cpu. (like Alan Wake 2; the last of US, etc.)

Nothing noticable upgrade, just......meh...:kookoo:
wow that is great proof how sufficient efficient the quadcore is and the upcoming 14100 is very potent the increases in performance single vs multi thread to its predecessor 13100 look like intel really pushed the limits (in a responsible way as to intel standards, stock non overclocked, preserving longlivety, again still running a 3770 here from 2012, never changed thermal paste, playing darktide etc.).

Local brick n motar sells the i5 12400 (6c/12t)for the exact same price as the 14100 so pointless chip. 14100 would need to be like $100 for the F model and $110-115 range for non F

i can understand the 12400 is widely celebrated a being a bread and butter cpu and enabling you to field 6 cores with low power consumption,
but as a hybrid user office/gaming and pc on 12+ hours a day the 12400 has a to high idle power consumption especially vs the insane low 13100 which is like 3 or 4 watts
 
the 12400 has a to high idle power consumption especially vs the insane low 13100 which is like 3 or 4 watts
Excuse me?
1703524738161.png
 
i can understand the 12400 is widely celebrated a being a bread and butter cpu and enabling you to field 6 cores with low power consumption,
but as a hybrid user office/gaming and pc on 12+ hours a day the 12400 has a to high idle power consumption especially vs the insane low 13100 which is like 3 or 4 watts

Are you using those on the same motherboard? Because I see very different idle usage on similar and sometimes the same Intel CPUs depending on motherboard settings and other weird things. I don't have anything as new as Alder Lake but I'll bet that with identical settings and Mobo, the 12400 and 13100 should both have very low idle power.

I've seen the same Core i5-8400 use 2W at idle and 13W, depending on the Motherboard and settings.

Edit: What @Beginner Micro Device posted.
 
interesting that is low.. changes my mind about the 12400 (i would choose probab the non f) but still when comparing the 12400 with 12100/13100 where does the 12400 really shine?
a bit better 1% lows? some gaming examples would be appreciated

Are you using those on the same motherboard? Because I see very different idle usage on similar and sometimes the same Intel CPUs depending on motherboard settings and other weird things. I don't have anything as new as Alder Lake but I'll bet that with identical settings and Mobo, the 12400 and 13100 should both have very low idle power.

I've seen the same Core i5-8400 use 2W at idle and 13W, depending on the Motherboard and settings.

Edit: What @Beginner Micro Device posted.
i found the values online and there is barely on google any info about 12400 idle power consumption that is why i thought it was high as some reddit user mentioned between 15-20 watts
 
where does the 12400 really shine?
a bit better 1% lows? some gaming examples would be appreciated
I don't play anything more relevant than the Cyberpunk 2077 game and if you're a 60 FPS gamer you won't notice any difference whatsoever but 12400 will make all sense in the world if you're aiming at 100+ FPS gaming. Will bench my 12400F at 6C12T and at 4C8T with uncapped FPS so I could get exact numbers in a matter of hour.

Competitive games like Counter Strike and Fortnite also seriously benefit from a couple additional cores. In the AAA gaming world, you're realistically bottlenecked by your GPU more than anything else so 12100 VS 12400 is almost a draw but 12400 definitely has more stability and less difference between AVG and 1% lows.
 
I'd say more than four cores will help a lot with min FPS, not just above 60 FPS gaming. I know they're fast cores, but it's still a quad core in 2023.
 
I don't play anything more relevant than the Cyberpunk 2077 game and if you're a 60 FPS gamer you won't notice any difference whatsoever but 12400 will make all sense in the world if you're aiming at 100+ FPS gaming. Will bench my 12400F at 6C12T and at 4C8T with uncapped FPS so I could get exact numbers in a matter of hour.
would for 75fps be a higher incentive to take a 12400 over a 14100 (14100 is insane uplift over 13100)

I'd say more than four cores will help a lot with min FPS, not just above 60 FPS gaming. I know they're fast cores, but it's still a quad core in 2023.
on youtube plenty of videos with fps counters and a broad variety of games tested i cant see a significant reason even for 75fps to take a 12400 in a new built over a 13100 or 14100
i m myself are surprised that just a 4 core is so blazingly fast
 
14100 is insane uplift over 13100
Insanely low I'd say. +200 MHz is all it provides over its predecessor. 10100 over 9100 is what can be called insane (4C4T → 4C8T+100 MHz).
 
Insanely low I'd say. +200 MHz is all it provides over its predecessor. 10100 over 9100 is what can be called insane (4C4T → 4C8T+100 MHz).
check single and multi thread benchmarks they are not that bad.. putting the 14100 even over the 13400 in terms of single thread etc.
 
some gaming examples would be appreciated
Decided to have a couple 3-minute sessions.

RX 6700 XT as a GPU, 1920x1080 with FSR: Performance to minimise GPU bottleneck.

As you can see, 6C12T mode made my GPU work a little bit harder. Hard to notice but median GPU load was a tad higher (didn't record GPU utilisation but you can get it through power consumption graph). 1% lows were also better in the 6C12T mode and despite massive screen tearing due to vsync off I was able to tell the difference between tearing and FPS drops substantially below 60 FPS which were not the case in the 6C12T mode.
 

Attachments

  • 6cores12threads.png
    6cores12threads.png
    2.8 MB · Views: 62
  • 4cores8threads.png
    4cores8threads.png
    2.5 MB · Views: 78
Decided to have a couple 3-minute sessions.

RX 6700 XT as a GPU, 1920x1080 with FSR: Performance to minimise GPU bottleneck.

As you can see, 6C12T mode made my GPU work a little bit harder. Hard to notice but median GPU load was a tad higher (didn't record GPU utilisation but you can get it through power consumption graph). 1% lows were also better in the 6C12T mode and despite massive screen tearing due to vsync off I was able to tell the difference between tearing and FPS drops substantially below 60 FPS which were not the case in the 6C12T mode.
appreciate the work done but a13100 has different 4 core 8 thread output as with a mobo downgraded 12400
The 13100 has higher boost clock that 100mhz makes a difference
 
appreciate the work done but a13100 has different 4 core 8 thread output as with a mobo downgraded 12400
The 13100 has higher boost clock that 100mhz makes a difference
And the 12400 still has 1.5 times more L3 cache which matters way more than a couple hundred MHz in this particular game.

Oh by the way, 4-core turbo ratios are equal in 13100 and 12400, both are boosting to 4200 MHz.
 
Fake or not, it provides us with insufficient information. We need real gameplay footages with metrics, not just final numbers which could be born either orthodox or unorthodox way.
ok but i must admit the 12400 looks more potent as expected..
my experience with intel chip has been so good that i would not need to replace one after 5 years , as such the 12400 might win from my thoughts of getting a 14100 depends on the pricing than, even a 12100 doesnt bother me if it just manages 60fps and ocassionally 1440p 75fps
 
ok but i must admit the 12400 looks more potent as expected..
my experience with intel chip has been so good that i would not need to replace one after 5 years , as such the 12400 might win from my thoughts of getting a 14100 depends on the pricing than, even a 12100 doesnt bother me if it just manages 60fps and ocassionally 1440p 75fps

I had two PCs, one based upon i5-7600K and another one sported i5-8400. 4.6 GHz 4C4T vs 3.8 GHz 6C6T, basically the same architecture.

Even then, it was noticeable how more stable the six-core model is. Now, 7600K is just a funny joke and 8400 still can run most (but not all) AAA projects smoothly. As a user who upgrades no more often than twice a decade, you should completely ignore lower tier CPUs as this backfires hard. Windows builds become heavier, browsers tax your hardware harder, antimalware runs more and more inefficiently and bloatware... don't even start.

Overkill CPUs all the way. I'd pick something like 12700/13600K if I was you. That way you can be 100% sure it won't need to be upgraded any time soon.
 
I had two PCs, one based upon i5-7600K and another one sported i5-8400. 4.6 GHz 4C4T vs 3.8 GHz 6C6T, basically the same architecture.

Even then, it was noticeable how more stable the six-core model is. Now, 7600K is just a funny joke and 8400 still can run most (but not all) AAA projects smoothly. As a user who upgrades no more often than twice a decade, you should completely ignore lower tier CPUs as this backfires hard. Windows builds become heavier, browsers tax your hardware harder, antimalware runs more and more inefficiently and bloatware... don't even start.

Overkill CPUs all the way. I'd pick something like 12700/13600K if I was you. That way you can be 100% sure it won't need to be upgraded any time soon.

agree but the power consumption is to high and i don't mind that say around 2030 the cpu isnt so good anymore..
as such models i consider: 13100/14100/12400(thx to u)/13400/14400 any more is to high tdp
running until now since 2012 a 3770 shows me this will be more than enough, its up to a user to keep a system mean and lean and from times to times clear
it from malware etc.
amd is in instantly for me when they release a reasonable low idle cpu which runs stable from the bat, than i give them a go np
 
agree but the power consumption is to high and i don't mind that say around 2030 the cpu isnt so good anymore..
as such models i consider: 13100/14100/12400(thx to u)/13400/14400 any more is to high tdp
running until now since 2012 a 3770 shows me this will be more than enough, its up to a user to keep a system mean and lean and from times to times clear
it from malware etc.
amd is in instantly for me when they release a reasonable low idle cpu which runs stable from the bat, than i give them a go np
You can configure power consumption all you please, no one stops you. Just spend a couple minutes in BIOS and you're golden. Undervolting and underclocking guides are vastly available, it's almost a zero IQ hassle.

Idle power consumption is <10 W for all next-gen CPUs if the motherboard works correctly anyway.
Gaming power consumption can easily be lowered by enabling vsync and undervolting.
Productivity power consumption, too. It also doesn't matter that much if you get paid for your TFLOPS xD
 
You can configure power consumption all you please, no one stops you. Just spend a couple minutes in BIOS and you're golden. Undervolting and underclocking guides are vastly available, it's almost a zero IQ hassle.

Idle power consumption is <10 W for all next-gen CPUs if the motherboard works correctly anyway.
Gaming power consumption can easily be lowered by enabling vsync and undervolting.
Productivity power consumption, too. It also doesn't matter that much if you get paid for your TFLOPS xD

Ok i see it a tiny bit more flexible, carrying over the same number 3770 to e.g. 13700 has its charm
 
Ok i see it a tiny bit more flexible, carrying over the same number 3770 to e.g. 13700 has its charm
Sorry, forgot to mention.

During the undervolting session you must take into account every CPU is unique. One CPU can be downed, say, from 220 W to 150 W at 100% performance but another, despite being de jure identical, will only be downed to 170 W without losing any performance whatsoever. This is called silicon lottery.

At some point of undervolting, you will notice your system is no longer stable or even refuses to boot. It doesn't mean you damaged anything, your system is fine. It's just configured incorrectly. Reset your BIOS settings, try something less adventurous, repeat if necessary. When you found your stable number that doesn't crash in any stress test/benchmark/game/etc just give your CPU another +10 mV on top of what you gave it already to be 200% sure it won't BSoD. Check benchmark results, unstable undervolting can result in lower performance but it won't crash for some reason.
 
Back
Top