• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Core i9-14900K

I didn't ask you to rely on anything. You asked me why I bought a 14900k, im telling you cause it's the fastest they way I run my CPU.
Good for you, but because all rely on trusted reviews before they buy their CPUs, then the 7800X3D is still the fastest gaming CPU for most gamers.

No one rely on what a random dude on a forum is able to achieve.

If we are supposed to care about your points here, then people should also care that I bought the 7700 CPU, because it's power efficient and has a performance / price ratio that is very good. Yup, being only 11.5% slower in gaming in 1440p over the 14900K to a whopping 45% cheaper price.

That's what I will call priceless.
 
Good for you, but because all rely on trusted reviews before they buy their CPUs, then the 7800X3D is still the fastest gaming CPU for most gamers.

No one rely on what a random dude on a forum is able to achieve.

If we are supposed to care about your points here, then people should also care that I bought the 7700 CPU, because it's power efficient and has a performance / price ratio that is very good. Yup, being only 11.5% slower in gaming in 1440p over the 14900K to a whopping 81% cheaper price.

That's what I will call priceless.
The 7700 is very expensive actually, it's only 10% faster than an i3 12100 and costs 300% more. Especially with the GPU you have, it will perform identical in games. You paid 3 times the money for 0% more performance.
 
The 7700 is very expensive actually, it's only 10% faster than an i3 12100 and costs 300% more. Especially with the GPU you have, it will perform identical in games. You paid 3 times the money for 0% more performance.
relative-performance-games-2560-1440.png


The '12100' is at 68.9% there while the '7700' is at 88.5% compared to the 14900K at 100%. That's almost 20% difference from the '7700' to the '12100'. So where do you get your 10% from?

And as for rendering stuffs like Cinebench R23, the 12100 is so slow that it's not even listed in the chart here.

cinebench-multi.png


So yeah, the 12100 is dirt cheap for a good reason.

EDIT: I have the CPU I have so I can upgrade to a new and much more powerful GPU later without having to update the CPU as well.
 
The '12100' is at 68.9% there while the '7700' is at 88.5% compared to the 14900K at 100%. That's almost 20% difference from the '7700' to the '12100'. So where do you get your 10% from?

And as for rendering stuffs like Cinebench R23, the 12100 is so slow that it's not even listed in the chart here.
That's with a 4090. You have a 3060ti. There is 0% difference in gaming with the 3060ti. You paid 300% more for 0% performance, very good choice, value for money.

If you think your 7700 is just 11% slower than a 14900k then go ahead, post your numbers on TLOU :D
 
That's with a 4090. You have a 3060ti. There is 0% difference in gaming with the 3060ti. You paid 300% more for 0% performance, very good choice, value for money.
It's still a huge difference with a 4090.

Again, let me just point it out again, incase you missed it:

I have the CPU I have so I can upgrade to a new and much more powerful GPU later without having to update the CPU as well.

If you think your 7700 is just 11% slower than a 14900k then go ahead, post your numbers on TLOU :D
TPU shows it is 11.5% slower in gaming at 1440p.
 
Then post your numbers, let's see that
Why should I post my numbers just to please you and just to prove the point?

TPU have already shown how powerful both the 7700 and 7800X3D is in games with the best GPU possible. And this is about how powerful the CPU in it self are when maxed out and not to how powerful MY computer is in combination to what my hardware on my computer is.
 
Why should I post my numbers just to please you and just to prove the point?

TPU have already shown how powerful both the 7700 and 7800X3D is in games with the best possible GPU possible.
Because we both know the difference isn't 11% but 30+ but whatever, you do you
 
Because we both know the difference isn't 11% but 30+ but whatever, you do you
I'm not here to please you towards how you set some rules towards how your computer is and how you overclock your CPU. I'm here to prove a valid point that TPU already have proven.

With a RTX 4090, the 7700 is only 11.5% slower than the 14900K in games at 1440p resolution and the 7800X3D is the fastest CPU for gaming. What my hardware combination on my computer does, is of no interests. What matters here is how the 'Ryzen 7 7700' and the 'Ryzen 7 7800X3D' performs in comparsion to the 'Core i9-14900K' CPU under the most optimal setting / situation.

TPU have given the stats from the best possible way to achieve those stats by utilizing the CPUs fully out so we know how powerful they actually are.

So why should we care about some scewed stats and stats that can't be relayed on from a random overclocker with random overclocking settings?
 
Last edited:
Sorry if this was covered but why was there such a huge difference in the manual vs. auto overclock? 5.5Ghz manually "and it was too hot" but the AI that doesn't touch voltages was hitting 5.9Ghz with 8 active cores (aka all the cores). @W1zzard
 
Sorry if this was covered but why was there such a huge difference in the manual vs. auto overclock? 5.5Ghz manually "and it was too hot" but the AI that doesn't touch voltages was hitting 5.9Ghz with 8 active cores (aka all the cores). @W1zzard
I only do manual all-core OC, with a voltage offset, all cores to the same frequency, which is definitely suboptimal in many cases.

Good question though. I think at that config it will run into the power limit before it can reach 5.9 on all cores
 
The data for the 7950x in the 14900k temperature chart is very wrong.
 

Attachments

  • Screenshot (30).png
    Screenshot (30).png
    587.8 KB · Views: 100
  • Screenshot (31).png
    Screenshot (31).png
    452 KB · Views: 103
Core i9-14900K is Intel's new flagship with clock speeds of up to 6 GHz. It's actually clocked even higher than the 13900KS, thanks to an extra 100 MHz when more than two cores are active. Our review confirms that Raptor Lake Refresh is amazing for both applications and gaming, if you can live with the power consumption.
Out of curiosity, at what resolution did you run gaming power consumption chart? Was it 1080p, 1440p or 4K? Would CPUs use less power in 4K, as GPU does more work?
 
Out of curiosity, at what resolution did you run gaming power consumption chart? Was it 1080p, 1440p or 4K? Would CPUs use less power in 4K, as GPU does more work?
At 1080p, which is typically the highest CPU power, but still realistic unlike 720p
 
My 14900k was instable from beginning. This year after 3-4 months with the new mainboard (before Z690, now Z790 refresh) i get rarely even CPU Parity errors in the Windowslog.

And by the online news the reports become more of crashes etc. and that it also concerns the 13th gen.
 
My 14900k was instable from beginning. This year after 3-4 months with the new mainboard (before Z690, now Z790 refresh) i get rarely even CPU Parity errors in the Windowslog.

And by the online news the reports become more of crashes etc. and that it also concerns the 13th gen.
So you've ruled out the motherboard by trying a different one. have you ruled out the RAM yet?

If so, just RMA it. Intel's warranty is there to replace defective products, and the binning process isn't 100% perfect so you might simply have a defective CPU that can't actually sustain it's binned speeds.
 
RAM: Before i used DDR4. Even at 2xxx speeds the Intel XTU stress tests failed.
Now DDR5. The crashes aren‘t so often, maybe once in 2-3 months. One i can remember at a freshly installed game, where it was in the process to compile shaders (i guess) - there it crashed.
 
@firejohn CPUs can come out of the factory bad, as rare as it may be. I suggest starting a new thread with your problems. Narrow down if actually the CPU or not.
 
The 5800X3D gaming average power consumption must be incorrect. There's no way this CPU averages 52W power consumption in those 13 games.
 
Back
Top