• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core i9-13900KS

Waste of money anyway, imo. Thousands of jiggahertzes don't mean anything when there's no real world performance benefit.

And that's up to the consumer to decide. Me liking or disliking a product has zero bearing on what someone else might think. I think every current generation gpu minus the 4090 is trash due to pricing it doesn't make it true though.
 
How much cost 10 year old Haswell i5 computer?
 
Welcome to the internet. In a few years you will learn why low res game benchmarks are still there. Hint: It's NOT because people run games at low res.

Thanks for the warm welcome. Sorry for the missing /s. But you have to understand something - when I have to explain sarcasm, I feel that is the most horrible insult ever. You apparently need the training wheels when dealing with sarcasm. The previous phrase was a training wheel. And the previous one. And the previous one. There you go, now you are rolling!

If you don't see the ridiculous extreme of running way too low resolution benchmarks - sure, go ahead and run your 300p tests for the *mysterious* only slightly hinted reason that only some select few can truly understand. /s

The truth is that difference between the processors in real scenarios is extremely small: +- 1 or 2 fps at 150 fps. We don't need ridiculous low resolution tests to artificially exacerbate that difference and make it look like one processor is severely stronger than the other one. Unless your sponsor is strong arming you to do that, or you are looking to validate a choice you already made.
 
Last edited:
Thanks for the warm welcome. Sorry for the missing /s. But you have to understand something - when I have to explain sarcasm, I feel that is the most horrible insult ever. You apparently need the training wheels when dealing with sarcasm. The previous phrase was a training wheel. And the previous one. And the previous one. There you go, now you are rolling!

If you don't see the ridiculous extreme of running way too low resolution benchmarks - sure, go ahead and run your 300p tests for the *mysterious* only slightly hinted reason that only some select few can truly understand. /s

The truth is that difference between the processors in real scenarios is extremely small: +- 1 or 2 fps at 150 fps. We don't need ridiculous low resolution tests to artificially exacerbate that difference and make it look like one processor is severely stronger than the other one. Unless your sponsor is strong arming you to do that, or you are looking to validate a choice you already made.
I think what our colleague tried to say (with bad manners), is that reviewers do low resolution tests to simulate a future scenario. When you keep your CPU for 4-6 years, but you upgrade your GPU, you inadvertently run into a CPU bottleneck scenario. For that case, it is useful to know how much of a CPU bottleneck you can expect. It is not to test games at low res. It is not to test how much of a CPU bottleneck you will definitely have, as there is no way to accurately predict that. But it is a relatively good forecast to see how your CPU will perform with a theoretical future GPU in theoretical future games.
 
And that's why we, as enthusiasts, should be demanding that AMD and Intel (more Intel than AMD) should be supporting a socket and chipset for more than two years. That way, in four years, you can do a simple drop-in replacement of your CPU and instantly get another four years. This whole replace the whole system just to replace/upgrade the CPU is why we even have to have this question in the first place.
 
It's more global estimate, but we need more data from more market research companies, as they have different methodologies of gathering data.
View attachment 292762

Oh, you refer to ARM entirely. Yeah, I see ARM taking off in the mobile space, but I think Apple will remain an elite, luxury-focused brand. Their computers aren't cheap and they aren't shy to let anyone know that they're also not intended to be affordable.
 
i think it makes most sense with people who want to try and pair it with 8000 memory on somthing like an Apex/Unify X/Tachyon motherboard.

Like many high end cars, audio gear and homes these 13900KS CPUs are probably a waste of money for those looking to accomplish day to day tasks.

Then there is the entire realm of folks who want a piece of cherry picked gear to play with. Consider these a ticket to ride without as much risk of hoping to win the silicon lottery. DDR5 is relatively new to me. I've probably only built like a dozen systems for myself and clients since September 2022. My first DDR5 system for my own use was a 13700k and some G.Skill DDR5 6400. Well, I never was able to use it at 6400 speeds, instead settling on 5600 RAM speeds. I actually thought I had bad RAM after returning kit after kit discovered it was the nature of the CPU I was dealt (purchased). In February I decided up upgrade my Threadripper 2950x main rig to something more modern. I just could not believe how much more work CPUs can do since October 2018 when I built the Threadripper. Based on my rather annoying RAM experience I spend the extra $200 and bought the expensive edition for $729 in Feb then waited 2 weeks for it to get here. Apparently they are MUCH more expensive now, if you can find one. No matter. I will use this CPU for a few years and if I get tired of it, it gets migrated to one of my other machines around the house or office. I am certainly not wealthy, nor do I care about bragging rights. Given the hardware I have, I really do not even game any more. What I am, is curious about what current state of the art hardware feels like so I can honestly advise clients from personal experience. Would I recommend any of my clients buy a 13900KS? No, probably not. Most people don't need it. I don't even "NEED" it. I work with computers 6 to 7 days a week. Time is money. I love fast hardware. I love trying new hardware and purchase quite a bit of it for myself or clients. So I guess I am the 1/10 of 1% this might appeal to. Don't get me wrong, I don't buy every single CPU that hits the market just because it's top dog of the day. There has to be something of a perfect storm of the processor: 1- Being much faster than what I was using 2 - Offer new tech such as FASTER RAM, PCI Express and NMVe standards. 3 - I usually only buy a top CPU every few years and not every generation.

The KS CPU's were never about being a mainstream product. There are plenty of other choices that fit people's budgets, workloads, and thermal envelope requirements. I have no reason to badmouth those making choices for whatever reasons they feel is important to them. It's a personal thing. I am just happy there is real competition alive and well since 2017. The 10 or so years before that weren't that good for consumers as Intel spoon fed us 5% increases generation after generation.
 
Oh, you refer to ARM entirely. Yeah, I see ARM taking off in the mobile space, but I think Apple will remain an elite, luxury-focused brand. Their computers aren't cheap and they aren't shy to let anyone know that they're also not intended to be affordable.
True that. ARM entirely is almost Apple entirely plus a few server designs. Qualcomm is coming to laptops next year with ARM, so it will be interesting.
 
This 350W@stock (120W in gaming) furnace of a CPU with 8P+8e cores is losing to a 77W (50W in gaming) 8-core CPU in gaming. Seems to me that Intel CPU archs aren't as good anymore for enthusiast&sensible gamers.
 
This 350W@stock (120W in gaming) furnace of a CPU with 8P+8e cores is losing to a 77W (50W in gaming) 8-core CPU in gaming. Seems to me that Intel CPU archs aren't as good anymore for enthusiast&sensible gamers.

I find it so funny when someone has so much spite that they become bloodthirsty and go in for the kill without actually even bothering reading or anything. It has 16 E-cores, not 8. The one with 8 E-cores is the i7-13700K, or the previous-generation i9. But let's not kid ourselves, if you think that the 7800X3D is a "mere" 77W CPU, you're in for a world of disappointment.

If you're a so-called "sensible" gamer you don't even look at this segment, you buy the midrange parts like the 7800X3D which in their segments have historically performed equal or slightly ahead of flagships for years on end at this point. i5-2500K was arguably the first to do this, but let's fast forward to recent times... 5800X3D was better than the 5950X at games. That didn't make it a faster or better CPU or anything, just like the 7800X3D is not better than the 7950X3D or this processor. It's just targeted at a different segment, which it services well.
 
I'm really curious what Intel can do with the upcoming refresh, I mean this is already pushed to its limits.
100 come with improvement because the new proces have better performance, thermal and power consumption
 
I think what our colleague tried to say (with bad manners), is that reviewers do low resolution tests to simulate a future scenario. When you keep your CPU for 4-6 years, but you upgrade your GPU, you inadvertently run into a CPU bottleneck scenario. For that case, it is useful to know how much of a CPU bottleneck you can expect. It is not to test games at low res. It is not to test how much of a CPU bottleneck you will definitely have, as there is no way to accurately predict that. But it is a relatively good forecast to see how your CPU will perform with a theoretical future GPU in theoretical future games.
I'm a serious person, so I didn't get his sarcasm before.

However, I wish you didn't help him out here as I wanted him to learn how to use google search. The hard way. With keystrokes.

(I don't blame you tho)

100 come with improvement because the new proces have better performance, thermal and power consumption
What new process?
 
What new process?

Next-generation CPU, Meteor Lake, should be fabricated on Intel 4 (their 7 nm process). Raptor Lake is built on Intel 7 (marketing rebrand of their 10ESF process). It's a 10 nm chip.
 
Might add, not only "technically can", it does and does so with ease, at least when paired with a motherboard which has an adequate VRM and with adequate cooling.
Though only 1-2 core boost to reach that clock speed is kinda meh, more like a marketing trick. Just like FX-9590 was the "first 5GHz CPU" back in the day.
 
355W CPU Power and 115 degress Tempature woowwo_O
 
And it gives Intel the bragging rights to say that they have the first CPU on the market which can hit 6GHz.
Funny thing is, any 13700k\900k can do 1-2 cores at 6GHz either by manual bios settings or using a Gigabyte mobo with the right bios (see my spec). You don't need the KS for that..

The KS mainly give you the 'halo product' feeling of ownership and supposedly the uncompromising tag (if you disregard power completely..).
The only reason to choose it over the regular 13900K, as I see it, is if you must have the 'better binning' in order to undervolt and run it as cool as possible and\or with lower cooling capacity on a SFF build.
 
if you think that the 7800X3D is a "mere" 77W CPU

But it is a mere 77W CPU, that's what TPU found.

Regardless you can't deny that the difference in power consumption during gaming between the two isn't comical :

1682325008985.png


That's 150% more power for the same or lower performance.
 
That's 150% more power for the same or lower performance.

Yeah, but what I find interesting is the temps.

For an avg of 49W it should be sooo much cooler.

cpu-temperature-gaming.png
 
Yeah, but what I find interesting is the temps.

For an avg of 49W it should be sooo much cooler.

How much cooler ? 65C is perfectly adequate, AMD's chips have higher thermal density than Intel's.
 
Last edited:
CPU thermals are irrelevant if the cooling solution that one has, allows the CPU to perform at or very close to its best. Power draw otoh is critical as it rises the temp in the room the PC exists. Aside from the power cost.
 
Last edited:
Next-generation CPU, Meteor Lake, should be fabricated on Intel 4 (their 7 nm process). Raptor Lake is built on Intel 7 (marketing rebrand of their 10ESF process). It's a 10 nm chip.
Don't be fooled by the names. Intel 7 is much closer to TSMC's N7 than TSMC's 10nm process. Here are the relevant dimensions:

ProcessContacted Gate Pitch (CPP) (nm)Minimum Metal Pitch (MMP) (nm)Fin Pitch (nm)
Intel 7 (10nm ESF)604034
TSMC N7 (7 nm)64 (57 for high density libraries)4030

Next-generation CPU, Meteor Lake, should be fabricated on Intel 4 (their 7 nm process). Raptor Lake is built on Intel 7 (marketing rebrand of their 10ESF process). It's a 10 nm chip.
Nope. Intel 4 is closer to TSMC's N3 than N5. TSMC's N7 isn't even worthy of consideration.

1682343940635.png
 
Last edited:
Don't be fooled by the names. Intel 7 is much closer to TSMC's N7 than TSMC's 10nm process. Here are the relevant dimensions:

ProcessContacted Gate Pitch (CPP) (nm)Minimum Metal Pitch (MMP) (nm)Fin Pitch (nm)
Intel 7 (10nm ESF)604034
TSMC N7 (7 nm)64 (57 for high density libraries)4030

Yes, I'm aware of its characteristics, but it remains a rebrand of (aka, it is) Intel's 10ESF process. The one used in Raptor Lake is further tweaked to exceed the performance of what was seen in Alder Lake. It's only yet another reason as to why nm's are practically meaningless these days.

Good read: https://fuse.wikichip.org/news/525/...ntels-10nm-switching-to-cobalt-interconnects/
 
I think Ryzen was on 4nm already? the new Zen 7040 series APU is 4nm I thought? So why we talking about old tech for?
 
Back
Top