• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Launches Core i9-13900KS 8P+16E Flagship Processor at $700

That's true. But not an AMD problem. DDR5 is the same for both (and all those fancy Intel benchmarks are for Intel using much more expensive DDR5 than the AMD results). As for motherboards, again AMD boards are cheaper. The cheapest B760 board for Intel to release in Canada so far is $180 USD. As the transition to DDR5 speeds up this year the old DDR4 stuff will go off the market. It is already happening. RTX 3080 is gone also.

The AMD Asus Tuf B650 ATX MB is much cheaper than the equivalent Z790 version.

The design of Zen4 is a huge success (much faster than people were expecting from the early leaks) and it costs little for AMD to make, it is a commercial success, the better product you have that you can make for less, that's a win.
You know why I really really doubt the 3d will beat Intel? Cause ive tested extensively, a 12900k running at 5.4 ghz and tuned ddr5 is around 15% slower than a STOCK 13900k with 7600c34 ram. That 12900k, at 5.4 ghz, is surely faster than the 5800x 3d and all zen 4 at stock, yet the 13900k is around 15% ahead of that.
 
Does it matter? They will be very close, but one of them will be energy efficient, and the other power guzzler that heats your room with extra 200W for roughly the same performance - and games are rarely so CPU dependant that it makes any visible difference. Unless you buy an RTX 4090 and play on 1080p...
 
Does it matter? They will be very close, but one of them will be energy efficient, and the other power guzzler that heats your room with extra 200W for roughly the same performance - and games are rarely so CPU dependant that it makes any visible difference. Unless you buy an RTX 4090 and play on 1080p...
Which cpu is using 200w in gaming? What are you talking about man... In 4k on a 13900k the cpu usually sits between 60 to 90 watts...
 
Which cpu is using 200w in gaming? What are you talking about man... In 4k on a 13900k the cpu usually sits between 60 to 90 watts...
Never said it draws that much in gaming. Although yours must be a golden sample, TechPowerUP measured quite a bit more even in gaming. But what do they know...

Since we're at 4K gaming, how much faster will the $700 CPU be from a "normal one" at that resolution?

"In 4K Ultra HD resolution, which is the native stomping ground for the GeForce RTX 4090, we're seeing that the Ryzen 7 5800X3D is matching the Core i9-13900K "Raptor Lake" very well. Averaged across all 53 games, the i9-13900K is a negligible 1.3% faster than the 5800X3D."
 
Never said it draws that much in gaming. Although yours must be a golden sample, TechPowerUP measured quite a bit more even in gaming. But what do they know...

Since we're at 4K gaming, how much faster will the $700 CPU be from a "normal one" at that resolution?

"In 4K Ultra HD resolution, which is the native stomping ground for the GeForce RTX 4090, we're seeing that the Ryzen 7 5800X3D is matching the Core i9-13900K "Raptor Lake" very well. Averaged across all 53 games, the i9-13900K is a negligible 1.3% faster than the 5800X3D."
Well the same logic applies to the 3d. Since we are at 4k gaming, how much faster is the 5800x3d or even the 7700x 3d compared to lets say the much cheaper 12400?

When you said it draws 200w more what did you refer to? Mt workloads? Well i can tell you that the 13900k is way way way more efficient than the 5800x 3d in those workloads.
 
13900K and KS are not just gaming CPUs.
A theoretical 13600KS would perform the same.The 13900 range is to run games and apps at the same time or demanding software. And yes it consumes a lot and it’s hot as hell.

The KS version specifically should not be criticised about the consumption or the temps though. It’s a halo product, a binned cpu that has its value to the people who know why they want it.
Intel did not develop it for the average or even the enthusiast user.
 
The same people who buy zen 4 i guess? 6 cores foe 350 euros working at 95c. Your description fits them perfectly
350? Even here and in this corner of Europe where in general all hardware is very expensive, the 7600x does not exceed the 270e, as for the 95c you probably haven't heard of PBO and Curve optimizer.
Opera Snapshot_2023-01-13_114656_www.skroutz.gr.png
 
350? Even here and in this corner of Europe where in general all hardware is very expensive, the 7600x does not exceed the 270e, as for the 95c you probably haven't heard of PBO and Curve optimizer.View attachment 278906
Thats the price today after multiple priccuts cause they werent selling. It was originall released at 350, competing against the competitions 14cores.

You can optimize every cpu, so how is that an argument? Even the 13900ks can be limited to 50 watts and be extremely efficient, thay doesnt stop people from crying about its temperatures does it?
 
Uhm, what? The 7700x is nowhere near the 13900k, what the heck are you talking about? The 7700x is a 12600k competitor in MT performance.

It's funny though, you think the 13900k consumes 280w at 100 degrees playing cs go? Uhm, okay, does Valorant count? I tried it, cpu was at 40 watts. So yeah, classic amd hive mind spreading misnformation.
It is completely unreasonable/insane to buy a CPU with so many cores just for gaming. So the TDP problem is valid, as simulations and other workloads will run continuously for long periods on all cores.
 
It is completely unreasonable/insane to buy a CPU with so many cores just for gaming. So the TDP problem is valid, as simulations and other workloads will run continuously for long periods on all cores.
You know you can run any cpu at whatever wattage you want, right? There is the 13900T with a 35w power limit
 
Last edited:
13900K and KS are not just gaming CPUs.
A theoretical 13600KS would perform the same.The 13900 range is to run games and apps at the same time or demanding software. And yes it consumes a lot and it’s hot as hell.

The KS version specifically should not be criticised about the consumption or the temps though. It’s a halo product, a binned cpu that has its value to the people who know why they want it.
Intel did not develop it for the average or even the enthusiast user.

Most people that buy it just want to feel special. It's a marketing product and that marketing works on people. It's the same reason people buy $400 water cooling instead of a $100 cooler. Buy a $600 motherboard instead of a $300. You're kidding yourself if you think the people that buy it have a use for it. It's a dopamine hit. The same reason people go to the mall and buy stuff they don't need.
 
Halo products always seem to trigger folks. To each their own I say.
Shrug
If Ted wants a 13900ks to build his ultimate gaming rig with, he's going to get one and nothing anyone says is going to sway him.
But folks have this overpowering urge to tell other folks how they should or shouldn't spend their money (for a multitude of reasons). And so here we are, seeing people angry type about a CPU...what a hilarious world we live in.

Every gen it's the same arguments over and over again, as far back as I can remember.
 
As far as hobbies go, PC gaming even isn't that expensive.

In last decade or two most hobby equipment got it's very expensive high end range. And it doesn't matter if it's amateur sports, or hobby photography, or hi-fi equippment, or home improvement tools and gadgets, or watches - there is a market for outrageously expensive stuff, and I don't mean "it's nearly $2000!" expensive. You want to spend $200.000 for audio equipment? You can find just the speakers for that price, you're a bit short for the whole system!

Of course it's difficult to do that with high tech products that only make economic sense if they're produced in large enough quantities. But I wouldn't be surprised if we're seeing the attempts to tap into that market segment, artificially selecting the top end as somehow ultra hard to make - and we should see the $10.000 CPUs and GPUs sooner or later then.
 
When you said it draws 200w more what did you refer to? Mt workloads? Well i can tell you that the 13900k is way way way more efficient than the 5800x 3d in those workloads.

Please, tell me honestly - are you being paid for this nonsense?

embed.php



13900K consumes 95% more, than 5800X3D in all Phoronix benchmarks (and at the peak - 123% more, prepare your PSU for this).



its gaming performance is similar to that of the i9-11900K.

Where is 11900K similar, maybe in this game, m? (happened to see a CPU test in this game today, so i bring it)

2gnq1bgc9tba1.jpg
 
Please, tell me honestly - are you being paid for this nonsense?

embed.php



13900K consumes 95% more, than 5800X3D in all Phoronix benchmarks (and at the peak - 123% more, prepare your PSU for this).





Where is 11900K similar, maybe in this game, m? (happened to see a CPU test in this game today, so i bring it)

2gnq1bgc9tba1.jpg
Are we seriously debating whether the 13900k is more efficient in mt workloads compared to the 5800 3d? Oh god
 
I'm going to give one of my "Keynote Award Presentation Speech" posts for this one:
(Just imagine me being on a keynote stage saying this... heheheheh)

Ladies and gentlemen, almost exactly 9½ years ago, AMD released a CPU that was well-known as a fire-breathing monster.

It was clear even then that Intel was insanely jealous of it because for years Intel has stubbornly tried to replicate AMD's amazing feat.

Intel put in tremendous effort with a "never-say-die" doggedness, flatly refusing to give up but until today has sadly always fallen short.

If nothing else, Intel has proven to all of us that if you never give up on your dream, all the effort you put in will one day pay off in the biggest way!

Intel should be proud to have finally created a CPU that is every bit as cynical, wasteful, hot, overpriced and underperforming as the AMD FX-9590!

Congratulations to Intel on finally qualifying for that list! I can just imagine how proud you are and it's about time!

We have the FX-9590 on hand to welcome his brother-in-spirit, the i9-13900KS to this exclusive club.

PLEASE GIVE THEM BOTH A BIG ROUND OF APPLAUSE FOR THEIR RESPECTIVE ACHIEVEMENTS!!! :D
 
I'm going to give one of my "Keynote Award Presentation Speech" posts for this one:
(Just imagine me being on a keynote stage saying this... heheheheh)

Ladies and gentlemen, almost exactly 9½ years ago, AMD released a CPU that was well-known as a fire-breathing monster.

It was clear even then that Intel was insanely jealous of it because for years Intel has stubbornly tried to replicate AMD's amazing feat.

Intel put in tremendous effort with a "never-say-die" doggedness, flatly refusing to give up but until today has sadly always fallen short.

If nothing else, Intel has proven to all of us that if you never give up on your dream, all the effort you put in will one day pay off in the biggest way!

Intel should be proud to have finally created a CPU that is every bit as cynical, wasteful, hot, overpriced and underperforming as the AMD FX-9590!

Congratulations to Intel on finally qualifying for that list! I can just imagine how proud you are and it's about time!

We have the FX-9590 on hand to welcome his brother-in-spirit, the i9-13900KS to this exclusive club.

PLEASE GIVE THEM BOTH A BIG ROUND OF APPLAUSE FOR THEIR RESPECTIVE ACHIEVEMENTS!!! :D
Im not going to defend the 13900ks cause it isn't a great value cpu...but it's a prebinned preoverclocked 13900k. So what were you expecting exactly?
 
Halo products always seem to trigger folks. To each their own I say.
Shrug
If Ted wants a 13900ks to build his ultimate gaming rig with, he's going to get one and nothing anyone says is going to sway him.
But folks have this overpowering urge to tell other folks how they should or shouldn't spend their money (for a multitude of reasons). And so here we are, seeing people angry type about a CPU...what a hilarious world we live in.

Every gen it's the same arguments over and over again, as far back as I can remember.
I agree with you. I only have the 12th gen i9 halo product because it was about the same price as the regular one (secondhand, eBay). Its a great bin though. The only downside to the halo products is if the average i9 K series is made worse when Intel takes the better bin i9 for the KS. Otherwise I can't see any reason for anyone to care.
 
That's true. But not an AMD problem. DDR5 is the same for both (and all those fancy Intel benchmarks are for Intel using much more expensive DDR5 than the AMD results). As for motherboards, again AMD boards are cheaper. The cheapest B760 board for Intel to release in Canada so far is $180 USD. As the transition to DDR5 speeds up this year the old DDR4 stuff will go off the market. It is already happening. RTX 3080 is gone also.

The AMD Asus Tuf B650 ATX MB is much cheaper than the equivalent Z790 version.

The design of Zen4 is a huge success (much faster than people were expecting from the early leaks) and it costs little for AMD to make, it is a commercial success, the better product you have that you can make for less, that's a win.

You're comparing apples to oranges in bad faith throughout. Z790 isn't positioned in the B650's tier, it's on the X670E's. It's the premium segment chipset for the LGA 1700 socket.

It's definitely not "very cheap to manufacture", but prices have been going down because of the lineup's poor commercial performance, for the reasons mentioned by Bwaze. Hell in the United States if you have access to Micro Center they are giving people a free DDR5 kit for simply buying one of these CPUs and it still hasn't tilted the market in its favor.

Simply put no one wants to buy an expensive and buggy platform and that is precisely what AM5 is thus far. With Raptor Lake you can buy an inexpensive Z690 motherboard and reuse your existing DDR4 memory and that has been a very popular path to take.

It's not much faster than what I personally expected, its only strength is AVX-512 support which die-hard AMD fans ironically hated until their processor brand could do it too. Go figure. As for the the general performance, unimpressive from the standpoint of someone on Zen 3 and downright weak if you're comparing it to Raptor Lake. Needs work, hence, X3D chips.

Where is 11900K similar, maybe in this game, m? (happened to see a CPU test in this game today, so i bring it)

2gnq1bgc9tba1.jpg

TPU's own review that I linked in my initial comment, which I personally rank far above that Russian "GameGPU" website (whose results I have personally never been even close to replicating). GameGPU is UserBenchmark-tier SEO crap, that is to say... almost completely untrustworthy.
 
Last edited:
Somebody's talking about energy efficiency...
So, for what reason do you think they added some E-cores to it? Because P-cores are simply not energy-efficient enough of course.
Then why didn't they make it all E-cores, say 32 E-cores? Because it won't be powerful enough.
So, they make it hybrid, something they still dare not to apply to business segment to this day...
 
Last edited:
You're comparing apples to oranges in bad faith throughout. Z790 isn't positioned in the B650's tier, it's on the X670E's. It's the premium segment chipset for the LGA 1700 socket.

It's definitely not "very cheap to manufacture", but prices have been going down because of the lineup's poor commercial performance, for the reasons mentioned by Bwaze. Hell in the United States if you have access to Micro Center they are giving people a free DDR5 kit for simply buying one of these CPUs and it still hasn't tilted the market in its favor.

Simply put no one wants to buy an expensive and buggy platform and that is precisely what AM5 is thus far. With Raptor Lake you can buy an inexpensive Z690 motherboard and reuse your existing DDR4 memory and that has been a very popular path to take.

It's not much faster than what I personally expected, its only strength is AVX-512 support which die-hard AMD fans ironically hated until their processor brand could do it too. Go figure. As for the the general performance, unimpressive from the standpoint of someone on Zen 3 and downright weak if you're comparing it to Raptor Lake. Needs work, hence, X3D chips.



TPU's own review that I linked in my initial comment, which I personally rank far above that Russian "GameGPU" website (whose results I have personally never been even close to replicating). GameGPU is UserBenchmark-tier SEO crap, that is to say... almost completely untrustworthy.
"You're comparing apples to oranges in bad faith throughout. Z790 isn't positioned in the B650's tier, it's on the X670E's."

Total lie. The B650 is the same as the Z790. Both unlocked for overclocking. You got fooled by a letter. X670 adds USB and SATA ports and it is a scam, it doesn't add overclocking. LMAO. Who did the positioning you talked about? You? Intel's dreams?

That's very rich coming from you, talking about arguing in bad faith when you start with such a whopper. We're done.

What is it with Intel users and the constant lying. B760 is not comparable to B650, or B650E, not even close. You know that. Don't lie.
 
At this point it might as well not come with an IHS and just provide a special cooler for a direct mount. If Intel is going to go this far, they might as well just go all out. With how much heat these dense chips produce, even a soldered IHS can mean a 14°C delta versus a direct die cooler.
 
Back
Top