• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core i9-9900KS to be Available from October

it might be the best processor for gaming, but it sure as hell isn't the best one for gaming AND streaming/recording in x264
I've been playing around with my 3900x, and if I set the affinity on OBS on the last 12 threads + CPU5 and CPU11 (SMT threads in CCX0 and CCX1 of CCD0 respectively), then the drop in FPS I experience when recording in 1080p medium 30 fps is a whopping 2-3 frames, with the reference point being 120 fps.
the quality is fantastic, there are zero dropped frames, and the game fps loss is minimal.

there is simply NO reason to buy a $500 intel CPU right now when the objectively superior 3900x exists for the same price, especially since you can slot it into most B450/X470 motherboards. that's what I did; upgraded from a R7 1700 to the R9 3900x on the same X470 motherboard I've got, and the difference is night and day. I can finally record/stream in x264 without any hiccups, any dropped frames, and with a completely ridiculously minimal drop in game framerate.
 
Also the socket is dying already.
 
Unless Intel are willing to face the issue of pricing for these chips, the only ones using them will be YouTubers while real Worlders will continue to buy and use AMD.
 
For gaming and emulation? LMAO... AMD CPU is hit or miss performancewise. Mostly miss if you're a 120-240 Hz gamer.
5% difference with a 2080Ti on FHD. On 1440P, you have 3% difference. That 3% surely misses when you target 120-240.... LOL

With your 1080Ti, there is about ZERO difference even at FHD between a 3900X and a 9900K.

+

352617_csgo.jpg


CSGO.png


Meanwhile, this is how Intel is panicking in Slovakia:
View attachment 131072

and in Czech rep. after AMD releasing Zen2.
View attachment 131073
What do you want to say? AMD is outselling Intel 4 to 1 in Germany, Japan and I bet in other countries too.

If you call 3-5% more a rape? Funny :)

Depends which game and I'd rather go min fps than max.

The Emulator is not optimized for Ryzen which is a new processor. Just give it some time :)
Leave him, he is a blue fan. Fun fact: he is playing with emulators instead of buying a console while having a $500 CPU. LOL at him.
 
Last edited:
I don't agree with that, at least 7700K OC'd to 5GHz easily, which was rare with 6700K.

And that 5ghz didn’t do a shit in real gaming scenarios. After 4.5 you could squeeze about 1 fps for each 100hz. But the power consumption increased substantially. So yeah, 7700k was a terrible release.
 
And that 5ghz didn’t do a shit in real gaming scenarios. After 4.5 you could squeeze about 1 fps for each 100hz. But the power consumption increased substantially. So yeah, 7700k was a terrible release.
I doubt that every overclocker is just playing games.
 
The last CPU that can convince me to buy Z390 chipset. i will wait and see the benchmarks.
 
Wasn't that the version with 512k L2?

That's not the only 512k Pentium III. The first ones were Katmai SLOT1, they had dedicated IC's... For Tualatin on die yes... Also the SMP capability tagged along.
 
That's not the only 512k Pentium III. The first ones were Katmai SLOT1, they had dedicated IC's... For Tualatin on die yes... Also the SMP capability tagged along.
I know, I had a Katmai also. Tualatin-512 was what I meant :)
 
Since we now officially have a 5GHz all core chip, when do we get the 10GHZ Netburst one Intel?
 
intel are going to have to do something magical to run 8 cores 16 threads at 5 g without some supper expensive cooling solution..

i am gonna believe it when i see it..

trog
 
For gaming and emulation? LMAO... AMD CPU is hit or miss performancewise. Mostly miss if you're a 120-240 Hz gamer.
Even 9700K rapes the 3900X here. For less money.

I would never choose Ryzen for a gaming PC unless it's for 4K/UHD+ / GPU bound gaming maybe. Because for high fps gaming, fps will be lower for sure compared to 8th or 9th gen Intel. Especially when these are running at 5 GHz or more.

I would however choose 3600 over 6C i5's in the budget range tho, if you plan to use it for more than 2 years.
For high-end, nothing beats 8700K, 8086K, 9700K and 9900K/9900KS with OC. As good as it gets for high fps gaming for now and this is a fact.

If you do emulation, like I do, AMD CPU is a no-go unless you want to see terrible perf
I'm blasting 100+ fps in Zelda BOTW at 5K using tons of graphicpacks, newest CEMU, try that on Ryzen, good luck

I'm a robotic Intel supporter & AMD is bad, the end.

Your comment in a nutshell.
 
This better be as good as Intel claims, or the Intel fan babies heads will explode.;)
 
This better be as good as Intel claims, or the Intel fan babies heads will explode.;)
Don't you think they've exploded already? Considering some of the post here and other threads, brainless monkeys must have posted some of it.

Since we now officially have a 5GHz all core chip, when do we get the 10GHZ Netburst one Intel?
Oh yeah. The Netburst. On paper it looked amazing and then reality came along and it all went down the drain. Bummer.
 
Don't you think they've exploded already? Considering some of the post here and other threads, brainless monkeys must have posted some of it.


Oh yeah. The Netburst. On paper it looked amazing and then reality came along and it all went down the drain. Bummer.

Since we now officially have a 5GHz all core chip, when do we get the 10GHZ Netburst one Intel?

And Bulldozer looked great on paper and crapped out on the real world. So what? AMD fanboys are hilarious, AMD actually has a killer cpu and you bring up Netburst, why?
 
And Bulldozer looked great on paper and crapped out on the real world. So what? AMD fanboys are hilarious, AMD actually has a killer cpu and you bring up Netburst, why?
I didn't bring it up. I responded to a post mentioning Netburst so what? Besides this isn't about AMD. Not saying you are wrong though.
 
5% difference with a 2080Ti on FHD. On 1440P, you have 3% difference.

No, it's more.
Which makes me wonder, if it is my memory that is faulty, or TPU has changed the results.
 
intel are going to have to do something magical to run 8 cores 16 threads at 5 g without some supper expensive cooling solution..

i am gonna believe it when i see it..

trog
Binned chips which consume less than a normal 9900K, I'd guess.
 
I'm a robotic Intel supporter & AMD is bad, the end.

Your comment in a nutshell.

Nah my comment simply represent the truth. There's numbers all over the web that backs it up.
I have a Ryzen 1700 in my homeserver, where it does a fine job. Underclocked and undervolted.

I would not use it in my gaming rig tho... That's for damn sure, even 3rd gen Ryzen will be a pretty big downgrade from my 9900K @ 5.2 GHz. Already tested emulation perf on my 1700 and it was terrible, even at 4 GHz, using 2933/CL14 and newest AGESA. Even an old i5-2500K would probably smash it in emulation. Most emulators simply suck with AMD hardware. AMD GPU can work in some but AMD CPU is mostly a no-go. Just like in tons of applications. They are simply optimized for Intel and therefor perform better, sometimes much better.

Ryzen is decent for some workloads, performance is hit or miss tho. With Intel you get solid performance across the board. Especially true for gaming and emulation.

This may or may not change over the years but right now that is the reality.

I'm not sure why some people keeps denying this (most are Ryzen owners tho, soooo I understand ;)

If you call 3-5% more a rape? Funny :)

Depends which game and I'd rather go min fps than max.

The Emulator is not optimized for Ryzen which is a new processor. Just give it some time :)

Digital Foundry recently tested this in depth and 9700K was faster in min, max and avg. by more than 3-5% ... ALSO cheaper than 3900X.

Yes give it some time and performance MIGHT improve ... personally I want good performance NOW, not in "some years" - I guess we are all different.

We can discuss this all day long, what I'm saying is facts tho. All you have to do is read reviews and watch video's. Look for real world testing and high fps gaming to see the real truth, instead of Cinebench.

Im not saying Ryzen is bad, it's just bad for what I do with my PC: Emulation and high fps gaming. Ryzen is nowhere near the performance my 9900K delivers in these workloads. Regardless of generation, agesa, memory etc.
 
Last edited:
I brought up Netburst because it also ran quite hot, and people above talked about the older Pentiums with the "S" at the end.

For my usecase the 9900KS does not make sense. It will run way too hot.
And for AMD Bulldozer, at first i thought it would be a 8-Module CPU. But then reality set in.

When it will be released and benchmarked we will see, how much improvement over the 9900K there really is. And if there is even an noticeable power consumption difference to the 9900K which are overclocked to 5GHz all core. For games which mostly use only one thread, there should not be any difference in performance. I am really curious to see what games can really benefit from the 300Mhz more at all cores :)

@Ias The Ryzen 3k are quite an improvement over the 1 and 2K ones on some areas. And Benchmarking Intel is difficult, since most Mainboard vendors ignore the official intel specs and just boost longer and higher then officially specified. (Just a fact, with advantages and disadvantages). And Intels architecture hasn't changed much over the years, which make a optimisation quite usefull. AMD is a new Architecture, and programmers have still get used to it.
Oh, do you by any chance know, why the 9900K is falling behind a 2700X when you game and stream? Both have 8 cores. Which makes it kind of a mystery to me. It just does not make sense.
 
Last edited:
I brought up Netburst because it also ran quite hot, and people above talked about the older Pentiums with the "S" at the end.

For my usecase the 9900KS does not make sense. It will run way too hot.
And for AMD Bulldozer, at first i thought it would be a 8-Module CPU. But then reality set in.

When it will be released and benchmarked we will see, how much improvement over the 9900K there really is. And if there is even an noticeable power consumption difference to the 9900K which are overclocked to 5GHz all core. For games which mostly use only one thread, there should not be any difference in performance. I am really curious to see what games can really benefit from the 300Mhz more at all cores :)

9900KS won't run hot using decent cooling. 9900K can run 5 GHz on all cores using cheap 240 aio or dual tower air coolers like nh-d14/d15.

My 9900K at 5.2 GHz (no AVX offset) peaks at 50C in gaming using custom water, 8 year old head and low ppi rads .. Same head that I used on my 2600K back in the days, for 4.8 GHz..

Performance is clearly improved compared to stock (watch any review that tests 9900K stock vs 5 GHz and you'll see the same).

Games don't just use one thread, lol?

9900K already beats 3900X in most real world testing and 9900KS will beat it even more

Anyone that has experience with high fps gaming and emulation software knows what I'm talking about
 
Last edited:
So Intel is pulling an AMD, with a "panic" launch that is 4 months late and is just a rebrand of an already existing chip? :rolleyes:

Good job intel, getting caught with your pants down yet again.
 
Back
Top