• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Releasing 10 core 20 thread i9-10900KF for $499 very soon... 5.2 Ghz boost

Status
Not open for further replies.
Joined
Sep 3, 2019
Messages
2,965 (1.78/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 150W PPT limit, 79C temp limit, CO -9~14
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F37h, AGESA V2 1.2.0.B
Cooling Arctic Liquid Freezer II 420mm Rev7 with off center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3600MHz 1.42V CL16-16-16-16-32-48 1T, tRFC:288, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~465W (366W current) PowerLimit, 1060mV, Adrenalin v24.2.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR1000
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v23H2, OSB 22631.3155)
For me is somewhere in the middle...
My opinion and assumption according what I am seeing getting past (downwards) that 14~12nm process and what Im hearing and reading across the net.

Clockspeeds are not irrelevant today. Most mainstream and some pro software is build from years to now upon the clockspeed/IPC gains of CPUs mostly. Developers did not bother too much to utilize the wider resources (cores/threads) because its way easier to use clockspeed.
The have been lazy the past decade because CPUs kept increasing clock speed significantly along and over IPC. The have been sleeping on the job...
I believe this was an era that is soon to be over. Intel gasps the last breaths of clocks increases. I bet her next all new arch is a more AMD like. Clockspeed cannot increase infinitely, especially at those processes of 7nm >> 5nm >> 3nm... It just cant happen without significant leaks (EMI).

Developers will eventually adopt the multi core/threaded resources or go home because there is a wall coming and is coming fast... When Intel drops the clockspeed hunt... after 2020...
 
Last edited:

ppn

Joined
Aug 18, 2015
Messages
1,231 (0.39/day)
can we safely predict 10 cores 5.2 = 312 watts in prime+avx. ony if socket 1200 can even handle that power.
 
Joined
Sep 3, 2019
Messages
2,965 (1.78/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 150W PPT limit, 79C temp limit, CO -9~14
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F37h, AGESA V2 1.2.0.B
Cooling Arctic Liquid Freezer II 420mm Rev7 with off center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3600MHz 1.42V CL16-16-16-16-32-48 1T, tRFC:288, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~465W (366W current) PowerLimit, 1060mV, Adrenalin v24.2.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR1000
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v23H2, OSB 22631.3155)
Not to be mistaken I'm not ranting on devs. I would have gone the same way in their place... to be honest...
 
Last edited:
Joined
Oct 25, 2018
Messages
278 (0.14/day)
1. I have thought about them. That much should be clear.
2. Their clocks are higher... especially dual core and all core boost. Clocks are not pointless. Clearly.
3. Yep. I am talking about games and also made a distinction about content creation and users that can actually UTILIZE the cores and threads.
4. I pointed out that hex cores have been in the market for nearly a decade and most things, including games, don't utilize them. Adaptation is slow. But make no mistake about it, the resources have been there for several years already and devs haven't done much in that time.
5. I have noticed that a couple of new titles can run more than 4c. And you may notice I covered that point as well... I clearly stated that a MINIMUM for gaming would be 4c/8t where a sweetspot is 6c/12t today.

Intel's new CPU is really on the edge for me of mainstream and HEDT...

Let me quote myself for clarity.....

Capeesh?

They are on par with Intel or slightly ahead... that is awesome but has little to do with the obnoxious core count on the news Intel chip here and AMD's mainstream offerings.

Again, we've had hex cores out for almost a decade already. So I agree with this, but to what end today when devs had a chance for several years already. I think the new consoles will grease the wheels... but the resources have been there and outside of power users, few can utilize the large core counts.




In the end, we'll all have to agree to disagree. I simply wish that we are NOT in a core war and Intel followed suit... I wish there was more of a black and white line between Mainstream and HEDT core count instead of shoe horning in 16/c32t in mianstream (or anything 10c+ honestly).

I'm going to "not agree" rather than "disagree" also. Are you saying that you would want to pay $400 for an Intel 4 core CPU vs a Intel 10 core CPU for $500? I don't get that mindset.
If the core war had not started due AMD being competitive again; this is where we would still be. $350-$390 for a 4 core i7. Then the HEDT CPU went usually $500 plus from what I recall...
For people that use productivity software; these cores help ALOT!! Its nice having the extra power for multitasking also. And yes; games will start to scale better in the future. So if your a gamer
it will matter someday.

Just my 2 cents. I think ultimately this helps all of us out. cheaper parts!! I still can't get past the whole mining craze 2017-2018.....that was clearly robbery...I'm glad prices finally started
to return to normal. But if you will notice they still are slightly above the pre-mining craze. Who knows, maybe Intel will surprise with a good GPU; even if its only good in the price vs performance ratio;
its still may drive the 2 big players to lower some prices in the low to mid low end GPU segments..

can we safely predict 10 cores 5.2 = 312 watts in prime+avx. ony if socket 1200 can even handle that power.

yea; that is a pretty safe bet. I have wondered the same about LGA1200 (or LGA1159) or whatever they use next....I was thinking that Intel had to add more Vcc to power these added cores...
But then take a look at this: https://hexus.net/tech/news/cpu/125...-analysis-shows-extra-power-pins-unnecessary/
So; that's hard to say. I'm not a Microprocessor design engineers.....just a nerd...
 
Joined
Dec 31, 2009
Messages
19,366 (3.72/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
Are you saying that you would want to pay $400 for an Intel 4 core CPU vs a Intel 10 core CPU for $500? I don't get that mindset. If the core war had not started due AMD being competitive again; this is where we would still be. $350-$390 for a 4 core i7. Then the HEDT CPU went usually $500 plus from what I recall...
Nope. I never said that. To you, it seems like adding more cores/threads is mutually exclusive to competition and lower pricing. It isn't. Both companies could have stuck to 6/8c parts in the mainstream and had higher core count CPUs still in HEDT...also priced better. 2021, after the consoles do their thing on the core war (and again those are simply 8c CPUs, right?), then start making more cores available in the mainstream segment.
For people that use productivity software; these cores help ALOT!! Its nice having the extra power for multitasking also. And yes; games will start to scale better in the future. So if your a gamer
it will matter someday.
My point exactly. There are those who can utilize the c/t count above 8c/16t. But it isn't many. Most who do stick to the HEDT platform in the past. Now, the lines are blurred. I said from the get go that there is a small contingent using these at home as such. Otherwise, typically, for an office (at least the three places I have worked at including large pharma, large utility, and AWS) they buy mainstream potatos for the office and HEDT/Xeon/TR for workstations.

Someday....... we've been waiting for that for almost a decade now, no? You don't find it odd that a quad core was released over a decade ago, hex cores over 8 years ago, and just today 4c/8t CPUs are getting long in the tooth on a few titles? But yet, now that there are 10-16c products out suddenly things will change and quickly? I disagree, especially with the quickly part.

Just my 2 cents. I think ultimately this helps all of us out. cheaper parts!! I still can't get past the whole mining craze 2017-2018.....that was clearly robbery...I'm glad prices finally started
to return to normal. But if you will notice they still are slightly above the pre-mining craze. Who knows, maybe Intel will surprise with a good GPU; even if its only good in the price vs performance ratio;
its still may drive the 2 big players to lower some prices in the low to mid low end GPU segments..
It does ultimately help all of us out, but it has little to do with the core count IMO (hence the agree to disagree, lololol). My contention with this new Intel CPU and the core wars is simply that it is premature and blurring the lines of mainstream and HEDT/workstations and can confuse the average joe quite easily.
 
Last edited:
Joined
Mar 18, 2015
Messages
2,960 (0.90/day)
Location
Long Island
If a new CPU came out that had more cores and smaller die size ... what % if folks would ya think would choose it over a competing product w/ less cores and larger die size, that ran their proggrams faster ?
 
Joined
Dec 31, 2009
Messages
19,366 (3.72/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
I think you are confusing things people would know (core count and performance) versus something 99% dont (die size) and the relevance of such a thing.
 
Joined
May 31, 2016
Messages
4,323 (1.51/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 16GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
1. I have thought about them. That much should be clear.
2. Their clocks are higher... especially dual core and all core boost. Clocks are not pointless. Clearly.
3. Yep. I am talking about games and also made a distinction about content creation and users that can actually UTILIZE the cores and threads.
4. I pointed out that hex cores have been in the market for nearly a decade and most things, including games, don't utilize them. Adaptation is slow. But make no mistake about it, the resources have been there for several years already and devs haven't done much in that time.
5. I have noticed that a couple of new titles can run more than 4c. And you may notice I covered that point as well... I clearly stated that a MINIMUM for gaming would be 4c/8t where a sweetspot is 6c/12t today.
ad. 1 sure :)
ad. 2 Pointless unless they can go much higher. 100 or 200 Mhz is not much of a difference.
ad. 3 Good for you.
ad. 4 Where in the server market or you are talking about Phenom? Open your eyes, nobody will support a processor that didn't have the performance to begin with.
ad. 5 so if you covered everything so precisely where this "ïf only a general consumer could utilize these" coming from?

BTW. AMD doesn't compete with clocks but performance and as you already know Intel will not go 5Ghz and up all the time with node shrinks so this "higher clocks" is in my eye juvenile
This car is better, 'cause it has more BHP. C'mon man :)
 
Last edited:
Joined
Aug 6, 2017
Messages
7,412 (3.05/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
Wait? Those are my choices? Oh, thank you!! I was unware...lol..........wasn't my point though, lol.

It's clear to me that AMD cannot compete with clocks using Ryzen arch. They knew it and went modular/wide instead. The benefits of that, today, aren't much for the average user and even most enthusiasts. With the PS5 ro Xbox or w/e has AMD hardware in it coming out late next year, perhaps we will FINALLY see width and more cores and threads utilized for most users. But that will take TIME... remember hex cores have been in the market for at least what, 8 years, and just today, 4c/8t would be considered a 'minimum' for most. More than 10c/20t on the mainstream platform, either side, is just too much. AMD blurred the lines, and speaking strictly from a core count perspective, brought out products we don't need. The good about this the cheap pricing... otherwise, yes, today, for someone buying a PC, unless you are a content creator etc, 6c/12 is the sweetspot while 8c/16t is enthusiast level, 10c/20t is just nuts for most... I just don't like seeing that many cores/threads in mainstream when the reality is very few can use it. ;)
quoted for truth.
 
Joined
Sep 3, 2019
Messages
2,965 (1.78/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 150W PPT limit, 79C temp limit, CO -9~14
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F37h, AGESA V2 1.2.0.B
Cooling Arctic Liquid Freezer II 420mm Rev7 with off center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3600MHz 1.42V CL16-16-16-16-32-48 1T, tRFC:288, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~465W (366W current) PowerLimit, 1060mV, Adrenalin v24.2.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR1000
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v23H2, OSB 22631.3155)
For me is somewhere in the middle...
My opinion and assumption according what I am seeing getting past (downwards) that 14~12nm process and what Im hearing and reading across the net.

Clockspeeds are not irrelevant today. Most mainstream and some pro software is build from years to now upon the clockspeed/IPC gains of CPUs mostly. Developers did not bother too much to utilize the wider resources (cores/threads) because its way easier to use clockspeed.
The have been lazy the past decade because CPUs kept increasing clock speed significantly along and over IPC. The have been sleeping on the job...
I believe this was an era that is soon to be over. Intel gasps the last breaths of clocks increases. I bet her next all new arch is a more AMD like. Clockspeed cannot increase infinitely, especially at those processes of 7nm >> 5nm >> 3nm... It just cant happen without significant leaks (EMI).

Developers will eventually adopt the multi core/threaded resources or go home because there is a wall coming and is coming fast... When Intel drops the clockspeed hunt... after 2020...
Not to be mistaken I'm not ranting on devs. I would have gone the same way in their place... to be honest...

And to add something more to the whole conversation, I want to say that windows also is responsible for not utilizing properly the wider resources. The windows scheduler in particular is dumb... works great with current Intel architecture though. Norrow, long and fast pipeline...
 
Joined
Feb 3, 2017
Messages
3,475 (1.33/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
And to add something more to the whole conversation, I want to say that windows also is responsible for not utilizing properly the wider resources. The windows scheduler in particular is dumb... works great with current Intel architecture though. Norrow, long and fast pipeline...
Skylake is 8-issue wide and 14-19 pipeline stages. Zen is 10-issue wide and 19 pipeline stages, Zen2 should be pretty much the same.
Windows scheduler issues are primarily related to CCX/NUMA node handling with Zen's peculiar layout and fast(est) core issues with Zen2.
 

1000t

New Member
Joined
Sep 30, 2019
Messages
25 (0.02/day)
For me is somewhere in the middle...
My opinion and assumption according what I am seeing getting past (downwards) that 14~12nm process and what Im hearing and reading across the net.

Clockspeeds are not irrelevant today. Most mainstream and some pro software is build from years to now upon the clockspeed/IPC gains of CPUs mostly. Developers did not bother too much to utilize the wider resources (cores/threads) because its way easier to use clockspeed.
The have been lazy the past decade because CPUs kept increasing clock speed significantly along and over IPC. The have been sleeping on the job...
I believe this was an era that is soon to be over. Intel gasps the last breaths of clocks increases. I bet her next all new arch is a more AMD like. Clockspeed cannot increase infinitely, especially at those processes of 7nm >> 5nm >> 3nm... It just cant happen without significant leaks (EMI).

Developers will eventually adopt the multi core/threaded resources or go home because there is a wall coming and is coming fast... When Intel drops the clockspeed hunt... after 2020...
This post looks like it's a decade late (except the nm numbers). I'll explain.

Benefits of parallelism are extracted in two ways. One is in a core at instruction level by out-of-order architecture with help of wide pipeline or vector instructions and the second is multicore/multithreading. CPUs with 4c/4t or 4c/8t have been here for well over a decade and dualcores even longer. I think in that time practically every developer tried to make their programs multithreaded. The benefits of faster processing that multithreading enables are not always worth the effort (developing/maintaining) because the amount of data user process sometimes is not that big.

Embarrassingly parallel tasks are easiest to harvest and see the benefits of MT. But that's given and every such task today can utilize the resources if available (which in mainstream CPUs they are).
Not obviously parallel task are the hard part. It requires a lot of thinking and could be limited by number of threads when adding more does not help.
Then there are algorithms that cannot be parallelized or it's not worth the effort.

Now, what tasks does an average mainstream user? More or less the same as 10 years ago. Developers had the incentive to rewrite programs for multithreading for quadcores. And they mostly did it, piece by piece.

This post is not a defense of quadcores. It illustrates the diminishing returns everyday PC user will see from more cores.
 
Joined
Feb 3, 2017
Messages
3,475 (1.33/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
Even games have had very strong incentive to use 6-7 cores since 2013. Both Xbox One and Playstation 4 came with 8 (weak) cores, out of which 1-2 are reserved to OS.
Next year (supposedly), new consoles will move these goalposts further on with 8c/16t Zen2 CPUs.
 

1000t

New Member
Joined
Sep 30, 2019
Messages
25 (0.02/day)
Even games have had very strong incentive to use 6-7 cores since 2013. Both Xbox One and Playstation 4 came with 8 (weak) cores, out of which 1-2 are reserved to OS.
Next year (supposedly), new consoles will move these goalposts further on with 8c/16t Zen2 CPUs.
Games and their gameplay are varied and only some are big and complex enough to require that many cores. But it does not mean it's not a useful and welcome advancement.
 
Joined
Aug 6, 2017
Messages
7,412 (3.05/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
How many games use over 12 threads? Quite a lot of them.
What is the difference ? Core load is lower but performance increase is small.
What is the premium you pay for 8 cores over 6 ? Usually at least 60%.4c to 6c is 50% more and you end up paying about 50%.6c to 8c is 33% and the premium is 60% or more.
End of story.single threaded performance is still as relevant as its always been,dont matter how you achieve that-higher freqency,better ipc,faster memory or lower latency.
 
Last edited:
Joined
Sep 17, 2014
Messages
20,776 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
It wasn't my point to tell you about your choices but a sincere suggestion to think about them.
It is clear to me that intel can't compete in core count with either of its arch with AMD. It is a good move and for me only one. Intel can't up clocks every new gen and you know that. So clocks are pointless in this case as an argument. Unless you have different? At least try to understand this.
Number of cores and threads that are the sweet-spot is your opinion and you are clearly referring to games. So, buy a console if you want to play games and stay away from computers since they can do way more than run Minecraft or CS:GO.
Since you point out cores and threads utilization. Remember, first you need resources, afterwards somebody will utilize them. If you are stuck for so long with 4c (in a mainstream like desktop) then nobody sane will develop anything beyond what is being given to cripple his own product. Now we have more resources, lets see what developers will do with it.
BTW, I'm sure you have already noticed that some new titles require more than 4c since you are into gaming so much.

All those cores and resources go underused, that's the point, while the higher clocks do not. It can easily take another 5-7 years before AMD's design truly pays off. A convenient amount of time for everyone else to go wider too.

You're right we need resources, but the push to 16 core desktop parts is silly. Its also a very bad balance with dual channel memory. Zen is first and foremost a server part, and this core count is derived from that - the high core count is not a great push to actually using that on desktop, its just 'possible' because the dies exist. It wasn't designed 'for us' lowly MSDT plebs, but somehow we like to convince ourselves otherwise (you, not I). There is a difference between enough and too much.
 
Last edited:
Joined
May 31, 2016
Messages
4,323 (1.51/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 16GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
All those cores and resources go underused, that's the point, while the higher clocks do not. It can easily take another 5-7 years before AMD's design truly pays off. A convenient amount of time for everyone else to go wider too.

You're right we need resources, but the push to 16 core desktop parts is silly. Its also a very bad balance with dual channel memory. Zen is first and foremost a server part, and this core count is derived from that - the high core count is not a great push to actually using that on desktop, its just 'possible' because the dies exist. It wasn't designed 'for us' lowly MSDT plebs, but somehow we like to convince ourselves otherwise (you, not I). There is a difference between enough and too much.
Well for me this is not silly and I'm sure a lot of people will agree with me. What is silly on the other hand is saying that there's no need for more cores and boosting frequency is better. Also saying that we don't need more than 4 or 6c is silly (they are being maxed out in games or very close to reaching 100% utilization).

BTW, When you buy a PSU do you buy one 550 and draw 500Watts outta it? No you want some headroom. You get 16c CPU with headroom and yet you complain it is too much and you don’t need it. That's just ridiculous.
You need to push tech forward and then use it. Game developers will not ask CPU manufacturers or announce in media for them, "Guys it is time for you to start manufacturing 8c per CPU for software and game developers now because we need it". This is not how this goes. First, you have resources and then software developers will eventually use it or at least have a choice, balance if needed. Would you like to have a choice? Of course and you have one now. You dont need to go 8c if you don't want. You want to go high clocks go Intel. You wanna go more cores? AMD (for a reasonable price). So do not bash or call it silly to have a 16c in the desktop market because you don't like it or you think it is not needed because, I don't think anyone of us is in the position to say that. BTW.
Saying that, we don't need more cores is just stupid. It is not about cores but performance. Saying you don't want more cores is like saying you don't want more performance. Also saying you would rather have more clock speed than cores is even more ridiculous. Clock speed is not something you will get because you prefer to have it rather than cores. It won't happen and neither Intel nor AMD will boost clocks of their upcoming processors to 5.5 or 6Ghz to satisfy your liking.
I don't understand you people. You get more cores and more performance with it and you complain because you want higher clock speed instead. There's just no pleasing you.
There is a difference between having a choice and being told what to pick.
 
Last edited:
Joined
Feb 3, 2017
Messages
3,475 (1.33/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
Unless you really do productivity stuff - rendering or video encoding seem to be the main ones here - 16 cores is overkill. Even with the price being almost linear (with slight upwards trend) the price/performance in consumer tasks goes out of hand for more expensive CPUs. This is especially the case for gaming. Additionally, gaming generally benefits from high clock speeds but relatively little from additional cores past 8 (or threads past 8-10).

It all depends on how long are you planning to keep the CPU. Buying 16-core Ryzen 3950X today for gaming or consumer use cases is overkill to the max. If you plan to keep it for 5 years, then it maybe makes kind of sense. I wouldn't hold my breath though. Same applies to Ryzen 3900X at a lesser degree. Keep in mind that these are $749 and $499 CPUs, respectively.
 
Joined
May 31, 2016
Messages
4,323 (1.51/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 16GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Unless you really do productivity stuff - rendering or video encoding seem to be the main ones here - 16 cores is overkill. Even with the price being almost linear (with slight upwards trend) the price/performance in consumer tasks goes out of hand for more expensive CPUs. This is especially the case for gaming. Additionally, gaming generally benefits from high clock speeds but relatively little from additional cores past 8 (or threads past 8-10).

It all depends on how long are you planning to keep the CPU. Buying 16-core Ryzen 3950X today for gaming or consumer use cases is overkill to the max. If you plan to keep it for 5 years, then it maybe makes kind of sense. I wouldn't hold my breath though. Same applies to Ryzen 3900X at a lesser degree. Keep in mind that these are $749 and $499 CPUs, respectively.
That only tells me that you guys are limited and narrow minded with a lot of attitude. You perceive today's computing as gaming only and thus you say 16c cpu is not needed in the desktop market because nobody (or marginal percent) will benefit from it. I'm telling you, go console and your problems are over. On top of that you can get that 16c for such a good price. You are against advancement of technology you just don't know what you want and you can't appreciate what you are getting.

That's you guys.
AMD stop with this 16c madness in the desktop market, we had enough. 4c is all you need so stop this. The game developers wont use it ever. 16c is only for server and professionals, for desktops it is an overkill.
NV stop realeasing new GPUs. We've got 2080 Ti that's enough. We can play 1080p 144Hz who needs more than that?
AMD don't try to catch up with NV and release new 5000 series we've got 2080 Ti that's more than enough besides you are always an underdog so stop.
Why go 4k RT when we can go 60FPS RT 1080p who needs more than that?
4k sucks it is better to go 720p 1000FPS.

It is kinda funny to think about but that is how I see you guys :) It would seem like you are just born to argue. Typical European way of living now :)
 
Last edited:
Joined
Oct 25, 2018
Messages
278 (0.14/day)
16 cores may be overkill as of today; but you have to remember, if the mainstream core count increases the trend will continue to optimize software to use the
extra resources. Can you run windows now on a 2 core/ 2 thread cpu? not very well; its starting to choke. The mainstream CPU now has 4-6 cores w/ 8-16 threads.
with at least 8GB-16GB RAM. That seems to make windows 10 happy. 2-3 years ago that was not the case. Some early Versions of Win 10 were running just as good as Win 7.
Now that is not the case. Windows has become bloated.
 
Joined
Feb 3, 2017
Messages
3,475 (1.33/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
@ratirt you really like these straw men, don't you?

Edit:
I mean, gaming is the main performance hog on my system. It is much more GPU-heavy than it is CPU heavy. Other than occasional video encoding there is no production workload on my computer. And it will inevitably be replaced in about 2-3 year cadence. Is it surprising that I am focusing on what I see? Most of the people I know have even more casual approach to using their PCs.

Looking at TPU's 9900KS review, my 170€ CPU from over a year ago loses to 749€ 3900X by less than a percent at 1440p and about 7% at 720p. Noticeable difference at 720p but at 4-5x price difference - 550-600€? Games in TPU's review are pretty well threaded ones as well with the exception of Witcher 3.
 
Last edited:
Joined
Sep 17, 2014
Messages
20,776 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
That only tells me that you guys are limited and narrow minded with a lot of attitude. You perceive today's computing as gaming only and thus you say 16c cpu is not needed in the desktop market because nobody (or marginal percent) will benefit from it. I'm telling you, go console and your problems are over. On top of that you can get that 16c for such a good price. You are against advancement of technology you just don't know what you want and you can't appreciate what you are getting.

That's you guys.
AMD stop with this 16c madness in the desktop market, we had enough. 4c is all you need so stop this. The game developers wont use it ever. 16c is only for server and professionals, for desktops it is an overkill.
NV stop realeasing new GPUs. We've got 2080 Ti that's enough. We can play 1080p 144Hz who needs more than that?
AMD don't try to catch up with NV and release new 5000 series we've got 2080 Ti that's more than enough besides you are always an underdog so stop.
Why go 4k RT when we can go 60FPS RT 1080p who needs more than that?
4k sucks it is better to get 720p 1000FPS.

It is kinda funny to think about but that is how I see you guys :) It would seem like you are just born to argue. Typical European way of living now :)

Right. Let's just say I'm glad you're not designing our CPUs. This is clearly way over your head. Read carefully, again, what's being said. Nobody is against progress.
 
Joined
Jun 2, 2017
Messages
7,789 (3.13/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
This is purely anecdotal but if game developers started or are using Threadripper based workstations to create games would we not start to see more cores being utilized to run the game engine? is AOTS a sign of the future or a flash in the pan?
 
Joined
Oct 25, 2018
Messages
278 (0.14/day)
I'll say this again; I love the competition back in the CPU market. That is nothing to complain about. Just because the software basically can't utilize the full potential of some
new hardware is not a reason to get upset. Really all this is good; if your into 4 core cpus; then you should be happy as crap. the price has dropped almost $150 on average for a 4 core.
 
Joined
Jun 2, 2017
Messages
7,789 (3.13/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
I'll say this again; I love the competition back in the CPU market. That is nothing to complain about. Just because the software basically can't utilize the full potential of some
new hardware is not a reason to get upset. Really all this is good; if your into 4 core cpus; then you should be happy as crap. the price has dropped almost $150 on average for a 4 core.

Why stop there you can get a 1900X for $149 on Amazon.com........
 
Status
Not open for further replies.
Top