• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Zen 4 Desktop Processors Likely Limited to 16 Cores, 170 W TDP

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.19/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Yeah, OK, I guess that core 2 quads had the same per core performance on DDR3 as haswell then? Dude just stop embarassing yourself.
I said the core count hadnt changed, i literally said slow increases per generation... Kentsfield to haswell was 2006 to 2013 - and what, 5 generations?

And this is what we got



Then the change zen managed in 2 gens... (including a 4 core equivalent and then GASP there is no zen 3 4 core chip! how dare they progress!)


(images look different, got them from different sources, but both passmark)
 
Last edited:
Joined
Oct 10, 2009
Messages
786 (0.15/day)
Location
Madrid, Spain
System Name Rectangulote
Processor Core I9-9900KF
Motherboard Asus TUF Z390M
Cooling Alphacool Eisbaer Aurora 280 + Eisblock RTX 3090 RE + 2 x 240 ST30
Memory 32 GB DDR4 3600mhz CL16 Crucial Ballistix
Video Card(s) KFA2 RTX 3090 SG
Storage WD Blue 3D 2TB + 2 x WD Black SN750 1TB
Display(s) 2 x Asus ROG Swift PG278QR / Samsung Q60R
Case Corsair 5000D Airflow
Audio Device(s) Evga Nu Audio + Sennheiser HD599SE + Trust GTX 258
Power Supply Corsair RMX850
Mouse Razer Naga Wireless Pro / Logitech MX Master
Keyboard Keychron K4 / Dierya DK61 Pro
Software Windows 11 Pro
AMD pulling an Intel here?

Sounds like they're satisfied with their place in the market and have switched to stagnation mode.
You really need more than 8 cores on your daily home and gaming machine? Holy fuck.
 
Joined
Sep 1, 2020
Messages
2,026 (1.53/day)
Location
Bulgaria
Hmm, the only reason ZEN4 Raphael to be with 16 cores probably is iGPU part in left from I/O chiplet. There is(was) rumors or leaks that all mainstream processors from AMD will have iGPU.
 
Joined
Nov 13, 2007
Messages
10,232 (1.70/day)
Location
Austin Texas
Processor 13700KF Undervolted @ 5.6/ 5.5, 4.8Ghz Ring 200W PL1
Motherboard MSI 690-I PRO
Cooling Thermalright Peerless Assassin 120 w/ Arctic P12 Fans
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2x 2TB WDC SN850, 1TB Samsung 960 prr
Display(s) Alienware 32" 4k 240hz OLED
Case SLIGER S620
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard RoyalAxe
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
AMD pulling an Intel here?

Sounds like they're satisfied with their place in the market and have switched to stagnation mode.


Yes they are... just like they did last time when they were ahead.

We aren't going to see 32 cores on the desktop until Apple or Intel start creeping back up.
 
Joined
Dec 6, 2016
Messages
748 (0.28/day)
I can see a 170W TDP 16 core 'extreme, water cooled only' top end part to try and claim the king of the hill in reviews and media, with the rest being a lot more power efficient and better suited for actual gamers

Yes, this is probably it. Further leaks from Patrick Schur:


Also, regarding the news: at the moment, they should focus on single core performance and efficiency for the first gen on AM5 and increase the core count on second/third gen. At the current moment, I simply can't see how pushing more cores would benefit the average consumer. If you need more cores, why not go Threadripper?
 
Joined
Oct 12, 2005
Messages
682 (0.10/day)
I think the key here is the power envelope and the intended buyers of these CPU. AMD already offer up to 64 cores with Threadripper so we could say they limit consumers desktops to 16 cores, but that is more than enough for most people.

A key selling point for desktop CPU right now is still gaming performance. I suppose that AMD want to use 2 CCD (2x8 cores) and gave them more power to go higher clock than having another chiplet that would need to use a significant portion of the power envelope preventing the others core to achieve the higher frequency.

Single thread performance is still the key performance metrics everyone need to get on no matter what people say here. The higher it is, the higher performance you will get with each core you add. You have to always balance working on getting your single thread performance as good as possible then leveraging to more cores.

Also, no matter what architecture or ISA you are on (x86-64, ARM, RISC-V), the key is to feed the big cores. 2 channels (or even the 4 channel (2x2x32) of DDR5 might not be enough to feed another full CCD. Larger cache might help but not fix the problem as a whole.

Also AMD is benefits a lot from its chiplet architecture, but there are drawback. I suppose that they would have released smaller cpu with Zen 2 (like 10-12 core) but they had to go to 2 full CCD. The next step is 24 witch is another big jump.


We will see, pretty sure that if that become a problem, they will adapt, but I am not sure if that it would be something to keep the crown but not really something widely available. AMD already have to give the best chips to the *950 and *900 cpu to stay in the AM4 power envelope. They would have to do crazy binning to do the same thing with 3 chiplet instead of 2.

But i would be more interested as a someone that primarily need performance for gaming at 8 core with 96MB of L3 than a 24 cores with the same amount of L3 cache. Even more if that 8 cores run at higher frequency (or is just buyable).
 
Joined
Feb 21, 2006
Messages
1,978 (0.30/day)
Location
Toronto, Ontario
System Name The Expanse
Processor AMD Ryzen 7 5800X3D
Motherboard Asus Prime X570-Pro BIOS 5003 AM4 AGESA V2 PI 1.2.0.B
Cooling Corsair H150i Pro
Memory 32GB GSkill Trident RGB DDR4-3200 14-14-14-34-1T (B-Die)
Video Card(s) AMD Radeon RX 7900 XTX 24GB (24.3.1)
Storage WD SN850X 2TB / Corsair MP600 1TB / Samsung 860Evo 1TB x2 Raid 0 / Asus NAS AS1004T V2 14TB
Display(s) LG 34GP83A-B 34 Inch 21: 9 UltraGear Curved QHD (3440 x 1440) 1ms Nano IPS 160Hz
Case Fractal Design Meshify S2
Audio Device(s) Creative X-Fi + Logitech Z-5500 + HS80 Wireless
Power Supply Corsair AX850 Titanium
Mouse Corsair Dark Core RGB SE
Keyboard Corsair K100
Software Windows 10 Pro x64 22H2
Benchmark Scores 3800X https://valid.x86.fr/1zr4a5 5800X https://valid.x86.fr/2dey9c
Lots of people here want to bag on intel for "muh quad cores" without realizing that 6+ cores were available in HDET and, more importantly, they offered 0 performance improvement.
You are right 6 cores was available in HDET.

I had a 6 core i7-970 however you are incorrect when saying it offered no performance improvement. That cpu can still play most recent BF games with a decent gpu due to the extra cores a i7-920 cannot. Anything that can use those extra cores will including handbrake, any compression applications etc.
 
Joined
Mar 10, 2015
Messages
3,984 (1.20/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
AMD pulling an Intel here?

Sounds like they're satisfied with their place in the market and have switched to stagnation mode.
I thought only single core mattered and moar cores were for fools?
 
Joined
Mar 20, 2019
Messages
556 (0.30/day)
Processor 9600k
Motherboard MSI Z390I Gaming EDGE AC
Cooling Scythe Mugen 5
Memory 32GB of G.Skill Ripjaws V 3600MHz CL16
Video Card(s) MSI 3080 Ventus OC
Storage 2x Intel 660p 1TB
Display(s) Acer CG437KP
Case Streacom BC1 mini
Audio Device(s) Topping MX3
Power Supply Corsair RM750
Mouse R.A.T. DWS
Keyboard HAVIT KB487L / AKKO 3098 / Logitech G19
VR HMD HTC Vive
Benchmark Scores What's a "benchmark"?
Be warned, this is a rant. With an inquisitive part at the end.
What is going on with CPUs? Since the Pentium 4 stupidity (I know, NetBurst was not bad in some scenarios, but for consumers it was mostly not great.) and then Bloomfield being a horrible space heater it was, not to mention Bulldozer which was actually one of the worst CPUs I ever owned, after all this we were going in a good direction with TDP going down, efficiency going up and got to a point where even the top end consumer CPUs were reasonable for home use - I know several people who still use slightly overclocked 6700k/7700k/8700k with something like RTX 3060/3070 with no disturbing bottlenecking. Nowadays it's somehow acceptable to sell a 170W, 16 core monstrosity as a consumer product, so we're back to the days of pumping more power and creating more heat just to get a higher number on a cardboard box. I know there are people who want this, I'm sure AMD did their due diligence in market research, so: Who are you people and what do you use such consumer hardware for? This is honest curiosity, not condescension.

I know there is e-peen enlargement, also known as "bragging rights" (or conspicuous consumption known otherwise as a Veblen effect if you're more academically inclined), which is probably the main reason for high-end consumer hardware existence since before home computers were a thing. If you need performance for work you don't fart around, buy big boy stuff and write it off your taxes as a business expense, that's where your Xeons, Threadrippers and such live. I know people with multi-CPU servers with Tesla cards in their basements, but they use them to earn money or do scientific research, but a 170W 16 core consumer CPU? To do what, games can't even saturate eight cores properly without GPU becoming the limiting factor and consumer software is just not that demanding. What, you do time-critical building modeling with Revit as a hobby or something? I'd like to again stress the fact that his is honest curiosity. I personally have a Xeon/Quadro machine at home, but I use it do do actual work and earn money, and that's not consumer hardware, hence my curiosity regarding the viability of such products in a consumer setting. Maybe I'm just oblivious to a fundamental change in consumer usage patterns and people really, really need to have six hundred browser windows opened at once or a full suite of Autodesk software is now a must-have for everyone.
 

maze100

New Member
Joined
Dec 8, 2018
Messages
2 (0.00/day)
AMD pulling an Intel here?

Sounds like they're satisfied with their place in the market and have switched to stagnation mode.
the difference between a 1800x (zen 1 8 core) and 5800x (zen 3 8 core, without the optional v cache the arch support) is night and day, AMD improved ST performance by more than 50% and gaming perf by 2x - 3x

intel sandy to skylake (3 newer architectures) is no where near that
 
Joined
Sep 17, 2014
Messages
20,906 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Be warned, this is a rant. With an inquisitive part at the end.
What is going on with CPUs? Since the Pentium 4 stupidity (I know, NetBurst was not bad in some scenarios, but for consumers it was mostly not great.) and then Bloomfield being a horrible space heater it was, not to mention Bulldozer which was actually one of the worst CPUs I ever owned, after all this we were going in a good direction with TDP going down, efficiency going up and got to a point where even the top end consumer CPUs were reasonable for home use - I know several people who still use slightly overclocked 6700k/7700k/8700k with something like RTX 3060/3070 with no disturbing bottlenecking. Nowadays it's somehow acceptable to sell a 170W, 16 core monstrosity as a consumer product, so we're back to the days of pumping more power and creating more heat just to get a higher number on a cardboard box. I know there are people who want this, I'm sure AMD did their due diligence in market research, so: Who are you people and what do you use such consumer hardware for? This is honest curiosity, not condescension.

I know there is e-peen enlargement, also known as "bragging rights" (or conspicuous consumption known otherwise as a Veblen effect if you're more academically inclined), which is probably the main reason for high-end consumer hardware existence since before home computers were a thing. If you need performance for work you don't fart around, buy big boy stuff and write it off your taxes as a business expense, that's where your Xeons, Threadrippers and such live. I know people with multi-CPU servers with Tesla cards in their basements, but they use them to earn money or do scientific research, but a 170W 16 core consumer CPU? To do what, games can't even saturate eight cores properly without GPU becoming the limiting factor and consumer software is just not that demanding. What, you do time-critical building modeling with Revit as a hobby or something? I'd like to again stress the fact that his is honest curiosity. I personally have a Xeon/Quadro machine at home, but I use it do do actual work and earn money, and that's not consumer hardware, hence my curiosity regarding the viability of such products in a consumer setting. Maybe I'm just oblivious to a fundamental change in consumer usage patterns and people really, really need to have six hundred browser windows opened at once or a full suite of Autodesk software is now a must-have for everyone.

Dunno I think the streamer and creator groups can now easily work without having to go into workstations, and with that bar being so low, the market has exploded. Many a 'pro' gamer or young creator is easy to talk into more is better. And the more mainstream this becomes, the more you get demand for bigger and better. Even up to the point of nonsensical; look at top end GPU land and its the same thing.

And yes they can use those vacant threads. Browser, stream(s) , some transcode and a bunchbof other stuff will likely be used concurrently.
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
12,945 (2.60/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | Asus 24" IPS (portrait mode)
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Razer Huntsman Tournament Edition
Software Windows 11 Pro 64-Bit
Just because they arn't pushing more cores that means they stagnated?

Not to mention that threadripper exists?
This.

If they release chips on mainstream platform with more cores thats getting into TR territory and will totally castrate their own product line.
 
Joined
Jan 25, 2020
Messages
1,998 (1.29/day)
System Name DadsBadAss
Processor I7 13700k w/ HEATKILLER IV PRO Copper Nickel
Motherboard MSI Z790 Tomahawk Wifi DDR4
Cooling BarrowCH Boxfish 200mm-HWLabs SR2 420/GTX&GTS 360-BP Dual D5 MOD TOP- 2x Koolance PMP 450S
Memory 4x8gb HyperX Predator RGB DDR4 4000
Video Card(s) Asrock 6800xt PG D w/ Byski A-AR6900XT-X
Storage WD SN850x 1TB NVME M.2/Adata XPG SX8200 PRO 1TB NVMe M.2
Display(s) Acer XG270HU
Case ThermalTake X71 w/5 Noctua NF-A14 2000 IP67 PWM/3 Noctua NF-F12 2000 IP67 PWM/3 CorsairML120 Pro RGB
Audio Device(s) Klipsch Promedia 2.1
Power Supply Seasonic Focus PX-850 w/CableMod PRO ModMesh RT-Series Black/Blue
Mouse Logitech G502
Keyboard Black Aluminun Mechanical Clicky Thing With Blue LEDs, hows that for a name?!
Software Win11pro
Be warned, this is a rant. With an inquisitive part at the end.
What is going on with CPUs? Since the Pentium 4 stupidity (I know, NetBurst was not bad in some scenarios, but for consumers it was mostly not great.) and then Bloomfield being a horrible space heater it was, not to mention Bulldozer which was actually one of the worst CPUs I ever owned, after all this we were going in a good direction with TDP going down, efficiency going up and got to a point where even the top end consumer CPUs were reasonable for home use - I know several people who still use slightly overclocked 6700k/7700k/8700k with something like RTX 3060/3070 with no disturbing bottlenecking. Nowadays it's somehow acceptable to sell a 170W, 16 core monstrosity as a consumer product, so we're back to the days of pumping more power and creating more heat just to get a higher number on a cardboard box. I know there are people who want this, I'm sure AMD did their due diligence in market research, so: Who are you people and what do you use such consumer hardware for? This is honest curiosity, not condescension.

I know there is e-peen enlargement, also known as "bragging rights" (or conspicuous consumption known otherwise as a Veblen effect if you're more academically inclined), which is probably the main reason for high-end consumer hardware existence since before home computers were a thing. If you need performance for work you don't fart around, buy big boy stuff and write it off your taxes as a business expense, that's where your Xeons, Threadrippers and such live. I know people with multi-CPU servers with Tesla cards in their basements, but they use them to earn money or do scientific research, but a 170W 16 core consumer CPU? To do what, games can't even saturate eight cores properly without GPU becoming the limiting factor and consumer software is just not that demanding. What, you do time-critical building modeling with Revit as a hobby or something? I'd like to again stress the fact that his is honest curiosity. I personally have a Xeon/Quadro machine at home, but I use it do do actual work and earn money, and that's not consumer hardware, hence my curiosity regarding the viability of such products in a consumer setting. Maybe I'm just oblivious to a fundamental change in consumer usage patterns and people really, really need to have six hundred browser windows opened at once or a full suite of Autodesk software is now a must-have for everyone.

More than anything its cost. Its far cheaper to stay in the consumer realm with a 5950x system vs a threadripper rig.
 
Joined
Jul 7, 2019
Messages
830 (0.47/day)
Be warned, this is a rant. With an inquisitive part at the end.
What is going on with CPUs? -Snipped for space- Nowadays it's somehow acceptable to sell a 170W, 16 core monstrosity as a consumer product, so we're back to the days of pumping more power and creating more heat just to get a higher number on a cardboard box. I know there are people who want this, I'm sure AMD did their due diligence in market research, so: Who are you people and what do you use such consumer hardware for? This is honest curiosity, not condescension.

-Snipped for space- What, you do time-critical building modeling with Revit as a hobby or something? I'd like to again stress the fact that his is honest curiosity. I personally have a Xeon/Quadro machine at home, but I use it do do actual work and earn money, and that's not consumer hardware, hence my curiosity regarding the viability of such products in a consumer setting. Maybe I'm just oblivious to a fundamental change in consumer usage patterns and people really, really need to have six hundred browser windows opened at once or a full suite of Autodesk software is now a must-have for everyone.

I believe Jayz or Linus touched upon this once or twice, but a part of it is that the increased core count and performance has led to an increase in accessibility to do real-time or recorded HD, semi-professional video and photo editing (notably small businesses that can't afford to pay for, or don't need, prosumer-level hardware and software), as well as more easily enable aspiring "content creators" who game, livestream, and encode all at once on one PC. For livestreaming + PC gaming especially, both of them mentioned how it used to take two PCs; one configured to do the encoding/recording/editing/streaming, and the other being dedicated solely to gaming, to do a livestream, but was priced out of range for individuals until AMD blew open the core counts (heck, 8 cores was already good to do it, depending on title, and was one of the bigger selling points of the 1700X and 1800X).

And now with gaming studios taking advantage of more cores, scaling mostly up to 8, with a rare few offering options up to 16, the bigger livestreamers and content creators who make their living off such now start buying 12 and 16 core CPUs both to future-proof their rigs for a few years, and still have reserve cores for the real-time editing/encoding and other software they run. Sure, GPU makes have helped to off-load some of the encoding from the CPU to the GPU (NVENC notably comes to mind), but not everyone uses GPU encoding for one reason or another.

I myself am not familiar with all that goes into Livestreaming so I don't know the full extent of CPU reqs for a high-end, do-everything rig, but I have a passing familiarity with using consumer level hardware for "small, local business" ventures, since one of my friends uses a 5950X + 6700XT desktop for their 3-man photography and videography side-business. As they're the one mostly doing the editing and post-processing for photos and videos taken by the group, they like not having to wait for extended periods trying to scrub through 4k video footage or wait just as long for sampling and testing applied effects to large photos.

A less-used, but noted case was one Linus made; being able to take a high-end CPU + multi GPU combo, and split its cores to separate workstations. While Linus meme'd using a TR or EPYC CPU and multiple GPUs, he mentioned the viability of using say, a 16c CPU split 4 ways into "4c terminals" in an environment that doesn't need much more than CPU processing and basic graphics (provided by the terminal's basic display output), or a 16c CPU + 2 GPU rig split into 2 "8c+GPU" terminals.
 
Joined
Sep 1, 2020
Messages
2,026 (1.53/day)
Location
Bulgaria
Tsmc 5nm is not that different to 7nm.. that power consumption is big
U learn physics? Is not enough to know only to count number to has right opinion for this case I think :rolleyes:
But wait there's more: for to made 7nm and 5nm TSMC use different models machines from ASML. LoL :)
 
Joined
Apr 12, 2013
Messages
6,743 (1.67/day)
We aren't going to see 32 cores on the desktop until Apple or Intel start creeping back up.
Then I'm sure you'll probably to have to wait for a decade, if not more :laugh:

Just an FYI 32 core mainstream chips should become a reality come zen5/6 on 3nm TSMC.

AMD doesn't need to pull an Intel & I doubt they'll do so given the competitive landscape today, not to mention real threat of ARM eating x86 everywhere for breakfast, lunch & dinner. If ARM was even remotely competitive back in 2005 (on PC) then you can bet every dollar in crypto exchanges that Intel would either have released 8 core chips early 2010's or gone bankrupt today!
 
Joined
Mar 28, 2018
Messages
1,794 (0.81/day)
Location
Arizona
System Name Space Heater MKIV
Processor AMD Ryzen 7 5800X
Motherboard ASRock B550 Taichi
Cooling Noctua NH-U14S, 3x Noctua NF-A14s
Memory 2x32GB Teamgroup T-Force Vulcan Z DDR4-3600 C18 1.35V
Video Card(s) PowerColor RX 6800 XT Red Devil (2150MHz, 240W PL)
Storage 2TB WD SN850X, 4x1TB Crucial MX500 (striped array), LG WH16NS40 BD-RE
Display(s) Dell S3422DWG (34" 3440x1440 144Hz)
Case Phanteks Enthoo Pro M
Audio Device(s) Edifier R1700BT, Samson SR850
Power Supply Corsair RM850x, CyberPower CST135XLU
Mouse Logitech MX Master 3
Keyboard Glorious GMMK 2 96%
Software Windows 10 LTSC 2021, Linux Mint
You really need more than 8 cores on your daily home and gaming machine? Holy fuck.
2006: You really need more than two cores on your daily home and gaming machine?

1981: You really need more than 640K of memory on your daily home and gaming machine?

What I'm saying is we need to keep an eye on what AMD is or isn't doing. Sure, eight cores may be overkill nowadays, but what about in a few years?

Intel perpetuated the idea that four cores were plenty from at least 2007 all the way up until Zen happened in 2017. Also notice that when Zen came out, all of a sudden, "regular" programs could take advantage of it.

I hate Intel as much as the next guy, but AMD isn't perfect either. They're both companies, and when they've gotten far enough ahead of their competitors, they'll slow down innovation until we get another Intel from Sandy Bridge to Kaby Lake situation.

The price increase Zen 3 received was the first red flag for me (but that didn't stop me from buying it!).
 
Joined
Oct 12, 2005
Messages
682 (0.10/day)
2006: You really need more than two cores on your daily home and gaming machine?

1981: You really need more than 640K of memory on your daily home and gaming machine?

What I'm saying is we need to keep an eye on what AMD is or isn't doing. Sure, eight cores may be overkill nowadays, but what about in a few years?

Intel perpetuated the idea that four cores were plenty from at least 2007 all the way up until Zen happened in 2017. Also notice that when Zen came out, all of a sudden, "regular" programs could take advantage of it.

I hate Intel as much as the next guy, but AMD isn't perfect either. They're both companies, and when they've gotten far enough ahead of their competitors, they'll slow down innovation until we get another Intel from Sandy Bridge to Kaby Lake situation.

The price increase Zen 3 received was the first red flag for me (but that didn't stop me from buying it!).
Those 2 quote can't be compared. There is no law that limit how much data it's worth storing.

But there is a rules about multicore/multithread scaling. Amdahl law https://en.wikipedia.org/wiki/Amdahl's_law

The thing is a lot of the thing that can easily be parallelized can be done by GPU, or can be done by larger cores with larger SIMD. So for desktop users, there won't be a lot of benefits of having very large number of cores.

it have been shown in the past and it will be shown again in the future, you better have less faster core than more slower cores. At least on desktop. On server, well depending on the workload and stuff that might be another story.

In the end, all that are child game, it do not really matter. The only things that really matter is how much performance you get from the chips and that all it matter. If someone could release a 1 core CPU that blow out of the water everything we have currently, who would really care?
 
Joined
May 8, 2021
Messages
1,978 (1.84/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
The same was said about 8 cores in 2012
And nothing has meaningfully changed. If you are a gamer, 4 cores will be utilized well, 6 not so well and 8 are useless. Loads of software still can't parallelize properly either. 8 core today is still generally pointless, unless you are 100% sure that your software utilizes is.
 
Joined
Apr 24, 2020
Messages
2,560 (1.76/day)
But there is a rules about multicore/multithread scaling. Amdahl law https://en.wikipedia.org/wiki/Amdahl's_law

There's a 2nd law: https://en.wikipedia.org/wiki/Gustafson's_law

When you can't scale any higher, you make the work harder. Back in the 00s, why would you ever go above 768p ?? But today, we are doing 1080p regularly and enthusiasts are all in 4k. With literally 16x the pixels, you suddenly have 16x more work and computers can scale to 16x the size.

4k seems to be the limit for how far is reasonable (at least for now). But there's also raytracing coming up: instead of making more pixels, we're simply making "each pixel" much harder to compute. I expect Gustafson's law to continue to apply until people are satisfied with computer-generated graphics. (and considering how much money Disney spends on Pixar rendering farms... you might be surprised at how much compute power is needed to get a "good cartoony" look like Pixar's Wreck it Ralph, let alone CGI Thanos in the movies).

Video game graphics are very far away from what people want still. The amount of work done per frame can continue to increase according to Gustafson's law for the foreseeable future. More realistic shadows, even on non-realistic games (ex: Minecraft with Raytracing or Quake with Raytracing) wows people.

--------------

For those who aren't in the know: Amdahl's law roughly states that a given task can only be parallelized to a certain amount. For example: if you're looking forward 1-move in Chess, the maximum parallelization you can get is roughly 20 (there are only ~20 moves available at any given position).

Therefore, when the "chess problem" is described as "improve the speed of looking forward 1-move ahead", you run into Ahmdal's law. You literally can't get any faster than 20x speedup (one core looking at each of the 20-moves available).

But... Gustafson's law is the opposite, and is the secret to scaling. If the "chess problem" is described as "improve the number of positions you look at within 30 seconds", this is Gustafson's law. If you have 400-cores, you have those 400-cores analyze 2-moves ahead (20x for the 1st move, 20x that for the 2nd move). All 400-cores will have work to do for the time period. When you get 8000 cores, you have one core look at 3 moves ahead (20x20x20 positions), and now you can look at 8000 positions in a given timeframe. Etc. etc. As you can see: "increasing the work done" leads to successful scaling.

----------

Some problems are Amdahl style (render this 768p image as fast as possible). Some other problems are Gustafson's (render as many pixels as you can within 1/60th of a second). The truth for any given problem lies somewhere in between the two laws.
 
Last edited:
Joined
May 8, 2021
Messages
1,978 (1.84/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
Be warned, this is a rant. With an inquisitive part at the end.
What is going on with CPUs? Since the Pentium 4 stupidity (I know, NetBurst was not bad in some scenarios, but for consumers it was mostly not great.) and then Bloomfield being a horrible space heater it was, not to mention Bulldozer which was actually one of the worst CPUs I ever owned, after all this we were going in a good direction with TDP going down, efficiency going up and got to a point where even the top end consumer CPUs were reasonable for home use - I know several people who still use slightly overclocked 6700k/7700k/8700k with something like RTX 3060/3070 with no disturbing bottlenecking. Nowadays it's somehow acceptable to sell a 170W, 16 core monstrosity as a consumer product, so we're back to the days of pumping more power and creating more heat just to get a higher number on a cardboard box. I know there are people who want this, I'm sure AMD did their due diligence in market research, so: Who are you people and what do you use such consumer hardware for? This is honest curiosity, not condescension.

I know there is e-peen enlargement, also known as "bragging rights" (or conspicuous consumption known otherwise as a Veblen effect if you're more academically inclined), which is probably the main reason for high-end consumer hardware existence since before home computers were a thing. If you need performance for work you don't fart around, buy big boy stuff and write it off your taxes as a business expense, that's where your Xeons, Threadrippers and such live. I know people with multi-CPU servers with Tesla cards in their basements, but they use them to earn money or do scientific research, but a 170W 16 core consumer CPU? To do what, games can't even saturate eight cores properly without GPU becoming the limiting factor and consumer software is just not that demanding. What, you do time-critical building modeling with Revit as a hobby or something? I'd like to again stress the fact that his is honest curiosity. I personally have a Xeon/Quadro machine at home, but I use it do do actual work and earn money, and that's not consumer hardware, hence my curiosity regarding the viability of such products in a consumer setting. Maybe I'm just oblivious to a fundamental change in consumer usage patterns and people really, really need to have six hundred browser windows opened at once or a full suite of Autodesk software is now a must-have for everyone.
I'm pretty sure that 170 watt part is just basic part with PPT jacked up. Really pointless chip.
 
Joined
Sep 1, 2020
Messages
2,026 (1.53/day)
Location
Bulgaria
But there is a rules about multicore/multithread scaling. Amdahl law
This is temporary. Theoretically Amdahl is right...for now
Newton also is right and his elaborations on gravity have not been repealed. However, many other scientists lived and worked after him, including Einstein, and we now have an improved view of the laws of nature. Just for example. If we relied only on Newton's formulas, we would not have exactly global positioning, and maybe even good navigation in space. For this purpose it is necessary to use quantum mechanics. It is possible that a genius has already been born who, without disproving Amdahl ... Just writes other rules and laws that will also be valid and less restrictive.
 
Joined
Dec 28, 2012
Messages
3,475 (0.84/day)
System Name Skunkworks
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software openSUSE tumbleweed/Mint 21.2
You are right 6 cores was available in HDET.

I had a 6 core i7-970 however you are incorrect when saying it offered no performance improvement. That cpu can still play most recent BF games with a decent gpu due to the extra cores a i7-920 cannot. Anything that can use those extra cores will including handbrake, any compression applications etc.
Were the most recent BF games benchmarked in 2010? I dont think so. Unless you came back from the future nobody was doing so. For games of 2010, the 970 was no faster, and oftentimes was a bit slower due to lower clock speeds then the consumer quad cores. Most consumer software of the time could not use more then 4 cores, or even more then 2 cores.

And while your 970 can still play games, I'd bet good money that a current gen i3 quad core would absolutely smoke it. Single core performance has come a long way in the last decade, as has multicore performance. One can look up benchmarks for an i3-10320 and see it is ranked slightly higher then a i7-980.

I said the core count hadnt changed, i literally said slow increases per generation... Kentsfield to haswell was 2006 to 2013 - and what, 5 generations?

And this is what we got



Then the change zen managed in 2 gens... (including a 4 core equivalent and then GASP there is no zen 3 4 core chip! how dare they progress!)


(images look different, got them from different sources, but both passmark)
You suggested that the fundamental change for intel performance wise was due to DRAM speed increases instead of core changes. That, you yourself, have proven incorrect, as both haswell and the core 2 quad can use DDR3. You said it, not me.

2006: You really need more than two cores on your daily home and gaming machine?

1981: You really need more than 640K of memory on your daily home and gaming machine?

What I'm saying is we need to keep an eye on what AMD is or isn't doing. Sure, eight cores may be overkill nowadays, but what about in a few years?

Intel perpetuated the idea that four cores were plenty from at least 2007 all the way up until Zen happened in 2017. Also notice that when Zen came out, all of a sudden, "regular" programs could take advantage of it.

I hate Intel as much as the next guy, but AMD isn't perfect either. They're both companies, and when they've gotten far enough ahead of their competitors, they'll slow down innovation until we get another Intel from Sandy Bridge to Kaby Lake situation.

The price increase Zen 3 received was the first red flag for me (but that didn't stop me from buying it!).
Well, when 8 cores finally isnt enough, AMD already has 12 and 16 cores available for you! and by that time, AMD will have likely have higher then 16 core parts available.

Funny how when the 2000 series came out and still features 8 cores nobody was loosing their mind on AMD "stagnating", yet after the same amount of time at 16 cores suddenly AMD is worse then intel. This would be like people complaining that intel was stagnating right after sandy bridge came out because it was still 4 cores when the vast majority of games were still single threaded. By the time it's relevant this AM4/AM5 platform will be ancient history.
 
Last edited:
Joined
Oct 2, 2015
Messages
2,991 (0.96/day)
Location
Argentina
System Name Ciel
Processor AMD Ryzen R5 5600X
Motherboard Asus Tuf Gaming B550 Plus
Cooling ID-Cooling 224-XT Basic
Memory 2x 16GB Kingston Fury 3600MHz@3933MHz
Video Card(s) Gainward Ghost 3060 Ti 8GB + Sapphire Pulse RX 6600 8GB
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB
Display(s) AOC Q27G3XMN + Samsung S22F350
Case Cougar MX410 Mesh-G
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W
Mouse EVGA X15
Keyboard VSG Alnilam
Software Windows 11
What 11th gen disaster?

My 11700 uses 40-45 watts during gaming.
Try a K model, with that new "last ditch single core boost" enabled, easily over 200w.
Regular models like the vanilla 11700 are fine, the moment Intel tries to force 14nm to be more than it is, you get CPUs that can't be cooled with a 360mm radiator.
 
Top