• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Ryzen 9 3000 is a 16-core Socket AM4 Beast

It's a common misunderstanding that multicore scaling is primarily a lack of good software. I explained this some more here.
TLDR; most real-world tasks can't scale across an arbitrary number of cores, so unless you're running more tasks or you're running more typical servers, more and more cores is only going to give you diminishing returns, and even lower performance if you at some point have to sacrifice core performance for more cores.

Single core performance is essential and will become only more important in the next years, even for those processes which uses many threads, due to the synchronization overhead. But the clockspeed race seems to be nearly over, so future gains will come from IPC increases.
 
It's a common misunderstanding that multicore scaling is primarily a lack of good software. I explained this some more here.
TLDR; most real-world tasks can't scale across an arbitrary number of cores, so unless you're running more tasks or you're running more typical servers, more and more cores is only going to give you diminishing returns, and even lower performance if you at some point have to sacrifice core performance for more cores.

Single core performance is essential and will become only more important in the next years, even for those processes which uses many threads, due to the synchronization overhead. But the clockspeed race seems to be nearly over, so future gains will come from IPC increases.
Unfortunately, understanding all that requires programming knowledge. Most people don't have that.
At some point I crossed paths with a guy with several years of programming experience, rather well regarded within his team. When tasked with something that required a mild amount of concurrency, he said "I'm going to need some time to get familiar with this threading thing". So if a guy programming for a living can do that, good luck explaining cores and threads to the layman.
 
Unfortunately, understanding all that requires programming knowledge. Most people don't have that.

At some point I crossed paths with a guy with several years of programming experience, rather well regarded within his team. When tasked with something that required a mild amount of concurrency, he said "I'm going to need some time to get familiar with this threading thing". So if a guy programming for a living can do that, good luck explaining cores and threads to the layman.
Sure. My main point is that it's not just lack of willingness to adapt multithreading, hard problems are actually hard to solve.

Your story doesn't surprise me at all. Of all the programmers I've dealt with over more than a decade and a half, probably less than 5% is at that level of competence to deal with problems this complex. Even a typical programmer with 10 years of experience wouldn't even fully grasp the problem, even if explained in detail. The wast majority of programmers don't touch anything this low level, like web developers, app developers and most writing enterprise software in Java or C#, so they never develop an understanding of how it works.
 
Sure. My main point is that it's not just lack of willingness to adapt multithreading, hard problems are actually hard to solve.

Your story doesn't surprise me at all. Of all the programmers I've dealt with over more than a decade and a half, probably less than 5% is at that level of competence to deal with problems this complex. Even a typical programmer with 10 years of experience wouldn't even fully grasp the problem, even if explained in detail. The wast majority of programmers don't touch anything this low level, like web developers, app developers and most writing enterprise software in Java or C#, so they never develop an understanding of how it works.
Well, if you think about it, higher level languages actually trivialize multi-threading (think Java's executors, Erlang's spawn or Go's goroutines). The fact that even with this help programs don't fully load cores is further proof things we routinely do don't really need that many cores.
On the other hand, if we, programmers, would bubble sort and brute force everything, the whiny bunch would actually be much happier. Their cores would suddenly be seeing 100% usage.
 
Well, if you think about it, higher level languages actually trivialize multi-threading (think Java's executors, Erlang's spawn or Go's goroutines). The fact that even with this help programs don't fully load cores is further proof things we routinely do don't really need that many cores.
This only really works if you're writing something that spawns independent worker threads, and each of them do a fairly large chunk of work(so the average overhead becomes small). This mostly applies to typical server workloads, and is hard to apply efficiently in normal desktop applications.
Some languages have ways to distribute functions across several worker threads, but it usually creates more synchronization overhead and problems than it solves.
 
  • Like
Reactions: bug
Details... :p

Still waiting for software to manage...whatever that means on the programming side. :)
 
This only really works if you're writing something that spawns independent worker threads, and each of them do a fairly large chunk of work(so the average overhead becomes small). This mostly applies to typical server workloads, and is hard to apply efficiently in normal desktop applications.
Some languages have ways to distribute functions across several worker threads, but it usually creates more synchronization overhead and problems than it solves.
Even so, you can have an architect or tech lead come up with the blue prints. My point was, from a programming point of view, threading has become pretty trivial. Using threading to speed up things, like you point out, is a whole other story - not everything will become faster because you spread the load over more threads. You can split the AI in a game and let it run amok, but unless that game happens to be chess, it still has to sync up with user input and whatnot.
Now let's try to get back on topic ;)
 
So octo cores are OK now they're mainstream but anymore and it's a no no. Gotcha, let us know when it's OK by you when we can use more than 8 cores
 
So octo cores are OK now they're mainstream but anymore and it's a no no. Gotcha, let us know when it's OK by you when we can use more than 8 cores
Buying CPUs by the core disregarding your actual needs is a no no, but seemingly also beyond your comprehension ability.
Where the hell did you pull that 8 core limitation from anyway?
 
Buying CPUs by the core disregarding your actual needs is a no no, but seemingly also beyond your comprehension ability.
Where the hell did you pull that 8 core limitation from anyway?
You could say the same about any high end computer component purchase, when you can buy something for a fraction that will still do the same job, or the guys with 32/64gb ram when 8/16 is sufficient for 90% of people. That's the key, 16 core CPU's won't be bought by the 90%, I won't buy one, I have 6c12t and that's plenty by me, still I don't try and tell other people what they should buy based on my use. I don't see what the big deal is, if you don't see a need for it then you won't buy it, simple and of course there are those with more money than sense who will buy it just because, that's no different to how its ever been, regardless of if its 8/16/32c or whatever

The real winners here, and I've said this before, are the cheap quad/hex/octo w/SMT. Much more than that, few need to care.
And sorry I'm still catching up with the thread I was replying to this comment from the previous page, my bad
 
So octo cores are OK now they're mainstream but anymore and it's a no no. Gotcha, let us know when it's OK by you when we can use more than 8 cores
Jesus, no forest through the trees with this guy. :(

Edit: or just not finishing the thread before replying... as I just did seeing that comment. Haha!

As I've said before, what this does is setup the lemmings and those not in the know (95% of people) to think more cores are better. And to an extent, that is true. But for most users a 6c/12tcpu is plenty and doesnt bottleneck anyhing (and wont for years).. so hes, I'm annoyed the mainstream is packing in cores. At least with clockspeeds.. EVERYONE benefits. Cores... few do.
 
Last edited:
Honestly, if this pans out to be true, I might consider going with this 16c chip. I'm already at the point of thinking about upgrading since X79, at this point, is a pretty dated platform and I wanted to replace it with a 2950X, but if I can get 16c/32t for 500 USD instead of 800, I'm all for it (forget the cost of a TR motherboard.) In reality I don't need all of that PCIe goodness and if dual-channel DDR4 can keep at least 12c/24t fully fed under memory intensive tasks, then I think I found my next upgrade.

I've waited this long, so let's just wait and see what happens. :D
 
Honestly, if this pans out to be true, I might consider going with this 16c chip. I'm already at the point of thinking about upgrading since X79, at this point, is a pretty dated platform and I wanted to replace it with a 2950X, but if I can get 16c/32t for 500 USD instead of 800, I'm all for it (forget the cost of a TR motherboard.) In reality I don't need all of that PCIe goodness and if dual-channel DDR4 can keep at least 12c/24t fully fed under memory intensive tasks, then I think I found my next upgrade.

I've waited this long, so let's just wait and see what happens. :D

Expect X570 boards to be pricey. This might be a reason why AMD is keeping the CPU prices on the low. Don't expect the exact pricing from AdoredTV's leak though, as prices have changed since then. Two weeks to go...
 
Expect X570 boards to be pricey. This might be a reason why AMD is keeping the CPU prices on the low. Don't expect the exact pricing from AdoredTV's leak though, as prices have changed since then. Two weeks to go...
Expensive, sure, but TR expensive? Probably not. We'll know soon enough. :D
 
Expect X570 boards to be pricey. This might be a reason why AMD is keeping the CPU prices on the low. Don't expect the exact pricing from AdoredTV's leak though, as prices have changed since then. Two weeks to go...
Will the 16 Cores released later or with the other CPUs?
 
What is driving the price of x570 boards up so much? I guess we can use previous generation boards if its that bad?
 
Last edited:
What is driving the price of x570 boards up so much? I guess we can previous generation boards if its that bad?
Wait for the BIOS updates.
 
What is driving the price of x570 boards up so much? I guess we can previous generation boards if its that bad?

Possibly the lack of volume with the X570 chipsets? I've read the reason Zen 2 had been delayed was for it to wait for the chipset because the CPUs themselves were ready for a while, but i'm not 100% sure if this is true or not.

OTOH, it could very well be board makers attempting to squeeze us consumers ...
 
Expensive, sure, but TR expensive? Probably not. We'll know soon enough. :D

Probably not, but expect a $30-50 premium over current boards, if the board makers don't shoulder any of the cost.

Will the 16 Cores released later or with the other CPUs?

Unclear. AMD hasn't told the board makers the launch lineup as yet. But it's correct that there are 12 and 16 core parts with the board makers, as per various rumours.

What is driving the price of x570 boards up so much? I guess we can previous generation boards if its that bad?

The extra parts needed for PCIe 4.0. Two sets are needed for a full PCIe 4.0 board, one set minimum. These are in addition to the cost of the chipset itself. It would seem there's no issues to use current boards though, you just don't get the new board related features.

Possibly the lack of volume with the X570 chipsets? I've read the reason Zen 2 had been delayed was for it to wait for the chipset because the CPUs themselves were ready for a while, but i'm not 100% sure if this is true or not.

OTOH, it could very well be board makers attempting to squeeze us consumers ...

No squeezing, it's all about PCIe 4.0. Extra re-drivers and re-timers are needed and these are costly for PCIe 4.0 right now, since very few products need them. It'll likely change over time.
 
I don't see 16 core CPU's as a bad thing; Parallel processing is the only way go from here until someone figures out how to
overcome the current limitations. It seems they pushed the envelope as far they can already. So IPC and more cores is the only
way to sell & market something new for the chip makers.
Things get hairy after 5GHZ-ish; efficiency goes down remarkably after that. Test results
repeatedly show this.
Just remember; we would all still be on 4 cores if AMD didn't force the 6 and 8 core CPU's into the mainstream.
AMD doing good is good for everybody. AMD or Intel fans.
I'm all Intel now myself; but if AMD puts out a good CPU at a good price; I could make use of all those extra cores..
There is a large amount of people that would say the same.
 
I need this budget priced 16/32 cpu for some hobbyist linux box fun!
 
Probably not, but expect a $30-50 premium over current boards, if the board makers don't shoulder any of the cost.
To get 16c/32t for 300 USD less than the 2950x? I could live with that.
 
Just remember; we would all still be on 4 cores if AMD didn't force the 6 and 8 core CPU's into the mainstream.
AMD doing good is good for everybody. AMD or Intel fans.
I'm all Intel now myself; but if AMD puts out a good CPU at a good price; I could make use of all those extra cores..
There is a large amount of people that would say the same.

While AMD did achieve something great with the release of their multi core CPUs, it was Intel who first released a quad core CPU to the market.
While I agree with you that AMD capitalized on the market by producing 6 and 8 core CPUs respectively, I don't think we'd still be stuck on four cores in the mainstream. Could be wrong though...
 
Back
Top