• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Readies Ryzen 9 3950X 16-core Processor to Awestrike Crowds at E3

Not sure if we need 16 cores in a desktop chip. 4 still does okay, 6 seems to be the sweet spot, and 8 is top end/overkill, unless you actually use HEDT programs.
 
Word from the press briefing before E3 presentation is that AMD ran the CPU comparisons

Intel = No mitigation
Great if true, however all OS & BIOS mitigations should be enabled by default. I wouldn't go out to my way to disable HT, however I'll also not disable the latest security patches & I suspect most users are that way.
 
Not sure if we need 16 cores in a desktop chip. 4 still does okay, 6 seems to be the sweet spot, and 8 is top end/overkill, unless you actually use HEDT programs.
PS5 has "8 core Zen v2" .
 
I personally don't understand why we're limiting ourselves to low TDP values in desktops, it's not like we're dealing with notebooks in which you have limited cooling capacity and power usage limits. This is the desktop platform, the sky is the limit. If the processor needs more power, there are 1000 Watt power supplies. If the processor needs more cooling then you know, go balls to the walls with a massive liquid cooling system.

We're not, we can OC can we not? And you can go balls to the wall, it just doesn't really pay off.

Turn the argument around: I do prefer my CPUs to be air cooled and still run close to max capacity ;) It wouldn't be good if you had to add half the price of the CPU in cooling to make it work proper, would it?
 
Intel will not have anything but 14nm refreshes on desktop until 2022 based on their own roadmap. Comet Lake up to 10c/20t next year (and new socket again) and Rocket Lake problably 12c/24t in 2021.

Intel's is stuck between rock and a hard place right now.
Yet the main reason AMD is gaining market share are shortages of Intel chips.
 
I think that we can agree that the Intel shortages just over after this E3 keynote :-).
 
Has anyone figured why it's:

3700X, 3800X, 3900X, 3950X

instead of:

3700X, 3750X, 3800X, 3900X ???

It's an AMD thing to create a xx50 model that's a step above the rest. It's like Nvidia's "Ti" designation. The 3800X isn't enough of a step above the 3700X to get that branding.
 
599? take my money pls.
 
Sorry you lost me lol.

I meant: why does AMD make it look like there's a bigger step between 3700X and 3800X, both 8C, and having such a small increment in model numbers between the 12C and the 16C?

Edit: Or are you actually saying that AMD has changed the model numbers after Computex? That list looks old..

not a clue on their naming scheme, here is the updated list from PC gamer,

  • Ryzen 9 3950X—16C/32T, 3.5GHz to 4.7GHz, 72MB cache, 105W TDP, $749 (in September)
  • Ryzen 9 3900X—12C/24T, 3.8GHz to 4.6GHz, 70MB cache, 105W TDP, $499
  • Ryzen 7 3800X—8C/16T, 3.9GHz to 4.5GHz, 36MB cache, 105W TDP, $399
  • Ryzen 7 3700X—8C/16T, 3.6GHz to 4.4GHz, 36MB cache, 65W TDP, $329
  • Ryzen 5 3600X—6C/12T, 3.8GHz to 4.4GHz, 35MB cache, 95W TDP, $249
  • Ryzen 5 3600—6C/12T, 3.6GHz to 4.2GHz, 35MB cache, 65W TDP, $199
  • Ryzen 5 3400G—(Zen+) 4C/8T, 3.7GHz to 4.2GHz, 6MB cache, Vega 11 Graphics at 1400MHz, 65W TDP, $149
  • Ryzen 3 3200G—(Zen+) 4C/4T, 3.6GHz to 4.0GHz, 6MB cache, Vega 8 Graphics at 1250MHz, 65W TDP, $99
 
That's kind of a... weird way of looking at it. I mean, if you're an enthusiast sure, you can throw power requirements and cooling requirements out the window. Have your 1000w power supply and your water chiller. But neither Intel nor AMD is selling chips that require water chillers and go way past the point of diminishing returns. 99% of desktop users are fine with a 65w or less processor, if they even use a desktop anymore. Laptops and smartphones are where it's at for a lot of people these days.
No, it's not weird at all. It's what an enthusiast would say versus someone else who would rather have maximized performance-per-watt over maximized performance. And, even environmentalists can be enthusiasts because if they're really committed they can set up a solar array with batteries so that their gaming PC doesn't generate much carbon.

Thing is, AMD under Su has been really big on small dies to maximize profits. Even Radeon VII has a small die. Small dies are rarely the optimal path for enthusiast performance. Of course, having a large die doesn't guarantee outstanding performance either (as we saw with Fiji). If those dies are full of irrelevant quasi-relevant stuff like Tensor, then maybe what you're getting is help with hot spots more than the highest performance. (Or, if the architecture has a bottleneck...) It does seem to make some sense, though, to go to chiplets to spread heat out, since we're at 7nm now — despite the latency. However, if we're talking about best performance we would be far nearer the reticle limit. Enthusiasts should be wary about getting excited about expensive products like Radeon VII that ship with small dies and high clocks. That's not only probably not optimal for performance-per-watt, it's not optimal for pure performance. What it is optimal for is margin for the company peddling it. Small dies with high stock clocks are great for planned obsolescence.

Hyperbole isn't very useful in this subject area, either. 105 watts is very little. Even if we were to see a return to the 250 watt 9590 (it used more than the 220 it was rated for), that would hardly be all that outlandish when compared to what high-performance GPUs use. This is an industry where people used to run triple SLI. Anandtech and others included triple GTX 480 and 580 systems in their benchmarks. I am reminded of the sudden obsession with power consumption that happened with televisions once marketeers starting pushing LED backlit TVs. Suddenly, it was utterly shameful to not have the most power-sipping LED technology. Anyone buying a CCFL model was clearly wanting to drag knuckles and doom the planet. It's fascinating to see how marketing pressure can change the narrative so rapidly and effectively.

If people are really so interested in performance-per-watt, then why enable companies to slap 3 fans on a 1660 and a high voltage — in search of a clock that's well outside of the efficient limit of the chip? People buy cards like this. In fact, I don't think any 3rd-party GPUs with decent cooling don't have some amount of overclock on them out of the box. GPU manufacturers know that enthusiasts care more about performance than they care about performance-per-watt efficiency. If Fiji, Vega, and Polaris had been able to lead in raw performance then sales would have been considerably higher, despite the worse performance-per-watt (outside the rare games that are more efficient on recent GCN). I have no doubt that if someone were to put out a 500 watt GPU that leads in performance over Nvidia's top card by 25% it would sell very well, or which equalized performance at half the price.

Server CPUs typically have lots of cores with high performance-per-watt and a low TDP. Enthusiasts were sold server-centric designs by AMD before. Remember Bulldozer/Piledriver? 8 cores with weak FPU performance back in 2011/2012 was definitely not optimal for gamers. It was, though, more useful for servers. What many don't realize is that AMD sold a good number of Piledriver Opterons for supercomputers. Those computers mainly used GPUs for the heavy lifting, making the cheapness of Piledriver attractive. Yes, Intel's Xeon had better performance-per-watt than Bulldozer/Piledriver so AMD didn't do so well in the server market. (That doesn't mean the design, though, wasn't more server-oriented than it could have been.) The supercomputer market cared less about power consumption, although that's becoming more of a marketing focus lately. AMD pioneered the many small cores design philosophy with Bulldozer and what we're seeing with Ryzen is an extension of that philosophy, only without CMT.

Something like a 180 watt TDP would be perfectly reasonable for a 16 core 32 thread desktop CPU. 105 seems too low to maximize performance. It also seems like a way to enable mediocre board VRMs to get less heat from customers. This way, AMD props up the practice of selling "8 phase" boards that are really 4, and so on.

However, all this said, I like performance-per-watt a lot because it makes cooling without a high level of noise easier. Regardless, though, quiet cooling and high performance are at odds. The enthusiast point of view typically favors the latter, unless that person has deep enough pockets to offset the higher heat with better dissipation hardware.

If you think it's weird to advocate for higher performance (higher TDP) over something very conservative like 105 watts, consider the explosion of AIOs in popularity (not their penchant for literally exploding, as happened with my H50). Dual fan AIOs are pretty standard these days for enthusiasts and their CPUs. Why shouldn't they be running cheaper air coolers instead? They're running dual and triple fan AIOs to gain performance.
We're not, we can OC can we not?
How successful that is depends on design factors like power delivery robustness. This can be seen, for instance, with GPUs that can't get further because of weak VRM and/or not enough/large enough connectors. How easy it is also depends on various decisions. I think most anyone would rather have a CPU perform higher at stock so their overclocking effort is less work. There are also people who don't want to or who can't overclock (work computers).

If you release low TDP parts across the board then board makers can sell their fake/weak VRMs and enthusiasts will be stuck paying the extra extra premium for the few top-end boards that actually have good performance. If your CPUs, though, are more demanding out of the box, there is less room for fakery.

There is also the potential issue of the number of layers. I have read that 6 layers are really needed for PCI-e 4 and yet Gigabyte is apparently going to be selling 4 layer 570 boards. Perhaps one reason for the 105 watt TDP is to make it possible to cut down on the number of layers needed. I don't know if there is a connection.
 
Last edited:
@RichF that's a little over-reaching, isn't it? Sure, you've got people that don't care if their cards draw 500w and require exotic cooling, but you've got plenty of people who do. Remember how bad Fermi "Thermi" got slammed for being hot, power hungry and loud? How about the Radeon 2900XT?

All I'm saying is I don't see a point in either Intel or AMD releasing some crazy 300w CPU that requires water cooling out of the box. If you are an enthusiast, you buy the K series CPU (or HEDT model) and you do that yourself.
 
If you want a toasty cpu then I am guessing wait for the next Threadripper. Here they are wedging a unicorn 16/32 into the desktop AM4 platform. Looks like the 12-core will be the performance sweet spot. Also that 65w 8-core seems like nice firepower in a great power envelope. I suspect the 3950X will be more of an enthusiast / hobbyist cpu.
 
Last edited:
Back
Top