• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Ryzen 12-Core, 24-Thread CPU Surges on SiSoftware Sandra

Raevenlord

News Editor
Joined
Aug 12, 2016
Messages
3,755 (1.18/day)
Location
Portugal
System Name The Ryzening
Processor AMD Ryzen 9 5900X
Motherboard MSI X570 MAG TOMAHAWK
Cooling Lian Li Galahad 360mm AIO
Memory 32 GB G.Skill Trident Z F4-3733 (4x 8 GB)
Video Card(s) Gigabyte RTX 3070 Ti
Storage Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB
Display(s) Acer Nitro VG270UP (1440p 144 Hz IPS)
Case Lian Li O11DX Dynamic White
Audio Device(s) iFi Audio Zen DAC
Power Supply Seasonic Focus+ 750 W
Mouse Cooler Master Masterkeys Lite L
Keyboard Cooler Master Masterkeys Lite L
Software Windows 10 x64
In an interesting report that would give some credence to reports of AMD's take on the HEDT market, it would seem that some Ryzen chips with 12 Cores and 24 Threads are making the rounds. Having an entire platform built for a single processor would have always been ludicrous; now, AMD seems to be readying a true competitor to Intel's X99 and its supposed successor, X299 (though AMD does have an advantage in naming, if its upcoming X399 platform really does ship with that naming scheme.)





The CPU itself is an engineering sample, coded 2D2701A9UC9F4_32/27_N. Videocardz did a pretty god job on explaining what the nomenclature means, but for now, we do know this sample seems to be running at 2.7 GHz Base, and 3.2 GHz Boost clocks (not too shabby for a 12-core part, but a little on the anemic side when compared to previous reports on a 16-Core chip from AMD that would run at 3.1 GHz Base and 3.6 GHz Boost clocks.) What seems strange is the program's report on the available cache. 8x 8 MB is more than double what we would be expecting, considering that these 12-core parts probably make use of a die with 3 CCX's with 4x cores each, which feature 8 MB per CCX. So, 3 CCX's = 3x 8 MB, not 8x 8 MB, but this can probably be attributed to a software bug, considering the engineering-sample status of the chip.

View at TechPowerUp Main Site
 
You might want to include a source link when you borrow their charts...
https://videocardz.com/67649/amd-ryzen-cpu-with-12-cores-and-24-threads-spotted

Strong, inappropriate wording there, when I mentioned their site on the article. The absence of source link was no more than a forgetful moment. And you will likely be happy to know that I had already included the source before I even set eyes on your comment. Maybe less rocks on hand would have been nice.

Thank you, anyway. I'm sure that your words weren't meant to say what they mean.
 
Here is to hoping they sell an 8 core part with higher clocks and more cache on this platform.
 
The CPU itself is an engineering sample, coded 2D2701A9UC9F4_32/27_N. Videocardz did a pretty god job on explaining what the nomenclature means, but for now, we do know this sample seems to be running at 2.7 GHz Base, and 3.2 GHz Boost clocks (not too shabby for a 12-core part, but a little on the anemic side when compared to previous reports on a 16-Core chip from AMD that would run at 3.1 GHz Base and 3.6 GHz Boost clocks.) What seems strange is the program's report on the available cache. 8x 8 MB is more than double what we would be expecting, considering that these 12-core parts probably make use of a die with 3 CCX's with 4x cores each, which feature 8 MB per CCX. So, 3 CCX's = 3x 8 MB, not 8x 8 MB, but this can probably be attributed to a software bug, considering the engineering-sample status of the chip.

Source: Videocardz

It will be 4 ccxs each with 3 cores enabled since this 12 part is likely cut down from the 16core part that's on the rumour Mill.
 
It will be 4 ccxs each with 3 cores enabled since this 12 part is likely cut down from the 16core part that's on the rumour Mill.

That's what it will be (no business sense in not harvesting the parts), but it still doesn't take into account such a difference in overall L3. The 16-core part will still have only 4 CCX's, ergo, 4x 8 MB L3. Where are the other 4x that Sandra seems to be reading?
 
That's what it will be (no business sense in not harvesting the parts), but it still doesn't take into account such a difference in overall L3. The 16-core part will still have only 4 CCX's, ergo, 4x 8 MB L3. Where are the other 4x that Sandra seems to be reading?

It is probably missing the entire second CPU.
 
Strong, inappropriate wording there, when I mentioned their site on the article. The absence of source link was no more than a forgetful moment. And you will likely be happy to know that I had already included the source before I even set eyes on your comment. Maybe less rocks on hand would have been nice.

Thank you, anyway. I'm sure that your words weren't meant to say what they mean.

Actually, they were meant to say exactly what they say. Having been a tech writer/journalist myself for over a decade it makes me furious when people post content without a link to the source. It takes so little time and effort to do it. I actually read through your article before posting my comment and there was no link at that point. I also don't see what was inappropriate with what I said, it's such a little thing to do that so many people forget about. If this was something you discovered and a chart you put together, all credit to you, but in this case, someone spent some serious time doing the research and putting together that chart and the least they can get is a link.
 
I would think they would make this a 4 CCX with 3 cores a piece. Easier then to make the 16 core variant.
 
I would think they would make this a 4 CCX with 3 cores a piece. Easier then to make the 16 core variant.

You know this as a matter of fact, or are you just making assumptions? If each core complex is four cores, wouldn't it make more sense to use three CCX's with four cores each, going by your logic?
 
Actually, they were meant to say exactly what they say. Having been a tech writer/journalist myself for over a decade it makes me furious when people post content without a link to the source. It takes so little time and effort to do it. I actually read through your article before posting my comment and there was no link at that point. I also don't see what was inappropriate with what I said, it's such a little thing to do that so many people forget about. If this was something you discovered and a chart you put together, all credit to you, but in this case, someone spent some serious time doing the research and putting together that chart and the least they can get is a link.

And I understand and applaud that sentiment, and am totally in agreement. What was inappropriate is that you accuse someone of "stealing" when that isn't - at all - what happened. That's what's wrong. If that rubs you the wrong way like that, why not send a PM saying "Hey, did you forget the source link?" Or a post, asking "hey, you bananas, did you forget the source link?" That is helpful, and thoughtful, and respectful, whilst not marring someone's image with your own hasty, wrong, baseless, conclusions.

I always strive not to jump to conclusions - and above all, to not be unfair with baseless accusations. You didn't strive for those, and instead threw some rocks. I always re-read my posts after I put them up, and realized I didn't input the source link. I edited the piece with the appropriate, deserved, source link, before I ever set eyes on your comment. If you don't see why your words and the way you went on about this are wrong, well, I really don't have anything else to say.
 
And I understand and applaud that sentiment, and am totally in agreement. What was inappropriate is that you accuse someone of "stealing" when that isn't - at all - what happened. That's what's wrong. If that rubs you the wrong way like that, why not send a PM saying "Hey, did you forget the source link?" Or a post, asking "hey, you bananas, did you forget the source link?" That is helpful, and thoughtful, and respectful, whilst not marring someone's image with your own hasty, wrong, baseless, conclusions.

I always strive not to jump to conclusions - and above all, to not be unfair with baseless accusations. You didn't strive for those, and instead threw some rocks. I always re-read my posts after I put them up, and realized I didn't input the source link. I edited the piece with the appropriate, deserved, source link, before I ever set eyes on your comment. If you don't see why your words and the way you went on about this are wrong, well, I really don't have anything else to say.

Hence why I changed my wording to borrowed as it was poorly worded on my behalf when I posted it.
 
Warring : here comes the cores race! :fear:
Good! The world needs to start using more cores and using them effectively. We've been sitting on dual cores too long.

Better software use of more cores, more performance out of same hardware. Intel or AMD fans, we will all benefit from this.

(This also reminds me of that time Qualcomm said that 8-core chips are pointless and stupid, only to launch a 8 core (4+4 in BIG.Little config) a year later.
 
That's what it will be (no business sense in not harvesting the parts), but it still doesn't take into account such a difference in overall L3. The 16-core part will still have only 4 CCX's, ergo, 4x 8 MB L3. Where are the other 4x that Sandra seems to be reading?

Yeah sorry I'm posting from mobile so cutting your post down just to the ccx part was proving to be tricky.

I have no idea what's going on with the cache.
 
12 cores on a desktop platform. Vary nice.
 
Good! The world needs to start using more cores and using them effectively. We've been sitting on dual cores too long.

Better software use of more cores, more performance out of same hardware. Intel or AMD fans, we will all benefit from this.

(This also reminds me of that time Qualcomm said that 8-core chips are pointless and stupid, only to launch a 8 core (4+4 in BIG.Little config) a year later.

True. The fact that phones already running on 8-10 cores, makes me realize how badly are desktops being played by intel 4 cores monopoly for the past few years.

Competition is indeed so important for any development.
 
Last edited:
True. The fact that phones already running on 8-10 cores, makes me realize how badly are desktops being played by intel 4 cores monopoly for the past few years.

Competition is indeed so important for any development.

Phones are on 4 for power saving, 4 for performance, and iPhones are still extremely competitive with their dual core designs.

Phones really don't need a bazillion cores, it generally creates more problems than it helps.
 
Phones are on 4 for power saving, 4 for performance, and iPhones are still extremely competitive with their dual core designs.

Phones really don't need a bazillion cores, it generally creates more problems than it helps.

I can attest to that have a mediatek decacore cpu and it trips over its self trying to decide what set of cores to use which ones to put in a ready state and which ones to power down.

When It locks on performance is fine so video editing and prolonged levels of stress to cpu work great but it's the general ui that really suffers.
 
Strong, inappropriate wording there, when I mentioned their site on the article. The absence of source link was no more than a forgetful moment. And you will likely be happy to know that I had already included the source before I even set eyes on your comment. Maybe less rocks on hand would have been nice.

Thank you, anyway. I'm sure that your words weren't meant to say what they mean.

And I understand and applaud that sentiment, and am totally in agreement. What was inappropriate is that you accuse someone of "stealing" when that isn't - at all - what happened. That's what's wrong. If that rubs you the wrong way like that, why not send a PM saying "Hey, did you forget the source link?" Or a post, asking "hey, you bananas, did you forget the source link?" That is helpful, and thoughtful, and respectful, whilst not marring someone's image with your own hasty, wrong, baseless, conclusions.

I always strive not to jump to conclusions - and above all, to not be unfair with baseless accusations. You didn't strive for those, and instead threw some rocks. I always re-read my posts after I put them up, and realized I didn't input the source link. I edited the piece with the appropriate, deserved, source link, before I ever set eyes on your comment. If you don't see why your words and the way you went on about this are wrong, well, I really don't have anything else to say.

Don't you dare even make a hint of an error or you shall be stoned at dawn!

Nah, agreed, being unpleasant to a writer over perceived errors is completely uncalled for. If I notice an embarrassing typo or other gaff, I'm more likely to send a pm than mention it in public where it would have the effect of humiliation and unpleasantness.

I'd like to see what a 16 core, 32 thread Ryzen can do. That will certainly be interesting.
 
Yeah sorry I'm posting from mobile so cutting your post down just to the ccx part was proving to be tricky.

I have no idea what's going on with the cache.

It's most likely 4x 8MB, not 3x or 8x. SiSoft is probably reading it wrong...
 
Strong, inappropriate wording there, when I mentioned their site on the article. The absence of source link was no more than a forgetful moment. And you will likely be happy to know that I had already included the source before I even set eyes on your comment. Maybe less rocks on hand would have been nice.

Thank you, anyway. I'm sure that your words weren't meant to say what they mean.

You have also forgotten a link here: https://www.techpowerup.com/231696/amds-elusive-polaris-12-makes-an-appearance-on-compubench and you ignored my hint to check if Polaris was LPP or LPE: https://www.techpowerup.com/forums/...igher-clocks-lower-power.231692/#post-3623708 (and you included the link to wccftech named BenchLife).

Please, take your time when writing news and always try to use multiple sources.
 
I don't think even the mainstream media does that nowadays.

Yes, that is probably true ... but that is also why doing a little bit of additional research provides a great opportunity to stand out from the crowd :)
 
Back
Top