• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Confirms Ryzen 9 7950X3D and 7900X3D Feature 3DV Cache on Only One of the Two Chiplets

my guess is the next x3d chips in a year or two will haver both ccd, its probably just a backup plan for if Intel comes out swinging again soon, they can just add another x3d cache to the other ccd and swing back to take the crown again.

lol. dumb, they should have just swung all the way and slam dunked.
That would put unnecessary pressure on AMD themselves when it comes to Ryzen 8000. No need to narrow the performance gap needlessly when the competition cant keep up. This way if Intel doesnt respond, Ryzen 8000 will look like it has a bigger performance gap.
 
No... Games have been multithreaded for a decade now. Back to 2008, most games from codemasters, activision etc... was already multithreaded. And from 2013/2014, almost every game is multithreaded.
This comment reeks of ignorance. I'm guessing you're also one of the guys who swore up and down his core 2 quad was way faster in games then a core 2 duo, because muh corez.

First of all, in 2008 most games were NOT "multithreaded". Most games primarily used 1 core, and offloaded some stuff onto a second core. Dual core Athlons and core 2 duos regularly outperformed quad core phenoms and core 2 quads. Go look back at some benchmarks and refresh your memory.

Today, again, "everything is multithreaded" is wrong. Not all games regularly push multiple cores, and of the (admittedly many) that do, they do not push past 6 cores. Usually only 1-2 cores are heavily hit, the others not so much. They're used, but not to the extent that they are heavily utilized like cores 0-1. Games hardly take advantage of 8 cores, let alone 12-16 cores.

There is no reason to have a 3d cache on both chiplets, games do not use that many cores and likely will not or the foreseeable future. Games are not inherently multi threaded and there is a limit to how much you can split up.
 
This will be a nightmare for air coolers.
Why would that be the case? 5800X3D is not that different than 5800X as far as temps go (TPU shows 75C vs 77C). Seeing as how these come with ECO mode ON by default, it's more likely that temps will be lower vs. stock non 3DV CPUs.
 
Is this all just speculation or do we actually know this for a fact? It sounds like speculation to me given that there is no source. I'm kind of taking issue with how this is being written as if it were fact when it's really just a guess.
 
Second CCD looks like it will also have some silicon spacer on top to bring it to the same height as the cache die.
Temps will be interesting.

Edit: First die with 3D cache gets thinned down to original height so a spacer shouldnt be needed.
 
Last edited:
This comment reeks of ignorance. I'm guessing you're also one of the guys who swore up and down his core 2 quad was way faster in games then a core 2 duo, because muh corez.

First of all, in 2008 most games were NOT "multithreaded". Most games primarily used 1 core, and offloaded some stuff onto a second core. Dual core Athlons and core 2 duos regularly outperformed quad core phenoms and core 2 quads. Go look back at some benchmarks and refresh your memory.

Today, again, "everything is multithreaded" is wrong. Not all games regularly push multiple cores, and of the (admittedly many) that do, they do not push past 6 cores. Usually only 1-2 cores are heavily hit, the others not so much. They're used, but not to the extent that they are heavily utilized like cores 0-1. Games hardly take advantage of 8 cores, let alone 12-16 cores.

There is no reason to have a 3d cache on both chiplets, games do not use that many cores and likely will not or the foreseeable future. Games are not inherently multi threaded and there is a limit to how much you can split up.


GTA4 was the first game I remember that actually used a quad core, 2009


More than 8 threads are useless in gaming due to dependency chains
 
I think that the 3DV cache CCD will boost up to 5Ghz and the non 3DV cache CCD will boost like the normal chip, a bit misleading, i dont think it will bost higher than 5Ghz in games, maybe 5.2ghz? , we will have to wait and see.

This is exactly what I was just thinking. It’s the only reason the boost on the 7800X3D would be so low with a single CCD.

Gaming performance should be relatively equal between these 3 SKUs then. The 7900/7950 3D variants will retain some of their better productivity performance with 1 CCD still being able to hit the non 3D variant boost clocks. So expect lower productivity performance unless any programs take advantage of the large cache.

At first glance it looks like there’s still a similar limitation to the Gen 1 3D design limiting clock speeds.
 
Why would that be the case? 5800X3D is not that different than 5800X as far as temps go (TPU shows 75C vs 77C). Seeing as how these come with ECO mode ON by default, it's more likely that temps will be lower vs. stock non 3DV CPUs.
Because one CCD will be hotter than the other?
 
At least we have AMD to thank for all of the innovation in desktop CPUs. From 2011 to 2018, Intel was just fine giving us the same quad-core chips for $800.

Unless you have stock in either AMD or Intel or nVidia, you want all of their products to be within 20% of each other or you get an Intel and nVidia situation.
 
Gaming CPU crown secured for AMD. And for much more sensible wattage.
 
No... Games have been multithreaded for a decade now. Back to 2008, most games from codemasters, activision etc... was already multithreaded. And from 2013/2014, almost every game is multithreaded.
Irrelevant, most games still run a few threads that need sequential handling and are heavy on a single core. That's where the cache shines; picking up all that data so quickly for the cores that need it.
 
My only concern would be how they are going to keep the right workload on the 3DV cache.
 
Intel still king in games
We will see shortly. I expect the 3D chips to be, on average, faster than Raptor Lake. This site compared the 13900k and 5800X3D when the 4090 came out, and the result was a narrow win for the 13900k.

1672934695342.png
 
We will see shortly. I expect the 3D chips to be, on average, faster than Raptor Lake. This site compared the 13900k and 5800X3D when the 4090 came out, and the result was a narrow win for the 13900k.

View attachment 277670
Because half the games tested are gpu bound even on a 4090. You can check the games here


I know for a fact that my 13900k at stock with 7600c34 ram is 15% faster than a maxed out 12900k running at 5.4ghz all core. I also know that said maxed out 12900k is faster than the 5800x 3d. Therefore it's obvious that the difference between the 13900k and the 5800x 3d cannot be 6% when fully cpu bound.
 
I can't really vote on the poll. I kind of like this because cache is used mostly for games, which don't use more than 8 cores anyway, and run better on a single CCD as well. But that makes the existence of dual-CCD versions questionable. Who are they for?
 
Because one CCD will be hotter than the other?
Unlikely. It looks like the 3DV cores will be running at a lower freqency than the normal ones (the reason why 7900 and 7950 have way higher clocks than 7800), so the temps should balance out.
 
Wish they could sell 7800x3d without stock cooler :(
Hardly a deal breaker...

sounds like gaming performance is going to be all over the place with CPPC disabled - one thread on ccd with the 3d cache, another on the ccd with no 3d cache, and so on
CPPC or bust: the cpu
I don't think it will work like that at all.
 
Unlikely. It looks like the 3DV cores will be running at a lower freqency than the normal ones (the reason why 7900 and 7950 have way higher clocks than 7800), so the temps should balance out.
I hope so.
 
I don't think it will work like that at all.
Agreed. We've had preferred cores/threads since Ryzen 3000 (I think) and Intel 11th gen. Windows knows which threads are meant for foreground and which for background tasks.
 
As long as AMD is not stupid with pricing these will sell like hotcakes. Even though that analogy is not what it once was.
 
What I don't understand is the logic behind the naming (as usual with AMD). We had the 5800X and 5800X3D, but they wanted to avoid the same confusion by having a 7700X and 7800X3D, which is fine. But then, we have a 7900X and X3D and a 7950X and X3D, which is weird. Also, what's with the rumored 10-core 7800X? Is it not a thing anymore?

Edit: IMO, it would be much simpler if we had xx00 numbers for normal CPUs, and xx50 for chips with the 3D cache. x600 could be 6-core, x700 8-core, x800 10 or 12-core and x900 16-core, and the xx50 would be the X3D.
 
Probably too rich for my blood. But happy they released it.
 
Back
Top