• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Ryzen 7000X3D Announced, Claims Total Dominance over Intel "Raptor Lake," Upcoming i9-13900KS Deterred

Third image footnote: "All prices are suggested online retailer price (SEP) in US dollars"

...yet no prices to be seen. What did i miss?
 
I notice on amd's site, 7950x3d says " Native PCIe® Lanes (Total/Usable) 28/23 " .
which is less than 7950x's 28/24 .

how impactful is this?
 
I am sure it will be the most successful processor in games. R9 7950X3D
 
I notice on amd's site, 7950x3d says " Native PCIe® Lanes (Total/Usable) 28/23 " .
which is less than 7950x's 28/24 .

how impactful is this?
I'm pretty sure that's a typo. The unusable lanes are for the chipset and there is no such thing as a PCIe x5 interface.
 
Last edited:
Would love to see the 7950X3D with the 3D cache on the best CCD .. imagine 3D cache with 5.7GHz ST: I'd disable the non 3D cache CCD all day long, lol
 
Imagine a 5800X3D at 5Ghz on even a single core.
That's exactly what has me salivating lol. I had a 5800x until very recently and I thought it was going to be a bit longer before the 7k3D parts so I bought a 5800x3D because I do a lot of VR flight simulator stuff and other things it benefits.......now with this announcement I'm about to buy an AM5 setup and camp outside for a 7800x3D haha.
 
Very reliable result, the 12700k has better lows than the 13900k, lol.
Exactly.
Because the 13900K throttles a lot harder, while the 12700K doesnt run into it's TDP and thermal issues as often.

A CPU that never reaches its throttles (like when an all core OC is used) is always performing equally, but a CPU that relies on very drastic differences between base clock and boost clock is going to have those lows, any time a limit is reached - be it thermal or PL1/PL2 related
 
Exactly.
Because the 13900K throttles a lot harder, while the 12700K doesnt run into it's TDP and thermal issues as often.

A CPU that never reaches its throttles (like when an all core OC is used) is always performing equally, but a CPU that relies on very drastic differences between base clock and boost clock is going to have those lows, any time a limit is reached - be it thermal or PL1/PL2 related
You think that the 13900k throttles in farcry 6? LOL. Man, what the actual heck. This is just trolling, I don't believe for a second you believe what you just said. It barely draw 100watts in farcry 6 but somehow it's throttling! There is not a single game in existence that makes the 13900k throttle to either power or thermal limitations.
 
Last edited:
I'm good, my 5800X3D is fine and will carry my 4090 for the next few years just fine, especially since I use DLDSR 2.25x to get my games rendered at 4K.

Waiting for Zen 5 3D V-Cache.
 
  • Like
Reactions: Kei
I'm good, my 5800X3D is fine and will carry my 4090 for the next few years just fine, especially since I use DLDSR 2.25x to get my games rendered at 4K.

Waiting for Zen 5 3D V-Cache.
I am anxiously awaiting reviews for them before I truly decide, but if I don't love what I see then I'll be doing the same as you (same system specs) and be more than happy.

I hope there is a decent little bump though but we'll see...
 
I am anxiously awaiting reviews for them before I truly decide, but if I don't love what I see then I'll be doing the same as you (same system specs) and be more than happy.

I hope there is a decent little bump though but we'll see...
I don't have a real reason to swap out my 5800X3D considering it's better than a 12900KS and I don't gain much since my games are 4K rendered, as I mentioned. Based on TechPowerUp's reviews of 53 games with the 4090 paired with a 5800X3D vs 13900K, the 13900K is on average 4.7% faster... hurray. And the platform cost of AM5 and these new X3D CPUs is not something I want to dabble in right now.

It will do just fine until Zen 5 3D V-Cache which is when I'll be doing a complete system rebuild, and get a 5090 or something, granted if there is stock. If not, I'll keep my 4090.
 
I don't have a real reason to swap out my 5800X3D considering it's better than a 12900KS and I don't gain much since my games are 4K rendered, as I mentioned. Based on TechPowerUp's reviews of 53 games with the 4090 paired with a 5800X3D vs 13900K, the 13900K is on average 4.7% faster... hurray. And the platform cost of AM5 and these new X3D CPUs is not something I want to dabble in right now.

It will do just fine until Zen 5 3D V-Cache which is when I'll be doing a complete system rebuild, and get a 5090 or something, granted if there is stock. If not, I'll keep my 4090.
The 3d isn't better than the 12900k or the ks. Its just isnt
 
The 3d isn't better than the 12900k or the ks. Its just isnt
That's why it tops it in most games and especially the games I play, while costing much less, drawing MUCH less power, and being able to do so with much crappier memory, right?

Of course it's not better in productivity, but this isn't a productivity rig. It's a gaming rig. I swapped the 5900X for this. And the little productivity I do it handles more than fine. Now, as for games, the 12900K/KS would never have given me the same stutter free experience in VR games and the fantastic 0.1 and 1% lows which is what I'm after. Sorry, but 'it just won't'.

Edit: ah, you own a 12900K, def some buyer bias going on.
 
Ok, I can't brain this one

7900X3D - What's the cache and core layout look like ?

it's only one CCD with 64 meg extra cache right ?

So is it an 8 core CCD with and extra 64 meg - in which case the other CCD is 4 core
Or is it a 2x 6 core CCD with (one with extra 64 meg) - in which case the cores get more cache and there's now no technical reason we can't have a 7600X3D
 
That's why it tops it in most games and especially the games I play, while costing much less, drawing MUCH less power, and being able to do so with much crappier memory, right?

Of course it's not better in productivity, but this isn't a productivity rig. It's a gaming rig. I swapped the 5900X for this. And the little productivity I do it handles more than fine. Now, as for games, the 12900K/KS would never have given me the same stutter free experience in VR games and the fantastic 0.1 and 1% lows which is what I'm after. Sorry, but 'it just won't'.

Edit: ah, you own a 12900K, def some buyer bias going on.
No it's not. And....using your logic, ah you own a 3d, def some buyer bias going on. Right? Or it only applies to other people? :p

I don't own a 12900k anymore,, got a 13900k.

Performance from this very site, the 3d is far down the list, nowhere near close the 12900k

 
Last edited:
Does it matter? He said the 3d is faster than the 12900ks, which is not the case. Does he play at 720p?
He didn't say it's faster, he said it's better. Justifying it for having better lows than either 12900K and 13900K, thus making his gameplay more fluid.
 
He didn't say it's faster, he said it's better. Justifying it for having better lows than either 12900K and 13900K, thus making his gameplay more fluid.
But its not better, either at lows or average or what have you. Its far behind as per the review from this very site i just linked.
 
Does it matter? He said the 3d is faster than the 12900ks, which is not the case. Does he play at 720p?
I think your 5800X3D is not working properly.
 
I think your 5800X3D is not working properly.
Does yours? Go ahead then, upload a cyberpunk 2077 run, spiderman, spiderman miles morales, farcry 6, valorant, cs go, choose your poison.
 
If the 7800X3D is as good as this article suggests, then Intel have a big problem.

I'm gonna seriously consider this for my 2700K upgrade once the reviews are out. Will be really nice to dodge the e-core bullet, if nothing else.
 
If the 7800X3D is as good as this article suggests, then Intel have a big problem.

I'm gonna seriously consider this for my 2700K upgrade once the reviews are out. Will be really nice to dodge the e-core bullet, if nothing else.

mate i've done 4 upgrades since the 2700K (> 4790K > 7700K > 9700K)..... i'm also waiting to see how the 7800X3D plays out although not feeling the higher premiums for the platform/DDR5 swap. Got a couple of freely handed AM4's sitting around with the 5800X3D being the upgrade backup.

Just curious, is the 2700K your daily game driver? Mine couldn't handle some of the newer titles around 10 years ago (maybe exaggerated/or closer to home).
 
Last edited:
they just announced live the best supercomputer chip ever made in history. 146 billion transistors, and most of the show was about innovation in healthcare, robotic surgery, Lisa Su also brought on a female Astronaut to talk about AMD has helped Artemis to the Moon, and other NASA relationships, etc...

Did you watch the Live Show at all, or just go based off tech threads?

very small part of it was for gaming. AMD really did a great job tonight, Lisa Su was fantastic.
No her Green jacket was Fantastic /s
 
mate i've done 4 upgrades since the 2700K (> 4790K > 7700K > 9700K)..... i'm also waiting to see how the 7800X3D plays out although not feeling the higher premiums for the platform/DDR5 swap. Got a couple of freely handed AM4's sitting around with the 5800X3D being the upgrade backup.

Just curious, is the 2700K your daily game driver? Mine couldn't handle some of the newer titles around 10 years ago (maybe exaggerated/or closer to home).
4 upgrades? It has been a while for me! :laugh:

Yes, the 2700K is my main PC; I don't have anything higher spec than this - see specs for full info. Note that I've upgraded just about everything other than the CPU, supporting components and case since I built it in 2011.

Well, on the desktop, it feels as snappy as ever. Seriously, no slowdown at all since I first built it, hence Microsoft hasn't made Windows any slower. Fantastic. I don't run any intensive apps that would really show up the lack of performance compared to a modern system.

Now, while I do have hundreds of games, I haven't played that many of them (Steam is very efficient at separating me from my money with special offers lol) or that often.

I ran Cyberpunk2077 and got something like 15-25fps even when dropping screen res and details right down, so it's no good for that. In hindsight, I should have gotten my money back, nvm.

CoD: Modern Warfare (the newer one) runs quite well at 60-110fps or so. Jumps around a lot, but with my newish G-SYNC monitor, that hardly matters and it plays fine. Even before that, the experience was still good, but not great, especially if I set the screen refresh rate to 144Hz and vsync off. Felt very responsive like that. Note that my 2080 Super easily plays this game at 4K. I don't have all the details maxed out though, regardless of resolution. I don't like motion blur and ambient occlusion doesn't make that much visual difference, so I turn them both off, for example, but both really reduce the performance.

CoD: Modern Warfare II Warzone 2.0 runs with rather less performance and can drop down into the stuttery 40fps which is below what the G-SYNC compatibility will handle, but otherwise not too bad. It also tends to hitch a bit, but my console friends reported that too, so is a game engine problem, not my CPU.

I've got CoD games running back generations and they all work fine. Only the latest one struggles to any degree.

I've run various other games which worked alright too, can't remember the details now. It's always possible to pick that one game that has a really big system load, or is badly optimised and runs poorly, but that can happen even with a modern CPU.

I have a feeling that this old rig, with its current spec, can actually game better than a modern gaming rig with a low end CPU. Haven't done tests of course, but it wouldn't surprise me.

Agreed, I don't like the greater expense for AMD either, so the devil will be in the details. I want to see what the realworld performance uplift will be compared to the 13700K I have my eye on before I consider my upgrade. Thing is, every time, I think I'm finally gonna pull the trigger, the goal posts move! The real deadline here of course, is Windows 10 patch support is gonna end in 2025, so it's gonna happen for sure by then.

And finally, out of interest, here's the thread I started when I upgraded to my trusty 2700K all those years ago. It's proven to be a superb investment to last this long and still be going strong.

 
Back
Top