• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Ryzen 7 5800X3D

Nice processor if you want to game at 720p/1080p
Dude, read the review next time. The chart where it came on top was 1440p. +0.6 percent versus the 12900k at 1440p. Read the review next time.

It has entered mass production which means in the next weeks we will see the official launch.

Nice uplift in Farcry 5, up to 35% at 720p but besides that, it's a meh.
Totally irrelevant at 2160p today.
Read the review. It was clustered at the top of the charts for 4k also. All you're really saying is "NO CPU MATTERS that much when playing at 4k" which is true. But the entire point of buying the 12900ks or 5800x3d is to play at high refresh rates. 4k at minimum settings. 1440p 240hz. 1080p 360hz etc. Don't you think it is silly to say "no cpu matters at 4k" as a way to attack the best cpu? LOL
 
Last edited:
Anyone know how this CPU performs for emulation like dolphin, cemu, yuzu, ryujinx?

Wonder how it would end up with something like 3800 14-14-14-28 memory in games as well.
yup exactly..look at alder lake ddr5...review using low cl 6000 ddr6 from gskill...not fair
and if AMD using 4000mhz cl14...gonna good duel
 
How in Gods name can you possibly recommend this??????????
Its literally SLOWER than a standard 5800x in EVERYTHING except gaming, and its BARELY any faster in games!!!! What are you people smoking??
90 percent of buyers that buy i7 or i9 are buying for the gaming, your argument is the same there, the only CPU that makes sense then is the 12600k

the entire reason intel limits the cache for the i5 and i7 is to make a reason to buy the i9

if you are not interested in high refresh gaming, you don't want the best fps, that's fine

AMD was so far ahead versus Intel 10th and 11th gen precisely because the 5600x has the same cache as the 5800x and 5900x, so gaming never needed anything better than the 5600x (imagine a 12600k with the same amount of cache as the i9, NOW THAT WOULD BE THE BEST CPU EVER at that price), unlike with Intel, there's no reason to upgrade for cache, the 5600x was tops in game fps for so long because of that (and why it commanded higher prices suddenly)

(Did I make sense? Intel cuts the cache i9 to i7 to i5 to make gaming worse on purpose, even though the i5 has more than enough cores)

yup exactly..look at alder lake ddr5...review using low cl 6000 ddr6 from gskill...not fair
and if AMD using 4000mhz cl14...gonna good duel
I'd love a follow up piece with the newegg $130 DDR4 3600 C14 stuff
 
The only reason to buy anything above the 6 cores from either camp, is as a high refresh gamer
For a lot of games that means low res, or lower settings (or future, unreleased GPU's)

If you do nothing but game, get a 12400f or 5600x and stop wasting money.

If you intend to game at 144Hz+, this review shows you what CPU's can do that, if your GPU can keep up
 
yup exactly..look at alder lake ddr5...review using low cl 6000 ddr6 from gskill...not fair
and if AMD using 4000mhz cl14...gonna good duel
DDR4 3800 vs DDR5 5200.

i2700fps.jpg
 
I don't understand why everyone is so excited.
  • AMD is not the first company to have a massive L3 cache: Intel did that seven years ago with Broadwell which featured a massive 128MB L4 cache and showed crazy improvements in certain applications and games as well.
  • 5800X3D is a single experimental CPU which rocks in some games but loses in others and also loses in general applications which don't require enormous L3 cache due to decreased frequencies.
  • Very few people game at 720p or 1080p with their uber GPUs, and at 1440p and 4K most games are GPU bound and don't care about your L3 cache.
  • RPL according to the leaked information will feature an increased IPC for both its P and E cores as well as a significantly increased caches which means Intel will swiftly catch up and overtake this CPU and maybe even Zen 4.
  • It's too effing expensive for what it offers, no, it's not $100 more expensive than 5800X, 5800X has been recently sold for as low as $350 which makes it a huge $200 difference. I got it wrong, sorry. Still we're only talking about rare applications and gaming at quirky resolutions.
  • This is the last hooray of AM4, there's no future upgrade path.
Kudos to AMD for this experiment. Rare AMD fans who game at lower resolutions and those who like to boast about gaming benchmarks must be happy.
 
Last edited:
I don't understand why everyone is so excited.
  • AMD is not the first company to have a massive L3 cache: Intel did that seven years ago with Broadwell which featured a massive 128MB L4 cache and showed crazy improvements in certain applications and games as well.
  • 5800X3D is a single experimental CPU which rocks in some games but loses in others and also loses in general applications which don't require enormous L3 cache due to decreased frequencies.
  • Very few people game at 720p or 1080p with their uber GPUs, and at 1440p and 4K most games are GPU bound and don't care about your L3 cache.
  • RPL according to the leaked information will feature an increased IPC for both its P and E cores as well as a significantly increased caches which means Intel will swiftly catch up and overtake this CPU and maybe even Zen 4.
  • It's too effing expensive for what it offers, no, it's not $100 more expensive than 5800X, 5800X has been recently sold for as low as $350 which makes it a huge $200 difference.
  • This is the last hooray of AM4, there's no future upgrade path.
Kudos to AMD for this experiment. Rare AMD fans who game at lower resolutions and those who like to boast about gaming benchmarks must be happy.
$449 for a cpu that doesn't overclock in order to game at 1080P not to mention content creators won't give this cpu a second thought after looking at the rendering benches. This should be interesting to say the least when the 12700F is going for $310 and the 12700K/KF is going for $370 atm.
 
I think at that price point there should not have been a downgrade in productivity applications. It's not pre-Alder Lake era, where Intel was trailing badly in that area.

What we have is more of a proof of concept product, and I imagine lessons learned here will be valuable in next Ryzen chips that are designed from the start with massive cache - and the ability to power and cool it properly.
 
I don't understand why everyone is so excited.
  • AMD is not the first company to have a massive L3 cache: Intel did that seven years ago with Broadwell which featured a massive 128MB L4 cache and showed crazy improvements in certain applications and games as well.
  • 5800X3D is a single experimental CPU which rocks in some games but loses in others and also loses in general applications which don't require enormous L3 cache due to decreased frequencies.
  • Very few people game at 720p or 1080p with their uber GPUs, and at 1440p and 4K most games are GPU bound and don't care about your L3 cache.
  • RPL according to the leaked information will feature an increased IPC for both its P and E cores as well as a significantly increased caches which means Intel will swiftly catch up and overtake this CPU and maybe even Zen 4.
  • It's too effing expensive for what it offers, no, it's not $100 more expensive than 5800X, 5800X has been recently sold for as low as $350 which makes it a huge $200 difference. I got it wrong, sorry. Still we're only talking about rare applications and gaming at quirky resolutions.
  • This is the last hooray of AM4, there's no future upgrade path.
Kudos to AMD for this experiment. Rare AMD fans who game at lower resolutions and those who like to boast about gaming benchmarks must be happy.
To me, this is like a warning shot by AMD, and also for them to try and claw back some of the ”thunder” they lost to Alder Lake. The 5800X3D to me is too little and too late. Too little because you are not going to get a consistent performance improvement since it is just the cache that got increased. So if I was looking for an overhaul of my system, Alder Lake is clearly the better choice because the performance uplift is significant. I know that people are going to say that it uses a lot more power, but the truth is, unless you consistently load the CPU to its max, you won’t see 200+ W of power consumption. In games at 1440p, I rarely see HWInfo reporting power draw of more than 70 to 80W. In fact if one really wants to get a Ryzen 5000 series at this point in time, I feel the 5900X is actually better in value if you have the use case to utilise the cores.
This product is too late as well because it is going to end up like Rocket Lake where it is going to get replaced by more capable chip in the next 6 months or so.
 
I don't understand why everyone is so excited.
...
Kudos to AMD for this experiment. Rare AMD fans who game at lower resolutions and those who like to boast about gaming benchmarks must be happy.

This processor exists to tell investors that Zen 4 will beat Intel at gaming, and 16-core Zen 4 will beat Intel's 8+8 core processors. It exists to prove Gelsinger's claim wrong about AMD's heydays being over.
 
This processor exists to tell investors that Zen 4 will beat Intel at gaming, and 16-core Zen 4 will beat Intel's 8+8 core processors. It exists to prove Gelsinger's claim wrong about AMD's heydays being over.
Lest we forget Raptor Lake is right around the corner and those cpu's will work with LGA-1700 600 series boards.
 
Lest we forget Raptor Lake is right around the corner and those cpu's will work with LGA1700 600 series boards.
And Raptor Lake is built on existing Intel 7 node (10 nm), whereas Zen 4 is N5 (5 nm). If 5800X3D is a ~10% gaming perf uplift over Zen 3, the expectation from Zen4 will be set at 25% over Zen 3, which is about 10-15% above Alder Lake. Good luck to Intel trying to get there on existing node.
 
This processor exists to tell investors that Zen 4 will beat Intel at gaming, and 16-core Zen 4 will beat Intel's 8+8 core processors. It exists to prove Gelsinger's claim wrong about AMD's heydays being over.

I don't know where you pulled this from. This CPU has absolutely nothing to say about the performance of Zen 4 except that the latter will be faster. No one has even confirmed that desktop/mobile Zen 4 SKUs will have 3D V-Cache.

And Raptor Lake is built on existing Intel 7 node (10 nm), whereas Zen 4 is N5 (5 nm). If 5800X3D is a ~10% gaming perf uplift over Zen 3, the expectation from Zen4 will be set at 25% over Zen 3, which is about 10-15% above Alder Lake. Good luck to Intel trying to get there on existing node.

The node advantage will allow to pack more transistors at the same power package - a smaller node doesn't allow the CPU to magically clock a lot higher. With a very high confidence I can claim that 5.5GHz of ADL will be unattainable for Zen 4 CPUs.
 
The node advantage will allow to pack more transistors at the same power package - a smaller node doesn't allow the CPU to magically clock a lot higher. With a very high confidence I can claim that 5.5GHz of ADL will be unattainable for Zen 4 CPUs.

Please read our 12900KS review. Raptor Lake is built on that same node, where Intel plans to add even bigger P-cores, and two more E-core clusters (8P+16E). 12900KS is already burning the block, and with thermal limits overridden, it crosses 100C.

Also, the 5.5 GHz ADL clock speed advantage is kinda pointless in the face of 5800X3D gaming perf?

I don't know where you pulled this from. This CPU has absolutely nothing to say about the performance of Zen 4 except that the latter will be faster. No one has even confirmed that desktop/mobile Zen 4 SKUs will have 3D V-Cache.
From this review, and the gaming performance uplift.
 
I really don't get all the haters. Are you being forced to buy this CPU?
Yes, it's very much a last hooray for the AM4 socket, but why does this irk you?
No-one has to buy it and if it doesn't suit your needs, there are plenty other options.
I'm glad I got a 5800X for the same price as the 5700X, but I'm sure some people will go for the 5800X3D as it suits their needs.

Is it the most amazing processor ever? No.
I would actually call this a retail tech demonstration by AMD.
It shows what the company is capable of, but it comes at a cost that are going to make most people look elsewhere and that's fine.
 
It's not stated anywhere.. I tested it, 1866 POST, 1900 no POST, which probably means WHEA errors at 1866
No wonder if you tested with those semi-crap memory sticks. Try decent b-die sticks.
By the way, did you use dual rank memory sticks?
 
Please read our 12900KS review. Raptor Lake is built on that same node, where Intel plans to add even bigger P-cores, and two more E-core clusters (8P+16E). 12900KS is already burning the block, and with thermal limits overridden, it crosses 100C.

Also, the 5.5 GHz ADL clock speed advantage is kinda pointless in the face of 5800X3D gaming perf?


From this review, and the gaming performance uplift.
God, I'm tired of this crap. 5800X3D is faster in few selected games and it loses in all others. It's indistinguishable for 1440K and 4K.

We have discussed ADL power consumption ad nauseam already - I'm not going down that route ever again. Hint: it's not 300W, it's not 100C. In actual games ADL has been shown to be as power effective as Zen 3 CPUs.

Is this a discussion of the review of 5800X3D or what?

Lastly, let me tell you how much I hate yellow headlines: "AMD Ryzen 7 5800X3D CPU Crushes Intel's Fastest Gaming Chip, The Core i9-12900K, In Gaming Benchmarks". There's so much pain and fallacy in it it's just cringe-worthy. Out of two dozen tested titles 5800X3D runs faster in what 3? 4? At resolutions most people couldn't care less about.

OH, GOD, THIS IS A REVOLUTION AS IF BROADWELL NEVER EXISTED. I'm done and out of this completely pointless discussion. AMD fans have collectively ejaculated - great! Your idol has reclaimed the performance crown under rare quite uncommon conditions maybe 5000 people in the world care about.
 
Last edited:
Very impressive. Not sure why they are stealth launching this, it matches or beats the limited ediition behemoth power guzzling 12900KS for a much, much lower price.

Low stock maybe?

Or to keep scalpers from catching wind of the launch.
 
Uh, half of 450 is 225 and not 375. How did you come up with the idea the 12700F is 50% cheaper? It's not even close. Including platform costs a 5800X3D and 12700F will end up costing similar amounts. In essence it's top of the line gaming performance. For people who want that, it's a heck of a lot cheaper option than a 12900K.
I didn't say 50% cheaper. I said the 5800x is 50% more expensive
 
Impressive in gaming, but I wish it atleast supported negative curve optimizer, that could have yielded up to 6% better perf multicore and would lower temps single core.
 
I don't understand why everyone is so excited.
  • AMD is not the first company to have a massive L3 cache: Intel did that seven years ago with Broadwell which featured a massive 128MB L4 cache and showed crazy improvements in certain applications and games as well.
  • 5800X3D is a single experimental CPU which rocks in some games but loses in others and also loses in general applications which don't require enormous L3 cache due to decreased frequencies.
  • Very few people game at 720p or 1080p with their uber GPUs, and at 1440p and 4K most games are GPU bound and don't care about your L3 cache.
  • RPL according to the leaked information will feature an increased IPC for both its P and E cores as well as a significantly increased caches which means Intel will swiftly catch up and overtake this CPU and maybe even Zen 4.
  • It's too effing expensive for what it offers, no, it's not $100 more expensive than 5800X, 5800X has been recently sold for as low as $350 which makes it a huge $200 difference. I got it wrong, sorry. Still we're only talking about rare applications and gaming at quirky resolutions.
  • This is the last hooray of AM4, there's no future upgrade path.
Kudos to AMD for this experiment. Rare AMD fans who game at lower resolutions and those who like to boast about gaming benchmarks must be happy.
Did you read the review?
It's winning at 1440p.
 
Impressive in gaming, but I wish it atleast supported negative curve optimizer, that could have yielded up to 6% better perf multicore and would lower temps single core.
Curve optimize is a glorified Load Line Calibration for per-core.
 
0.6%? OMG.
And?
It still goes against everything you just said

  • Very few people game at 720p or 1080p with their uber GPUs, and at 1440p and 4K most games are GPU bound and don't care about your L3 cache.
 
Back
Top