• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Ryzen 7 5800X3D

Joined
Jun 14, 2020
Messages
2,678 (1.90/day)
System Name Mean machine
Processor 13900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Fastest gaming cpu under 350€/$?? Keep on dreaming :laugh:
Even if it was (it isnt), intel frequently had fastest gaming cpu at that price. 7700 / 8700. But i guess, this is amd fleecing customers as per the usual
 
Joined
Nov 20, 2021
Messages
84 (0.09/day)
Or your brother buys a 12700f with a new mobo with new IO / better features and longevity? I mean the cpu + mobo is going to cost as much as the 3d on its own, Lol. Of course the 3d will lead by 3-4% in 720 p gaming but it gets destroyed in everything else. Amd fleecing customers again
Longevity? Two generations are not longevity. He can just wait for AM5 and get the X600 processor

???? Jesus with these stupid posts. Where did you see that it needs half the consumption for the same fps? The only time the 12900 consumes double is in cinebench, and it actually gets double the score as well, so efficiency is the same. Why can't amd fanboys stop spreading nonsense is beyond me.

Did you forget to look at this?

1649864585847.png


Even if it was (it isnt), intel frequently had fastest gaming cpu at that price. 7700 / 8700. But i guess, this is amd fleecing customers as per the usual
7700/8700??? This is 2022, not 2017. You need to let that go. Intel is not that guy anymore
 
Joined
Jun 14, 2020
Messages
2,678 (1.90/day)
System Name Mean machine
Processor 13900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Longevity? Two generations are not longevity. He can just wait for AM5 and get the X600 processor



Did you forget to look at this?

View attachment 243505


7700/8700??? This is 2022, not 2017. You need to let that go. Intel is not that guy anymore
Two generarions are more than what am4 offers right now.

Is 566 double of 510? Oh okay then
 
Joined
Jul 5, 2013
Messages
25,559 (6.48/day)
5600X3D would've been pretty interesting too.
Very likely, as long as the clocks were not dropped.
Where this chip would have killed it is on a APU.
Again, very likely. I can see this 3DCache making a big improvement for APUs.

This is AMDs first go with this technology. While the overall performance was not so impressive, this is solid first go. As they refine, it'll become something special.
 
Joined
Feb 23, 2019
Messages
5,630 (2.99/day)
Location
Poland
Processor Ryzen 7 5800X3D
Motherboard Gigabyte X570 Aorus Elite
Cooling Thermalright Phantom Spirit 120 SE
Memory 2x16 GB Crucial Ballistix 3600 CL16 Rev E @ 3800 CL16
Video Card(s) RTX3080 Ti FE
Storage SX8200 Pro 1 TB, Plextor M6Pro 256 GB, WD Blue 2TB
Display(s) LG 34GN850P-B
Case SilverStone Primera PM01 RGB
Audio Device(s) SoundBlaster G6 | Fidelio X2 | Sennheiser 6XX
Power Supply SeaSonic Focus Plus Gold 750W
Mouse Endgame Gear XM1R
Keyboard Wooting Two HE
Very likely, as long as the clocks were not dropped.

Again, very likely. I can see this 3DCache making a big improvement for APUs.

This is AMDs first go with this technology. While the overall performance was not so impressive, this is solid first go. As they refine, it'll become something special.
Not if you count Milan-X.
 
Joined
Oct 27, 2021
Messages
50 (0.06/day)
Not if you count Milan-X.
He was talking about 3D V-Cache Technology in general, which is exactly the same as found on the 5800X3D. AMD just repurposed those as "Gaming" CPUs which they do quite well. But Milan-X is where they shine and put a beat down on the EPYC and Intel 10nm Xeons
 
Joined
Jul 5, 2013
Messages
25,559 (6.48/day)
Not if you count Milan-X.
That's a fair point, but those are EPYC CPUs on a whole different platform and manufacturing system, which are not available to the general consumer. In the consumer/prosumer space, this is AMD's first go and while they have the experience with EPYC to go on, it's still a very different beast.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.94/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Just read the review and I'm not impressed. The drop in clocks hurt performance.

@ AMD,
You folks really needed to have the 5900X3D, with the clocks running at or close to the 5900X. Just saying..
That's a fair point, but those are EPYC CPUs on a whole different platform and manufacturing system, which are not available to the general consumer. In the consumer/prosumer space, this is AMD's first go and while they have the experience with EPYC to go on, it's still a very different beast.
What's weird is that this wasn't the behavior of Milan-X in the benchmarks that were done over at Phoronix, so I'm looking at this with a bit of skepticism. I am wondering if the OS plays a role here when it comes to scheduling because that can impact how data is evicted from cache, which is why I'd like to see some benchmarks with this chip in Linux to see if the trend is consistent with what we're seeing here, because it is not at all what I expected given Milan-X's performance uplift in Linux.
 
Joined
Jul 5, 2013
Messages
25,559 (6.48/day)
What's weird is that this wasn't the behavior of Milan-X in the benchmarks that were done over at Phoronix, so I'm looking at this with a bit of skepticism. I am wondering if the OS plays a role here when it comes to scheduling because that can impact how data is evicted from cache, which is why I'd like to see some benchmarks with this chip in Linux to see if the trend is consistent with what we're seeing here, because it is not at all what I expected given Milan-X's performance uplift.
Good point. Those results would be interesting to see.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.94/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Good point. Those results would be interesting to see.
It could also be that 96MB simply isn't enough. Milan-X has a whopping 768MB, so I could easily see cores switching context with data from a previous task still being resident in L3. It's really hard to say without more data.
 
Joined
Oct 27, 2021
Messages
50 (0.06/day)
What's weird is that this wasn't the behavior of Milan-X in the benchmarks that were done over at Phoronix, so I'm looking at this with a bit of skepticism
Are you really comparing real-world OS like Linux/Unix with Windows? Real Apps with Gaming/Benching apps?

It could also be that 96MB simply isn't enough. Milan-X has a whopping 768MB, so I could easily see cores switching context with data from a previous task still being resident in L3. It's really hard to say without more data.
The 5900X3D Prototype with 192 MiB had the same performance uplift(15% average) on the same games as the 5800X3D. You just need to realize that Gaming is really a Niche segment of computing. Unless Game developers start coding for 3D V-Cache it will be like this.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.94/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Are you really comparing real-world OS like Linux/Unix with Windows? Real Apps with Gaming/Benching apps?
Well, the funny thing is, is that the improvement we saw with gaming is what I would have expected across the board because that's what Milan-X demonstrated the vast majority of the time. So yes, I am looking squarely at Windows until I get more data. :)
 
Joined
Jul 5, 2013
Messages
25,559 (6.48/day)
It could also be that 96MB simply isn't enough. Milan-X has a whopping 768MB, so I could easily see cores switching context with data from a previous task still being resident in L3. It's really hard to say without more data.
Agreed, more analysis needs to be done.

I'd like to be clear, I'm not criticizing W1zzard's methodology. His testing methods are sound. However, he is limited by the tools available. As testing tools, programs and games become available that are aware of the additional cache, and are optimized for it, the benefit and scope of the 3DCache effect will become more clear.

Are you really comparing real-world OS like Linux/Unix with Windows?
Yes. Such is a valid comparison. Not everyone uses Windows.

Well, the funny thing is, is that the improvement we saw with gaming is what I would have expected across the board because that's what Milan-X demonstrated the vast majority of the time. So yes, I am looking squarely at Windows until I get more data. :)
This.
 
Joined
Oct 27, 2021
Messages
50 (0.06/day)
Well, the funny thing is, is that the improvement we saw with gaming is what I would have expected across the board because that's what Milan-X demonstrated the vast majority of the time. So yes, I am looking squarely at Windows until I get more data. :)
Really? Windows Users use Benchmark apps like Geekbench, and Cinebench, for bragging rights. Scientists/ Engineers use Linux and Ansy/OpenFoam to make money and save lives.

1649868470824.png




The truth is that 3D V-Cache is for HPC Dynamic Fluid Computing Simulations. The fact that it did so well in gaming is just a Casualty of AMD Trying to push for ever more performance at the HPC level. Game and Desktop Apps just don't see that type of performance improvement.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.94/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Really? Windows Users use Benchmark apps like Geekbench, and Cinebench, for bragging rights. Scientists/ Engineers use Linux and Ansy/OpenFoam to make money and save lives.

View attachment 243511
That's not the 5800X3D, but I think that kind of proves what I'm trying to say. I'm expecting more of an uplift than what I'm seeing in this review given Milan-X's performance uplift. This is why I'm skeptical that it's the chip and that it might be the OS. Once again, we need some Linux numbers for this chip to confirm that suspicion, because right now it's just a theory with the information we have.
 
Joined
Nov 8, 2015
Messages
30 (0.01/day)
System Name Big Chief
Processor Intel i7-980x@4.2Ghz
Motherboard Gigabyte UD3R v2
Cooling Noctua NH-U12P
Memory 24GB Kingston DDR3 1600Mhz CL10
Video Card(s) ASUS Strix GTX970
Storage 3xSamsung Evo840 500GB RAID-0
Display(s) 3xDell 2407WFP
Power Supply Corsair TX650
IDK about UK but you can get a 6000 CL 36 kit here for $360 -- in fact that's what I'm running now, and it's faster than my old DDR4 32gb 4133 4x single rank b dies.

View attachment 243426

Prices aren't that different here anymore and haven't been for a while.
Fair enough. That said I think it would be fascinating to know if the V-cache eliminates the need for 3800CL14 tuned RAM? I’ve also just seen some rumours about successful 5800X3D BCLK overclocking

I think he's more referring to price against realisitic alternates, obviously that doesn't include the 12900k.

12700k for example is over 10% faster in CPU test, only 2-3% slower in gaming + has 13th gen support and it's $100-125 cheaper. 5700/5800X is only like 6-9% slower in gaming and can be had for 25-30% cheaper
Sure but don’t forget we’re talking about an AMD chip that is a literal drop in upgrade vs an entire new build…
 
Joined
Jun 10, 2014
Messages
2,901 (0.80/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
What's weird is that this wasn't the behavior of Milan-X in the benchmarks that were done over at Phoronix, so I'm looking at this with a bit of skepticism. I am wondering if the OS plays a role here when it comes to scheduling because that can impact how data is evicted from cache, which is why I'd like to see some benchmarks with this chip in Linux to see if the trend is consistent with what we're seeing here, because it is not at all what I expected given Milan-X's performance uplift in Linux.
Only to the extent that more having more running in the background or using more cores will "pollute" the L3 more.
Even with aggressive scheduling, it's usually way too slow to be affected by the rate of data flowing in the caches. Considering the L3 is a LRU spillover cache from anything evicted from L2 (in all the cores) it will be overwritten with an incredible rate. And effectively, the more data an algorithm is churning through, the less effective L3 will be.
 
Joined
Nov 13, 2007
Messages
10,232 (1.70/day)
Location
Austin Texas
Processor 13700KF Undervolted @ 5.6/ 5.5, 4.8Ghz Ring 200W PL1
Motherboard MSI 690-I PRO
Cooling Thermalright Peerless Assassin 120 w/ Arctic P12 Fans
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2x 2TB WDC SN850, 1TB Samsung 960 prr
Display(s) Alienware 32" 4k 240hz OLED
Case SLIGER S620
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard RoyalAxe
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
Fair enough. That said I think it would be fascinating to know if the V-cache eliminates the need for 3800CL14 tuned RAM? I’ve also just seen some rumours about successful 5800X3D BCLK overclocking


Sure but don’t forget we’re talking about an AMD chip that is a literal drop in upgrade vs an entire new build…

Yeah so I would guess that it does to a great extent -- having more cache compensates for latency quite effectively in many latency-sensitive workflows and games (i.e. a 5800x with tuned 3800cl14 performs roughly similar to a 5800x3d with 3200 cl14 in those workflows), so gobs of 3d cache + DDR5 (which as we know has a latency penalty in favor of bandwidth) really seems like a smart mix for zen 4 with early gen ddr5.
 
Joined
Oct 27, 2021
Messages
50 (0.06/day)
I would like to point out that AMD intent with 3D V-Cache is to procure and gain market share in the most profitable segment in computing. To provide generational improvements exceeding 30%(up to 80% on many cases) in HPC. That 3D stacked L3 do so well in Gaming is just a Causality of that.


Also The 5900X3D Prototype with 192 MiB had the same performance uplift(15% average) on the same games as the 5800X3D. You just need to realize that Gaming is really a Niche segment of computing. Unless Game developers start coding for 3D V-Cache it will be like this. The 5900X3D or 5950X3D would have been great at HPC tasks. But provide no advantage to the 5800X3D in games
 

Pastuch

New Member
Joined
Jan 14, 2022
Messages
25 (0.03/day)
Two generarions are more than what am4 offers right now.

Is 566 double of 510? Oh okay then
I can’t help but defend AM4 and X570 in particular, I started with a 3600x, then a 5600x, next is a 5800x3d all on the same board. It’s been the most reliable motherboard I’ve ever had and the BiOS updates just keep coming. I’ll only buy the 5800x3d once I see Warzone benches, I want 280fps+ SO bad.
 
Joined
Apr 21, 2010
Messages
562 (0.11/day)
System Name Home PC
Processor Ryzen 5900X
Motherboard Asus Prime X370 Pro
Cooling Thermaltake Contac Silent 12
Memory 2x8gb F4-3200C16-8GVKB - 2x16gb F4-3200C16-16GVK
Video Card(s) XFX RX480 GTR
Storage Samsung SSD Evo 120GB -WD SN580 1TB - Toshiba 2TB HDWT720 - 1TB GIGABYTE GP-GSTFS31100TNTD
Display(s) Cooler Master GA271 and AoC 931wx (19in, 1680x1050)
Case Green Magnum Evo
Power Supply Green 650UK Plus
Mouse Green GM602-RGB ( copy of Aula F810 )
Keyboard Old 12 years FOCUS FK-8100
This CPU only good if you own AM4 MB otherwise for a new pc You need ADL or AM5
 

Vunnie

New Member
Joined
Apr 8, 2021
Messages
3 (0.00/day)
Dont know why everyone is so positive, 450 dollars for a cpu slower than the 5800x??
 
Last edited:

logicisntforyou

New Member
Joined
Nov 3, 2020
Messages
9 (0.01/day)
$449 for a cpu that doesn't overclock in order to game at 1080P not to mention content creators won't give this cpu a second thought after looking at the rendering benches. This should be interesting to say the least when the 12700F is going for $310 and the 12700K/KF is going for $370 atm.
Why? Most content creators have a 2-3 PC setup for streaming and recording content and that's besides the fact most of them don't actually edit their own videos and have someone do it for them. If they are gaming content creators all you care about are the frames if you're playing any competitive game.
 
Joined
Dec 30, 2010
Messages
2,098 (0.43/day)
That's a fair point, but those are EPYC CPUs on a whole different platform and manufacturing system, which are not available to the general consumer. In the consumer/prosumer space, this is AMD's first go and while they have the experience with EPYC to go on, it's still a very different beast.

The 5800X is just a single CCD; the main benefit from this alone would be no latency at all if it would had to switch or read/write data from other cores.

The 96MB of additional cache added on top would work wonders on any CPU really, but this cache experiment has bin tested with EPYC before with good results.

Basicly you got a CPU now slower, half the power with 3DVCache able to compete with intel highest offerings that needs to run on 5 to 5.5Ghz and consume a truckload of power along with it.

If AMD manages to build a seperate voltage rail onto it's next 3D Cache CPU we would be able to overclock the 5800x. But this CPU is extremely limited in regards of tweaking or overclocking. Still a nice gimmick.
 
Top