• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Are you using an AMD Ryzen X3D CPU with 3D V-Cache?

Are you using an AMD Ryzen X3D CPU with 3D V-Cache?

  • Yes, Zen 3 (5000X3D)

    Votes: 4,330 13.6%
  • Yes, Zen 4 (7000X3D)

    Votes: 3,038 9.5%
  • Running a classic Ryzen

    Votes: 14,775 46.2%
  • Running Intel

    Votes: 9,807 30.7%

  • Total voters
    31,950
  • Poll closed .
5800X3D here. Upgraded from a 1700 non-X a few months ago, same board, same Windows install, same RAM. (32GB B-die, 3200MHz, C14)

Updated BIOS all the way to latest beforehand (ASRock X370 Professional Gaming), and it still booted and ran my 1700 fine. Swapped chips and things just worked! It's been an enormous upgrade in everything I do. Awesome platform.
 
5800X3D and 6800XT on 1440p 165Hz. Plenty of performance and the 1% lows are insane.
The palm of my hand itches for a 4090, if PowerColor would have made a Red Devil version 4090 i wouldn't be able to resist it.
 
I'm running a 9700k and 2070 super. It's still good enough for me right now. My buddy did get a new PC and I convinced him to shell out for the 7800x3d. If I felt the upgrade itch, I would also run a 7800x3d.
 
Currently running a 5800X3D after upgrading from a 3600X. Didn't think I'd notice anything, but games just seem much smoother in general.
 
I asked because the title makes it sound like there are X3D cpu's without 3D V-Cache
Well, you were right, regardless. :D
I'm running a 9700k and 2070 super. It's still good enough for me right now. My buddy did get a new PC and I convinced him to shell out for the 7800x3d. If I felt the upgrade itch, I would also run a 7800x3d.
Your 9700K should be more than enough at 1440p with your 2070 Super. That's a damn good gaming CPU. :D
it's part of my collection. i have a lot of hardware.
Oh good! I was afraid that I was the only one who collects hardware! I'm not alone in my insanity! :roll:
It's not enough for ultra settings in CP2077 even at 1080p!

Just kidding, it's literally a handful of games running not precisely sweet on this GPU. 7700K is what ruins that gentleman's UX the most.
Yeah, I was gonna say... the 1080 Ti is about on par with the 5700 XT except that it has more VRAM. That's probably the greatest card that nVidia has ever made.
I'm currently stuck on Intel because I got a premium tier LGA1200 motherboard and swapping to either AM4 (too little uplift) or AM5 (extremely expensive where I live) makes negative sense. LGA1700 makes very little sense, too. Mostly bottlenecked by my GPU anyway, RX 6700 XT is not a 3440x1440@165 GPU after all.
Hey, if it works, it works, eh? I always say that the perfect gaming PC has a video card that is fast and a CPU that is "fast enough" and for your purposes, the 7700K is definitely fast enough. ;)
Will just get myself something decent (i7-11700 most likely) and call it a day. My current EngiSample CPU is running 8c8t at 4 GHz and it's also degrading so it's not ideal.
Honestly, you'd probably be just fine with an i5. An i5 is a much better value than an i7 for gaming.

Probably not worth it at this time. I built him a decent gaming rig and all he plays is Roblox anyways. I could have simply got him a Dell. /holdsheadinshame
But you didn't and that's what matters most! ;)

5800X3D and 6800XT on 1440p 165Hz. Plenty of performance and the 1% lows are insane.
The palm of my hand itches for a 4090, if PowerColor would have made a Red Devil version 4090 i wouldn't be able to resist it.
That's exactly the configuration that I had and I was 100% happy with it. However, I heard whispers of a new upcoming GPU shortage, not because of mining but because AMD, Intel and nVidia are planning to dedicate a huge amount of their fab allocation to making AI chips. I ignored the warnings the last two times that it happened and, while I managed to escape the first one unscathed, the second one really screwed me hard. This time, I decided to get ahead of things and I grabbed an open-box ASRock Radeon RX 7900 XTX Phantom Gaming OC from newegg.ca for $1,143CAD ($845USD). The thing was cheaper than an RX 7900 XT and I just couldn't say no to it. I'll probably end up homeless because of it. :roll:

5800X3D here. Upgraded from a 1700 non-X a few months ago, same board, same Windows install, same RAM. (32GB B-die, 3200MHz, C14)

Updated BIOS all the way to latest beforehand (ASRock X370 Professional Gaming), and it still booted and ran my 1700 fine. Swapped chips and things just worked! It's been an enormous upgrade in everything I do. Awesome platform.
The R7-1700 was my first Ryzen CPU. I initially wanted an R5-1600X but Canada Computers had an open-box R7-1700 that was only like $30CAD more than the 6-core variant. I couldn't say no to that and it was the perfect upgrade from my FX-8350 (which incidentally is still being used in my mother's HTPC).
 
Last edited:
Honestly, you'd probably be just fine with an i5. An i5 is a much better value than an i7 for gaming.
I also use this PC for work sometimes, and this demands as many cores as possible. Happens once or twice a year but still. 2 additional cores for as negligible as $70 is not bad.
 
I also use this PC for work sometimes, and this demands as many cores as possible. Happens once or twice a year but still. 2 additional cores for as negligible as $70 is not bad.
No, not bad at all. :D
 
Yeah, 5800X3D
I was using a 1600AF before and, Jesus Christ, the difference is abysmal in every task, gaming specially, never had stuttering issues ever since.
Besides hot temps, i love it very much!
 
5800X3D
Very nice. Helps with MS FS2020, which is the only time it really gets tested. Used to have the 3600 which was a pretty good chip, quite a lot cooler, but lacking ooompphhh!
 
I recently went from 5800x to 5950x, I do other things besides gaming and the extra cores have helped a lot.
 
Ended up with a 7800x3d today anyway, as a coworker decided he should move over to laptop only.. his loss is my gain
 
I've still got a 2600x, but I'll be buying a new mobo, 7800x3d, 32gb ddr5, m.2 and 7900xtx in September or October. Reusing my current board, ram and rtx 2060 and buying a 5600 cpu to rebuild my daughter's pc that she spilled noodles into last year
 
I upgraded to a 5800X3D from a 3700X last December. Same motherboard, and cooler. Performance is nice.


Vanilla Ryzen 7000 is nice too.


And the somewhat old Comet Lake laptop is chugging along.
 
My friends,5800x3d user here,in the last 2 days I had 4 reboots out of nowhere,I did a lot of searching to find what was happening,the cpu cooler was working perfectly,I also checked the processor paste but everything was fine and finally I found out that my motherboard ROG Strix X570 -F Gaming in combination with the latest version of the chipset driver 5.08.02.027 increased the the cpu volts to 1.4 without having to change anything in the bios,luckily the failsafe worked with a CPU Over Temperature Error message,I informed AMD about the problem as well as its solution, back to the previous driver, now I have no problem.
 
5800x3D here paired with 4080 at 3440x1440. Amazing chip for Sim Racing, etc and going to hold on until late-gen AM5

Currently running a 5800X3D after upgrading from a 3600X. Didn't think I'd notice anything, but games just seem much smoother in general.
that 3d vcache does wonders for 1% lows. helps drive the smoother experience
 
Still sticking to my Ryzen 3600 non-X. It did come to mind that if I were to upgrade, I'd try seek the 5800X3D, because single CCD plus giant cache, but if it weren't available I'd go straight for the 5950X as a final upgrade for my rig until it became too slow and further upgrade was extremely necessary.

Alas, current (almost eternal at this point) economic situation in Argentina makes it very low priority when the 3600 is still good enough for my day-to-day
 
My friends,5800x3d user here,in the last 2 days I had 4 reboots out of nowhere,I did a lot of searching to find what was happening,the cpu cooler was working perfectly,I also checked the processor paste but everything was fine and finally I found out that my motherboard ROG Strix X570 -F Gaming in combination with the latest version of the chipset driver 5.08.02.027 increased the the cpu volts to 1.4 without having to change anything in the bios,luckily the failsafe worked with a CPU Over Temperature Error message,I informed AMD about the problem as well as its solution, back to the previous driver, now I have no problem.
Yeah, ASUS has been garbage as of late, probably because a lot of people just blindly buy ASUS without remembering that other names like ASRock, Biostar and Gigabyte also exist. When I worked at Tiger Direct, I saw just how overpriced that ASUS motherboards were (even then) and thus, I've never owned an ASUS motherboard.

The last five motherboard brands that I've owned have been, in order, ECS/Elitegroup (AM2+), Gigabyte (AM3+), ASRock (X370), ASRock (X570) and Biostar (A320). They all work just fine and they were a fraction of what a comparable ASUS board would've cost. That ECS board is especially impressive because I needed a cheap board to replace my MSi K9A2 Platinum (which died unexpectedly and I learned just what a-holes work at MSi support) and that board still functions flawlessly today.

Don't be concerned with brand when buying a motherboard. Always remember that noobs buy by brand while experts buy by spec. An ASUS ROG Strix motherboard may look pretty but it won't give you better gaming performance than an ASRock Phantom Gaming, it'll just cost a crapload more.
 
Last edited:
Upgraded from 9900K that ran 5.15GHz 0AVX and 4000-14-14-14-28-2T DDR4.
PET10-7800X3D-6400-F2200-E102-37233-20230829-CPUZ-AVX2-512-STABLE.png
 
Intel 13th-gen is perfectly fine.... for people who have their roof covered with solar panels.
And live in cold climates.
I've got some intel parts coming to play around with soon, and i'm not looking forward to using them in summer.

The difference is night and day. Once you get used tot he buttery smooth 1% lows, you just cant go back. x3d chips are a real game changer.
That's exactly it.
Using FPS caps to keep the GPU under 100% usage is the next part, so you get perfectly flat frametimes that simply never vary, and it's hard to ever go back to anything else.

Yikes... I spent years repairing and fixing those computers and I'm not a fan. However, if that is all you had, I get it and I would get one too. They would not have been my go to's.
Those of us growing up with family PC's learned very quickly that parents would buy utter trash because they didn't feel like spending the mental energy on researching the product and would rather 'trust the salesperson' - and we also learned how to tweak the crap out of those PC's to stop them being so utterly terrible

Upgraded from 9900K that ran 5.15GHz 0AVX and 4000-14-14-14-28-2T DDR4.
View attachment 311179
Check your PBO settings/Eco modes. Did some testing in another thread about a 7950x and had these values


7950x Eco modes (All core clock + temps then render times, lower is better)
65W TDP / 88W PPT (4050MHz 40.3) 8.0s
105W TDP / 158W PPT(4800MHz 62c) 6.4s (25% faster)
170W TDP / 262W PPT (5100Mz 95c) 6.0s (33% faster / 6.6% faster)
(In this case, there'd be a sweetspot value around the 200W-220W mark with <80C temps and 1-3% performance loss)

You're at 95W on a CPU meant to cap at 88W - set a lower PPT limit and/or use an undervolt curve and you'll find the threshold the temps just plummet
 
Last edited:
Check your PBO settings/Eco modes. Did some testing in another thread about a 7950x and had these values


7950x Eco modes (All core clock + temps then render times, lower is better)
65W TDP / 88W PPT (4050MHz 40.3) 8.0s
105W TDP / 158W PPT(4800MHz 62c) 6.4s (25% faster)
170W TDP / 262W PPT (5100Mz 95c) 6.0s (33% faster / 6.6% faster)
(In this case, there'd be a sweetspot value around the 200W-220W mark with <80C temps and 1-3% performance loss)

You're at 95W on a CPU meant to cap at 88W - set a lower PPT limit and/or use an undervolt curve and you'll find the threshold the temps just plummet
This is why I love my 7900X3D. It never pulls more than 105 watts and is set at a maximum of 1.25 volts. As a result even putting the CPU in Eco Mode only looses 300 Mhz on the boost clock but all cores will over 5 Ghz.
 
This is why I love my 7900X3D. It never pulls more than 105 watts and is set at a maximum of 1.25 volts. As a result even putting the CPU in Eco Mode only looses 300 Mhz on the boost clock but all cores will over 5 Ghz.
I consider "Eco-Mode" to be the true default mode. AMD set the default to max OC because that's what Intel did to try to make their CPUs look better than they really are in review benchmarks. If AMD hadn't done it, their CPUs would appear far slower than Intel's even though that's just not the case. It's getting ridiculous these days.
 
And live in cold climates.
I've got some intel parts coming to play around with soon, and i'm not looking forward to using them in summer.


That's exactly it.
Using FPS caps to keep the GPU under 100% usage is the next part, so you get perfectly flat frametimes that simply never vary, and it's hard to ever go back to anything else.


Those of us growing up with family PC's learned very quickly that parents would buy utter trash because they didn't feel like spending the mental energy on researching the product and would rather 'trust the salesperson' - and we also learned how to tweak the crap out of those PC's to stop them being so utterly terrible


Check your PBO settings/Eco modes. Did some testing in another thread about a 7950x and had these values


7950x Eco modes (All core clock + temps then render times, lower is better)
65W TDP / 88W PPT (4050MHz 40.3) 8.0s
105W TDP / 158W PPT(4800MHz 62c) 6.4s (25% faster)
170W TDP / 262W PPT (5100Mz 95c) 6.0s (33% faster / 6.6% faster)
(In this case, there'd be a sweetspot value around the 200W-220W mark with <80C temps and 1-3% performance loss)

You're at 95W on a CPU meant to cap at 88W - set a lower PPT limit and/or use an undervolt curve and you'll find the threshold the temps just plummet
Thanks for the suggestion, but with custom water cooling, reducing TDP sounds counter-intuitive to my goals of achieving higher performance. I just meant to point out that the Prime sieve algorithm in PerfTest 10.2 is an anomaly in terms of pushing temperatures.
 
I consider "Eco-Mode" to be the true default mode. AMD set the default to max OC because that's what Intel did to try to make their CPUs look better than they really are in review benchmarks. If AMD hadn't done it, their CPUs would appear far slower than Intel's even though that's just not the case. It's getting ridiculous these days.
100%, for gamers eco mode is where to be. For tweakers, you then find a way to undervolt and get as much performance as possible in that eco range.
Thanks for the suggestion, but with custom water cooling, reducing TDP sounds counter-intuitive to my goals of achieving higher performance. I just meant to point out that the Prime sieve algorithm in PerfTest 10.2 is an anomaly in terms of pushing temperatures.
Theres a reason my 5800x3D is 4% faster than techpowerups review system while running colder (i've got a 3090 pre-heating my coolant. Long term i'd love to put the CPU back on air) using less power and limiting those values is a big part of why.
Logic can be misleading if you're missing information, it's the biggest enemy of beginners.

Horrible, HORRIBLE rushed PS paint image - would you rather the spiky clocks and performance that gets you a win in 'maximum' performance but worse lows, or a flat value above the average of the spiky result?
1693451578175.png


With modern hardware their boost methods are *Designed* to work within limits. Think of the BIOS limits as a way to cap performance at 95%, where the hardware limits in the CPU/GPU might drop it far lower briefly - on ryzen they call it clock stretching where cores will idle for a few milliseconds (they can change states every single ms) so you could get 1% more peak performance, only to have the CPU forced to sleep 5% of the time harming the overall performance and synthetic testing doesn't always catch this sort of thing as gaming runs all the PC hard at once producing more heat, over long time periods.


77c/121W PPT (107W CPU, rest is from SoC running 64GB of ram)
1693450903564.png
1693451031182.png


1693450535534.png
Screenshot 2023-07-22 153728.png




As an example of how this pays off, screenshots from my playing borderlands 2 in 4k 144Hz (In vulkan)

120FPS cap with +2 frames from the render queue
1693451731489.png

Perfectly flat frametimes. Zero variation. Nothing every hits any hardware throttles (CPU, GPU, VRMs, etc), so it never downclocks fully.
 
Last edited:
R5 7600 upgrade from R5 5600x only thanks to CPU + B650 + 4800 Cl40 32GB cost me 270 Euro * Second hand deals and damaged board I needed to bend back socket pins :)
Around 20% uplift even with crappy memory.
 
Intel 13th-gen is perfectly fine.... for people who have their roof covered with solar panels. ;)

And live in cold climates.
I've got some intel parts coming to play around with soon, and i'm not looking forward to using them in summer.
Today I finish chapter 3 of WoT exclusively on the 12500 + igp system (only there the game is installed). 13500 behaves similarly.
Duration: 3 months and 2 days
Average daily game: ~ one hour
Maximum consumption in the game: <60W (wattmeter)
As the wattmeter indicates 35W in youtube 1080, 55W in 4K@60FPS and 24W in idle, I'm afraid it's not enough to warm me in the coming winter. I'm also afraid that this style of using the computer (light games, www, office, news, forums and... pauses between them, everything with igp) leads the consumption to such a low area, almost impossible for a ryzen in the same tasks. In reality, the consumption of a ryzen is much higher.

It also has reserves for heavier loads, but that's another discussion. So, don't generalize and cheap ironies don't do you any good.

P.S. I use this system for gaming because it is sufficient. The more powerful system doesn't add anything to my skill, I didn't see an increase in my efficiency in the game using 13500+3070Ti.
 
Back
Top