• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Final AM4 Upgrade

I'd keep the 5600X which is fine for gaming and put everything into GPU.

4070 Ti Super etc.

Eventually move to 1440p.
I had a 5600x for $200 USD before the 5700x3d ($147 USD) i got now. With a 3070 @1440p you cant fell much difference except 1% lows that got better (RDR2, CP2077). Not sure if the 5600x can drive any NV GPU above 3080 perf level without a botleneck. The AMD GPU's are more suited for the 5600x from what i seen in the online videos on YT.

But looking at forums here the most issues are with AMD cards for games and drivers. Never had anything newer than a RX5600XT but i was tired of debuging issues instead of playing and jumped back to NV.
 
Yeah, peeps in the US tend to forget this is an international forum, where there a hella pricing premium on hardware, especially Intel and nVidia components. :(

A 5600or 5600X would still be fine for gaming, though if you wanna hold on to your AM4 platform a little longer, a 5x00X3D chip is the way to go. I 'upgraded' my 5900X to a 5700X3D (couldn't justify the price/performance difference) in my gaming rig because, well, it's my main gaming rig, and gaming performance is a priority. My 5900X went into a 2nd build (with spare RX 6900XT, Enermax MAXREVO 1500W PSU, SOLDAM XR-1 case, 32GB RAM I had, after upgrading both the CPU and GPU in my main rig) for some ripping and other productivity work, in addition to gaming of course.

I had a 5600x for $200 USD before the 5700x3d ($147 USD) i got now. With a 3070 @1440p you cant fell much difference except 1% lows that got better (RDR2, CP2077). Not sure if the 5600x can drive any NV GPU above 3080 perf level without a botleneck. The AMD GPU's are more suited for the 5600x from what i seen in the online videos on YT.

But looking at forums here the most issues are with AMD cards for games and drivers. Never had anything newer than a RX5600XT but i was tired of debuging issues instead of playing and jumped back to NV.
Are you saying that the R5 5600X would most likely bottleneck an RTX 3080 but not, say, an RX 6800XT/RX 6900XT? Serious?
 
are there any examples comparing 40 series to rdna 3?
I'd suggest checking out reviews here on TPU, especially in newer games, and decide for yourself which one is worth the price.
 
Yeah, peeps in the US tend to forget this is an international forum, where there a hella pricing premium on hardware, especially Intel and nVidia components. :(

A 5600or 5600X would still be fine for gaming, though if you wanna hold on to your AM4 platform a little longer, a 5x00X3D chip is the way to go. I 'upgraded' my 5900X to a 5700X3D (couldn't justify the price/performance difference) in my gaming rig because, well, it's my main gaming rig, and gaming performance is a priority. My 5900X went into a 2nd build (with spare RX 6900XT, Enermax MAXREVO 1500W PSU, SOLDAM XR-1 case, 32GB RAM I had, after upgrading both the CPU and GPU in my main rig) for some ripping and other productivity work, in addition to gaming of course.


Are you saying that the R5 5600X would most likely bottleneck an RTX 3080 but not, say, an RX 6800XT/RX 6900XT? Serious?
The 5600x CPU usage is 45-50% in some titles with Nvidia hardware, this means 100% without SMT. And the GPU goes under 97% utilization. Combined with a 6950XT/7900XT not the case of GPU under-utilization.

Here's one example of 4080 usage:

3080Ti

And with 7900XT
 
I'd suggest checking out reviews here on TPU, especially in newer games, and decide for yourself which one is worth the price.
i meant encoder
im fine with 7700 XT performance
 
i meant encoder
im fine with 7700 XT performance
Oh that... I guess you could look for Youtube videos, although I'd take everything with a huge pinch of salt because of Youtube's compression.
 
Oh that... I guess you could look for Youtube videos, although I'd take everything with a huge pinch of salt because of Youtube's compression
couldnt find any :<
 
SMT doesn't cut CPU usage in half.

Last time i disable SMT/HT was with i7 4790 and there it was the case. Will check later with 5700X3d, thanks for the tip.
 
Last time i disable SMT/HT was with i7 4790 and there it was the case.
HT or SMT has never been that efficient in Core of Ryzen CPU's IMO, if that were the case Intel would NEVER drop it.

What you saw could be a consequence of something else, I dunno.
 
4070 Ti Super etc.
Just plain 4070S will be mighty fine for both 1080p and 1440p (DLSS P helps a lot), unless you crave perfection in image quality. The OP seems to never play anything stupid heavy so it might be an overkill.

The difference might be invested in a monitor upgrade, or maybe some other peripheral. New headset perhaps..?
 
The 3D cache is good to have if you also have a high-end GPU and you're chasing ultra high frame rates, but it's not necessary by any means. As long as you're GPU limited, you won't feel much, if any difference.
It's not just for "ultra high FPS", it also does wonders for 1% lows and stuttering.
5 years minimum
If you're gonna keep it, and can get a 8 core x3d chip cheap, then do it. You won't regret it.
 
Not gonna lie, at stock the 5600X is doing alright against the X3D.

It can do better.

Is he running 2R or 1R memory, I think that helps too.
 
HT or SMT has never been that efficient in Core of Ryzen CPU's IMO, if that were the case Intel would NEVER drop it.

What you saw could be a consequence of something else, I dunno.
Red Dead Redemption 2
CPU 5700X3D CO -25
GPU 3070

1728486775835.png


There is something very wrong with SMT for me, i have more FPS with SMT OFF than ON. But the CPU usage is more than double with SMT OFF. Even the GPU draw more power.
Have to install/test more games.
 
Red Dead Redemption 2
CPU 5700X3D CO -25
GPU 3070

View attachment 366794

There is something very wrong with SMT for me, i have more FPS with SMT OFF than ON. But the CPU usage is more than double with SMT OFF. Even the GPU draw more power.
Have to install/test more games.
GPU drawing more power is good. It means it's being utilised more, it spends less time waiting for data from the CPU.
 
Guess I'll ask here but
if I were to go 1440p, what 240hz monitors are there that I should get?
I want BFI because of how well it was done on the XG2431
 
I love my 5800X3D, but I moved to it from a 3700X, and I play some CPU heavy titles too. Going from a Zen 3 to Zen 3 X3D is a bit harder to recommend unless you know you play CPU heavy titles, and are looking to extend AM4 as long as possible, and even then I'd still only do if it's cheap enough since you're already using a 5600X.

I'd definitely recommend the 5700X3D over the 5800X3D instead though. While it's 5% to 10% slower on average than the 5800X3D (this shrinks as resolution increases, so it might be closer to 3% to 5% in practice), pricing favors it way too much. In the US, the 5700X3D is $190 and the 5800X3D is like... between $380 and its original MSRP of $450, so it's pretty much not worth buying that one anymore.

If you plan on going to 1440p high refresh, you would have more reason to potentially skip the CPU upgrade and get a better graphics card instead.

Also, AMD's video encoding is fine, as far as I know? From everything I've seen, all three brands (so this includes Intel) seem to be pretty comparable when it comes to AV1 and HEVC. It does when it comes to H.264, where AMD does do worse, but why are you using that when AV1 is available? Unless you're streaming to Twitch (or has Twitch started allowing streaming in AV1 yet?), you probably shouldn't need to. There's plenty of features where AMD is well behind nVidia (such as ray tracing, upscaling, and power efficiency), but I'm not sure if encoding is really one of them anymore.

(Not saying to get an AMD graphics card or anything, just saying that a lot of the information floating around regarding encoding seems to be just old narrative persisting.)
 
I love my 5800X3D, but I moved to it from a 3700X, and I play some CPU heavy titles too. Going from a Zen 3 to Zen 3 X3D is a bit harder to recommend unless you know you play CPU heavy titles, and are looking to extend AM4 as long as possible, and even then I'd still only do if it's cheap enough since you're already using a 5600X.

I'd definitely recommend the 5700X3D over the 5800X3D instead though. While it's 5% to 10% slower on average than the 5800X3D (this shrinks as resolution increases, so it might be closer to 3% to 5% in practice), pricing favors it way too much. In the US, the 5700X3D is $190 and the 5800X3D is like... between $380 and its original MSRP of $450, so it's pretty much not worth buying that one anymore.

If you plan on going to 1440p high refresh, you would have more reason to potentially skip the CPU upgrade and get a better graphics card instead.

Also, AMD's video encoding is fine, as far as I know? From everything I've seen, all three brands (so this includes Intel) seem to be pretty comparable when it comes to AV1 and HEVC. It does when it comes to H.264, where AMD does do worse, but why are you using that when AV1 is available? Unless you're streaming to Twitch (or has Twitch started allowing streaming in AV1 yet?), you probably shouldn't need to. There's plenty of features where AMD is well behind nVidia (such as ray tracing, upscaling, and power efficiency), but I'm not sure if encoding is really one of them anymore.

(Not saying to get an AMD graphics card or anything, just saying that a lot of the information floating around regarding encoding seems to be just old narrative persisting.)
I know, its just I feel pretty disappointed with my 5700 XT
seems to struggle despite using h264 and h265
as for av1? i just plan to post on youtube
 
I have a healthy market on Facebook marketplace, these CPU's consistently sell for 200 used
Is there even buyer protection on there?

Ebay, amazon have it

I'd suggest checking out reviews here on TPU, especially in newer games, and decide for yourself which one is worth the price.
Good Old Gamer, RATech on youtube
 
I know, its just I feel pretty disappointed with my 5700 XT
seems to struggle despite using h264 and h265
as for av1? i just plan to post on youtube
AMD was/is well behind nVidia and Intel when it came to H.264/AVC so your poorer experience with that on Radeon makes sense.

With AV1, they should all be pretty similar for the most part, and I'd recommend using AV1 where available since it should generally give a better file size/quality ratio. I very seldomly use it for recording videos (and don't notice any performance loss) and uploading to YouTube on occasion and I've been much happier with it than what my process was before.
 
AMD was/is well behind nVidia and Intel when it came to H.264/AVC so your poorer experience with that on Radeon makes sense.

With AV1, they should all be pretty similar for the most part, and I'd recommend using AV1 where available since it should generally give a better file size/quality ratio. I very seldomly use it for recording videos (and don't notice any performance loss) and uploading to YouTube on occasion and I've been much happier with it than what my process was before.
Does youtube support AV1 encoding?
If so im pretty much sold on going 7800 xt-7900 GRE since im happy with the performance improvements since RDNA 1
 
Does youtube support AV1 encoding?
If so im pretty much sold on going 7800 xt-7900 GRE since im happy with the performance improvements since RDNA 1
I know that Youtube support AV1 for stream unlike twitch. I assume that they support it for standard video too but I'm not sure.
 
I know that Youtube support AV1 for stream unlike twitch. I assume that they support it for standard video too but I'm not sure.
did some digging around
you have to be popular, or they'll re-encode it to h.264
but av1 converted to h.264 would be better than native h.264 on radeon right?
 
Does youtube support AV1 encoding?
If so im pretty much sold on going 7800 xt-7900 GRE since im happy with the performance improvements since RDNA 1
They do for streaming as far as I know, but I've never done streaming so I can't verify to what extent this is true.

It's possible they do their own re-encoding on their end after upload regardless? I don't know. I just know I can use the AV1 files that I get with no performance loss while recording, and I get small sizes, good quality, and fast uploads, and it seems like the HD resolutions are processed fast, but I don't know if that means it directly takes the AV1 files or if it's just faster to process because I'm feeding YouTube much smaller files to begin with.

If this re-encoding happens on their end regardless (not sure if it does or doesn't), then what GPU brand you have isn't going to be relevant anyway. But even if it does happen, I've been happy with my own experiences. Note that if you do this larger time/for a living/whatnot, definitely do your research and look into it and don't just take someone's word for it. But if it's just small time or whatever, yeah, dealing with/uploading AV1 is fantastic with YouTube in me experience, whether they re-encode it or not.
 
Back
Top