• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Upgraded to 5800X3D and RX 7900 XTX, nothing but problems, mediocre performance

Are you still getting sound cutouts?

Can you enable CSM in the bios (it will disable rebar) and then try to run the benches again?
Haven't noticed any sound cutouts, but I'm also using another display right now, so that doesn't mean much. I'll mess with CSM on/off reBAR on/off later, just getting Windows updates and stuff like that right now.

Honestly, if I can get performance where it should be there are ways I can change up connections and get around the sound issue. I'd rather just run everything through the TV, it's a lot simpler that way (mostly for guests, I don't care if there are 10 remotes) but it's not the end of the world.
 
Haven't noticed any sound cutouts, but I'm also using another display right now, so that doesn't mean much. I'll mess with CSM on/off reBAR on/off later, just getting Windows updates and stuff like that right now.

Honestly, if I can get performance where it should be there are ways I can change up connections and get around the sound issue. I'd rather just run everything through the TV, it's a lot simpler that way (mostly for guests, I don't care if there are 10 remotes) but it's not the end of the world.

It's not that - it's that CSM / reBAR is bugged on some giga boards with certain hardware - it's been a bug for forever. The sound dropouts are a very common symptom of that.

Enabling pice 3.0 instead of 4.0 or enabling CSM could resolve it.
Also sweet itx build :toast: (once it's working per spec)
 
Hey sorry you're not feeling like you're getting the results you want. I have the 5800X3D as well but had 0 issues. This coming from drop in upgrades from a Ryzen 3950x, to 5900x, to the X3D.

First, could you run CB23 and tell me your CPU results for multi core? Mine always scored anywhere from 14.9 to 15.1k in run to run variance after the -30 all core undervolt.

As for games, What do you get with that setup in the CP77 benchmark on RT Ultra settings using FSR Quality? The 7900 XTX should be close to my 3080 TI. I usually get 75fps average at DLSS Quality 1440p native.

I also highly recommend the FFXIV Endwalker benchmark for a constant game bench that exposes CPU/RAM bottlenecks as well as top end GPU. Run in 1440p for relative score. I usually score around 30.5k there with my setup. Just for reference, the score was only at 25k on the 5900x and 20k on the 3950x all using the rest of the same setup. Just trying to help isolate your subpar performance findings to see what it could be.

Including Timespy score as well just for other reference comparison for the CPU mainly. Your XTX will hands down outperform on the GPU side.
Screenshot 2023-07-12 1905mhz at 900mv.png
 
Last edited:
Hey sorry you're not feeling like you're getting the results you want. I have the 5800X3D as well but had 0 issues. This coming from drop in upgrades from a Ryzen 3950x, to 5900x, to the X3D.

First, could you run CB23 and tell me your CPU results for multi core? Mine always scored anywhere from 14.9 to 15.1k in run to run variance after the -30 all core undervolt.

As for games, What do you get with that setup in the CP77 benchmark on RT Ultra settings using FSR Quality? The 7900 XTX should be close to my 3080 TI. I usually get 75fps average at DLSS Quality 1440p native.

I also highly recommend the FFXIV Endwalker benchmark for a constant game bench that exposes CPU/RAM bottlenecks as well as top end GPU. Run in 1440p for relative score. I usually score around 30.5k there with my setup. Just for reference, the score was only at 25k on the 5900x and 20k on the 3950x all using the rest of the same setup. Just trying to help isolate your subpar performance findings to see what it could be.

Including Timespy score as well just for other reference comparison for the CPU mainly. Your XTX will hands down outperform on the GPU side.View attachment 315170
cp 2077 4k path tracing native.png
cp 2077 4k ultra fsr.png
cp 2077 1440p ultra fsr.png

Not terrible, but not what I expected/hoped for. Raytracing does matter to me, so again, guess I should have gone Nvidia.
 
What is the Generation of your HDMI cables? That could be the culprit. The other 2 GPUs you mentioned do not support that. GPU cable specs are more imporrtant than ever and often overlooked. One thing you can do is feel the cable when Gaming. If it feels hot then you have an issue.
 
View attachment 315179View attachment 315180View attachment 315181
Not terrible, but not what I expected/hoped for. Raytracing does matter to me, so again, guess I should have gone Nvidia.

Yeah, in that game specifically Nvidia destroys AMD when RT is on you really need a 4090 at 4k and even then DLSS quality on top although I would hate to be stuck with FSR to get ok performance regardless.



Screenshot (217).png


Really the whole reason I skipped RDNA3 this generation although Nvidia makes you pay a heavy price for the luxury. When I purchased my 4090 decent 7900XTX was $1100 usd though now it's not hard to get them for $950 usd making the price gap hurt quite a bit more.

I'm not saying you should or shouldn't swap but if you do just go 4090 the 4080 is likely to disappoint you as well.

I am kinda surprised that the 1440p RT performance is lower than the 4k performance on the 4090 though damn. Maybe your card us underperforming a bit.
 
Last edited:
Yeah, in that game specifically Nvidia destroys AMD when RT is on you really need a 4090 at 4k and even then DLSS quality on top although I would hate to be stuck with FSR to get ok performance regardless.



View attachment 315185


Really the whole reason I skipped RDNA3 this generation although Nvidia makes you pay a heavy price for the luxury. When I purchased my 4090 decent 7900XTX was $1100 usd though now it's not hard to get them for $950 usd making the price gap hurt quite a bit more.

I'm not saying you should or shouldn't swap but if you do just go 4090 the 4080 is likely to disappoint you as well.

I am kinda surprised that the 1440p RT performance is lower than the 4k performance on the 4090 though damn.
I got the 7900XTX for like $820 after getting a $90 partial refund because Amazon never sent me the Starfield Premium code it was supposed to come with, so yeah, the thought of spending $400+ more for a used 4090 after selling this is very hard to swallow.

Also, still not seeing an improvement in bench numbers / hitting the average.

AMD Radeon RX 7900 XTX video card benchmark result - AMD Ryzen 7 5800X3D,Gigabyte Technology Co., Ltd. X570 I AORUS PRO WIFI (3dmark.com)

Don't know what needs to be tweaked, but I'm quickly getting frustrated again.

Also no idea why it says hardware monitoring was disabled -- it's not.
 
RT is always going to be a challenge but you should be getting amazing 4k raster performance and the 5800X3D shouldn't be holding you back. Kinda surprised you're having so many issues.
 
Not terrible, but not what I expected/hoped for. Raytracing does matter to me, so again, guess I should have gone Nvidia.

I hate to be that guy, but how much did you research before buying? If RT matters at all, especailly at high resolutions, AMD is not the way to go, and if you just want to set every game to ultra settings with RT on the only choice is a 4090, or a 4080 if you want to do more fiddling with settings.

(edit: when it comes to RT even the 4090 doesn't really do set and forget, meaning native)
 
I hate to be that guy, but how much did you research before buying? If RT matters at all, especailly at high resolutions, AMD is not the way to go, and if you just want to set every game to ultra settings with RT on the only choice is a 4090, or a 4080 if you want to do more fiddling with settings.

(edit: when it comes to RT even the 4090 doesn't really do set and forget, meaning native)

Maybe I'm in the minority but I wouldn't like 4080 RT performance at 4k I'd be targeting 1440p with that card if RT was important.
 
@faye It seems that you having around 1000 points less then @Itz_Vexx on 3Dmark Time Spy Cpu score and also by checking other results most of the people have 12000 and above(top scores 13K+) on CPU score so could be that something is holding back yours CPU maybe is the heat or something else and most likely because of that your GPU result and overall score are lower also ......
 
Last edited:
@faye Yeah seems that you having around 1000 points less then @Itz_Vexx on 3Dmark Time Spy Cpu score and also by checking other results most of the people have 12000 and above(top scores 13K+) on CPU score so could be that something is holding back yours CPU maybe is the heat or something else and most likely because of that your GPU result and overall score are lower also ......
@faye have you tried setting curve optimizer to undervolt your 5800X3D? While I think your motherboard VRM should handle it just fine, if something changes when benchmarking (for the better, let's hope) it could give us all a pointer.
 
You said you updated your BIOS, but did you install all the latest audio and chipset drivers for your motherboard after reinstalling Windows? Sometimes Windows can install things that are incorrect for your hardware, causing a variety of problems. I'd try to go with the latest from the motherboard vendor (or straight from AMD) and see how those perform.

Also, if you go into your BIOS and look at the PCIe settings, does it correctly identify and run your GPU at the right gen/speed? I've seen a few older motherboards not detect GPUs that launched a couple years after they did and run them at Gen2 speeds instead of gen3/4. You can run GPU-z and click the little "?" button next to the Bus Interface box to check it too (that button will run a load on the GPU to force it to run at max, otherwise it cycles down in speed when not loaded). Anyway, sometimes you can fix it by manually setting that in the BIOS instead of leaving it on "Auto". It would also be good to see that it's using x16 lanes instead of x8 for some reason.
 

From the very limited data here, your 7900XTX seems to be performing adequately in Time Spy. Your 5800X3D, however, is underperforming, and the most likely reason are the temps. You need to keep it below 75 C for the boost to take full effect. Here's my result with a reference card at stock:

1695670700795.png


We need meaningful data to narrow down the source of the problem. Cyberpunk benchmark does not include any statistics relevant to the CPU. As has been said, try running Cinebench with HWinfo monitoring temperatures in the background, or a game benchmark that includes CPU performance.

And no, simply buying a 4090 isn't going to be a silver bullet.
 
Also, this is not really a suggestion, but if you wanted to go to higher performing memory there are only some expensive options out there like this g.skill neo kit. gets you some tighter timings, I believe it's b-die (which was the best DDR-4) but others can probably confirm. I don't know if it'll be noticeably faster for how much more expensive it is though, and as others have pointed out, there's likely a bigger problem here. Getting slightly better memory is not going to make that large of a difference, especially with your CPU.

I think temperatures are a good place to look more as well. Your average temps might be ok, but in really high-load areas it may actually be hitting a temperature threshold that causes throttling, which you'll notice as sudden drops in performance. What cooler are you using, and what thermal paste? While the power of the 5800X3D is quite low, it has a very high power density, which means you need as good quality paste as possible and usually end up with an oversized cooler to keep up with them. The 5800x model had the same issue and was one of the hottest chips of that generation, even though power was very reasonable, because it was putting all the power through one dense CCD. For paste, I'd recommend KPx, Prolimatech PK-3 nano, or Kryonaut Extreme (pink version).
 
I ran one on my other rig with 7900, tspy and tspy extreme for kicks. It's stock stock btw.


 
Have you tried using windows 10?

11 is a work in progress
 
I hate to be that guy, but how much did you research before buying? If RT matters at all, especailly at high resolutions, AMD is not the way to go, and if you just want to set every game to ultra settings with RT on the only choice is a 4090, or a 4080 if you want to do more fiddling with settings.

(edit: when it comes to RT even the 4090 doesn't really do set and forget, meaning native)
Not enough, clearly. Saw a good deal on the 7900 XTX, read a couple reviews, raster performance looked similar to a 4090, or at least better than a 4080, for $500+ less. I thought I'd miss RT less than I do when it's off, that's my biggest mistake in all this so far. I'm used to being able to turn everything to max after spending this much on an upgrade, I guess that's just not the way the world works any more.

Have you tried using windows 10?

11 is a work in progress
Having just installed and updated 11 I don't love the idea of going through all that with 10, but I may end up going that route.

Also, this is not really a suggestion, but if you wanted to go to higher performing memory there are only some expensive options out there like this g.skill neo kit. gets you some tighter timings, I believe it's b-die (which was the best DDR-4) but others can probably confirm. I don't know if it'll be noticeably faster for how much more expensive it is though, and as others have pointed out, there's likely a bigger problem here. Getting slightly better memory is not going to make that large of a difference, especially with your CPU.

I think temperatures are a good place to look more as well. Your average temps might be ok, but in really high-load areas it may actually be hitting a temperature threshold that causes throttling, which you'll notice as sudden drops in performance. What cooler are you using, and what thermal paste? While the power of the 5800X3D is quite low, it has a very high power density, which means you need as good quality paste as possible and usually end up with an oversized cooler to keep up with them. The 5800x model had the same issue and was one of the hottest chips of that generation, even though power was very reasonable, because it was putting all the power through one dense CCD. For paste, I'd recommend KPx, Prolimatech PK-3 nano, or Kryonaut Extreme (pink version).
The chip is hitting 80C under load, so yes, it's not boosting consistently. I've got a cheapy 240mm AIO on it right now, though reading a few 240mm AIO shootouts it seems to do okay. I'll pull it, redo the paste in case I got a bad mount, and try undervolting. 240mm AIO is really my only option with the A4-H20 case.

You said you updated your BIOS, but did you install all the latest audio and chipset drivers for your motherboard after reinstalling Windows? Sometimes Windows can install things that are incorrect for your hardware, causing a variety of problems. I'd try to go with the latest from the motherboard vendor (or straight from AMD) and see how those perform.

Also, if you go into your BIOS and look at the PCIe settings, does it correctly identify and run your GPU at the right gen/speed? I've seen a few older motherboards not detect GPUs that launched a couple years after they did and run them at Gen2 speeds instead of gen3/4. You can run GPU-z and click the little "?" button next to the Bus Interface box to check it too (that button will run a load on the GPU to force it to run at max, otherwise it cycles down in speed when not loaded). Anyway, sometimes you can fix it by manually setting that in the BIOS instead of leaving it on "Auto". It would also be good to see that it's using x16 lanes instead of x8 for some reason.
1695673733522.png


And yes, I installed the X570 driver package.
 
Last edited:
If you want to disable smt and your temps will drop to around 40c

That might even fix your gpu performance problems.
 
Not enough, clearly. Saw a good deal on the 7900 XTX, read a couple reviews, raster performance looked similar to a 4090, or at least better than a 4080, for $500+ less. I thought I'd miss RT less than I do when it's off, that's my biggest mistake in all this so far. I'm used to being able to turn everything to max after spending this much on an upgrade, I guess that's just not the way the world works any more.


Having just installed and updated 11 I don't love the idea of going through all that with 10, but I may end up going that route.


The chip is hitting 80C under load, so yes, it's not boosting consistently. I've got a cheapy 240mm AIO on it right now, though reading a few 240mm AIO shootouts it seems to do okay. I'll pull it, redo the paste in case I got a bad mount, and try undervolting. 240mm AIO is really my only option with the A4-H20 case.

And yes, I installed the X570 driver package.

Lots of people have been having trouble with temps on X3Ds on AIOs, and even blocks generally do not produce amazing results either. So many threads in the past year alone, still no consistent explainer for it. If you are using water (including blocks), make sure things are mounted very tight.

Unfortunately in the A4-H2O you don't really have much of a choice under water. Limitation of 55mm air cooler is a deal breaker for running any Ryzen 8 cores properly, even 70mm coolers struggle with cooler running 5700X and 5700G in confined spaces. 5800X3D can be restricted on power and -30 and you would still hit 80C very early.

80C in Cinebench R23 is about expected for a stock 5800X3D, but 80C is generally not acceptable in most games (with maybe the exception of some CPU-heavy games that pin one thread abnormally heavily, and even then it's warm). 80C is not acceptable in Timespy - the CPU test portion is super short and not that demanding.

timespy 4070 ti 23k 13k.png

At least 12k is the threshold you want to make, bare minimum. Later AGESA versions of 1.2.0.7 or later may reduce 5800X3D clocks and performance for messing with their V-F curves. More so than temps being an issue, I would lean more towards your specific board/BIOS being just bad for performance. Not exactly a "bug" per se, just how some boards are. There is a lot of board-to-board variance on power/temps/clocks even with the same CPU sample.

Try some different BIOSes if you are open to the idea. I'd do a 1206 BIOS since it seems to generally better than anything after, 1207 1208 and 120A are all performance-wise the same. 1205 and earlier is not compatible with X3D (locked to 3.4GHz).
 
Last edited:
Had a Ryzen 5 3600 and RTX 2080 in the Gigabyte X570 I Aorus Pro WiFi mITX board for years, got a 7900 XTX and wasn't happy with performance so I got a 5800X3D. RAM (Gskill Ripjaws DDR4-3600 CL19) had been occasionally flakey for a while so that was replaced with Corsair Vengeance DDR4-3600 CL16. At this point the only thing in the system more than a year old is the motherboard, and it seems to work fine. The chipset fan has been replaced with a Noctua.

BIOS fully updated
Windows 11 Pro fresh install, fully updated, including C/.net/direct X/etc
reBAR/above 4GB addressing enabled in BIOS
AMD X570 drivers installed
AMD Adrenalin 23.9.1 installed (23.9.2 works but control panel will not open)
AMD Ryzen Master installed, just for monitoring

Temps could be better but aren't worrying; the A4-H2O is a small case with somewhat limited airflow. Nothing is hitting above mid 70s C under load.

System is stable but doesn't perform like I just spent $1500 on it, and I'm starting to regret not saving up another $1000 for a RTX 4090 and 7800X3D. A number of people told me I'd be crazy to not just grab the 5800X3D, and looking at benchmarks that seemed to be the simple, easy, cheaper way forward... But it's been a stressful mess, and I'm not seeing the same numbers others are getting with very similar setups.

Microsoft Flight Sim is a good example of this - there are many benchmarks showing the 5800X3D and 7900XTX getting right around 60 FPS average at 4K, ultra settings, DX11 renderer. I do sometimes see that high, even higher, on internal plane view or at very high altitude, but getting reasonably close to any city will drop it in to the 40s. That's just... Pathetic. I expect far more from this kind of high end hardware, and I don't think that's unreasonable? So either the benches are all done poorly (unlikely) or there's something wrong with my setup.

I also regularly bench 5-10% below average for "same hardware" in 3DMark. While I don't put a lot of stock in these numbers I think it's telling that I've always seen equal or just above the average with the 3600 and 2080, everything running stock and otherwise equal.

Oh - I went in to this knowing AMD hadn't caught up with raytracing, but god it's bad. I expected to be able to do more than with the 2080 and yeah, not really. Between that and software issues - sound cuts out randomly over HDMI, the latest driver package breaks the Adrenalin control panel on a fresh Windows install - I'm really regretting giving AMD another chance after like 15 years of Nvidia cards.

Apart from reinstalling Windows AGAIN (which feels like it's inevitable at this point) any suggestions? I'm at the end of my rope.
Try this start menu,run,use this command "msconfing" click ok
Had a Ryzen 5 3600 and RTX 2080 in the Gigabyte X570 I Aorus Pro WiFi mITX board for years, got a 7900 XTX and wasn't happy with performance so I got a 5800X3D. RAM (Gskill Ripjaws DDR4-3600 CL19) had been occasionally flakey for a while so that was replaced with Corsair Vengeance DDR4-3600 CL16. At this point the only thing in the system more than a year old is the motherboard, and it seems to work fine. The chipset fan has been replaced with a Noctua.

BIOS fully updated
Windows 11 Pro fresh install, fully updated, including C/.net/direct X/etc
reBAR/above 4GB addressing enabled in BIOS
AMD X570 drivers installed
AMD Adrenalin 23.9.1 installed (23.9.2 works but control panel will not open)
AMD Ryzen Master installed, just for monitoring

Temps could be better but aren't worrying; the A4-H2O is a small case with somewhat limited airflow. Nothing is hitting above mid 70s C under load.

System is stable but doesn't perform like I just spent $1500 on it, and I'm starting to regret not saving up another $1000 for a RTX 4090 and 7800X3D. A number of people told me I'd be crazy to not just grab the 5800X3D, and looking at benchmarks that seemed to be the simple, easy, cheaper way forward... But it's been a stressful mess, and I'm not seeing the same numbers others are getting with very similar setups.

Microsoft Flight Sim is a good example of this - there are many benchmarks showing the 5800X3D and 7900XTX getting right around 60 FPS average at 4K, ultra settings, DX11 renderer. I do sometimes see that high, even higher, on internal plane view or at very high altitude, but getting reasonably close to any city will drop it in to the 40s. That's just... Pathetic. I expect far more from this kind of high end hardware, and I don't think that's unreasonable? So either the benches are all done poorly (unlikely) or there's something wrong with my setup.

I also regularly bench 5-10% below average for "same hardware" in 3DMark. While I don't put a lot of stock in these numbers I think it's telling that I've always seen equal or just above the average with the 3600 and 2080, everything running stock and otherwise equal.

Oh - I went in to this knowing AMD hadn't caught up with raytracing, but god it's bad. I expected to be able to do more than with the 2080 and yeah, not really. Between that and software issues - sound cuts out randomly over HDMI, the latest driver package breaks the Adrenalin control panel on a fresh Windows install - I'm really regretting giving AMD another chance after like 15 years of Nvidia cards.

Apart from reinstalling Windows AGAIN (which feels like it's inevitable at this point) any suggestions? I'm at the end of my rope.
Try this start menu,run,use this command "msconfig",click ok when the tab appears press boot,go to advanced options press it and when the tab opens see if the box Number of processors is gray open it select all cores and reboot.
 
Other than re-pasting, can you fit a Deepcool LS520? Reviews seem to indicate that's the current king of the hill for 240mm AIOs, often even beating out the long-running champion Arctic Liquid Freezer ii (which has usually won because of its thick radiator design and the offset bracket specifically for Ryzen). I know two people that used the Arctic Liquid Freezer ii 240mm on their 5800x to tame them successfully. If the LS520 is beating it and thinner (so better chance of fitting in your case), I'd take a look at it. If you don't want the RGB, the LT520 version seems to perform almost as good.
 
Try this start menu,run,use this command "msconfing" click ok

Try this start menu,run,use this command "msconfig",click ok when the tab appears press boot,go to advanced options press it and when the tab opens see if the box Number of processors is gray open it select all cores and reboot.

He reinstalled windows (twice).

Honestly it just sounds like a less than ideal BIOS + some classic X3D stacked thermal throttling (@ <80W lol) - with the 7900xtx dumping 400W through the rad probably doesn't help things.

Grab a dancase c4-sfx and slap a Thermalright intaking from the back on that puppy -- dump the gpu heat out the sides
1695677792779.png


This is basically the same layout as the case I have now, and I don't hear it or have to think about the thermals. A bit larger but totally worth the lack of faffing to get it going.
 
Last edited:
i would like to note that cyberpunk has a problem with 8 core cpu, and not running very well with path tracing in radeon gpus , i should try upgrade to rtx 4090 .
I wonder radeon update you drivers to increase performance in actual games but , they prefer focus on starfiel and games partners.
 
Back
Top