• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Upgraded to 5800X3D and RX 7900 XTX, nothing but problems, mediocre performance

This must be what getting old feels like.
Yeah. I feel you there. The 4090 is a great card, but it's expensive and every other card this gen is even more over-priced for what you get. The only thing that makes the 4090 pricing look reasonable is if you compare it to the 3000-series...which launched during a pandemic generated manufacturing shortage and massive boom in demand due to wfh and more home-based hobbies (i.e. gaming) on top of a crypto-currency craze. So those prices were insane, but at least there was some "supply and demand" explanation. Even though none of those conditions still exist, they then based their 4000-series pricing on those prices because if people paid those prices once, Nvidia is going to assume they'll pay them again. So yeah...here we are. Unfortunately, AI demands are going to consume huge amounts of their silicon and gamers are going to play second-fiddle again. So we'll have more bad pricing for years to come. My hopes for 5000-series (and AMD 8000 series) are quite low to the point where even though I was going to skip this generation, when a 4090FE showed up available at MSRP at Best Buy a couple months ago, I caved in and bought one.
 
I think is something related with your motherboard. I had some lower than average performance when I swapped my 6900XT and Ryzen 5900X into 7900XTX and 5800X3D and all that it required was to reset the videocard as it was running at 2X instead of 8X (My motherboard is an X470 so no PCI-E gen 4, and using a better fan case stopped the throttling (X3D CPUs due to their stacked cache will run higher than average but won't throttle even when totured. So check your GPU usage and power (A power starved GPU that is using the same power lanes like a Y cable could also cause issues) Right now my 5800X3D and 7900XTX are performing great!
 
I guess in the end I'm just angry this is the best AMD can or is willing to do, and that Nvidia has the gall to charge what they do for what is, at the end of the day, the only high end option currently available.

This must be what getting old feels like.

Very true. The $900-$1100 range is really disappointing this time around. Not that the XTX or the 4080 are bad cards, it's just that they're going to start showing their age sooner rather than later.
 
Very true. The $900-$1100 range is really disappointing this time around. Not that the XTX or the 4080 are bad cards, it's just that they're going to start showing their age sooner rather than later.

I was really hoping that AMD would get to within 20% in heavy RT games like CP/Witcher3NG but there is still like an 80-100% gap from AMD best vs Nvidia best that's basically 2 generations worth of performance in some games. Pricing really sucks across the board I still feel the 7900XTX is fine for those who don't care about RT though I really like it's raster performance. The crappy thing is AMD seems to be abandoning the high end next generation so anyone who want's a generational improvement will still be stuck with Nvidia and I imagine pricing is only going to be worse.
 
I think is something related with your motherboard. I had some lower than average performance when I swapped my 6900XT and Ryzen 5900X into 7900XTX and 5800X3D and all that it required was to reset the videocard as it was running at 2X instead of 8X (My motherboard is an X470 so no PCI-E gen 4, and using a better fan case stopped the throttling (X3D CPUs due to their stacked cache will run higher than average but won't throttle even when totured. So check your GPU usage and power (A power starved GPU that is using the same power lanes like a Y cable could also cause issues) Right now my 5800X3D and 7900XTX are performing great!
I wish it were the motherboard, but at this point I'm confident it's not. The card is running at PCIe 4.0 x 16, the CPU isn't boosting as high as it possibly could, but even in Cinebench 2024 it's like 3% off ideal boost clocks. Worst case I'm 10% off where I could be... And if the card were 20% faster than it currently is I'd still consider it a mediocre, overpriced product.

I expect far more for this kind of money, and have never been this unimpressed after jumping two full generations (and arguably, a card tier on top of that) with an upgrade. As far as I'm concerned, the 7900 XTX is a $600 card at best. At $900+ it's robbery.
 
... with the understanding that they won't be able to to turn on anything beyond the most basic raytracing stuff.
Doesn't the 7900xtx more or less match the RTX 3090 on ray tracing? If ray tracing is this important to you then the only option is the 4090. I am not sure even that will meet your incredibly high standards.
 
Doesn't the 7900xtx more or less match the RTX 3090 on ray tracing? If ray tracing is this important to you then the only option is the 4090. I am not sure even that will meet your incredibly high standards.

If you average it out between a bunch of games that only use 1-2 effects sure it's in the ballpark of a 3090 but in heavy RT games like Cyberpunk it performs more like a 4070 and at something like 1440p the 4090 is like 100% faster add in DLSS looking better than FSR and the difference becomes larger.
 
The thing is there's nothing to get rid of, apart from like, unnecessary services I guess? It's not like I installed a bunch of unnecessary garbage. Really hoping things calm down once Windows is done indexing or whatever.
Disable VBS

If you average it out between a bunch of games that only use 1-2 effects sure it's in the ballpark of a 3090 but in heavy RT games like Cyberpunk it performs more like a 4070 and at something like 1440p the 4090 is like 100% faster add in DLSS looking better than FSR and the difference becomes larger.

This is so accurate. For certain games like Atomic Heart, Hogwarts Legacy, Cyberpunk the DLSS 2-3.5 / RT combo is just too good and the 4070 will beat the experience of the 7900xtx by a mile- especially since DLSS balanced usually looks better than FSR2 Quality in some of those. For other stuff the 7900xtx stomps all over nvidia (starfield, etc.).

It's really hard, when you don't have a linear upgrade path (not all games are +50% or w/e), to feel like you're getting your money's worth, unless you're playing a game that's far superior for AMD. If you're trying to run cyberpunk and compare it to you old 2080 with DLSS, AMD is going to feel terrible.
 
Last edited:
Don't take it personaly, but you know you should have choose an nVidia product before all this "drama" and fight with the 7900XTX.
It just too bad for you but now, you know what to do
Keep Going GIF by Big Brother
 
Don't take it personaly, but you know you should have choose an nVidia product before all this "drama" and fight with the 7900XTX.
It just too bad for you but now, you know what to do
Keep Going GIF by Big Brother

The hard thing for a lot of consumers is you just don't know how much you will miss certain features you can look at bar graphs all day long but until you get the hardware in your hands have time to use it personally it's really hard sometimes to make the best decision especially when the alternative 4080/4090 are substantially more expensive.
 
The hard thing for a lot of consumers is you just don't know how much you will miss certain features you can look at bar graphs all day long but until you get the hardware in your hands have time to use it personally it's really hard sometimes to make the best decision especially when the alternative 4080/4090 are substantially more expensive.

I blame reviewers honestly - they're so focused on making sure all the settings are bone stock and we're comparing raster to raster, or rt to rt, that they don't even talk about the features.

I've not seen review was like "these are the real settings a gamer would use in these games, and this one has settings that provide a better IQ and FPS, while this one doesn't". It's always "well if you run native raster at 4k ultra the 7900xtx gets a 40 fps slide show which is 100% faster than the 20 fps slideshow the 4070ti gets, so therefore the 7900xtx is double the speed at this title".

The graphs are misleading.
 
Disable VBS



This is so accurate. For certain games like Atomic Heart, Hogwarts Legacy, Cyberpunk the DLSS 2-3.5 / RT combo is just too good and the 4070 will beat the experience of the 7900xtx by a mile- especially since DLSS balanced usually looks better than FSR2 Quality in some of those. For other stuff the 7900xtx stomps all over nvidia (starfield, etc.).

It's really hard, when you don't have a linear upgrade path (not all games are +50% or w/e), to feel like you're getting your money's worth, unless you're playing a game that's far superior for AMD. If you're trying to run cyberpunk and compare it to you old 2080 with DLSS, AMD is going to feel terrible.
It's already disabled, maybe because I'm running an insider preview build.
 
I blame reviewers honestly - they're so focused on making sure all the settings are bone stock and we're comparing raster to raster, or rt to rt, that they don't even talk about the features.

I've not seen review was like "these are the real settings a gamer would use in these games, and this one has settings that provide a better IQ and FPS, while this one doesn't". It's always "well if you run native raster at 4k ultra the 7900xtx gets a 40 fps slide show which is 100% faster than the 20 fps slideshow the 4070ti gets, so therefore the 7900xtx is double the speed at this title".

The graphs are misleading.

I don't blame reviewers because their job is already hard enough trying to cater to consumers that already have their personal biases as it is.

A good example is even though I think DLSS is superior to FSR 99.9% of the time I'm glad it isn't used in reviews becuase I think that is somthing the consumer has to decide for themselves and should always be a bonus on top not a crutch for developers optimization.

The RT thing is hard as well becuase I'd be willing to bet 50% ish + of pc gamers don't use it and there are still so many games shipped with terrible implementation like RE4 remake where the crap cube maps look better.

They really are doing the best they can and are in a no win situation.

Even the coverage for 8GB cards on TPU I don't agree with but I can fully understand the point they are trying to make in the here and now 99% games are fine.

This all makes it really hard for the consumer as well with pricing being so terrible.

Lucky for me I already had a 3080ti so I know full well how a 7900XTX generally performs in RT becuase in most games they are similar and to me already having that gpu for nearly 2 years prior made me probably more disappointed than most in how RDNA3 handles RT.
 
And if the card were 20% faster than it currently is I'd still consider it a mediocre, overpriced product.

Then you're going to absolutely love the RTX4090:

1695753696692.png


20% more performance for at least 75% more money.
Good luck with your next purchase :oops:
 

Attachments

  • 1695753671808.png
    1695753671808.png
    167.3 KB · Views: 96
There's a lot to read here so sorry if I missed something but I skimmed through and I'm hearing that this is an ITX build and that CPU temps are around 80C and under in what looked like a continuous Cinebench run, and that frame rates are coming in around 40fps in busy city areas of Flight Simulator.

That sure sounds like a CPU bottleneck but from the reviews it sounds like the 5800X3D should reach 90C before throttling. But I don't see any information here to attempt to observe the bottleneck nor the temps while gaming. What does Task Manager show during the lower-performance parts of a game? Are all the cores busy? What does it show for the GPU? What temperature on the CPU and GPU during poor game performance? What about the clock speeds of both the CPU and GPU? I don't remember how to see all this information but it sounds like MSI Afterburner might be useful. I'm not looking for screenshots I'm just imagining that if you saw this information you could know right away if the CPU or GPU was the culprit, and if either is clocking lower than it should.

If they're perfectly balanced (very unlikely especially in more than one game), one CPU core should be 100% utilized and the GPU should be as well. But if the GPU is always 80% utilized and there's always a CPU core somewhere that is 100% utilized, then the CPU is the bottleneck. But my guess is the GPU is struggling to stay cool and it's the culprit.
 
This must be what getting old feels like.

I'm just going to say that you did indeed not research enough. Yeah the computer market sucks (apart from storage I guess), as does everything else. The only way to win is to not play.
I get it, in the end I guess my expectations are just a bit too high, but the thing is that the RTX 2080 was already handling most of what I play at 1440p at similar settings.

I'll probably end up trading the 7900 XTX + cash for a 4090 in a month or two, once I get a couple more paychecks in and some other hobby stuff sells to offset. The 7900 XTX is a major disappointment - yes, faster than the 2080, but often not enough to matter... Or enough to be impressive, at least in modern titles. As of right now I would not recommend it to anyone but the most hardcore AMD loyalist, or those for whom ~$900 is the absolute top of their budget, with the understanding that they won't be able to to turn on anything beyond the most basic raytracing stuff.

Return the 7900xtx, keep the 2080 and get the 4090 instead. It'll be cheaper.

Then you're going to absolutely love the RTX4090:

View attachment 315299

20% more performance for at least 75% more money.
Good luck with your next purchase :oops:

He only looks at RT performance.
 
If you average it out between a bunch of games that only use 1-2 effects sure it's in the ballpark of a 3090 but in heavy RT games like Cyberpunk it performs more like a 4070 and at something like 1440p the 4090 is like 100% faster add in DLSS looking better than FSR and the difference becomes larger.
I took a look at the new cyberpunk performance analysis. The 7900xtx is comparable to the 3080 in ray tracing. That would have been pretty good last generation. Nobody is buying AMD for Ray tracing though. Well maybe @faye but they have learned from their mistake.

performance-rt-2560-1440.png

Personally, the 4090 is barely sufficient for ray tracing without dlss. I won't care until the x60 equivalent card has 2x 4090 performance in ray tracing. In the mean time I am very happy with my AMD gpu.
 
I took a look at the new cyberpunk performance analysis. The 7900xtx is comparable to the 3080 in ray tracing. That would have been pretty good last generation. Nobody is buying AMD for Ray tracing though. Well maybe @faye but they have learned from their mistake.

performance-rt-2560-1440.png

Personally, the 4090 is barely sufficient for ray tracing without dlss. I won't care until the x60 equivalent card has 2x 4090 performance in ray tracing. In the mean time I am very happy with my AMD gpu.

100% agree RT is still pretty niche but if it is important to someone in games like CP2077 Nvidia is really the only option they take a lot less tweaking to get good RT performance in games like this. My over 2 year old 3080ti actually outperforms a 7900XTX in RT in this game and honestly I wouldn't be happy with that performance either but again until people actually use the hardware it's really hard to know what is an ok level of performance for you, Maybe they will get a 4090 and still be dissapointed maybe not but that doesn't change the fact that the 7900XTX isn't providing the oomph they thought it would everyone makes mistakes.

There is nothing wrong with either being happy or unhappy with a cards particular performance in ones own use case only they can decide if something meets their expectations.

I'm sure 99.9% of 7900XTX owners love the card but I am sure I wouldn't be happy with it's performance but it's a lot easier when I already have a card that performs similar to it in RT to make that decision.

Your 6750XT is a fantastic gpu I have a 6700XT sitting in a pc I need to install windows on def the best value right now if you can get it for 300 ish usd no reason not to be happy with it.
 
Last edited:
until people actually use the hardware it's really hard to know what is an ok level of performance for you,
Ain't that the truth. I upgraded to a 7800x3d. It provided a 15-50% fps boost compared to the 5800x3d in my primary game. It still isn't enough. I expect to buy a 8800x3d day one.
 
VRM is probably overheating I had that problem with the MSI MPG gaming edge wifi. VRM set was hot enough to burn you from the heat sink. That was with a only a 5600x switching to a MSI MEG ace no more overheating VRM. cpu ran cooler too but I also switched from a single GPU to dual RTX 2080 ti's & then upgraded to the 5800x 3D. The 5800x 3D is nice but it barely helps out any on dual card I'm still cpu bottlenecked. I need more clock speed & IPC at the same time.
you should look in HWmonitor to see what the VRM temps are.
 
VRM is probably overheating I had that problem with the MSI MPG gaming edge wifi. VRM set was hot enough to burn you from the heat sink. That was with a only a 5600x switching to a MSI MEG ace no more overheating VRM. cpu ran cooler too but I also switched from a single GPU to dual RTX 2080 ti's & then upgraded to the 5800x 3D. The 5800x 3D is nice but it barely helps out any on dual card I'm still cpu bottlenecked. I need more clock speed & IPC at the same time.
you should look in HWmonitor to see what the VRM temps are.

Doubtful the MSI X570 boards were trash compared to the Gigabyte ones at a similar price range in the vrm department the X570 I Aorus elite pro wifi has an 8 phase direct vrm with 70amp Infineon TDA21472 powerstages. The MSI boards basically had B450 level vrm under the 299 usd unify.

I believe it also has a blackplate to help soak away some of the heat.
 
Last edited:
Doubtful the MSI X570 boards were trash compared to the Gigabyte ones at a similar price range in the vrm department the X570 I Aorus elite pro wifi has an 8 phase direct vrm with 70amp Infineon TDA21472 powerstages. The MSI boards basically had B450 level vrm under the 299 usd unify.

I believe it also has a blackplate to help soak away some of the heat.
It does. Also has the AIO pulling air over the board and a Noctua 40mm in place of the original chipset fan. I'll double-check, but I don't think VRM temps are a problem.
 
Doubtful the MSI X570 boards were trash compared to the Gigabyte ones at a similar price range in the vrm department the X570 I Aorus elite pro wifi has an 8 phase direct vrm with 70amp Infineon TDA21472 powerstages. The MSI boards basically had B450 level vrm under the 299 usd unify.

I believe it also has a blackplate to help soak away some of the heat.
Boards are usually bench on an open air bench when tested. We're talking about an SSF case which doesn't have the benefits of free-flowing air. Lastly Gigaybes quality has been questionable the last couple of years.
 
Boards are usually bench on an open air bench when tested. We're talking about an SSF case which doesn't have the benefits of free-flowing air. Lastly Gigaybes quality has been questionable the last couple of years.
Video cards? Alright, I'll bite.
GB-branded motherboards? Maybe.
Aorus-branded motherboards? No. Just no.
 
Video cards? Alright, I'll bite.
GB-branded motherboards? Maybe.
Aorus-branded motherboards? No. Just no.

Honestly they've all been culprits of crappy boards Asus/Gigabyte?Asrock/MSI people just have brand goggles for whatever brand they prefer. Case in point buys a crap msi board maybe I should get a much more expensive msi board maybe that will fix the problem unfortunately that is the buying habits of most people.

Funny thing is at launch every board was cheaper than the edge on this list.....

Screenshot (219).png

But gigabyte makes questionable boards....... Thankfully MSI has redeemed themselves since first with the launch of the Tomahawk and then later the B550/S line. Their X670 boards are also mostly great.

That is off topic though all I said was it is doubtful it is the issue. Impossible no.
 
Last edited:
Back
Top