• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

RTX 4070 or RX 7800XT if I have CPU Bottleneck?

Status
Not open for further replies.
That's an interesting perspective. It doesn't bare out with actual usability. Raytracing enabled does not equate to ultra settings. Raytracing is a setting all in and of itself and can be adjusted independently of any other setting.
It can be, but I wouldn't touch it unless I can run ultra settings comfortably, because it doesn't add as much to the experience as several other settings (e.g. textures, shadows, model detail, etc.) do. I treat it as a kind of ultra-above-ultra graphics option. But each to their own.
 
The 7800XT is 5% faster on 1080p than the 4070, so go for that. The model you cited can also be overclocked nicely based on TPU review : https://www.techpowerup.com/review/sapphire-radeon-rx-7800-xt-nitro/32.html
Lol, 5%. For 50 W more power. No DLSS. According to recent reviews it's more like 2%, too.

Now look at RT, where even the 7900XT is slower, and the 4070 is 40% faster.

UE5 games use this by default and it often can't be turned off, even "old" UE5 like Stalker 2 (old as in its using an old version and started development many years ago).

relative-performance-rt-1920-1080-2.png
 
Lol, 5%. For 50 W more power. No DLSS.

Now look at RT, where even the 7900XT is slower, and the 4070 is 40% faster.

UE5 games use this by default and it often can't be turned off, even "old" UE5 like Stalker 2.

View attachment 374954
I don't believe percentage comparisons help OP a lot. When I'm looking for a new GPU, I need raw performance values, not comparisons. (200% of 10 FPS is still only 20 FPS - a crude example)

Then you are doing yourself a disservice and missing out on what would otherwise be an exceptional experience.
Nah, I'm talking from experience. RT isn't so much better in 99.9% of games to warrant the performance hit. Slightly better shadows, a bit more realistic reflections. Nothing to write home about, imo.

Certainly not something I'd consider a priority being on a tight budget (running an i7-6700 in 2024 assumes a tight budget).
 
I don't believe percentage comparisons help OP a lot. When I'm looking for a new GPU, I need raw performance values, not comparisons. (200% of 10 FPS is still only 20 FPS - a crude example)


Nah, I'm talking from experience. RT isn't so much better in 99.9% of games to warrant the performance hit. Slightly better shadows, a bit more realistic reflections. Nothing to write home about, imo.

Certainly not something I'd consider a priority being on a tight budget (running an i7-6700 in 2024 assumes a tight budget).
Sure.

60 vs 42 FPS.

It's not even close, especially when you consider that like I said, many new games cannot turn RT off. Star Wars Outlaws and Stalker 2 being two that come to mind without looking it up.

Then go ahead and use upscaling, which is also the default on many new games, and suddenly you're at a point where DLSS balanced looks better than FSR quality, pushing that gap even further.

rt-cyberpunk-2077-1920-1080.png

Also it's a 2% raster difference. Not 5% like someone said. With the 7800XT using 50 W more to achieve that.

relative-performance-1920-1080.png
 
Last edited:
Upgrade your CPU platform, there is no point running a modern GPU with 9 years old CPU, modern games are CPU heavy (Stalker 2, Cyberpunk, Wukong ...) you cannot run these with that old 4 core CPU. You need at least a decent 6 core CPU, at least a Ryzen 5600X with those GPU's, but 7600X preffered -> consider upgrading your PC to AM5 platform.
What matters most in game is ST performance. That's common knowledge.
 
Nvidia card not recommended for very cpu limited rigs because of the driver overhead. This should be considered first.
 
Lol, 5%. For 50 W more power. No DLSS. According to recent reviews it's more like 2%, too.

Now look at RT, where even the 7900XT is slower, and the 4070 is 40% faster.

UE5 games use this by default and it often can't be turned off, even "old" UE5 like Stalker 2 (old as in its using an old version and started development many years ago).

View attachment 374954
RT is a marketing gimmick that most ppl do not use, it makes very little difference visually for a huge cost in performance. It can be disabled in graphics menu of all games.
 
RT is a marketing gimmick that most ppl do not use, it makes very little difference visually for a huge cost in performance. It can be disabled in graphics menu of all games.
You see, if you actually played all games, you'd quickly realise this isn't true. I've already given examples.

Instead of being a gimmick it's actually the default lighting system of modern engines, and in many cases cannot be turned off.

Considering that, I'd take 40% RT performance, 10% more efficiency and DLSS over 2% raster, and consider anyone who wouldn't to be insane.
 
What matters most in game is ST performance. That's common knowledge.
The i7-6700 isn't great on that front anymore, either, unfortunately.
 
RT is a marketing gimmick that most ppl do not use
That is total hogwash. RT is a system seller and most people who can use it do use it. There are exceptions but they are not the majority..

The i7-6700 isn't great on that front anymore, either, unfortunately.
But it's not terrible either. The OP can get by with it until they upgrade.

Considering that, I'd take 40% RT performance, more efficiency and DLSS over 2% raster
I would agree with you there..
and consider anyone who wouldn't to be insane.
..but I would not go as far as to say that.
 
Last edited:
That is total hogwash. RT is a system seller and most people who can use it do use it. There are exceptions but they are not the majority..


But it's not terrible either. The OP can get by with it until they upgrade.


I would agree with you there..

..but I would not go as far as that.
Yes, it is a working marketing gimmick, it makes very little difference visually though for a huge cost in performance. There is a reason why companies invest so much in marketing, it works.

Lol, 5%. For 50 W more power. No DLSS. According to recent reviews it's more like 2%, too.

Now look at RT, where even the 7900XT is slower, and the 4070 is 40% faster.

UE5 games use this by default and it often can't be turned off, even "old" UE5 like Stalker 2 (old as in its using an old version and started development many years ago).

View attachment 374954

That's not how percentages work. Why are you using a GPU at a completely different performance tier as the baseline for a comparison between the RTX 4070 and RX 7800 XT? You can't just subtract the value and expect it to make sense, you need to divide it. 190/151 = 1.258, so this data set shows the RTX 4070 being 26% faster than the RX 7800 XT in ray traced games, nowhere near 40%.

In the RX 7800 XT review, the RTX 4070 is only about 16% faster than the RX 7800 XT in games that use RT at 1080p. The percentage different depends heavily on the set of games being tested.

1733738851737.png


Is the RTX 4070 still better than the RX 7800 XT? Honestly, probably yes, but it's much closer than you imply, and the RX 7800 XT is often a better deal because it's often significantly cheaper.
 
Nah, I'm talking from experience. RT isn't so much better in 99.9% of games to warrant the performance hit. Slightly better shadows, a bit more realistic reflections. Nothing to write home about, imo.
We'd have to know if OP actually cares about RT or not, and what types of games or games hes playing in general to say which would be the better card imo. We can make a reasonable assumption that the precedent games like Indiana Jones & TGC are setting that RT will become the lightning standard (and we can assume this will primarily affect single-player games or story driven games, where immersion matters)

Ultimately if we knew what OP actually plays and what types of games he enjoys playing it'd make this a night and day debate. But we don't. :confused: That's just my semi-uneducated opinion though.


Yes, it is a working marketing gimmick, it makes very little difference visually though for a huge cost in performance. There is a reason why companies invest so much in marketing, it works.
Depends heavily on the game. If were talking something like RE4: Remake? I'd definitely agree. But most of the newer games I've seen its definitely more than 'very little'.

We're not in the 20 series days anymore where ray-tracing is an overmarkerted and underdelivering feature. Its actually viable now, and is continuing to grow to be. I don't blame you for thinking the way you do, but its just not that simple anymore imo
 
Why are you using a GPU at a completely different performance tier as the baseline for a comparison between the RTX 4070 and RX 7800 XT?
Most recent review (most recent drivers and game set) that has both of these cards. It's not complicated or a conspiracy.
 
Yes, it is a working marketing gimmick, it makes very little difference visually though for a huge cost in performance.
No, it's an effective feature set that many, many people actually want. Your silly nay-saying does not reality make..
 
Most recent review (most recent drivers and game set) that has both of these cards. It's not complicated or a conspiracy.
I'm criticising your math, not your choice of data set.

190% is not 40% more than 151%, it's 26% more.
 
Yes, it is a working marketing gimmick, it makes very little difference visually though for a huge cost in performance. There is a reason why companies invest so much in marketing, it works.
No, it's an effective feature set that many, many people actually want. Your silly nay-saying does not reality make..
I have to side with @Shakallia there, although I'd call it a matter of opinion.
 
I have to side with @Shakallia there, although I'd call it a matter of opinion.
Does this matter of opinion change the fact that most new games use an engine that has RT on as default? I wonder how long it is till TPU review datasets can't find current gen games that can turn RT off, thus RT testing becomes the default performance testing.
 
Does this matter of opinion change the fact that most new games use an engine that has RT on as default? I wonder how long it is till TPU review datasets can't find current gen games that can turn RT off, thus RT testing becomes the default performance testing.
We're lucky that this default RT is usually not heavy enough to impact performance on AMD significantly worse compared to Nvidia. When this changes, we'll have a different discussion.
 
We're lucky that this default RT is usually not heavy enough to impact performance on AMD significantly worse compared to Nvidia. When this changes, we'll have a different discussion.

Stalker 2, which uses software Lumen, hardware Lumen (which yields better results for cards with good RT hardware) isn't patched in yet, has the 7800XT at 70 and the 4070 at 80 FPS, beating the 7900GRE at 75 FPS, a tier above the 7800XT.

Losing to cards a tier below them is basically the best case scenario for AMD GPUs when any form of RT is used. Heavy RT? More like two tiers. Luckily in this case the 7900GRE and the 4070 are similarly priced.

1733740158403.png
 
the 7800XT at 70 and the 4070 at 80 FPS
Personally, I wouldn't call that a massive difference.

On a different note to OP: How about waiting for Blackwell and RDNA 4 instead of rushing to buy something now? Even if they turn out to be disappointing, at least they could drive prices down a bit.
 
You see, if you actually played all games, you'd quickly realise this isn't true. I've already given examples.

Instead of being a gimmick it's actually the default lighting system of modern engines, and in many cases cannot be turned off.

Considering that, I'd take 40% RT performance, 10% more efficiency and DLSS over 2% raster, and consider anyone who wouldn't to be insane.
RT can be disabled in the menu of 90% of games. For stalker 2, you can disable it via the config file. Stalker 2 is also very poorly optimized and can be considered quite in a beta state, so it might be possible in the future that they add the possibility to disable it via the menu considering how performance intensive it is for the little benefit it gives in a, once again, so poorly optimized game.
 
Personally, I wouldn't call that a massive difference.

On a different note to OP: How about waiting for Blackwell and RDNA 4 instead of rushing to buy something now? Even if they turn out to be disappointing, at least they could drive prices down a bit.
It doesn't need to be a massive difference, you're implying that the default RT on state of new games isn't important because that implementation is lighter than other examples, like CP2077, where it's 40 vs 60 FPS, what most would consider to be the playable minimum.

The fact remains that you can get a card that's, at worst, still more than 10% faster in this case, and up to 30-40% faster in other cases, is 10% more efficient, and has things like DLSS. Or you can get something slower, less efficient, and less "future proof", no matter how much VRAM is brought up, , for the same price.
 
On a different note to OP: How about waiting for Blackwell and RDNA 4 instead of rushing to buy something now? Even if they turn out to be disappointing, at least they could drive prices down a bit.
That could be a good idea as long as they don't need a card ASAP(if their current card still works).

RT can be disabled in the menu of 90% of games.
Just because you can does not mean you should.
 
Status
Not open for further replies.
Back
Top