• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Real talk 4070 vs 6950 xt

iROH

New Member
Joined
Apr 16, 2023
Messages
10 (0.01/day)
So I have i7 8700k with 32gb ddr 3200 ram and 650w power supply.
650w is more or less enough for 6950 xt and lets not talk about bottlenecking of cpu.
I know some people have preferences and many things can be said about which one is better 4070 or 6950 xt but from technological perspective if im only playing on 1440p, which one is the best piece of tech?
 
Which games do you play? Do they have ray Tracing and support DLLS3? Is your electricity expensive?

All those factors decide which is the better option.

In your case, since you already have a 6950XT, there's no point sidegrading to a 4070
 
I don't have 6950 xt, ray tracing I mean... its only possible with dlss or frame gen. i have to try how frame gen feels first but, that was not my question.
I said as technological piece for every possible using scenario which one is better?
 
I said as technological piece for every possible using scenario which one is better?
6950XT is better in raster and 4070 is better in Ray Tracing, with much better power consumption. There's no absolute better for every scenario.
 
I mean the 4070 has better Upscaling/RT/efficiency. The 6950XT is better in raw rasterization and has 4 extra GB of vram.

You really just have decide what strengths are more important to you.

Best from a technology perspective would lean 4070 but that doesn't matter a whole lot if the 6950XT is nearly 20% faster in a game you want to play.

The main issues though is your 650w power supply. The 6950XT isn't as bad as the 6900XT in transient spikes it can still spike in the neighborhood of 450w though I personally wouldn't pair it with a 650w power supply. That's just the reference model the partner OC models will be worse in transients.

I don't feel either card is worth 600 usd for midrange products they both are still relatively expensive and have too many downsides.
 
I believe your going need a higher wattage PSU, i had to change my Seasonic 750w Plat as it would crash some times, even more so if you tab in and out of games.

Like to know were you got 650w would be enough.
 
I had a G.Skill 850W Platinum when I had my 3080Ti/6900XT initially, and my computer would prompt me at startup with a voltage overload and then turn off. The G.Skill 850W Platinum was fine with my 2080 Super.

You will likely also need to upgrade your PSU with either card.
 
thank you for the answers i have same conclusion, as a piece of tech i feel like 4070 is better but when I see 20% more fps on 6950 xt... if i consider buying 6950 xt it means I have to buy new PSU as well am I right? but 4070 will be fine, because I used 3070 before with 650w psu and it was barely going up to 450. and 3070 needs more wattage.
 
thank you for the answers i have same conclusion, as a piece of tech i feel like 4070 is better but when I see 20% more fps on 6950 xt... if i consider buying 6950 xt it means I have to buy new PSU as well am I right? but 4070 will be fine, because I used 3070 before with 650w psu and it was barely going up to 450. and 3070 needs more wattage.

Buying inefficient previous gen is kinda bad idea anyways.

Both Nvidia/AMD will spend more effort improving current gen GPU, so that over a period of time the current gen will increase in perf compare to previous gen

For example in latest review vs old review, 3090 is 42% faster than 2080Ti vs 34% 2 years ago
average-fps-2560-1440.png
average-fps_2560_1440.png
 
Buying inefficient previous gen is kinda bad idea anyways.

Both Nvidia/AMD will spend more effort improving current gen GPU, so that over a period of time the current gen will increase in perf compare to previous gen

For example in latest review vs old review, 3090 is 42% faster than 2080Ti vs 34% 2 years ago
View attachment 291717View attachment 291718

While I don't disagree when we are talking about flagships vs previous gen flagships if you look at the 3070 that offered a similar amount of perfomance as the 2080ti it has also regressed in comparison to the 3090.
 
Buying inefficient previous gen is kinda bad idea anyways.

Both Nvidia/AMD will spend more effort improving current gen GPU, so that over a period of time the current gen will increase in perf compare to previous gen

For example in latest review vs old review, 3090 is 42% faster than 2080Ti vs 34% 2 years ago
View attachment 291717View attachment 291718

Do not forget amd driver performance uplifts while nvidia has like zero... Most likely it is the same or slighty better in 2k with amd. As now 6700 xt is better than rtx 3070 in 2k. i would say it depends on price. Both topic mentioned gpu have the power to run 2k and 4k.

If OP wants to use fake resolution and RayTracing then go NVIDIA. Raw performance and see things how developer intented to see the game for players - AMD
 
So I have i7 8700k with 32gb ddr 3200 ram and 650w power supply.
650w is more or less enough for 6950 xt and lets not talk about bottlenecking of cpu.
I know some people have preferences and many things can be said about which one is better 4070 or 6950 xt but from technological perspective if im only playing on 1440p, which one is the best piece of tech?

Somehow I feel as though you already have your mind made up deep down, and just want to see how much of a flamefest TPU can achieve
  • If you don't care for RT and DLSS all or at least don't care for it enough to influence your decision, then go 6950XT with all its discounts as of late (price been looking yummy).
  • If you are building serious SFF where physical fitment and GPU heat output will be a problem, then certain compact 4070s would be the only choice.
  • If you frequently play certain games that absolutely plummet in performance or have issues on RDNA, then don't get a 6950XT.
  • If you frequently play certain games that underperform on Nvidia (ie. MW2 and Warzone), then don't get a 4070.
You can sit here, watch videos and speculate all you want for weeks, but in the end the only way to know for sure is to take the plunge. Nothing that can be found online could prepare me for the surprises of the 7900XT for example, and that's just how it is; conversely, you could read plenty about 90W idle but then find that you are unaffected personally. No one will know exactly how performance or usability will be on your specific setup.

I will say though, 650W might be cutting it a little close. Probably not as much a problem as a 6900XT might be, but you might want to make a PSU upgrade that will last you well into the future.
 
For current gen tech RT will never justify choosing it over pure raster performance, especially not with ~20% difference.
The real advantage the 4070 has is DLSS IF you want to play certain games that you know support it well.
But on the other hand the 6950 comes with 16GB VRAM ^^
 
Well FSR is getting there & while it may not surpass DLSS, perhaps ever, it will be supported by more games simply because AMD's driving the consoles now!
 
As a 6950XT owner..

Yes, you need a bigger PSU. A quality one at that.

FSR is simply amazing for games for when your card can't keep up, or you want less power draw. I'm sure NVIDIA DLSS is also very good but I haven't had the chance to play with it.

More VRAM = gooder. Raw rasterization as others have said is also so very nice with this card.


Either way, unless you really care about the VRAM, or there is a huge sale, look at the features both sides give. RTX, upscaling, encoding, etc.
 
The only game I've ever seen FSR2 work well in (via unofficial mod ironically) is Plague Tale Requiem, which was fortunate since that game was heavy AF.
Meanwhile DLSS has quite a few such titles under its belt, which is why I said it's an advantage IF one is looking to play these specifically.
Generally I'd simply avoid upscaling, which is also why pure raster performance is king in deciding what to buy - ultimately whoever runs games better in native res wins imnsho
 
So I have i7 8700k with 32gb ddr 3200 ram and 650w power supply.
650w is more or less enough for 6950 xt and lets not talk about bottlenecking of cpu.
I know some people have preferences and many things can be said about which one is better 4070 or 6950 xt but from technological perspective if im only playing on 1440p, which one is the best piece of tech?
650W isnt enough for the 6950XT. Recommended supply starts at 850W. It can spike which leaves the rig unstable.

7900XT can run safely at 650W. Thats where I went... with a 750W PSU only the 6800XT was in reach, so I spent the PSU cash into more GPU instead...

Either way I would stay away from a 650 dollar 12GB card in 2023. Im already seeing 12GB allocated since yesterdays upgrade ;) 4070 will leave you wanting in 2 years. It uses less power, but build quality also isnt 'high end'. And it will be too weak for RT proper, too.

Fair word of warning... if you upgrade into this territory on a 8700K the new bottleneck might be CPU :) TW Warhammer 3 already clearly showed me this.
 
Last edited:
The only game I've ever seen FSR2 work well in (via unofficial mod ironically) is Plague Tale Requiem, which was fortunate since that game was heavy AF.
Meanwhile DLSS has quite a few such titles under its belt, which is why I said it's an advantage IF one is looking to play these specifically.
Generally I'd simply avoid upscaling, which is also why pure raster performance is king in deciding what to buy - ultimately whoever runs games better in native res wins imnsho
Cyberpunk 2077, Farcry 6, Elite Dangerous, etc FSR works fantastic. There are quite a few that already support some kind of upscaling and I've seen much more AMD side support than NVIDIA.
 
What monitor are you currently using?? Is it a G-Sync or FreeSync monitor??
 
Just remember that for nVidia you need a top-tier CPU to get max performance out of their GPUs since they have software scheduler for DX12 and that causes massive driver overhead. AMD's scheduler allows for old CPUs to get almost max performance out of their GPUs.
 
Cyberpunk 2077, Farcry 6, Elite Dangerous, etc FSR works fantastic. There are quite a few that already support some kind of upscaling and I've seen much more AMD side support than NVIDIA.
I disagree about CP2077. Haven't tried it on the other two, because they run very well on my PC in native anyway, but I can immediately see upscaling artefacts in CP if I turn FSR on.
I would lower quality settings before using upscaling in a game.
That being said the same might well apply for DLSS in many titles - I just don't have the experience with it.
When I said FSR works really well in Plague Tale I literally meant that I can't tell it's on ^^
 
I disagree about CP2077. Haven't tried it on the other two, because they run very well on my PC in native anyway, but I can immediately see upscaling artefacts in CP if I turn FSR on.
I would lower quality settings before using upscaling in a game.
That being said the same might well apply for DLSS in many titles - I just don't have the experience with it.
When I said FSR works really well in Plague Tale I literally meant that I can't tell it's on ^^

Generally speaking DLSS is superior especially at 1440p so if upscaling is an important feature to someone then Nvidia is the bettet option.

Neither technology is overly good at 1440p but FSR does seem to degrade more at lower resolutions vs DLSS imo they are both only usable at 4k in their quality modes.



I like that FSR is open but amd hasn't improved it to the point where I would be willing to use it over DLSS


Separately DLSS3 as much as I like it isn't good enough to be a selling point. As much as I'd be willing to bet nvidia will improve it substantially over time there is no guarantee you won't have to buy a 50 series or 60 series to get that improvement Nvidia is Nvidia after all.
 
Separately DLSS3 as much as I like it isn't good enough to be a selling point. As much as I'd be willing to bet nvidia will improve it substantially over time there is no guarantee you won't have to buy a 50 series or 60 series to get that improvement Nvidia is Nvidia after all.

DLSS 3 is a selling point to this tier of GPUs.

When you buy a 4080/4090 you expect to play everything natively or with DLSS 2. DLSS 3 is good for tech demos like CP Overdrive.
But in the 4070 and lower tiers, having DLSS 3 is critical. It's one click away from making a game playable.
This tier doesn't have the luxury always to be able to tweak some settings and make it playable.
If a console port is completely broken, DLSS 3 can save your a$$.

Any 6000 card can not compete the 4070. They lack critical features. A 600$ card should play all AAA titles, some with a little tweaking and obviously with RT on when possible.
If the 7900XT was 650, it would be a winner but at the moment it's priced for other galaxies.
The 7000 are impressively performant cards all around. Even with RT on. It's sad that AMD don't make them attractive by lowering the prices.
 
When you buy a 4080/4090 you expect to play everything natively or with DLSS 2. DLSS 3 is good for tech demos like CP Overdrive.
But in the 4070 and lower tiers, having DLSS 3 is critical. It's one click away from making a game playable.
.

I would agree if the tech didn't increase latency and although the added smoothness can help the massive hit to input latency at lower base framerates makes it unusable to me if your gpu isn't strong enough to maintain a decent framerate.

You have to already be getting 70-90fps for it usable to me due to the latency hit.

It also still has bad ui artifacts in the majority of games that support it making an even bigger case against it as a selling point. It's also useless for MP games.

The technology is awesome and I'm super impressed with it overall but it still needs a decent amount of improvement for me to consider it a key selling point.
 
Last edited:
Back
Top