• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Best gpu for warzone

Joined
Feb 15, 2022
Messages
58 (0.05/day)
Whats up peeps
Whats the best gpu that can run warzone at 1080 with combination of low/medium settings @144 fps sustainable for 4 years

Specs:
I5 13400f
Asus prime B660m-k d4
2×8 gb ddr5 5200mhz
Evga supernova 750 gt 80+ gold
 
RX 6650 XT or RX 7600
 
Although it is heavier on the cpu this is a pretty good reference to go by. Your CPU is likely going to be the bigger determining factor than the gpu in hitting 144hz at all times in warzone.... Although you will still likely need quite a bit of gpu grunt as well.

4 years is a long time to maintain 144hz with any gpu even high end ones might struggle at that point.


Capture.PNG
 
does warzone update its graphics since its release? im not sure i dont play that game
 
ill take subjective questions for 800 alex
 
Although it is heavier on the cpu this is a pretty good reference to go by. Your CPU is likely going to be the bigger determining factor than the gpu in hitting 144hz at all times in warzone.... Although you will still likely need quite a bit of gpu grunt as well.

4 years is a long time to maintain 144hz with any gpu even high end ones might struggle at that point.


View attachment 306140
He did request a mix of low and medium settings (competitive settings) which is very doable with any x60 card from either manufacturer at 1080p. The safest bet would be the 6700 XT or RTX 3070 as those would guarantee 144 Hz at 1080p even at high settings.
 
How would the 4060 ti scale up here?
Since it offers similar performance to a 6750xt + dlss 3.0 for the future.
Also its cheaper than a 3070.
 
How would the 4060 ti scale up here?
Since it offers similar performance to a 6750xt + dlss 3.0 for the future.
Also its cheaper than a 3070.

As long as you stick with 1080p and medium ish settings that should be fine assuming warzone 3 or whatever replaces warzone 2 isn't substantially heavier.

DLSS 3 sucks for MP gaming but DLSS2 is fine and I personally feel it is much better at 1080p than FSR2 image quality wise neither are great at that resolution though.
 
As of year ago, RX 6600 non-XT was enough for 160 FPS at 1080p completely low. My wild guess is you go for RX 6700/RTX 3060 Ti or higher, you have your framerates secured.

4060 Ti will give or take provide you stable 140 to 180 FPS there given your settings won't go above med-high.
 
How would the 4060 ti scale up here?
Since it offers similar performance to a 6750xt + dlss 3.0 for the future.
Also its cheaper than a 3070.

DLSS 3.0 introduces input lag, something you definitely want to avoid in a multiplayer game. 1080p is not a great resolution for the tech anyways, both DLSS and DLSS 3.0 have reduced quality below 1440p. It won't help in the future either, the 4060 Ti critically lacks memory bandwidth and it's VRAM amount of 8GB if barley enough for games now. Neither of which DLSS or frame generation alleviate.

The 4060 Ti has a cost per frame higher than even AMD's last gen flagship part:

1690265720827.png


Suffice to say, you should not be paying more per frame than a much higher end card.

I'd personally recommend the 6700 XT with it's 12GB of VRAM. The 7600 is also a good choice but with only 8GB of VRAM. Really depends on what you play, if you don't plan on playing any demanding games you should be able to get by with only 8GB, even say 4 years from now.
 
Can nvidia reflex help alleviate some of dlss 3.0's input lag tho?

No the hit is too large. like 200 FPS Frame gen is worse than 120hz native for example... You can also use reflex with native rendering as well which makes the lead of better input latency even larger. Now if you want to play something like CP2077 with ultra RT sure its fine but not for MP gaming.
 
Specifically just for warzone ? Strange request but everything that is AMD performs significantly better in warzone than Nvidia counterparts.

Anything above an RX 6700 will be sufficient.
 
Can nvidia reflex help alleviate some of dlss 3.0's input lag tho?

DLSS 3.0 already requires that Nvidia reflex be enabled because otherwise the input lag would be horrid.

It has worse input lag than whatever your base frame-rate was. For example, if you have a base frame-rate of 60 FPS and you enable DLSS 3.0 you'll be getting 120 FPS but your input lag will be similar to if you were getting 50 FPS. DLSS is inserting a frame between the current displayed frame and the next frame generated by the GPU which means two things: 1) DLSS has to hold the next frame until it's done generating it's fake frame, adding latency. 2) The generated frame is an interpolation between the current frame and next frame, meaning it's squeezing that extra generated frame between the current frame and the next frame and thus the generated frame is inherently more latent than what your GPU could already generate by itself. Instead of simply displaying the next frame it first has to display the already latent generated frame.
 
Last edited:
But im trying to avoid amd gpus since ive had shitty expefiences with my old rx 470, 580 and 5700
So im afraid if the driver issue and bsods are still persistent in newer gpus.
 
4070 would be the cheapest card Nvidia makes that isn't crap honestly from the current generation. 4060ti is very meh for 400 usd and a 4060 might not last 4 years even at medium ish settings
 
Last edited:
But im trying to avoid amd gpus since ive had shitty expefiences with my old rx 470, 580 and 5700
So im afraid if the driver issue and bsods are still persistent in newer gpus.

Well unfortunately at this price point that means you'll have to accept that Nvidia is giving you a less than any prior xx60 Ti card while also charging more.
 
They're releasing constant updates, so they could improve framerate, in time.
You can get close to 140 fps with DLSS 2 enabled on a RTX 4060. It's not a future-proof card, though. Those 8 GB will not age well.
The 16 GB variant of the 4060 Ti would be worth considering, if the pricing would be (a lot!) better...
 
Last edited:
Cpu is probably going to be the limiting factor with warzone. Is that all you play? Vram would be something to consider if you think you might end up playing different games. 8gb isn't a lot. Even now its bare minimum. In 4 years, it will be even worse.
 
Cpu is probably going to be the limiting factor with warzone. Is that all you play? Vram would be something to consider if you think you might end up playing different games. 8gb isn't a lot. Even now its bare minimum. In 4 years, it will be even worse.
I also play valorant and csgo but they dont require a demanding pc to run.
Also ill run warzone at low settings so maybe 8gb will survive?
 
I also play valorant and csgo but they dont require a demanding pc to run.
Also ill run warzone at low settings so maybe 8gb will survive?
If you don't mind low settings and you're on 1080p, I'm sure 8gb will be fine. Me personally I would rather something older with more vram if I was on a budget ( ....which I am, but I bought on credit >.> ) but hey you do you man. The advantage of a newer gpu is longer driver support, plus dlss3 and a lower power bill. So there's definitely upsides to that plan.
 
DLSS 3.0 already requires that Nvidia reflex be enabled because otherwise the input lag would be horrid.

It has worse input lag than whatever your base frame-rate was. For example, if you have a base frame-rate of 60 FPS and you enable DLSS 3.0 you'll be getting 120 FPS but your input lag will be similar to if you were getting 50 FPS. DLSS is inserting a frame between the current displayed frame and the next frame generated by the GPU which means two things: 1) DLSS has to hold the next frame until it's done generating it's fake frame, adding latency. 2) The generated frame is an interpolation between the current frame and next frame, meaning it's squeezing that extra generated frame between the current frame and the next frame and thus the generated frame is inherently more latent than what your GPU could already generate by itself. Instead of simply displaying the next frame it first has to display the already latent generated frame.

This sounds very similar to what triple buffering does no?

To OP: Do you want to buy new, or you don't mind buying used?

amd driver problems again.

What problems did you get?
 
This sounds very similar to what triple buffering does no?

To OP: Do you want to buy new, or you don't mind buying used?



What problems did you
Bsods, psods, osods, driver timeouts, driver resetting itself on its own, pc doesnt wake up from sleep, stuttering, game crashing and sometimes artifacts.
Also i dont mine buying used but it mightve been an ex mining gpu?
 
Last edited:
Back
Top