• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

What graphical settings do you lower or disable for better performance?

Curious if anyone knows does MFAA apply to both full screen AA or also extend to transparency AA multisample? I haven't actually tried to check works on performance if it that checks out or not. It seems like it could be either scenario so it makes me wonder which scenario is accurate.
 
Probably the best compromise is going DLSS3, if you can.
I reduce all 'ultra' to high-medium, mostly on shadows.
 
There's also the lazy approach: lowering everything from Ultra to High, or from High to Medium. It works wonders when you just want to play the game without tinkering in the options menu for an extended amount of time.
 
Former laptop user here. A trick I was really fond of was using this to try and get better (or more stable) frames: https://github.com/doitsujin/dxvk

Setup is pretty straightforward, you drop the DLLs that match the game architecture and DirectX version into the game directory, the game will then have its DX calls converted to Vulkan which might help you eek out some more frames.

There are a LOT of caveats:
- I have only tested this on AMD GPUs (which I'm to understand are better at Vulkan than Nvidia?).
- It won't work for all games.
- It will make some games worse.
- It will make some games unstable.
- There will be some stutter on first run as it builds shader caches.
- It only supports DX9-11. You can kind of make it work for DX12 by using VKD3D-Proton (https://github.com/HansKristian-Work/vkd3d-proton) and poaching the dxgi.dll from DXVK, but it's very flaky.

Based on my quick bit of research your card should support the latest Vulkan extensions so you should be good to go. But for Nvidia specifics you'll have to ask the more experienced Team Green people on the forum for help.

Good luck!
 
the quality looks terrible once you play on any resolution lower than your native.
That depends on the scaling capabilities of your monitor. Some displays are very poor at it, though in general display scaling provides better results than GPU/driver scaling. Also, consider the screen size. While 1440p scaled on a 27" monitor would look OK to most people, the same couldn't probably be said of 1080p scaled to 32". And even this becomes largely irrelevant when playing the game on a TV from your couch. Again, acceptable image quality is a personal preference.

So, honestly, I would rather uninstall the game, and wait until I buy better hardware than play on a lower resolution.
That's OK, I've always done the same. I'd rather enjoy the game in full detail on proper hardware than having to lower every setting just for the sake of playing it. Yet, I understand that other people's experiences may be different. If you're simply having fun while playing the game, it's all good.

Curious if anyone knows does MFAA apply to both full screen AA or also extend to transparency AA multisample?
MFAA works on top of MSAA or TXAA by varying the sample number per frame (and optionally its pattern) and using a temporal filter. As such, it would not get applied to transparencies, as opposed to traditional FSAA/SSAA. However, I don't have the Nvidia hardware to test it myself.
 
Last edited:
That's OK, I've always done the same. I'd rather enjoy the game in full detail on proper hardware than having to lower every setting just for the sake of playing it.
That's how I used to think for a long time, but modern games show less and less visual difference among different quality settings, so I'm not quite sure anymore.
 
I rather just lower the resolution and keep everything else ultra.
I want to see how far they dig the details.
 
That's a no go for me...
There's nothing we two can agree upon. Lol

Fight Kangaroo GIF
 
It really depends on the game and what GPU you have. Obviously games that support RT and DLSS are going to have more settings, many of which affect performance. And it's not always true that the settings which are said to be typical resource hogs are so in ALL games.

This is why I feel it's best to educate yourself, not just on settings impact in general, but on a per game basis as to which settings are most demanding. I find the best way to do this is by viewing professional tech analysis videos of games, where they show the difference in look of each setting, and the FPS difference.

Some of these guides will also recommend settings based on what general level of hardware you have, usually low, medium, and high end categories. Overall I consistently find the best such guides are done by Alex Battaglia of Digital Foundry. Digital Foundry is pretty much my go to source anymore for recommended optimized settings for demanding games.
 
The GTX 980 Ti is a Maxwell architecture card, Nvidia's earliest DirectX 12 implementation.

To clear the myths and hypotheticals:

This is a 9 year old graphics architecture. It will not support Resizable BAR (or AMDs marketing trendy name for it, SAM), it will not support DLSS (it does not have any tensor cores), it does not support real-time Ray Tracing (it does not support DirectX 12 Ultimate, nor has any Ray accelerator engines), and it will generally not perform well in modern titles due to the aforementioned reasons.

It is also on its last stretch of driver support. Nvidia supports their graphics architectures for 8 years, with development realities extending this time usually by a year or so until they start work on a new driver branch and drop the baggage. Maxwell is unlikely to continue to be supported with new updates beyond this year (2023).

The answer you needed for relevance is on the second post already, and Solaris' order of lowering things is a pretty good idea.

That said, you should already look at an upgrade in earnest. Once your GPU loses driver support, things tend to get stale fast.
 
I can usually hit the performance I want, but when I cannot, its usually in this order.

If its 60fps I drop to 30 (yes I can play at 30 fine most of the time).
Reduce fog density, not often an option in games, but when it is it can surprisingly make quite a big difference, I have never been a fan of excessive fog also.
Reduce resolution scaling (might be called different things but basically lower rendering resolution slightly and have it upscale).
Reduce shadow quality.

At the bottom of the list, I would only do out of desperation to fix major issues.

Lowering texture quality.
Lowering draw distance.

By far the most common issue I come across in games now which is typically unfixable via game settings or hard to fix is stuttering.
 
Depending on the game, I set the settings to medium, which in modern games in my eyes doesn't look too different from high or ultra settings. Then I disable volumetric lighting, because (in my opinion) real life doesn't look like that anyway.
 
MSAA, HBAO, shadows, and I usually lower the resolution too. Finished Cyberpunk 2077 playing at 1024x768p but 75 FPS capped, way better than 1920x1440 and 25 FPS honestly.
 
SSAO. It doesn't do much (in most games) and really tanks performance.
 
sadly my gtx1660ti, is equivalent or faster that my gtx980. for gtav…


turn off (down grade) hi-res shadows. (GTAV) then ultra grass, then ultra reflections, then draw distance.

the shadows are a Frame rate killer. (and RT is a shadow thing)

on a ps5, cp2077 looks “realistic“ on RT, but “plastic” on 4K.(but 60 fps, the RT is for the “real” shadiows.
but i Digress…

learn to love playing at 1080p or 1440p, and forget about 4K. (i spent too much money chasing 4K resolution)
 
Finished Cyberpunk 2077 playing at 1024x768p but 75 FPS capped, way better than 1920x1440 and 25 FPS honestly.
That does remind me, when should you frame limit your game? I always wondered why some games have that option.

I usually always play with the Frame Limit "Off", but if I'm hitting consistent 60+ FPS would there be any benefit to turn on a frame limiter or not? If yes, should you do it in-game or via third-party program?

The only reason I could see why frame limit exists is to get a no-input-delay-no-screen-tearing kind of thing, but in that case, I guess I would need to limit the frames to 144 as that's my monitor's refresh rate.

Another thing that hasn't crossed my mind until now was; since I'm still on my Haswell CPU, what settings should I lower or disable for CPU-bound games? Does it even matter if you change anything in the video options (as they mostly affect the GPU)?
 
That does remind me, when should you frame limit your game? I always wondered why some games have that option.

I usually always play with the Frame Limit "Off", but if I'm hitting consistent 60+ FPS would there be any benefit to turn on a frame limiter or not? If yes, should you do it in-game or via third-party program?

The only reason I could see why frame limit exists is to get a no-input-delay-no-screen-tearing kind of thing, but in that case, I guess I would need to limit the frames to 144 as that's my monitor's refresh rate.

Another thing that hasn't crossed my mind until now was; since I'm still on my Haswell CPU, what settings should I lower or disable for CPU-bound games? Does it even matter if you change anything in the video options (as they mostly affect the GPU)?
The reason I limit frame rates is because I have a 60 Hz monitor, which means it cannot physically display more than 60 frames per second. Anything over it is wasted power and unnecessary heat and fan noise. Not to mention, it can also reduce screen tearing.
 
The reason I limit frame rates is because I have a 60 Hz monitor, which means it cannot physically display more than 60 frames per second. Anything over it is wasted power and unnecessary heat and fan noise. Not to mention, it can also reduce screen tearing.

Yup, been doing the same for years for the same reasons and limit my fps on a driver level to my monitor's refresh rate and if that doesn't work in some games then I sync it down in the game's settings or try RTSS.

It really depends on the game and what GPU you have. Obviously games that support RT and DLSS are going to have more settings, many of which affect performance. And it's not always true that the settings which are said to be typical resource hogs are so in ALL games.

This is why I feel it's best to educate yourself, not just on settings impact in general, but on a per game basis as to which settings are most demanding. I find the best way to do this is by viewing professional tech analysis videos of games, where they show the difference in look of each setting, and the FPS difference.

Some of these guides will also recommend settings based on what general level of hardware you have, usually low, medium, and high end categories. Overall I consistently find the best such guides are done by Alex Battaglia of Digital Foundry. Digital Foundry is pretty much my go to source anymore for recommended optimized settings for demanding games.

Those in depth otpimized settings videos/guides are pretty cool especially when a game has tons of settings that barely look different to my eyes.
Used DF's optimized Cyberpunk settings tweaked a bit to my personal preference and it was all good. 'I had RT cranked up a bit higher and thats it'

The non performance related settings I instantly get rid of are motion blur/chromatic abberation/film grain and depth of field depending on the game and implementation.
Resolution I cannot lower since anything other than native looks ass on my 21:9 display, upscalers works well enough but only on their highest quality setting like DLSS.
Textures I prefer to keep maxed out or at least 1 lower than maxed.
 
I usually always play with the Frame Limit "Off", but if I'm hitting consistent 60+ FPS would there be any benefit to turn on a frame limiter or not? If yes, should you do it in-game or via third-party program?
All games usually have this setting and it's on by default: VSync
Or you can use Nvidia control panel to cap the fps


The only reason I could see why frame limit exists is to get a no-input-delay-no-screen-tearing kind of thing, but in that case, I guess I would need to limit the frames to 144 as that's my monitor's refresh rate.

I don't care if they make 300hz monitors. As far as films are 30fps since Charlie Chaplin came into existence, I see no reason for having 144 frames whirligig in front my eyes.

Another thing that hasn't crossed my mind until now was; since I'm still on my Haswell CPU, what settings should I lower or disable for CPU-bound games? Does it even matter if you change anything in the video options (as they mostly affect the GPU)?
To make your CPU less exhausted, you can lower tesselation, draw distance, shadow quality, anti-aliasing (MLAA), particles, motion blur and shader quality
And always update your gpu drivers. Make sure your cpu is not thermally throttling
 
If I am violating some rule for posting a link, I apologize, but these videos DO explain how these settings affect performance in different games. Even if you disagree, it's good information with examples.

 
For me it really depends on the individual games.
For example in Planetside 2 I'm playing with shadows off anyway, because in the game's engine they somehow rely on CPU, in an already CPU heavy game game with bad multithreading. In Cities Skylines I'm seeing sub 20 fps in a big city no matter the settings, so might as well keep them maxed out.

For other games it really depends, sometimes shadows or draw distance (both slowly lowering, it doesn't immediately have to go to the minimum)
Depending on how big the visual/performance tradeoff is things like depth of field, ambient occlusion and obviously particle effects. In general particles tend to be heavy hitters and happen exactly in those situations where you want decent performance. Explosions are nice an all, but when something goes off, you're most likely in a spot where framedrops hit extra bad.

For some titles where you can set unit limits, those are always nice for the CPU. Things like NPC and vehicle density in games like Witcher 3 or GTA, or generally open world games with lots of people running around.

I'm quite sensitive to flickering and easily notice when things aren't smooth, so I try to keep AF on 16:1 and at least some form of AA running. I'd rather play with internal resolution sliders to get less 3D pixels to render but keep the interface crisp. Destiny 2 is a good example where even with AA vegetation just just a blurry, flickering mess, so with enough GPU headroom, I would totally run it above native ress for some nice SSAA.

Now for limiting fps, there are benefits of having as high fps as possible, less input latency and that stuff. Even if it goes beyond the refresh rate of your monitor. But I prefer a stable framelimiter over vsync, many engines (especially source engine, but others as well) get noticable input lag with vsync. So far that I notice the difference even without looking at the settings. Tearing doesn't trouble me as much as having the feeling my mouse is dragging through jelly.
 
Back
Top