• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

What are your GPU settings for running games and benchmarks?

Do you run your games & benchmarks with AA on Full, Off or somewhere in between?


  • Total voters
    68
Amendment to my max settings, which RCoon’s post reminded me of. Motion blur always gets turned off if it’s present in the menu.
 
Literally 7% of people, currently, voted they do not run AA. Of those respondants I wonder if it is due to performance or a simple preference like the OP...

Edit: 6.7% now... I wonder why the ONE (other) person who does this, does it. :p
 
Last edited:
Literally 7% of people, currently, voted they do not run AA. Of those respondents I wonder if it is due to performance or a simple preference like the OP...
Yeah it changed a little bit.

Edit: 6.7% now... I wonder why the ONE (other) person who does this, does it. :p
Actually that was Frick;
Always off. In the games that actually run well with it on max I don't see enough difference to make it worth the performance drop.
 
Benchmarks=Maximum , Games=Depends on the game (probably low).:)
 
Depends on the application. I always tweak settings to get 100+ FPS solid if that is achievable. If it isn't, I settle for 60 and max out on IQ. With my current system 95% of all games hit the 100+ mark, and that is also my metric for upgrades. If a large number of frequently played stuff goes under that, I get itchy.

In NVCP most settings are application controlled except for AF which is always x16. AA is best handled on a per game basis, and NVCP is a last resort. I play most games Vsync OFF and frame capped, or using Fast Sync - in case of those that hit 100+FPS. Some games also really like adaptive Vsync and barely or don't suffer a noticeable input lag penalty doing so. When there is a slight input lag penalty, I will notice and do things to avoid it, a higher priority to me than having the highest quality.

AA is one of the first victims when I can't hit my FPS target. Things like Motion Blur, DoF and vignetting are always OFF, I've rarely found a situation where they enhance the visuals, they mostly serve to hide poor/low quality assets and low FPS. Chromatic abberation is per-game, same with Screenspace reflections (example: In TW:Warhammer, and most total war games, it incurs a massive performance hit for invisible IQ gains). If I have GPU grunt to spare, I would use in-game (not DSR) res scaling > 100% as well over AA. TSSAA with a mild sharpening pass really is the ideal balance in terms of performance vs IQ.
 
Last edited:
i just voted for the AA off option.. just to make the numbers up.. :)

trog
 
I just got a 2080Ti, so yeah, everything is maxed. I game at 3440X1440 and with very few titles supporting SLI I needed an upgrade.

Motion Blur gets turned off, not for performance but because I find it ugly.
Film Grain is off as well, also cuz it's ugly.
 
Depends on the game. Only two settings are always the same across all games: Motion Blur (always off) and resolution (full-screen 1920x1080).

Everything else, most of the time, high/max quality, with exceptions. For example, No Man's Sky and Assassin's Creed: Origins are really intensive (mostly on CPU, it seems), so I keep quality as low as possible. Anti-Aliasing is always on, though I may choose to keep it low if the extra FPS are worth it.

Any game using Valve's Source Engine, everything turned to the max. Motion Blur and that film grain thing Left 4 Dead uses, off.

Metro 2033 and Last Light used to be hard on my old R7 260x, after upgrading to the RX 580 I set everything to max, except for SSAA (off).

Tomb Raider series, almost everything to the max, except for that PureHair thing, which looks weird.

In general, if I can get at least 40/50 FPS without stuttering and without making the computer sound like a windtunnel, that's enough for me.
 
I tend to max everything even on my pretty budget laptop. Its GTX 1060 handles most games perfectly fine at 1080P.
 
I tend to max everything even on my pretty budget laptop. Its GTX 1060 handles most games perfectly fine at 1080P.
As it should. I'm curious though, what budget laptop has a GTX1060 in it?
(Sorry for the late response, the TPU servers didn't send a notification)
 
As it should. I'm curious though, what budget laptop has a GTX1060 in it?
(Sorry for the late response, the TPU servers didn't send a notification)


Dell G3. I paid $799.99 for it at Microcenter. I'd consider that budget anyway, for having a 1060 at least. It is, technically, a 1060 Max-Q, but it's only about 11% slower than a proper 1060. I have used Afterburner and actually made it outperform stock 1060's by overclocking it by 175 mhz, but I don't consider the extra heat to be worth the marginal performance increase.
 
Don't know why benchmarks would be included as you should run them at the settings that they are required to be run at for points.

As for games. 4k with every single setting excluding ambiant occlusion turned all the way up. I don't like the motion blur thing it tries to do, doesn't make me sick or anything weird just makes clean good looking games ugly. This includes running 16x AA at 4k on a pixel dense 23.8" LCD I am a bit of a snob about quality.
 
Don't know why benchmarks would be included as you should run them at the settings that they are required to be run at for points.

As for games. 4k with every single setting excluding ambiant occlusion turned all the way up. I don't like the motion blur thing it tries to do, doesn't make me sick or anything weird just makes clean good looking games ugly. This includes running 16x AA at 4k on a pixel dense 23.8" LCD I am a bit of a snob about quality.
Pretty much this. I bench at max/defaults and adjust for my actual gaming(no Blur ever for example) minimum 2xAA etc.
 
Don't know why benchmarks would be included as you should run them at the settings that they are required to be run at for points.
That depends on your goal. When I run benchmarks it's at the system screen native res and customized setting to get a good idea of system performance. Systems are compared to themselves and to a standard established in-house. Systems are rarely(read; almost never) compared to benchmarks online.
I don't like the motion blur thing it tries to do, doesn't make me sick or anything weird just makes clean good looking games ugly.
Could not agree with you more on this one.
This includes running 16x AA at 4k on a pixel dense 23.8" LCD I am a bit of a snob about quality.
At 4k, AA does almost nothing for you, why suffer the processing overhead? Challenge; Turn AA off and give your games an objective try. Don't change anything else, just the AA. Then tell us what you think.
 
i remember playing the original quake and struggling to get 12 fps.. 16 fps seemed wonderful.. he he he..

if i remember correctly this was at 640 x 480.. :) not a lot else to say for me except its all in the mind and for some nothing will ever be good enough..

trog
 
i remember playing the original quake and struggling to get 12 fps.. 16 fps seemed wonderful.. he he he..

if i remember correctly this was at 640 x 480..
Oh that takes me back.. I remember resolving that problem by dropping to 400x300 to get FPS above 60. Totally worth it.
 
I run every game I play on on maximum settings as high as they can go

I mean why not
 
I always set up my games so that I'm GPU bottlenecked,I can't stand microstutter when CPU is bottlenecking.That's why I always run my GPU at max utilization and close to max OC too. I use a lot of AA,even though I have a 24" 1440p display.In fps games in tend to prefer smaa/temporal so that it doesn't cannibalize my fps, but for other games I use heavy txaa/resolution scaling.I actually bought this 24" 1440p one cause I couldn't stand the aliasing on 27" 1440p.
 
At 4k, AA does almost nothing for you, why suffer the processing overhead? Challenge; Turn AA off and give your games an objective try. Don't change anything else, just the AA. Then tell us what you think.

I have tried. Played them superscaled to 8k as well and depending on game there is a noticeable difference. Obviously games like borderlands that don't offer textures in those resolutions I saw zero change, but some do.
 
Using the system in my signature and play with everything maxed out / ultra. Get very respectable FPS because I am using a 720p monitor and running at 1280x720 ... or 1280x800, if the game supports it, since it's basically a Panasonic 43" 16:10 monitor.
 
Well, I bench to test stability and temperatures most of the time and not to win pissing contests so, whatever hits my hardware in a realistic way is usually what I go with. I only care about numbers if there is a problem most of the time.

As far as playing games, some games I'll turn it up for more than others. It depends on what I can get away with to keep the game smooth at 4k. Sometimes that means little to no AA and sometimes that means cranking it up.
 
Back
Top