• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Do you really need to play on ultra settings in games?

Depends on what "Ultra" brings, for example running Minecraft with something like optifine which adds breathtaking lighting to the game, yes, but in a game where "ultra" just means the grass has 50% more pixels then I don't care.
Also depends on the type of game, like if I'm playing a competitive game then FPS is the highest priority, whereas a single player or turn based game then I don't mind sacrificing frames for looks as long as the frames are within reason.
 
Custom. Always custom. I behead motion blur and depth of field first chance I get. So the game automatically changes preset from ultra to custom for me. Then vignette, film grain and any other post-processing scheiße, yes that means blurry shitty TAA. Either no AA or no AA with supersampling.

Also turn down shadow to medium since I can't for the life of me notice the differences.
+1 to this. Too much damn post processing don't make it look better, it makes it look worse/blurry while reducing framerate. Film grain, motion blur is the two things I disable the most.
 
+1 to this. Too much damn post processing don't make it look better, it makes it look worse/blurry while reducing framerate. Film grain, motion blur is the two things I disable the most.
Yep same, agree with both of you.
 
Most of the times, going from High to Ultra doesn't bring any worthwhile visual upgrades while sacrificing performance by a large margin. Hardware Unboxed game optimization videos are perfect example.
Tinkering with settings you tend to gain 20-30% more fps, while maintaining virtually same image quality.
 
Most of the times, going from High to Ultra doesn't bring any worthwhile visual upgrades while sacrificing performance by a large margin. Hardware Unboxed game optimization videos are perfect example.
Tinkering with settings you tend to gain 20-30% more fps, while maintaining virtually same image quality.
When I was playing on my Ryzen 5 2600 (OC'd to 4 GHz) / 1660 Ti rig, I usually started at High then adjusted settings as needed to get the most fps/smooth game play possible.

IMO, it depends on your PCs specs (CPU, GPU, RAM) as to whether you need to use customs settings.
If your PC can run a game at well beyond your optimal fps at ultra, I just don't see any reason to spend the time tweaking settings.
 
Custom. Always custom. I behead motion blur and depth of field first chance I get. So the game automatically changes preset from ultra to custom for me. Then vignette, film grain and any other post-processing scheiße, yes that means blurry shitty TAA. Either no AA or no AA with supersampling.

Also turn down shadow to medium since I can't for the life of me notice the differences.
I do similar changes in most games, motion blur makes me queasy
 
I find the best combo of settings that looks good but maintains 60FPS. I cap FPS to 60 to match my monitor, and I run the highest resolution that accomplishes that goal. Usually sliders are more than medium, but less than Ultra, and I can’t tell most of the time.
 
Ultra or bust.
My system cost $1900 when I built it...
If I wanted less I would stick to my consoles...
So yeah Ultra or bust.
Currently my gfx and monitor are perfectly matched @2k.. Unfortunately that means I'm need to upgrade both when it's time but I still get ultra for what I play (CoD MW)
 
Custom. Always custom. I behead motion blur and depth of field first chance I get. So the game automatically changes preset from ultra to custom for me. Then vignette, film grain and any other post-processing scheiße, yes that means blurry shitty TAA. Either no AA or no AA with supersampling.

Also turn down shadow to medium since I can't for the life of me notice the differences.
I mean, if you are turning down a setting because you do not prefer its effects are one thing.... but you are clearly an ultra type gamer. ;)

and by Ultra I mean the absolute maximum settings.
but ultra doesnt mean that. Ultra is just a canned setting in the game from the devs. Often there are still some IQ settings that can be improved upon to have 'maxed out' settings.
 
I mean, if you are turning down a setting because you do not prefer its effects are one thing.... but you are clearly an ultra type gamer. ;)

but ultra doesnt mean that. Ultra is just a canned setting in the game from the devs. Often there are still some IQ settings that can be improved upon to have 'maxed out' settings.

Forza Horizon 4/Gears 5/RDR2 are all good examples I manually max them out... Except water simulation in RDR2 I keep that at one tick from the maxed setting.
 
When possible, but 100+ fps preferably, depending on the game.
If I can get a steady 144fps with some graphics settings turned down I do so.
 
Yes.

That's the goal, right? That is a reason people use a PC instead of console for the higher IQ.

The only reason not to use ultra settings is if you are not reaching your target fps goal or play games competitively. Sure, some settings arent terribly noticeable between ultra and high... but, yes. I ultra whenever I can.

It could be one of your goals. I don't agree ultra is the end all be all reason for PC gaming - the fact you can run ultra, is a big plus though.

Far more important, to me at least, is the degree of freedom, running legacy stuff / infinite backwards compatibility, tweakability, modding, input devices you can use and the gameplay it enables, and far bigger library of games. Gaming is about the content, much more so than the picture quality.

+1 to this. Too much damn post processing don't make it look better, it makes it look worse/blurry while reducing framerate. Film grain, motion blur is the two things I disable the most.

Also this. I almost never run straight up ultra if the game offers a wide range of graphics options. Some stuff definitely gets turned off. And some games are tweaked to run at 120 fps.
 
Last edited:
I mostly play online first person shooters where frame rate is king so I tend to turn most settings down. This is also in part to the limitations of what my current rig can do. Can't wait for an upgrade...
 
Yep, all the bells and whistles for me.

If its a bit laggy, ill just drop the res.
 
I mostly play online first person shooters where frame rate is king so I tend to turn most settings down. This is also in part to the limitations of what my current rig can do. Can't wait for an upgrade...

Thats some unique graphics card you got there: "GTX 170Ti " :D
 
Thats some unique graphics card you got there: "GTX 170Ti " :D

Probably trying to type too fast. Thanks for picking up on that!
 
I have only low, all off, min resolution, 60 Hz.

Oh mistake. We have 2019 and I forgot how I love how modern games looks fantastic last years. And it is better and better. So my answer is, if possible and not competitive gaming, then always Ultra with 1440p@144 Hz and enjoying not only game but also every detail in game.
 
On some games, when it gets too realistic, I'd get queasy ... had to play in short bursts till body adjusted. As not all games are equal, to play on any system and optimal performance custom is the way to go ... Ill take the highest settings I can get with G-Sync turned off and ULMB on and play in the 100-120 Hz range . If i can't get that high will revert to G-Sync w/ monitor set back to 165 Hz.
 
We could also ask if we need mods for games, graphical, or even new games built on existing games.

Yes, I will always try for the highest image quality, depending on game in a trade off between quality and performance. I will overclock hardware
 
Nah. I play Mirror's Edge Catalyst on Hyper settings, they're above Ultra. :cool:
 
I will never understand the "Ultra or go console". Such an elitist stance, and I really do not agree with it. Since when did gaming become all about pretty graphics?
The main reason I game on PC is primarily the choice of games. RTS games suck on consoles (the few that are available), there's no MOBA games on consoles, and there's very few MMOs on consoles.

Also, mouse+keyboard > controller, any day (for me at least).

PS I always game on "custom", as I rarely update my computer, so it's almost never strong enough to handle high/ultra with acceptable FPS
 
Almost every video card upgrade I've bought was from a game being laggy due to "Needs moar Power!!"

I added a 7970 to make textures stop popping in Rage, lol.

I have to admit, I don't think the RX480 was an upgrade over the 2x 7970 crossfire setup, but it did drop 300W in heat from the case.
The fan on this 480 doesn't even come on until I load a game.

It will play Crysis with everything all the way up, at 1920x1200. :)
 
Back
Top