• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

One setting that makes antialiasing look much more like it's supposed to on nvidia Maxwell and above chips

Joined
May 20, 2020
Messages
1,431 (0.79/day)
A lot of testing with different cards later and I can say there's only one setting that makes AA on nvidia newer chips look like it should (where otherwise it didn't look/work correctly)
and no there's no forcing AA through control panel and no setting it through it either; you can leave everything factory default.
You open up nv_profile_inspector and go under Antialiasing - Gamma Correction and set it to ON_IF_FOS (0x00000001 - feasible or sensible: my guess :) ) and there should be no need anymore for any exotic SS modes or flags setting.
Now I usually do force 16xAF, 4x (8x) AA, 4x (8x) Transparency SuperSampling through nv_control_panel, but I have a lot of older games where there's no other way to get AA.
Please test it and report your findings.
 
higher resolution sometimes works better than AA.
 
One final change - since the ON_IF_FOS (0x00000001) is not good enough for all programs - combination with ON_ALWAYS (=default = 0x00000002) is though,
which results in 0x00000003 or AA_MODE_GAMMACORRECTION_MASK as explained in NV_API's Gamma Correction Enumeration.
You paste into nv_profile_inspector/"Global profile" the 0x00000003 value for AA - Gamma correction and Maxwell (GTX 9xx) and up cards always produce good AA-ed (multisampled) image.
Thank you and goodbye. :-)
 
higher resolution sometimes works better than AA.
That is true most of the time. I run at 4k and some games still require AA. Many do not though!
 
Well of course today we have really large resolutions and DSR and such supersampling, but I grew up with graphics cards so to speak from Cirrus Logic VLB in my 486 DX2 66MHz and ATI 3D Charger (2MB) with Voodoo1 (4MB) in my Cyrix 200+ and on and I remember how nice everything looked with 4xS on my GeForce4 Ti 4200 but even much better with my Radeon 9800 PRO and 4xAA 8xAF, now that was a real jump in image quality. Can anyone say Smoooooothvision? :)
Anyhow AA means a lot to me still; one never forgets how it's supposed to look from the Radeon 9800 PRO era and even HD 6950 - the best one yet but finally the GTX's can match it now and more with its 8xAA and 8xTransparencySS. But enough of rambling. :D
 
If I read correctly, this enables transparency antialiasing which is necessary for subpixel aa.
 
This setting only turns on proper Gamma Correction (if it's used) so games look good/proper on particular NVIDIA GPUs. Whether you use AA from NV control panel or in-game is your choice.
 
What games have you seen this effect on?
 
All kinds of games, OpenGL and D3D. With multisampling gamma correction on Maxwell and up GPU's is not done properly if the 0x00000003 mode is not used.
 
That's a really vague and unhelpful answer dude

AA gamma correction is on by default in modern Nvidia drivers, too - this might be an issue with you over-riding games settings with your forced ones
 
That's a really vague and unhelpful answer dude

AA gamma correction is on by default in modern Nvidia drivers, too - this might be an issue with you over-riding games settings with your forced ones
Dude, that's your belief, but I don't care and by the way forcing AA has nothing to do with it (tested it). I found the thing that solves NV_AA (though late) without any flag hunting and editing endless profiles and even published it. As I said - test it and report findings (0x00000002 vs 0x00000003 where applicable - obviously not for titles that can't use multisampling).
 
Tried this but I honestly can't tell a difference.

Can you post something like an example of what was supposed to be improved? Maybe I'm missing something here :D
 
Odd. :D
Well of course I mean I see difference in just about every game and now AA looks the same as ATI's that I enjoyed so many years.
A really obvious example: Unreal (original) with OpenGL with oldunreal Forum driver (I'm ut405player there). Also there is my OpenGL config posted there with update too.
map isvkran32 - through big red door with turbo-lift up and then you're at those green steps. AA there always looked jagged (and that's 16xAF 8xAA 8xTSS) but now it's really smooth.
Unreal_GFGTX980LP4_gamma-correction_0x00000003_step.png

Used to look like this (with NV_CP gamma)
Unreal_0x00000002_step.png

While I now see there isn't such a difference that I've taken screenshots, I surely see it while playing. I guess my monitor is at fault for that I don't know. But works for me and I shared it. :)

Crazy - now I tested another monitor and same bad results when on screen but not in screenshot (card's RAM).
I guess DVI-D ain't digital enough for these new tech wonders (and GTX980 ain't nothing new), what a messed up world!
0x00000003 is the solution for older monitors then. :ohwell:
 
Last edited:
It certainly could be something with your monitor, if you've altered its default sharpness values
 
I have Samsung 2253BW it's old I know, but it worked with my Radeons but with GTX980 it's just not cooperating. I wonder why not?
And I didn't alter anything really, just lowered the brightness and contrast some on the monitor itself, it doesn't even have sharpness settings... I did install factory color profile and it's really the same with or without it.
 
Dude, that's your belief, but I don't care and by the way forcing AA has nothing to do with it (tested it). I found the thing that solves NV_AA (though late) without any flag hunting and editing endless profiles and even published it. As I said - test it and report findings (0x00000002 vs 0x00000003 where applicable - obviously not for titles that can't use multisampling).
Then why don't you provide some solid examples of what you've tried this out with instead of just saying, "All kinds of games." I suspect people are asking because they want to validate your claims, which they can't do if they don't know what your testing process and methodology is, assuming you have one.
 
Indeed. It would appear that the monitor is to blame because in the screenshots nothing can be seen; I would have to test a display port monitor to see if the issue persists.
For now that 0x3 is a bandage or a hack to fix what appears to be a whole another issue... I'm quite surprised really and there I was so certain the 0x3 was the cure. :laugh:
 
Those Samsungs arent exactly known for color accuracy but certainly for oversaturation and other tweaks. Ive had them in front of me years ago. Compared to a decent IPS its night and day.

I have Samsung 2253BW it's old I know, but it worked with my Radeons but with GTX980 it's just not cooperating. I wonder why not?
And I didn't alter anything really, just lowered the brightness and contrast some on the monitor itself, it doesn't even have sharpness settings... I did install factory color profile and it's really the same with or without it.

You might want to go through the NVCP color settings instead before you start tweaking individual things outside of it. Not cooperating, is quite simply nonsensical. Back to stock settings and go from there. Color profiles dont do much if anything on sRGB. Color space is quite limited.

Gamma should be neutral on the GPU side first, then you tweak monitor however possible to hit your gamma value 2.1~2.3 depending on preference. Ideally 2.2. You can check the color response on this site then;


If results are still poor after this, you can tweak the gamma curve on the GPU side. But its always going to crush something along the way, or create some banding, etc. Perfect accuracy simply requires a neutral curve across all greyscale values and colors. And those Syncmasters are not going to offer it.
 
Last edited:
I have checked those LCD monitor test pages a few times before and now and everything looks ok there, I think I need to buy another monitor.
Would a Dell P2217 be good for now compared to this Samsung garb? There are a few used ones available... I'm quite perplexed and here I thought I had a decent monitor.
blush_2.gif
 
I have checked those LCD monitor test pages a few times before and now and everything looks ok there, I think I need to buy another monitor.
Would a Dell P2217 be good for now compared to this Samsung garb? There are a few used ones available... I'm quite perplexed and here I thought I had a decent monitor. View attachment 219188

If its about accuracy and regular 60 fps gaming, get an LG IPS at the bottom of the stack. Fwiw IPS is the best option anyway, VA at lower end is hit/miss, with a lot of potential misses. At high refresh rate this changes to an advantage for VA because bang/buck heavily leans in favor of VA compared to high refresh IPS.

Is 22 inch enough for you? The mainstream bog standard diagonal for 1080p is 24 inch. A new IPS should be around 100~150 bucks and its certainly preferable to another used-up ancient LCD. Uniformity and color accuracy do suffer over time as panels are used. It can be quite a bad experience diving into that second hand.

And even new TN, which could offer some more features within budget, is going to be miles better than past gen smaller diagonal panels.
 
Thank you so much.
I'll be using gamma correction 0x00000003 for now and with "DitherAlgo6"=dword:00000003 and "DitherAlgo8"=dword:00000003 (at active driver's registry location, you find its location at HKLM/Hardware/Devicemap/Video/Device/Video0), just looks so much better despite all attempts - nothing beats it.
But yeah I'll look at new 24" IPS monitors. Although 22" with resolution 1680x1050 is plenty good enough for me.
 
Last edited:
New finding (it might be just on my monitor though) - I now have the method to prove the first image with Gamma_Correction_MASK (0x00000003) in my upper post looks better than the second image with nvdia standard gamma correction (0x00000002):
Open both in any image viewing program that has the option to increase Gamma to 2.2 (under Image -> Color Corrections in Irfanview) and the first image with Gamma_correction_MASK will remain anti-aliased and
the second image with nvidia CP-standard color correction (0x00000002) will become jagged, as if not anti-aliased at all (and both were made with 8x multisampling).
 
the second image with nvidia CP-standard color correction (0x00000002) will become jagged
You cannot have alpha antialiasing(transparency antialiasing) if your gradients don't work.

It will calculate, but will nullify with wrong coefficient.
 
Except those surfaces aren't alpha!
All surfaces are alpha if you are using it to resolve blending in the viewing perspective. Alpha textures are applied to texture sampling to make antialiasing by default. That is how the rasterizer does its thing. There is no separate aa hardware. They are just z blenders and texture samplers.
 
Back
Top