• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA DLAA Anti-Aliasing

In my experience, the flickering and shimmering issue still exists. I've seen it in games Like Marvel Avengers and Control, and only recently. Perhaps the issue is less pronounced in newer titles that support DLSS, but the issue is very apparent to me.

Yeah the flickering is reduced to some extent with DLSS and sometimes DLSS can fully suppress the flickering, like in CP2077
 
I just looked at the 4K screens and TAA looks better (less Vaseline effect) then DLAA.
If NVIDIA can tweak DLAA a little more, it will be probably a little bit better then TAA but not by much.
 
SIM racing vr games need this bad. Other vr games too probably.

Bring it :)

Ams2 ACC AC rfactor2 iracing
 
regarding blur, even if one would perfectly scale a photo with analog, like a microfilm, always some blur occures because of the limited source.
in video output some people are okay with flickering parts of the picture, but not me
im okay with a little bit of blur, because i feel that as more natural ... my eyes are not as good as i was 15, so advancing blurriness is nothing new to me.
 
Pretty much how it always was, either you sacrifice anti-aliasing for more sharpness (sharpness filter 100%+you don't need AA at 4K type of guy) or you sacrifice sharpness for more stable image (SGSSAA and SSAA type of guy), and DLSS/DLAA are the most effective at suppressing shimmering and aliasing so far. What I would really want to see to is DLSS/DLAA implemented in a game with traditional MSAA where SGSSAA works well, that would be the comparison between the two best solutions to date, could maybe also throw some 2.25x DSR in the mix. Could probably help some people understand how good DLSS is, because aliasing and shimmering suppression with DLSS Quality would be quite close to MSAA+SGSSAA and sometimes even better, only the difference would be that the performance with DLSS would be massively increased instead of halved or worse.

What I would also like to see in comparisons is native res with DLAA compared against supersampled res with DLSS Quality, so for example 1440p DLAA vs 4K DSR DLSS Quality
 
Dammed if I can see any difference between DLAA & TAA at 4K.

..apart from the framerate dropping.

Move along, nothing to see here.

make it full screen and look especially on the trees inner texture and the building door in the back, it could just be sharpening and my silly brain saying "sharper is better" but TAA is noticebly sharper then DLAA and thus looks a tad better to me.
 
@maxus24 to quote you - "Compared to TAA and DLSS, DLAA is clearly producing the best image quality"

No it isn't, I zoomed in on each of the very cool side-by-side comparators and TAA was always the sharpest and had better FPS than DLAA. So what am I missing here fella..?
 
Call me strange, but I've been upscaling Elder Scrolls Online since closed beta and it looks exactly the same, honestly.

Also, all AA methods in use here create blur/washed out effects that reduce the definition of colors. For understandable reasons, but its there.

I'll take a no-AA upscale for this game any day before I'll add blur to the image... The fact is this engine isn't producing high amounts of detail to begin with, so you can AA all you want, but it will only add blur and muddy details further. The high res source simply won't offer more.

So yay, your tree branches and wires won't 'flicker' (its flickering now...lol, I think its just pixel density and you sitting too close to your screen, especially if you're running more than 1080p) but now you get them in blurry glory alongside the rest. To me that leaves the impression of 'creating problems to use solutions'. I'm more of a native res 1440p guy myself combined with a recommended view distance for the monitor size. No hassle, no need to stack 1160991 technologies on top to produce a marginally better image with inherent problems.

Totally pointless exercise for polishing turds. I suppose it makes you feel better about your tensor cores, or something :D

Native render is fine these days and native res is high enough...
 
Last edited:
But is your FPS the same using both methods?

Fun fact, in ESO, you're always limited to your CPU before your GPU, and the amount of network-related assets on screen. The reviewer even points this out in one or two situations, being CPU limited, and that's in an unpopulated area.

More people, your FPS takes a nosedive. Entered an instance like a dungeon? You get maximum FPS. The game runs on a toaster otherwise.
 
Yeah the flickering is reduced to some extent with DLSS and sometimes DLSS can fully suppress the flickering, like in CP2077
Thats one of the best things about DLSS in my experience. On quality modes it serves as a great AA method to stop flickering and staircase effects in games + improves performance on top.
 
Thats one of the best things about DLSS in my experience. On quality modes it serves as a great AA method to stop flickering and staircase effects in games + improves performance on top.

Yeah DLSS with Sharpen+ pretty much provide the best image quality IMO, no flickering and the sharpness adjustment make it look just right in every scenario, with +50% FPS to boot.
 
TAA has more sharper and both DLSS/DLAA has more blur. Look at pattern on the wall on right of image.

Edit : I mean 4K
i noticed that too ... funny enough if i needed AA i would go TAA but for 1440p and above i hardly find any AA or DLwhatever, needed
 
The same scene that is shown on screenshots at 4K with zero AA - 144 FPS.
1080p and 1440p stays at 180FPS because of CPU bottleneck in that particular scene.
so we're talking about a 111FPS to render a scene with DLAA when the same scene is 144FPS without DLAA?
In other words, DLAA robs 23% of the performance when Nvidia are claiming it would be "free" because DLAA is run on Tensor cores that aren't otherwise used by games and not through the shaders that are busy processing the frame....

Maybe I'm missing something here but 23% performance hit is certainly not "free"

FXAA is free.
TXAA is cheap.
TAA is moderately expensive, but worth it in most cases.
DLAA at 23% is expensive enough that you could just use DSR to get more detail instead.

What am I missing here? Perhaps Nvidia are right that DLAA has no shader cost because it runs on Tensor cores, but the tensor cores themselves are so shit that they're holding up the rendering pipeline?
 
so we're talking about a 111FPS to render a scene with DLAA when the same scene is 144FPS without DLAA?
In other words, DLAA robs 23% of the performance when Nvidia are claiming it would be "free" because DLAA is run on Tensor cores that aren't otherwise used by games and not through the shaders that are busy processing the frame....

Maybe I'm missing something here but 23% performance hit is certainly not "free"

FXAA is free.
TXAA is cheap.
TAA is moderately expensive, but worth it in most cases.
DLAA at 23% is expensive enough that you could just use DSR to get more detail instead.

What am I missing here? Perhaps Nvidia are right that DLAA has no shader cost because it runs on Tensor cores, but the tensor cores themselves are so shit that they're holding up the rendering pipeline?

You are missing the following:

DLAA versus TAA image quality difference is not great enough to justify 23% performance hit.


Aside from the above, this is a horrible game to use for launch of this.
 
so we're talking about a 111FPS to render a scene with DLAA when the same scene is 144FPS without DLAA?
In other words, DLAA robs 23% of the performance when Nvidia are claiming it would be "free" because DLAA is run on Tensor cores that aren't otherwise used by games and not through the shaders that are busy processing the frame....

Maybe I'm missing something here but 23% performance hit is certainly not "free"

FXAA is free.
TXAA is cheap.
TAA is moderately expensive, but worth it in most cases.
DLAA at 23% is expensive enough that you could just use DSR to get more detail instead.

What am I missing here? Perhaps Nvidia are right that DLAA has no shader cost because it runs on Tensor cores, but the tensor cores themselves are so shit that they're holding up the rendering pipeline?

it take ~2.1ms for tensor cores to output a 4K image, that means the cost of DLAA is less if you target 60FPS

Let say without AA, at 60FPS a single frame take 16.6ms, DLAA would add 2.1ms latency and reduce the FPS to 54FPS (18.7ms per frame), an 11%FPS drop.

Given that 4K DLSS Quality maintains excellent details, anti aliasing and texture flickering suppression, I would use 4K DLSS Quality with sharpen filter to have nice image quality and high framerate. DLAA is more suited for 1080p and 1440p

You are missing the following:

DLAA versus TAA image quality difference is not great enough to justify 23% performance hit.


Aside from the above, this is a horrible game to use for launch of this.

111/123FPS = 90%, that means DLAA take about 10%FPS but has much better image stability.
 
Last edited:
Well, looking at the sample images at 4k: TAA seems sharper and compromises textures the lessor amount compared to DLAA.
At 4k I don't use DLSS (even in quality mode) because how heavily it blurs an image compared to native res without AA or TAA.

However, maybe beyond static screen shots, DLAA can take care of artifacting, shimmer, pixel crawl etc., it may be worth a consideration. One can't make calls on this just from static images alone.
To me DLSS is a giant step backwards just to justify sales of RTX cards and the performance impact of ray tracing. Some of us are at 4k and 8k because we want greater resolution (at 8k I don't even use AA) and do not wish to run at a lower resolution than native (kinda defeats the purpose of having 4k or 8k monitors)

So maybe not in this one title, but DLAA may flourish to be an alternative to TAA . Let's hope and see, fingers crossed
 
so we're talking about a 111FPS to render a scene with DLAA when the same scene is 144FPS without DLAA?
In other words, DLAA robs 23% of the performance when Nvidia are claiming it would be "free" because DLAA is run on Tensor cores that aren't otherwise used by games and not through the shaders that are busy processing the frame....

Maybe I'm missing something here but 23% performance hit is certainly not "free"

FXAA is free.
TXAA is cheap.
TAA is moderately expensive, but worth it in most cases.
DLAA at 23% is expensive enough that you could just use DSR to get more detail instead.

What am I missing here? Perhaps Nvidia are right that DLAA has no shader cost because it runs on Tensor cores, but the tensor cores themselves are so shit that they're holding up the rendering pipeline?

Nvidia needs more purposes to justify its extra hardware and die size against the competition. Thats what we are missing here. It was never about raw perf or high quality. Its about funding RTX. Realistically this is so marginal of an actual change, its just that: "different".

And proprietary. Shareholders and Huang figured out two gens ago it was becoming end of profit for GPUs as raw perf is sufficient to push content at very high res, one where you enter the realm of 'beyond desktop'. High refresh and 4K got slaughtered rather quickly.

They needed new hurdles to waste clock cycles on. A simple effect wasnt going to cut it, but RT is so difficult, it will last for decades AND creates another perf metric alongside raster perf. Two things to upsell instead of one...
 
So just like DLSS this just another nvidia AA solution but without upsampling built into it. I guess that's nice for small studios that cant come up with their own solution? Hopefully more solutions will make their way slowly onto the PC since they've been in use for a while now in AAA console titles. (and some of them perhaps even better than nvidia's)
 
Upscaled low resolution image it is. Better to render on higher resulotion for smal screen. clearer, sharper image.
 
Nvidia needs more purposes to justify its extra hardware and die size against the competition. Thats what we are missing here. It was never about raw perf or high quality. Its about funding RTX. Realistically this is so marginal of an actual change, its just that: "different".

And proprietary. Shareholders and Huang figured out two gens ago it was becoming end of profit for GPUs as raw perf is sufficient to push content at very high res, one where you enter the realm of 'beyond desktop'. High refresh and 4K got slaughtered rather quickly.

They needed new hurdles to waste clock cycles on. A simple effect wasnt going to cut it, but RT is so difficult, it will last for decades AND creates another perf metric alongside raster perf. Two things to upsell instead of one...
Hot takes: There's no pure "gaming GPU" anymore at Nvidia, I believe that GeForce is a prosumer brand now. phones have been using their machine learning hardware for photo and video operation for a while now, those kind of thing are just making their beginning on the PC space with Adobe Photoshop and the like, Nvidia has been trying really hard to sell their RTX laptop as a creative tool, a market that got many apple afficionado, so I really doubt that we'll see them getting rid of it in their GPU, and let Apple (and soon intel) say stuff like : "you won't find machine learning acceleration in competing devices".

Even beyond gaming, Microsoft seemed pretty happy about adding on-device machine learning and A.I to windows, but the general public doesn't seems to care :D
AI Platform for Windows Developers - Windows Developer Blog

1632431496217.png
 
Everytime i see this tool to compare i think "Why not just let us compare everything with everything?" Like now, what has a better quality overall, "1080P + DLAA" (178FPS) or "4K + DLSS Performance Mode" (187FPS) or "1440P + DLSS Quality" (187FPS) or ....? Because, if "4K / 1440P + DLSS / TAA" is better and has more frames then "1080P + DLAA" becomes totally useless, right?
 
Last edited:
Why not just let us compare everything with everything?
Hm, yes. Agreed. Comparing 2 at a time from an image is not that useful.
Also: movement looks a lot different using these features, so a still image is often not enough to judge. :rolleyes:
 
Back
Top