Tuesday, September 21st 2021

NVIDIA Prepares to Deliver Deep Learning Anti-Aliasing Technology for Improved Visuals, Coming first to The Elder Scrolls Online

Some time ago, NVIDIA launched its Deep Learning Super Sampling (DLSS) technology to deliver AI-enhanced upscaling images to your favorite AAA titles. It uses proprietary algorithms developed by NVIDIA and relies on the computational power of Tensor cores found in GeForce graphics cards. In the early days of DLSS, NVIDIA talked about an additional technology called DLSS2X, which was supposed to be based on the same underlying techniques as DLSS, however, just to do image sharpening and not any upscaling. That technology got its official name today: Deep Learning Anti-Aliasing or DLAA shortly.

DLAA uses technology similar to DLSS, and it aims to bring NVIDIA's image-sharpening tech to video games. It aims to use the Tensor cores found in GeForce graphics cards, and provide much better visual quality, without sacrificing the performance, as it runs on dedicated cores. It is said that the technology will be offered alongside DLSS and other additional anti-aliasing technologies in in-game settings. The first game to support it is The Elder Scrolls Online, which now has it in the public beta test server, and will be available to the general public later on.
Source: via Tom's Hardware
Add your own comment

50 Comments on NVIDIA Prepares to Deliver Deep Learning Anti-Aliasing Technology for Improved Visuals, Coming first to The Elder Scrolls Online

#1
Crackong
Isn't DLSS already had AA effects built-in ?
Why doing the same thing twice ?

Or it is a cut down version of DLSS that focus on edges only so it is way easier to implement ?
Posted on Reply
#2
napata
Finally, after 3 years they finally released it. Better late than never. Hopefully they implement it in every game that has DLSS. As someone who didn't jump on the 4K bandwagon this is the most exciting part of DLSS and I was hyped for it when they revealed it ages ago. I've been doing it myself with a combination of DXR & DLSS but I assume a native solution is going to work better.

I'm not really a fan of upscaling as you're basically downgrading a ton of settings that scale with internal res and while it isn't a problem at 4K as the internal res is high enough, even when upscaling, it's certainly noticeable at lower resolutions, especially the RT effects. No upscaling technique can upcale effects so you're always stuck with low res effects if you use it.
CrackongIsn't DLSS already had AA effects built-in ?
Why doing the same thing twice ?

Or it is a cut down version of DLSS that focus on edges only so it is way easier to implement ?
If it works like they initially announced it, it's just DLSS with a renderscale of 100% or higher.
Posted on Reply
#3
wolf
Performance Enthusiast
Very interested to see this in action with my own eyes, the excellent AA component to DLSS as a standalone option on native res rendering when you do not need the performance boost. I wonder what parts of the DLSS pipeline are intact and how the fine details will be.
Posted on Reply
#4
Space Lynx
Astronaut
napataIf it works like they initially announced it, it's just DLSS with a renderscale of 100% or higher.
does this mean DLSS 2.0 will make more sense for 1080p gamers want to render higher? currently DLSS 2.0 only makes sense for 1440p and 4k gamers from what I understand. so this tech will make it worthwhile for 1080p wanting to scale up higher rez's?

@nguyen can you explain. its getting confusing as **** LOL
Posted on Reply
#5
VulkanBros
But how many games supports DLSS? I only have 2 games (Metro Exodus and Chernobylite) out of maybe 50 games.
Posted on Reply
#6
Vayra86
Still tied to Nvidia-only Tensor core support, still just as useless for long term implementation.

Good steps, zero strides. Still going to be interesting how this will pan out beyond the initial wave of support.
VulkanBrosBut how many games supports DLSS? I only have 2 games (Metro Exodus and Chernobylite) out of maybe 50 games.
This. We've been here a dozen times before. Nothing proprietary lasts longer than a few generations beyond its introduction. Even Gsync, which doesn't even require dev support, didn't last and got effectively replaced by something universal.

The real thing is going to happen when all three GPU families have 'cores' to do the work and can somehow share in the implementation/support.
Posted on Reply
#7
P4-630
Vayra86Even Gsync
I'm the proud owner of a monitor with a genuine G-Sync chip. :D
Posted on Reply
#8
londiste
VulkanBrosBut how many games supports DLSS? I only have 2 games (Metro Exodus and Chernobylite) out of maybe 50 games.
With the additions in the latest driver, 100+
lynx29does this mean DLSS 2.0 will make more sense for 1080p gamers want to render higher? currently DLSS 2.0 only makes sense for 1440p and 4k gamers from what I understand. so this tech will make it worthwhile for 1080p wanting to scale up higher rez's?
They seem to be doing the reasonable thing and not call this DLSS. The initial DLSS 2X name for this application did not make all that much sense and DLAA seems a good enough compromise. While the underlying technology is still the same, the application is different. DLAA does antialiasing, no upscaling here, runs on same native resolution and the only purpose is smoothing the edges (which, based on DLSS + DSR should get a pretty good result).
Posted on Reply
#9
nguyen
lynx29does this mean DLSS 2.0 will make more sense for 1080p gamers want to render higher? currently DLSS 2.0 only makes sense for 1440p and 4k gamers from what I understand. so this tech will make it worthwhile for 1080p wanting to scale up higher rez's?

@nguyen can you explain. its getting confusing as **** LOL
Well you can try DSR 2.25x on your 1080p screen and see that anti-aliasing is better resolved, DLAA is doing the same thing here, but has lower overhead (10%FPS vs 50%FPS or more with DSR).

With TAA being so dominant in today AAA games, DLAA can easily replace TAA because it's better and reduce the annoying flicker you can see in games, like the new Deathloop
Posted on Reply
#10
Bomby569
I must say i was an AMD man, then they completely destroyed my trust with the 5700xt. Honestly i was never even considering changing teams because of the freesync vs gsync thing. Now i'm completely sold on the green teams advantages, DLSS is amazing, keep them coming
Posted on Reply
#11
b4psm4m
AFAIK, if you want gsync ultimate then you still need a fully certified display.

I recently wanted to replace my Asus MG278Q 144 Hz gsync compatible (listed as compatible on nvidia's website) with an iiyama G-Master Red Eagle GB2770QSU-B1 27" WQHD IPS FreeSync Premium Pro; however, the monitor is not listed on nvidia's site as gsync compatible and I did not manage to find (in me allbeit limited research) anyone who had got the monitor to work with gsync. Additionally, you cannot use a gsync compatible display with HDR, you need a fully certified screen to do that.

It should be an interesting piece of tech, even if its main purpose turns out to be poking the competition into doing something similar.

Even if we think of DLSS and Gsync as proprietry/locked down technologies; I doubt AMD freesync/FSR would exist if the nvidia options did not (certainly their time to market would have been considerably longer). Whether you use it or not, where would hardware accelerated raytracing be without nvidia?

As an obervation, nvidia seem to always be setting the standard and then others follow. Maybe this scenario will improve with Intel entering the game?
Posted on Reply
#12
londiste
nguyenWell you can try DSR 2.25x on your 1080p screen and see that anti-aliasing is better resolved, DLAA is doing the same thing here, but has lower overhead (10%FPS vs 50%FPS or more with DSR).

With TAA being so dominant in today AAA games, DLAA can easily replace TAA because it's better and reduce the annoying flicker you can see in games, like the new Deathloop
TAA is a bit of an interesting case as the implementation has similarities to DLSS with the use of temporal information and overall similar source material. The results of TAA when it comes to antialiasing is a mixed bag though, it tends to blur the image considerably.

FXAA, MLAA and other postprocessing AA methods are usually notably worse than better methods like MSAA that have fallen out of favor due to excessive performance hit in today's rendering methods. If DLAA can slot in between in terms of performance hit with quality that is similar or better than MSAA, it will do just fine.
Posted on Reply
#13
Bomby569
b4psm4mAFAIK, if you want gsync ultimate then you still need a fully certified display.

I recently wanted to replace my Asus MG278Q 144 Hz gsync compatible (listed as compatible on nvidia's website) with an iiyama G-Master Red Eagle GB2770QSU-B1 27" WQHD IPS FreeSync Premium Pro; however, the monitor is not listed on nvidia's site as gsync compatible and I did not manage to find (in me allbeit limited research) anyone who had got the monitor to work with gsync. Additionally, you cannot use a gsync compatible display with HDR, you need a fully certified screen to do that.

It should be an interesting piece of tech, even if its main purpose turns out to be poking the competition into doing something similar.

Even if we think of DLSS and Gsync as proprietry/locked down technologies; I doubt AMD freesync/FSR would exist if the nvidia options did not (certainly their time to market would have been considerably longer). As an obervation, nvidia seem to always be setting the standard and then others follow. Maybe this scenario will improve with Intel entering the game?
Mine is not certified and works great. Try and search for some review/feedback.
I have no interest in HDR, for now anyway.
Posted on Reply
#14
napata
londisteTAA is a bit of an interesting case as the implementation has similarities to DLSS with the use of temporal information and overall similar source material. The results of TAA when it comes to antialiasing is a mixed bag though, it tends to blur the image considerably.

FXAA, MLAA and other postprocessing AA methods are usually notably worse than better methods like MSAA that have fallen out of favor due to excessive performance hit in today's rendering methods. If DLAA can slot in between in terms of performance hit with quality that is similar or better than MSAA, it will do just fine.
Aliasing is more than just jaggies though. FXAA, MLAA and honestly most AA methods don't do anything for pixel crawling, shader aliasing, shimmering, etc. They only adress jaggies and in current games they're just bad AA options.

Back in the day graphics were simple with almost no foliage & geometry. You could get away with not running a temporal AA method but nowadays that's just not really an option. You're left with either TAA, downsampling or a supersampling based AA method like SGSSAA.
Posted on Reply
#15
ne6togadno
b4psm4myou cannot use a gsync compatible display with HDR,
HDR works just fine on my gsync compatible monitor with games that support it (NMS to name one)
Posted on Reply
#16
spnidel
more proprietary brand-based tech segregation
boooooooo
Posted on Reply
#17
_Flare
The cheapest new card to use that tech costs 460 Euros atm in Germany. RTX 2060 6GB, it was launched in January of 2019
Posted on Reply
#18
Chrispy_
The problem with anything from Nvidia that starts with DL means that by default it won't work.

DLSS, DLAA - nonfunctional and unsupported until a title has been 'compiled' or whatever it is they do back at Nvidia by throwing it at their AI deep-learning supercomputer to train it on that game.

At launch, when a game is most popular and has the lowest performance is when DLSS and DLAA are most needed, and the very nature of requiring time for the game to get popular enough for Nvidia to take notice, feed it into Deep Thought or whatever, and then have it spit out a profile that sits and waits until the next bi-monthly game-ready driver to come around is TOO DAMNED SLOW. People have likely moved onto a newer title by then.
Posted on Reply
#19
Bomby569
Chrispy_The problem with anything from Nvidia that starts with DL means that by default it won't work.

DLSS, DLAA - nonfunctional and unsupported until a title has been 'compiled' or whatever it is they do back at Nvidia by throwing it at their AI deep-learning supercomputer to train it on that game.

At launch, when a game is most popular and has the lowest performance is when DLSS and DLAA are most needed, and the very nature of requiring time for the game to get popular enough for Nvidia to take notice, feed it into Deep Thought or whatever, and then have it spit out a profile that sits and waits until the next bi-monthly game-ready driver to come around is TOO DAMNED SLOW. People have likely moved onto a newer title by then.
The devs can implement DLSS, they don't need nvidia to do it, also 2.0 changed the way it works, there's no longer such a dependency on the AI supercomputers at nvidia headquarters. That's what i understood anyway,

You'll probably have launch games with DLSS that are nvidia branded and not on the ones that are AMD branded. The bi mounthly thing you mention is for older titles.
Posted on Reply
#20
Ravenas
Bomby569I must say i was an AMD man, then they completely destroyed my trust with the 5700xt. Honestly i was never even considering changing teams because of the freesync vs gsync thing. Now i'm completely sold on the green teams advantages, DLSS is amazing, keep them coming
The marketing of propiteary DLSS functions which can only be implemented by using tensor cores, to be countered via an open source FSR which works on any modern graphics card by any OEM. Same thing applies to GSync / Freesync. Before you go flaming, oh you're just an AMD fanboy, I've owned (2x) 3090 and I own a GSync monitor.
Posted on Reply
#21
nguyen
RavenasThe marketing of propiteary DLSS functions which can only be implemented by using tensor cores, to be countered via an open source FSR which works on any modern graphics card by any OEM. Same thing applies to GSync / Freesync. Before you go flaming, oh you're just an AMD fanboy, I've owned (2x) 3090 and I own a GSync monitor.
Yeah sure flipping those 3090s for profits then complain :rolleyes: .
Whatever suit your narrative man, I hope those 6900XT earn you money too

But of course we are talking about gaming here, where Nvidia techs are much much more interesting than some Lanczos derivative
Posted on Reply
#22
londiste
RavenasThe marketing of propiteary DLSS functions which can only be implemented by using tensor cores, to be countered via an open source FSR which works on any modern graphics card by any OEM. Same thing applies to GSync / Freesync. Before you go flaming, oh you're just an AMD fanboy, I've owned (2x) 3090 and I own a GSync monitor.
FSR is markedly inferior. It might get a 2.0 but improvements there are quite likely to follow what DLSS and XeSS are currently doing.
Wide support/adoption is a quality in itself but visually, there is a difference.
Posted on Reply
#23
nguyen
londisteFSR is markedly inferior. It might get a 2.0 but improvements there are quite likely to follow what DLSS and XeSS are currently doing.
Wide support/adoption is a quality in itself but visually, there is a difference.
FSR 1.0 can even reduce the Image Quality below the Internal resolution it's running at, for example 4K FSR Quality looks worse than just Upscaled 1440p.

Here is what Computerbase.de described FSR in Deathloop
All in all, even FSR on "Ultra Quality" looks clearly worse in Ultra HD than the native resolution - and that is still the best-case scenario. Even in comparison with a native resolution with the same number of pixels (Ultra HD + FSR on “Quality” versus WQHD), FSR simply doesn't want to look better, the opposite is actually the case. The lower quality levels only look worse. This also applies to FSR in lower resolutions.
So yeah, just free crap no one use.
Posted on Reply
#24
ZoneDymo
The problem with DLSS is....like so many projects from big N, just temporary, something will come along that works on everything, some global tech that will just replace it.
And then you will be looking back at those silly DLSS games from 10 years ago which no current gpu actually supports anymore.
Posted on Reply
#25
badboy87
How come they don't implement this in a more graphics-heavy game? ESO uses a heavily modified Morrowind engine so of course this new AA tech won't completely destroy your FPS. Sneaky tactics as usual from Nvidia. "sigh"
Posted on Reply
Add your own comment
Apr 30th, 2024 03:32 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts