Monday, February 18th 2019

NVIDIA: Image Quality for DLSS in Metro Exodus to Be Improved in Further Updates, and the Nature of the Beast

NVIDIA, in a blog post/Q&A on its DLSS technology, promised implementation and image quality improvements on its Metro Exodus rendition of the technology. If you'll remember, AMD recently vouched for other, non-proprietary ways of achieving desired quality of AA technology across resolutions such as TAA and SMAA, saying that DLSS introduces "(...) image artefacts caused by the upscaling and harsh sharpening." NVIDIA in its blog post has dissected DLSS in its implementation, also clarifying some lingering questions on the technology and its resolution limitations that some us here at TPU had already wondered about.

The blog post describes some of the limitations in DLSS technology, and why exactly image quality issues might be popping out here and there in titles. As we knew from NVIDIA's initial RTX press briefing, DLSS basically works on top of an NVIDIA neural network. Titled the NGX, it processes millions of frames from a single game at varying resolutions, with DLSS, and compares it to a given "ground truth image" - the highest quality possible output sans any shenanigans, generated from just pure raw processing power. The objective is to train the network towards generating this image without the performance cost. This DLSS model is then made available for NVIDIA's client to download and to be run at your local RTX graphics card level, which is why DLSS image quality can be improved with time. And it also helps explain why closed implementations of the technology, such as 3D Mark's Port Royal benchmark, show such incredible image quality scenarios compared to, say, Metro Exodus - there is a very, very limited number of frames that the neural network needs to process towards achieving the best image quality.
Forumites: This is an Editorial

The nature of DLSS means that the network needs to be trained for every conceivable resolution, since different rendering resolutions will require different processing for the image to resemble the ground truth we're looking for. This is the reason for Metro Exodus' limits in DLSS - it's likely that NVIDIA didn't actually choose not to enable it at 1080p with RTX off, it was just a case of the rendering time on its NGX cluster not being enough, in time for launch, to cover all of the most popular resolutions, with or without RTX, across its three available settings. So NVIDIA mated both settings, to allow the greatest image quality and performance improvements for those gamers that want to use RTX effects, and didn't train the network for non-RTX scenarios.
This brings with it a whole lot of questions - how long exactly does NVIDIA's neural network take to train an entire game's worth of DLSS integration? With linear titles, this is likely a great technology - but apply this to an open-world setting (oh hey, like Metro Exodus) and this seems like an incredibly daunting task. NVIDIA had this to say on its blog post:
For Metro Exodus, we've got an update coming that improves DLSS sharpness and overall image quality across all resolutions that didn't make it into day of launch. We're also training DLSS on a larger cross section of the game, and once these updates are ready you will see another increase in quality. Lastly, we are looking into a few other reported issues, such as with HDR, and will update as soon as we have fixes.
So this not only speaks to NVIDIA recognizing that DLSS image quality isn't at the level it's supposed to be (which implies it can actually degrade image quality, giving further credence to AMD's remarks on the matter), but also confirms that they're constantly working on improving DLSS' performance and image quality - and more interestingly, that this is something they can always change, server-side. I'd question the sustainability of DLSS' usage, though; the number of DLSS-enabled games is low enough as it is - and yet NVIDIA seems to be having difficuly in keeping up even when it comes to AAA releases. Imagine if DLSS picked up like NVIDIA would like it (or would they?) and expand to most of the launched games. Looking at what we know, I don't even think that scenario of support would be possible - NVIDIA's neural network would be bottlenecked with all the processing time required for these games, their different rendering resolutions and RTX settings.
DLSS really is a very interesting technology that empowers the RTX graphics card of every user with the power of the cloud, as NVIDIA said it did. However, there are certainly some quirks that require more processing time than they've been given, and there are limits to how much processing power NVIDIA can and will dedicate to each title. That the network needs to be trained again and again and again for every new title out there bodes well for a controlled, NVIDIA-fed games development environment, but that's not the real world - especially not with an AMD-led console market. I'd question DLSS' longevity and success on these factors alone, whilst praising its technology and forward-thinking design immensely.
Source: NVIDIA
Add your own comment

112 Comments on NVIDIA: Image Quality for DLSS in Metro Exodus to Be Improved in Further Updates, and the Nature of the Beast

#76
Aquinus
Resident Wat-man
EarthDogRight again. Nobody is saying it will be perfect. But they are saying it will get better.
There is an upper bound to how good it can get through, that's my point. That's the nature of ML. Accuracy will always be the bane of DLSS because it relies on ML.
Posted on Reply
#77
EarthDog
Hey hey! 3-3!!!! :)
EarthDogI was simply trying to convey that, over time, things should improve(....)Time will tell how much improvement we will see.
Posted on Reply
#78
Vayra86
EarthDogRight again. Nobody is saying it will be perfect. But they are saying it will get better.
Imagine if they'd say it would get worse over time... :rolleyes:

There is one certainty: it will not be predictable and thus not quite as consistent as many people would want or expect. Keep in mind we're just talking about an alternative to AA, but its being sold as a 'performance improvement with unnoticeable IQ loss'. Well, its as unnoticeable as a 192kbps MP3 is lossy - sounds like absolute shit.

There is just no way this investment will ever pay off, neither for gamers or for Nvidia. It is exactly as Aquinus says, you've got the hardware, why not utilize it. Which once again supports my stance on RTX: afterthought material. I'm not sure how many big fat warning signs people need to realize that...
Posted on Reply
#79
lexluthermiester
Vayra86Well, its as unnoticeable as a 192kbps MP3 is lossy - sounds like absolute shit.
Here's the flaw in your logic. Your example depends on several things.
1. That you use the correct encoder and not one sloppily put together by nitwits.
2. That the audio in question being encoded is of enough dynamic acoustic range to need more than 192kbps.
3. That the listener can actually hear the difference between 192kbps and 256kbps or 320kbps.
See if you use a crap encoder it doesn't matter what bitrate you use because it's going to sound like crap anyway. If you use a good encoder but will be playing the music back in a low quality device or in a car, the encoder/bitrate still won't matter. And if the music is hard rock or dialog, there will be very little benefit to using much over 192kbps because the differences will be hard to make out. If you're encoding high dynamic range audio(such as classical music) and will be playing back on high quality equipment and have good encoder than yes, you'll hear the difference between 192kbps and 320kbps.
Vayra86There is just no way this investment will ever pay off, neither for gamers or for Nvidia. It is exactly as Aquinus says, you've got the hardware, why not utilize it. Which once again supports my stance on RTX: afterthought material. I'm not sure how many big fat warning signs people need to realize that...
My point is; The reason your logic concerning RTX is flawed and incorrect is that you fail to see the points that count and matter. RTRT is going to replace the currently used imperfect and unrealistic lighting/rendering methods because ray-tracing is far more realistic. It's also got decades of proven use and learning behind it. The current lighting methods have reached the limit of what they can do and how far they can be taken before they literally become a form of ray-tracing. RTRT is the natural and logical progression development needs to go. The nay-saying you and others are continuing to regurgitate falls flat on it face becuase it's based on feelings instead of fact and merit based logic. RTRT is the future, the writing is on the wall. NVidia knows it, AMD knows it, Intel knows it, why don't you?
Posted on Reply
#80
Vayra86
lexluthermiesterHere's the flaw in your logic. Your example depends on several things.
1. That you use the correct encoder and not one sloppily put together by nitwits.
2. That the audio in question being encoded is of enough dynamic acoustic range to need more than 192kbps.
3. That the listener can actually hear the difference between 192kbps and 256kbps or 320kbps.
See if you use a crap encoder it doesn't matter what bitrate you use because it's going to sound like crap anyway. If you use a good encoder but will be playing the music back in a low quality device or in a car, the encoder/bitrate still won't matter. And if the music is hard rock or dialog, there will be very little benefit to using much over 192kbps because the differences will be hard to make out. If you're encoding high dynamic range audio(such as classical music) and will be playing back on high quality equipment and have good encoder than yes, you'll hear the difference between 192kbps and 320kbps.


My point is; The reason your logic concerning RTX is flawed and incorrect is that you fail to see the points that count and matter. RTRT is going to replace the currently used imperfect and unrealistic lighting/rendering methods because ray-tracing is far more realistic. It's also got decades of proven use and learning behind it. The current lighting methods have reached the limit of what they can do and how far they can be taken before they literally become a form of ray-tracing. RTRT is the natural and logical progression development needs to go. The nay-saying you and others are continuing to regurgitate falls flat on it face becuase it's based on feelings instead of fact and merit based logic. RTRT is the future, the writing is on the wall. NVidia knows it, AMD knows it, Intel knows it, why don't you?
Ah, the merit based logic argument.

My logic is the market, and so far its not moving, despite Nvidia's constant push. There is no buzz. People don't give a shit. And neither do I. Games look fine and games with RTRT are absolutely not objectively better looking, in fact, in terms of playability they are occasionally objectively worse. AMD knows it? Nah, AMD is waiting for the dust to settle - a very wise choice, albeit one out of pure necessity. Don't mistake a push from the industry with popularity. There have been many innovations that simply didn't get picked up and are now, at the very best, a niche, if they even still exist. You can look at VR for that as a recent example.

I understand your stance and I recognize it, its the same enthusiasm as we saw with VR. 'This is the future of gaming' people said. Both VR and RTRT are technologies that require a very high penetration rate to actually gain traction, because the initial expense is high and the competition (regular games) is extremely strong and can make competitive product at a much lower price.

Don't mistake 'points that count and matter' - to you - as points that are applicable to everyone. The market determines what technologies live or die, and we all represent an equal, tiny portion of that market.
Posted on Reply
#81
Xzibit
EarthDogRight, yep. I get it. Makes sense to use the hardware they put on the card.

I was simply trying to convey that, over time, things should improve. 3DMark is clearly a best case scenario due to its static and limited FPS in the benchmark, but I don't feel it is misleading/FUD/PR stunt. Time will tell how much improvement we will see.
I still think there is something else to it. People who've posted comparisons on Port Royal on Reddit (ew), The original non TAA non DLSS textures are much sharper and have greater detail but some how the modeling and the fit looks off. Like over grainy and something is up with intersections of models to textures that benefit when its blurry ala TAA and DLSS is applied. Non TAA DLSS the edges or intersection seam reflective





The vaseline effect is there along with the loss of texture detail the games are experiencing in FF, BFV and Metro.
Posted on Reply
#82
EarthDog
Something else like what?

If I didn't have a 9900K die on me and tested through 3 boards and endless configs today.. I would test it.
Posted on Reply
#83
Xzibit
EarthDogSomething else like what?

If I didn't have a 9900K die on me and tested through 3 boards and endless configs today.. I would test it.
Aside from the vaseline effect and texture loss. I would call that the norm for how DLSS.

The something else is (unless i'm missing the explanation) kind of edge reflects and grain you don't see in any of the games. Its like the RT settings of roughness is highlighted on the edges causing them to be reflective. Maybe thats the intent or a by product of RT object roughness reflective but it just looks off.
Posted on Reply
#84
EarthDog
No clue bud. All I know is that TPU wouldn't do anything intentionally as was referred to earlier.
Posted on Reply
#85
Xzibit
EarthDogNo clue bud. All I know is that TPU wouldn't do anything intentionally as was referred to earlier.
Not saying they did. They only showed the promo stuff TAA vs DLSS. No ones done much of a comparison outside of that.



Posted on Reply
#86
64K
Exodus received a patch today. One thing it addressed is DLSS fixes and improvements to sharpness . I don't have the game so I can't say if it made much of a difference yet.
Posted on Reply
#87
jabbadap
Well check this thread on resetera, there's some comparison pics after the patch. Vaselin effect is gone so at least is more usable now. They have to fix HDR and some nasty water shimmering still.
Posted on Reply
#88
Xzibit
jabbadapWell check this thread on resetera, there's some comparison pics after the patch. Vaselin effect is gone so at least is more usable now. They have to fix HDR and some nasty water shimmering still.
The softness is still there. Maybe not as much as before (have to see a comparison pre and post)
Posted on Reply
#89
lexluthermiester
Vayra86AMD knows it? Nah, AMD is waiting for the dust to settle - a very wise choice, albeit one out of pure necessity.
Or the reason Navi is taking so long is that they are working to add RTRT to it.
Vayra86I understand your stance and I recognize it, its the same enthusiasm as we saw with VR.
No it isn't. Not by a long shot. RTRT is something that can improve a gaming experience right away and without cumbersome headgear. VR might someday gain traction if they can make it less cumbersome to use and make a "killer app" for it. Raytracing is already a well used and proven technology and now it can be done real-time. RTRT is still in it's "growing pains" stage and will continue to improve.
Vayra86'This is the future of gaming' very few people said.
Fixed that for ya.
Vayra86Both VR and RTRT are technologies that require a very high penetration rate to actually gain traction
ALL new tech requires market adoption to succeed. Business 101. Hardly a startling revelation.
Vayra86Don't mistake 'points that count and matter' - to you - as points that are applicable to everyone.
Ah but they are. Everytime you go to the movies or watch TV, in one form or another, the good SFX you see are ray-traced. The points that count and matter already effect everyone whether they know it or not. Now it has come to gaming in real time. Arguing against it is a waste of time and energy. The efforts made by you and others nay-saying RTRT are tant-amount to trying to stop a tsunami with a foam cup. You can whine and complain all you want, but you can not and will not stop it.

Raytracing is the past, present and future. RTRT is the present and future. Why? Because it mimics nature and looks great when done right. And for similar reasons DLSS is going to be the future of pixel-edge-blending. Anti-aliasing was good for it's time, but it's time has passed.
Posted on Reply
#90
Vayra86
lexluthermiesterOr the reason Navi is taking so long is that they are working to add RTRT to it.

No it isn't. Not by a long shot. RTRT is something that can improve a gaming experience right away and without cumbersome headgear. VR might someday gain traction if they can make it less cumbersome to use and make a "killer app" for it. Raytracing is already a well used and proven technology and now it can be done real-time. RTRT is still in it's "growing pains" stage and will continue to improve.

Fixed that for ya.

ALL new tech requires market adoption to succeed. Business 101. Hardly a startling revelation.

Ah but they are. Everytime you go to the movies or watch TV, in one form or another, the good SFX you see are ray-traced. The points that count and matter already effect everyone whether they know it or not. Now it has come to gaming in real time. Arguing against it is a waste of time and energy. The efforts made by you and others nay-saying RTRT are tant-amount to trying to stop a tsunami with a foam cup. You can whine and complain all you want, but you can not and will not stop it.

Raytracing is the past, present and future. RTRT is the present and future. Why? Because it mimics nature and looks great when done right. And for similar reasons DLSS is going to be the future of pixel-edge-blending. Anti-aliasing was good for it's time, but it's time has passed.
Its really simple, seeing is believing. Not seeing all that much yet. VR had a very similar problem: content & killer apps.
jabbadapWell check this thread on resetera, there's some comparison pics after the patch. Vaselin effect is gone so at least is more usable now. They have to fix HDR and some nasty water shimmering still.
Its a sharpening filter and creates different artifacts. Not really much to write home about...
Posted on Reply
#91
lexluthermiester
Vayra86Its really simple, seeing is believing.
Yes, and what we've seen so far is impressive and cool.
Vayra86Not seeing all that much yet.
Gotta give it time. The market didn't see benefit's from Intel's MMX and SSE tech for over a year. AMD's 3DNow took about the same time. Developers need time to utilize the new tech, to refine it and make it shine. Cars don't instantly clean, buff and wax themselves. It takes time and hard work, even when you have tools. The nay-sayers need to get off their "instant gratification" high horse and be patient.
Posted on Reply
#92
Vayra86
lexluthermiesterYes, and what we've seen so far is impressive and cool.

Gotta give it time. The market didn't see benefit's from Intel's MMX and SSE tech for over a year. AMD's 3DNow took about the same time. Developers need time to utilize the new tech, to refine it and make it shine. Cars don't instantly clean, buff and wax themselves. It takes time and hard work, even when you have tools. The nay-sayers need to get off their "instant gratification" high horse and be patient.
That is just it. I'm very patient, and I think its too early. That is why I am not buying into Turing.
XzibitI still think there is something else to it. People who've posted comparisons on Port Royal on Reddit (ew), The original non TAA non DLSS textures are much sharper and have greater detail but some how the modeling and the fit looks off. Like over grainy and something is up with intersections of models to textures that benefit when its blurry ala TAA and DLSS is applied. Non TAA DLSS the edges or intersection seam reflective





The vaseline effect is there along with the loss of texture detail the games are experiencing in FF, BFV and Metro.
Wow, I can even notice the difference at this picture size...
Posted on Reply
#93
Steevo
lexluthermiesterYes, and what we've seen so far is impressive and cool.

Gotta give it time. The market didn't see benefit's from Intel's MMX and SSE tech for over a year. AMD's 3DNow took about the same time. Developers need time to utilize the new tech, to refine it and make it shine. Cars don't instantly clean, buff and wax themselves. It takes time and hard work, even when you have tools. The nay-sayers need to get off their "instant gratification" high horse and be patient.
Shitty performance that can be matched with better visuals at 60-70% rendering is impressive and cool? My "Instant Gratification" didn't happen for other tech like AF until we had perfect AF that now costs essentially nothing due to it being slowly integrated into the hardware pipeline with perfect angle independent performance for example, this class of Turing card will never offer that same level of "AA" by nature of it being new, highly untested, and low market adoption makes it the Physx card of the year. Its the equal to a houseboat, sure its a house, its a boat, but it does neither very well and all together someone is all wet.
Posted on Reply
#94
Aquinus
Resident Wat-man
Vayra86There is just no way this investment will ever pay off, neither for gamers or for Nvidia.
It matters for things like game consoles and mobile gaming because you want to squeeze every bit of performance out of it just to claim things like "runs games 4k". The reality is that it's just "smart" resolution scaling, using the tensor cores to scale it more realistically.

I want to make it clear that I don't think that this is bad, it's just disingenuous to call it in any way anti-aliasing, because it's not. It's really just smart scaling. It's the opposite of nVidia's DSR.
Posted on Reply
#95
Vayra86
AquinusIt matters for things like game consoles and mobile gaming because you want to squeeze every bit of performance out of it just to claim things like "runs games 4k". The reality is that it's just "smart" resolution scaling, using the tensor cores to scale it more realistically.

I want to make it clear that I don't think that this is bad, it's just disingenuous to call it in any way anti-aliasing, because it's not. It's really just smart scaling. It's the opposite of nVidia's DSR.
Fully agreed.
Posted on Reply
#96
lexluthermiester
SteevoShitty performance that can be matched with better visuals at 60-70% rendering is impressive and cool?
Have you actually seen the effect in question? 80-90fps is not "Shitty". BF5 is not that great a game, but it does show what can be done, to great effect, with raytracing. Maybe you're not seeing the subtleties?
SteevoMy "Instant Gratification" didn't happen for other tech like AF until we had perfect AF that now costs essentially nothing due to it being slowly integrated into the hardware pipeline with perfect angle independent performance for example
But that illustrates perfectly the point I was making, AF took time to perfect. That doesn't mean early implementations were perfect, but they were good enough to show the benefit on offer.
Steevothis class of Turing card will never offer that same level of "AA" by nature of it being new
That is not logical as stated.
Steevoand low market adoption makes it the Physx card of the year
Except that 2080ti's are still selling out and the RTX series is one of the fastest selling lines of GPU's in history. And "Physx" functionality is literally built in to every GPU NVidia makes making it one of the most adopted feature sets in the world.
Posted on Reply
#97
Xzibit
lexluthermiesterHave you actually seen the effect in question? 80-90fps is not "Shitty". BF5 is not that great a game, but it does show what can be done, to great effect, with raytracing. Maybe you're not seeing the subtleties?

But that illustrates perfectly the point I was making, AF took time to perfect. That doesn't mean early implementations were perfect, but they were good enough to show the benefit on offer.

That is not logical as stated.

Except that 2080ti's are still selling out and the RTX series is one of the fastest selling lines of GPU's in history. And "Physx" functionality is literally built in to every GPU NVidia makes making it one of the most adopted feature sets in the world.
Did we miss something you saw?... Just asking since Nvidia Gaming was down 46% since the intro of RTX. Unless we all missed something in their Financial report.

CEO is telling us it will take 3x times longer to sell cards and here your saying they are the fastest sellers in its GPU history.
Posted on Reply
#98
lexluthermiester
XzibitCEO is telling us it will take 3x times longer to sell cards and here your saying they are the fastest sellers in its GPU history.
Can you remember a time when cards were sold out consistently for a 6 month period? That's never happened before IIRC, even with the GTX1000 series cards. I have done more upgrades to RTX this generation than I've ever done before with the exception of the release of the Radeon 9700. Now maybe, just maybe, it's something weird going on in my part of the country. I doubt it, but maybe.
Posted on Reply
#99
Xzibit
lexluthermiesterCan you remember a time when cards were sold out consistently for a 6 month period? That's never happened before IIRC, even with the GTX1000 series cards. I have done more upgrades to RTX this generation than I've ever done before with the exception of the release of the Radeon 9700. Now maybe, just maybe, it's something weird going on in my part of the country. I doubt it, but maybe.
Well CEO shouldn't lie unless he wants to get sued by the SEC and possibly be booted out of his own company. Its hard to believe someone that makes a contradictory statement to what we just heard in a Financial report.
Posted on Reply
#100
lexluthermiester
XzibitWell CEO shouldn't lie unless he wants to get sued by the SEC and possibly be booted out of his own company. Its hard to believe someone that makes a contradictory statement to what we just heard in a Financial report.
Has the thought occurred to you that both could be true?
Posted on Reply
Add your own comment
Apr 16th, 2024 13:42 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts