• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Editorial NVIDIA DLSS and its Surprising Resolution Limitations

Now the RTX milked owners need to do the maths what monitors they need to buy if they want to use DLSS. :D
 
Eh, pretencious nonsense.
 
@VSG are you able to determine if it is the driver imposing the limitation, the RTX API, or the game itself? If the game itself has all of this extra code baked in, that is very concerning. For example, what happens 20 years from now with new cards on old games? It breaks the many decades old paradigm of putting the options in the hands of the players. That doesn't sit right with me.
 
@VSG are you able to determine if it is the driver imposing the limitation, the RTX API, or the game itself? If the game itself has all of this extra code baked in, that is very concerning. For example, what happens 20 years from now with new cards on old games? It breaks the many decades old paradigm of putting the options in the hands of the players. That doesn't sit right with me.
Considering this is different between Metro and BFV, it's either in the game or in the game profile in the driver.
 
such that if a GPU is not being challenged enough, DLSS is not going to be made available [...]
It would thus be fair to extrapolate why the RTX 2080 Ti does not get to enjoy DLSS at lower resolutions, where perhaps it is not being taxed as hard.

The concept of buying the most powerful card money can buy (@ $1200+) and find out it can't run games with the proudly advertised proprietary premium flagship features (that even a 2060 can), because... it's just too powerful for that (?!)... is just mind-boggling...

Typical buyer (at least inside the mind of nVidia):
- "This new tech sounds awesome! No way I gonna miss out on this bandwagon. Just to be safe I'll buy a more expensive model, so I can enjoy this new stuff and not have to worry about performance hits, resolutions or whatever. Unlimited power!"
Yep, bamboozled.
 
One thing I just thought of...FFXV supports DLSS across all resolutions and gets high fps with 2080ti at 1080p... so... is it really a fps limitation?? Cant say I buy that considering...
 
Last edited:
One thing I just thought of...FFXV supports DLSS across all resolutions and gets high fps with 2080ti at 1080pm... so... is it really a fps limitation?? Cant say I buy that considering...

No, its a budget limitation. Remember, its a deep learning process. Nvidia isn't putting whole farms on DLSS, just the bare minimum they need to get acceptable results. Why do you think its a slow trickle of DLSS enabled games? Why do you think its completely not consistent in its implementation? Nvidia is finding ways to push this feature without massively exceeding a budget available for it, while providing somewhat acceptable results, AND not showing too much of what is going on under the hood (@Steevo 's very plausible explanation of latency issues).

This is the core of my RTX-hate. Cost. DLSS has a very shaky business case, and RTX even more so - both for the industry and for Nvidia itself. Its a massive risk for everyone involved and for all the work all this deep learning and brute forcing is supposed to 'save', other work is created to keep Nvidia busy and developers struggling. And the net profit? I don't know... looking at those Metro screenshots I sure as hell don't prefer the RTX world they show, and the benefit of DLSS is far too situational.
 
looking at those Metro screenshots I sure as hell don't prefer the RTX world they show, and the benefit of DLSS is far too situational.
Remember that Metro was built with some other AO tech. They did not have RTX until rather late in the development process. How the lighting looks like at this point is not really a technical discussion but an art one - did level/art designers know what the level will look like with the RTX AO when they created it?

Physically and logically the screenshots with RTX do look more correct but that does not have a direct relevance to looking better or more playable. As you said, some of these screenshots - especially with closed off indoor areas - are too dark. If you think about it, it makes perfect sense. But it does not make the area good to go through in the game if you cannot see anything.
 
Remember that Metro was built with some other AO tech. They did not have RTX until rather late in the development process. How the lighting works at this point is not really a technical discussion but an art one - did level/art designers know what the level will look like with the RTX AO when they created it?

Physically and logically the screenshots with RTX do look more correct but that does not have a direct relevance to looking better or more playable. As you said, some of these screenshots - especially with closed off indoor areas - are too dark. If you think about it, it makes perfect sense. But it does not make the area good to go through in the game if you cannot see anything.

So what's next, an RT future with carefully placed 'soap opera' scenes you walk through, perfectly lit like a movie set ;) I can totally see it happening and being sold as 'realism' :D

The more I see of this technology and its effects, the more I get convinced it really serves a niche, if that, at the very best. Its unusuable in competitive gaming, its not very practical in any dark (immersive?) gameplay setting which happens to be a vast majority of them, and it doesn't play well with existing lighting systems either.

Dead end is dead...
 
So what's next, an RT future with carefully placed 'soap opera' scenes you walk through, perfectly lit like a movie set ;) I can totally see it happening and being sold as 'realism' :D
What do you mean? Game levels pretty much are a movie set. Set pieces are perfectly placed, lighting is tuned to look its best and the moment you meddle with something in there, it will look out of whack.
 
  • Like
Reactions: bug
What do you mean? Game levels pretty much are a movie set. Set pieces are perfectly placed, lighting is tuned to look its best and the moment you meddle with something in there, it will look out of whack.

You said it. Lighting is tuned. RT depends not on tuning but on the actual light sources and their intensity. You can't tune much without tuning it everywhere in the same way.
 
You said it. Lighting is tuned. RT depends not on tuning but on the actual light sources and their intensity. You can't tune much without tuning it everywhere in the same way.
That is why I said it is a level design/art question. You simply add light sources to where these are needed.
 
You said it. Lighting is tuned. RT depends not on tuning but on the actual light sources and their intensity. You can't tune much without tuning it everywhere in the same way.
Actually, you can tune it. You can tune the number, intensity and placement of individual light sources.
There is one light source that can't have its placement changed and that's the Sun (global illumination). Incidentally, global illumination is exactly what Metro uses RTX for. As cards become more powerful, they'll be able to handle both global and point light sources. But even then you'll still be able to claim rasterization looks better to you because... well... there's no metric for that.
 
Actually, you can tune it. You can tune the number, intensity and placement of individual light sources.
There is one light source that can't have its placement changed and that's the Sun (global illumination). Incidentally, global illumination is exactly what Metro uses RTX for. As cards become more powerful, they'll be able to handle both global and point light sources. But even then you'll still be able to claim rasterization looks better to you because... well... there's no metric for that.

Seeing is believing - what I'm seeing today is nowhere even close to a benefit to image quality. Or do you think this is a leap forward? Either in BFV or Metro? And 'as cards become more powerful' - you're already looking at a >700mm² GPU for the current performance. Good luck with that, with one or two node shrinks to go.

Honestly, if it objectively looks better I'm converted, but so far, the only instance of that has been a tech demo. And only ONE of them at that, the rest wasn't impressive at all, using low poly models or simple shapes only to show it can be done in real time.
 
Seeing is believing - what I'm seeing today is nowhere even close to a benefit to image quality. Or do you think this is a leap forward? Either in BFV or Metro?
Have not seen Metro beyond screenshots. In BFV DXR reflections are a benefit to image quality. Situational and ill-placed in a multiplayer shooter due to performance hit but a definite benefit.
 
So, if FF XV has the ability to run DLSS at lower res and high FPS.. why not a AAA title like BF V? It makes no sense at all that this is a monetary or FPS limitation considering what we see around us. As time goes on and we see more implementations (remember, the only thing that was scheduled to come out in 2018 was SOTR and BFV... we will see more as the year goes on according to the lists published in 9-2018) we'll be able to see how this shakes out.

But yeah, interesting editorial...thought provoking. :)
 
Last edited:
Good luck with that, with one or two node shrinks to go.

I think next gen will tell us where RTX is heading. Someone pointed, I think @notb, out in previous thread that we are averaging about 7 DX12 titles per year and it has been declining since 2016. Is it too complicated? I don't think so as it was supposed to bring console level development to PC. Is DX12 too expensive to develop for? Likely. Add together now having to reimagine your lighting environment and there will definitely be a curve. That's not to mention the cards aren't really powerful enough to take advantage of what RTX was meant for. Both of these should be mitigated with another year of working with it and maybe another gen of more powerful RTX.

I keep harping on it but all the virtues of DX12 we were told were coming to games have not happened yet so I don't hold my breath for these. Of the new tech introduced, I think DLSS has the best chance to be successful as it gets tuned.

So, if FF XV has the ability to run DLSS at lower res and high FPS.. why not a AAA title like BF V?

I don't understand what the benefit would be, honestly. If you are already at high frames, what do you need it for? At this point, it lowers image quality to the point where you could tweak your settings to achieve the same effect.

Would it be simply to say that I run 'ultra' vs saying I run 'mostly high'?
 
Last edited:
So, if FF XV has the ability to run DLSS at lower res and high FPS.. why not a AAA title like BF V? It makes no sense at all that this is a monetary or FPS limitation considering what we see around us. As time goes on and we see more implementations (remember, the only thing that was scheduled to come out in 2018 was SOTR and BFV... we will see more as the year goes on according to the lists published in 9-2018).

But yeah, interesting editorial...thought provoking. :)

FFXV was the first one to get the treatment, so they went all the way. BFV was an RTX 'launch' title, and Metro Exodus is neither of those and has the lowest degree of support thus far (ánd the longest dev cycle since RTX launch!). I don't see how this defeats the budget argument (yet - as you say, time will tell us more).

Another aspect I think we might overlook is the engine itself. I can totally understand that some engines are more rigid in their resolution scaling and support/options/features. Case in point with Metro Exodus: it can only run in fullscreen at specific resolutions locked to desktop res.

I think next gen will tell us where RTX is heading. Someone pointed, I think @notb out in previous thread that we are averaging about 7 DX12 titles per year and it has been declining since 2016. Is it too complicated? I don't think so as it was supposed to being console level development to PC. Is DX12 too expensive to develop for? Likely. Add together now having to reimagine your lighting environment and there will definitely be a curve. That's not to mention the cards aren't really powerful enough to take advantage of what RTX was meant for. Both of these should be mitigated with another year of working with it and maybe another gen of more powerful RTX.

I keep harping on it but all the virtues of DX12 we were told were coming to have not happened yet so I don't hold my breath for these. Of the new tech introduced, I think DLSS has the best chance to be successful as it gets tuned.

Spot on. Its all about money in the end. That is also why I think there is a cost aspect to DLSS and adoption rate ties into that in the most direct way for Nvidia. And it also supports my belief that this tech won't take off under DXR. The lacking adoption of DX12 has everything to do with workforce and money, too, and not with complexity. If its viable, complexity doesn't matter, because all that is, is additional time (and time is money). Coding games for two radically different render techniques concurrently is another major time sink.
 
Last edited:
What happens when you initially cap the frame rate or force higher AA to keep the frame rate lower? Is the use of DLSS predetermined? Also how does DSR play into any of this. It honestly just seems and feels as if there is more to this than Nvidia lets on. It also makes the marketing of DLSS feel like a serious case of snake oil or bait and switch. I can start to see why it's basically been not much more than a tech demo if it were better and easier to use and implement in beneficial ways we'd probably be seeing more widespread use of it by now rather than 2 or 3 AAA studio's. Sure it's new, but even among AAA studio's it's scarcely been demonstrated to date and they'd have earlier acess than the public to inner works of it and how to ready implementing it as well.

This really lowers the appeal of it and limits it's usage especially for the weaker card's though it's limiting for the stronger cards as well at the same time. It's like they are trying to tie it strictly to ultra high resolution and/or RTRT only. The thing is I don't think people that bought the cards with DLSS more in mind bought it with that handicap in the back of their heads this is almost akin to the 3.5GB issue if it's as bad as it sounds.
 
I don't understand what the benefit would be, honestly. If you are already at high frames, what do you need it for? At this point, it lowers image quality to the point where you could tweak your settings to achieve the same effect.

Would it be simply to say that I run 'ultra' vs saying I run 'mostly high'?
It looks like the cutoff for BF V, give or take a few FPS, is around 60-70FPS, no? I wouldn't call that high FPS. Some want 75/120/144/165 FPS/MHz. Many who do sacrifice IQ anyway to get there... so this is simply another option?
 
I keep harping on it but all the virtues of DX12 we were told were coming to have not happened yet so I don't hold my breath for these.
To get the benefits from DX12 (or Vulkan) you need damn good developers. And, due to lower level API you are required to do a lot of legwork yourself which does include writing (pieces of) rendering paths that are better for GPUs of one vendor or another. Or alternatively - writing several. This adds to time and cost of developing a DX12 engine or game. There is really a limited amount of examples out there where DX12 is a complete benefit.

Sniper Elite 4, Shadow of Tomb Raider. With some concessions, Rise of Tomb Raider and Hitman with latest patches and drivers.
There are some DX12-only games with good performance like Forzas or Gears of War 4 but we have no other APIs in them to compare to.
Metro Exodus seems to be one of the titles where DX12 at least does not hurt which is a good thing (as weird as that may sound).
I am probably missing 1-2 games here but not really more than that.

As an AMD card user there is a bunch of games where you can get a benefit from using DX12 but not from all DX12 games. Division, Deus Ex: Mankind Divided, Battlefields.
As an Nvidia card user you can generally stick to DX11 for the best. With very few exceptions.
 
Last edited:
RTX really means, Real Time Xtra complications.
 
It looks like the cutoff for BF V, give or take a few FPS, is around 60-70FPS, no? I wouldn't call that high FPS. Some want 75/120/144/165 FPS/MHz. Many who do sacrifice IQ anyway to get there... so this is simply another option?

Agreed, I think the restrictions are currently because NV is doing the learning where it will make the most impact.

To get the benefits from DX12 (or Vulkan) you need damn good developers. And, due to lower level API you are required to do a lot of legwork yourself which does include writing (pieces of) rendering paths that are better for GPUs of one vendor or another. Or alternatively - writing several. This adds to time and cost of developing a DX12 engine or game. There is really a limited amount of examples out there where DX12 is a complete benefit.

Sniper Elite 4, Shadow of Tomb Raider. With some concessions, Rise of Tomb Raider and Hitman with latest patches and drivers.
There are some DX12-only games with good performance like Forzas or Gears of War 4 but we have no other APIs in them to compare to.
Metro Exodus seems to be one of the titles where DX12 at least does not hurt which is a good thing (as weird as that may sound).
I am probably missing 1-2 games here but not really more than that.

As an AMD card user there is a bunch of games where you can get a benefit from using DX12 but not from all DX12 games. Division, Deus Ex: Mankind Divided, Battlefields.
As an Nvidia card user you can generally stick to DX11 for the best. With very few exceptions.

So considering everything you just wrote, what does that say for the likely implementation of RTX going forward? Your last sentence spells it out the best.
 
Spot on. Its all about money in the end. That is also why I think there is a cost aspect to DLSS and adoption rate ties into that in the most direct way for Nvidia. And it also supports my belief that this tech won't take off under DXR. The lacking adoption of DX12 has everything to do with workforce and money, too, and not with complexity. If its viable, complexity doesn't matter, because all that is, is additional time (and time is money).
There's also another way to look at this problem.
DX12 hasn't really offered anything interesting for the consumer. A few fps more is not enough. And yes, it's harder to code and makes game development more expensive.
RTRT could be the thing that DX12 needs to become an actual next standard.
 
Last edited:
Seeing is believing - what I'm seeing today is nowhere even close to a benefit to image quality. Or do you think this is a leap forward? Either in BFV or Metro? And 'as cards become more powerful' - you're already looking at a >700mm² GPU for the current performance. Good luck with that, with one or two node shrinks to go.
It's not only the fab node (though that is a big part of the equation). With the data collected from Turing, it's possible to fine tune the hardware resources (e.g. the ratio of CUDA cores:tensor cores:RT cores or something like that).
 
Back
Top