• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Updates Cyberpunk 2077, Minecraft RTX, and 4 More Games with DLSS

I swear that the DLSS and RT conversation is so reminiscent of the PhysX conversation when some Batman and COD Games had it but where is it now?
 
In games like Control or Death Stranding, 4K DLSS looks a lot better than 4K native, better definition, better details in textures, and so on

IMO that's not correct. Control with DLSS is a blurry mess, artifacts, blurry textures (look at the signs). It's not anywhere near to a good looking game at native res by any means. Take a look at Gears 5 with ultra textures for example. That's a proper and modern graphically speaking game which looks stuning, without RT/DLSS features, and running at solid 100fps at 1440p ultra on a RTX 2080.

But don't get me wrong, I think DLSS is a great feature to have in hand if you have an underpowered GPU for the latest and greatest games, allowing you to run, for example, 1440p High DLSS instead native 1080p. In that situation, DLSS 1440p will look a lot better than running 1080p native in a 1440p monitor. Also it's great if you want more fps at the same resolution, but please don't spread the word DLSS is the holy grail at game graphics because it's not. A good developed game will always look better at native than any upscaled method.

We are in the infancy of RT. I think it's the future, but right now we don't have the GPU power to make it viable at decent framerates without DLSS, and sometimes even with DLSS enabled the performance is not there. We need devs to make a clever use of RT too, because in several games the effects are so subtle and the performance hit so high that it's only included to tick a box in game's specs.
 
Welp, then just game at 1440p and claim it's better than 4k, dude... :D

I'm not claiming anything - it just legitimately doesn't look any different to me. For AMD actually think Virtual Super Resolution + RIS is amazing for games that can't hit a specific FPS target at 4k.

Free FPS with no perceptible IQ loss is kind of awesome.
 
Last edited:
Not sure if you understand, 4K DLSS is still 4K, all the pixel counts are there.
Normal upscaling is like stretching an jpeg image to make it bigger, it will look blurry. DLSS is like stretching a PDF file, it will look sharp no matter how you stretch it because all the missing details are filled it (reconstructed).

Another benefit to DLSS is that monitor only display their best image quality at their native resolution, using non native resolution will make it look blurry. You can compare IQ with a 4K monitor running upscaled 1440p vs a native 1440p screen, the 1440p screen will look better.

Not quite right on the first part, but i agree with the rest

4k with DLSS is a 4K 2D output, with a scaled/stretched 1440p (i believe) 3D rendering.
It's like loading a 1080p video on a 4k screen - sure all the pixels are lit up and in use, but some are just using repeated information. The key to DLSS is that its tweaked to look better than typical examples of this.

I'm all for this tech, as it allows us to have a good desktop resolution without the gaming performance hit... but it'll never be quite as good as native resolution (we seem to agree on that)

Some people may well like DLSS and say it looks better depending on what they see - some people want 32x TAA so not a single jaggy is in sight, while others are offended by the blurry edges that would cause
 
In my opinion, DLSS is a game changer purely from the standpoint that its able to run a game faster at a given "resolution", and in some cases, sharper texture. In the case of Cyberpunk, do note that performance mode may mean 1080p or 720p, which I don't believe upscaling will give you an image that is similar to 4K. It is some software trick, but it is not a miracle. Something has to give. Most people generally compare Quality mode, which is likely an upscale from 1440p to 4K, so I think the quality loss can easily be compensated as compared to upscaling from 1080p or 720p.

In any case, Cyberpunk just revealed how underpowered the current generation of GPUs are for the purpose to utilizing multiple ray tracing techniques at the same time. Previous AAA RT titles are mostly confined to using RT for either shadows, lighting or reflection, and therefore the performance penalty is still acceptable. So ultimately, gamers will still need to decide if they want to play Cyberpunk at higher resolution and FPS vs having RT on.
 
I want to game at 165Hz, so RT is going in the bin
 
Not quite right on the first part, but i agree with the rest

4k with DLSS is a 4K 2D output, with a scaled/stretched 1440p (i believe) 3D rendering.
It's like loading a 1080p video on a 4k screen - sure all the pixels are lit up and in use, but some are just using repeated information. The key to DLSS is that its tweaked to look better than typical examples of this.

I'm all for this tech, as it allows us to have a good desktop resolution without the gaming performance hit... but it'll never be quite as good as native resolution (we seem to agree on that)

Some people may well like DLSS and say it looks better depending on what they see - some people want 32x TAA so not a single jaggy is in sight, while others are offended by the blurry edges that would cause

Now that really depends on whether you are using any AA technique with Native Resolution.
_Native Res without any AA - jaggies, some games can be quite distracting even on high PPI screens.
_Native Res with MSAA - Highest IQ but Low FPS
_Native Res with TAA - No jaggies but Lower IQ
_Native Res with DLSS - IQ is between TAA and MSAA with massive performance improvement, however there is some visual anomaly sometimes.

With the performance boost that come with DLSS 2.0 though, just enable it in any game that support it. Not using it is like not using autopilot when you drive a Tesla :D.
 
I just see it as: DLSS with max settings vs DLSS off with some lower settings

Pick what pleases you... but seeing all these games barely manage 60FPS is laughable, i want my 144+ damnit
 
I just see it as: DLSS with max settings vs DLSS off with some lower settings

Pick what pleases you... but seeing all these games barely manage 60FPS is laughable, i want my 144+ damnit

DLSS OFF with a lot of lower settings :D.
Well nothing stopping from breaking the game grahics to get your doses of 144+ though, try DLSS Ultra Performance :roll: with everything on Low.
 
DLSS OFF with a lot of lower settings :D.
Well nothing stopping from breaking the game grahics to get your doses of 144+ though, try DLSS Ultra Performance :roll: with everything on Low.

depends whether my 3080 arrives before the game downloads... on my 1080 i'll be expecting low to medium with what i've been seeing
 
depends whether my 3080 arrives before the game downloads... on my 1080 i'll be expecting low to medium with what i've been seeing

Well my laptop with 2070 Super MQ should perform around the GTX 1080 level, I should be able to play at 1080p Ultra with DLSS quality :D, just waiting for download to finish now.
 
Im not seeing a difference here... what did DLSS 2 do to it?

On a zoomed in shot like that theres a tiny color difference with the blood in the hair maybe?

Its not even the exact same screenshot, any kind of lighting in the game could have altered the pic already. Its just medi01 being medi01, and DLSS is the last bastion to whine about really this gen.

I think the technical and performance advantage of DLSS is apparent. The problem is implementation. I'm hoping this will go the Gsync way, with a widely available version broadly applied. Because again, we can explain this in two ways... either Nvidia is doing it right but only for itself... or AMD is late to the party. Its clear as day we want this everywhere and if both red and green are serious about RT, they will NEED it.

depends whether my 3080 arrives before the game downloads... on my 1080 i'll be expecting low to medium with what i've been seeing

Yep.... Im scared tbh :p But it seems like we can squeeze acceptable perf from it still. Only just.
 
Last edited:
it just legitimately doesn't look any different to me.
Oh, please...
And perhaps you legitimately should just lower resolution to 1440p.... ;)

Its not even the exact same screenshot,
It is screenshot that is used to illustrate how good DLSS is... Good to know it's "not the same screenshot" once issues that TAA is known for pop up... :D
 
Oh, please...
And perhaps you legitimately should just lower resolution to 1440p.... ;)
Why would I do that when I can run 1440P upscaled to 4k ?

Plus can you even tell the difference with that 40" 1080P? or does everything look like a vaseline smear? ;)


Its not even the exact same screenshot, any kind of lighting in the game could have altered the pic already. Its just medi01 being medi01, and DLSS is the last bastion to whine about really this gen.

I think the technical and performance advantage of DLSS is apparent. The problem is implementation. I'm hoping this will go the Gsync way, with a widely available version broadly applied. Because again, we can explain this in two ways... either Nvidia is doing it right but only for itself... or AMD is late to the party. Its clear as day we want this everywhere and if both red and green are serious about RT, they will NEED it.


The weird thing is AMD isn't THAT late to the party - you can absolutely use super resolution and then Radeon image sharpening to get almost the same effect, it's just not marketed at all, and not packaged up nicely with a toggle like DLSS - the one advantage it does have is that you don't need to upscale 1440P but drop to like 1800P, gpu upscale and sharpen for an easy 15% fps boost with minimal quality drop.

DLSS is awesome right now -- going to be playing some Cyberpunk on my 4k TV, and there is no way the 2080ti would make it a fun time without DLSS even with normal rasterization, and upgrading the card is not even an option since there's not cards to buy lol.
 
Last edited:
Why would I do that when I can run 1440P upscaled to 4k ?

Plus can you even tell the difference with that 40" 1080P? or does everything look like a vaseline smear? ;)





The weird thing is AMD isn't THAT late to the party - you can absolutely use super resolution and then Radeon image sharpening to get almost the same effect, it's just not marketed at all, and not packaged up nicely with a toggle like DLSS - the one advantage it does have is that you don't need to upscale 1440P but drop to like 1800P, gpu upscale and sharpen for an easy 15% fps boost with minimal quality drop.

DLSS is awesome right now -- going to be playing some Cyberpunk on my 4k TV, and there is no way the 2080ti would make it a fun time without DLSS even with normal rasterization, and upgrading the card is not even an option since there's not cards to buy lol.

I'm enjoying AMD's tech right now on my good old 1080 and it works a charm. Yes, its not as sharp as native, but in motion its really quite fine and the perf boost is tremendous. Posted comparison shot in the CBP topic. And no, its not the exact same pic either :D

It is screenshot that is used to illustrate how good DLSS is... Good to know it's "not the same screenshot" once issues that TAA is known for pop up... :D

Hey, bingo! TAA... is not DLSS
I don't care what the pics are used for in anyone's mind, all I see is two shots that are not identical - its just a frame with different content or taken from a slightly adjusted angle, or a millisecond earlier or later. You can see this as the position of many objects is different.

At that point all bets are off, looking at still imagery. But we can still see at first glance (!) the image is virtually identical, to a degree that you need to actually pause and look carefully to find differences.

You're quite the expert at making mountains out of mole hills aren't you?
 
Last edited:
Made a Death Stranding clip for people you think native resolution (without AA) look better than DLSS (quality), this was recorded on my laptop.


Basically DLSS 2.0 preserve all the details but get rid of all the shimmering caused by aliased edges (grass, rocks, building, etc...), while boosting performance by 25%
Youtube compression made it pretty hard to tell the IQ difference though :roll:
 
Last edited:
Because many things, among them, your monitor, can upscale it to 4k, oh, and without blurring things and performance penalty... :)))


Amazing is how certain brains work.... :D


Sure, John...
And, dear god, that green reality distortion field is some scary shit.

Oh well, this guy can explain better why DLSS is better than Native without AA

Watch at 7:35 mark, same situation as with Death Stranding, too much shimmering happen with Native Res without AA.
Better IQ + 40% perf uplift, Cyberpunk 2077 really punish non-RTX owners with their DLSS implementation there :roll:
 
Last edited:
As long as the end result is comparable, who cares. Work smarter, not harder. Don't get too stuck up on how things are done in the past.

again im not, im talking about it not being fair to compare performance of 4k when one of the cards is not running 4k....
again, if you would just compare performance at 1080p but then say that at 1080p with DLSS'ed up to 4k one looks better, that is fine, that is still a fair comparison.

I mean its not even the same, DLSS introduces visual artifacts, dithering etc, its not the same quality as just running it at that actual resolution, its a mitigating technique to make up for how badly RT impacts performance.
In the future with mcuh more advanced gpu's, nobody is going to run DLSS
 
again im not, im talking about it not being fair to compare performance of 4k when one of the cards is not running 4k....
again, if you would just compare performance at 1080p but then say that at 1080p with DLSS'ed up to 4k one looks better, that is fine, that is still a fair comparison.

I mean its not even the same, DLSS introduces visual artifacts, dithering etc, its not the same quality as just running it at that actual resolution, its a mitigating technique to make up for how badly RT impacts performance.
In the future with mcuh more advanced gpu's, nobody is going to run DLSS

Who knows, every DLSS implementation has been getting better and better, just look at the Cyberpunk 2077 clip I provided above. Cyberpunk 2077 definitely requires DLSS to have good framerate at 1440p and above, with or without RT, for any GPU.

As I have said many times now, playing at native resolution without AA looks pretty bad, the shimmering on objects are very noticable when you are moving. TAA is an acceptable form of AA because it remove all those shimmering effect while DLSS is an even better form of AA than TAA.
 
Who knows, every DLSS implementation has been getting better and better, just look at the Cyberpunk 2077 clip I provided above. Cyberpunk 2077 definitely requires DLSS to have good framerate at 1440p and above, with or without RT, for any GPU.

As I have said many times now, playing at native resolution without AA looks pretty bad, the shimmering on objects are very noticable when you are moving. TAA is an acceptable form of AA because it remove all those shimmering effect while DLSS is an even better form of AA than TAA.
That is simply not true I am having none of those issues playing at 1440P and getting mostly the mid 70s in framerates but the kicker is my monitor's Freesync range is 40 to 144.
 
That is simply not true I am having none of those issues playing at 1440P and getting mostly the mid 70s in framerates but the kicker is my monitor's Freesync range is 40 to 144.

Yeah sure 70fps in a first person game is not what I would call getting good framerate
 
Who knows, every DLSS implementation has been getting better and better, just look at the Cyberpunk 2077 clip I provided above. Cyberpunk 2077 definitely requires DLSS to have good framerate at 1440p and above, with or without RT, for any GPU.

As I have said many times now, playing at native resolution without AA looks pretty bad, the shimmering on objects are very noticable when you are moving. TAA is an acceptable form of AA because it remove all those shimmering effect while DLSS is an even better form of AA than TAA.

Again you are flipping it, 1080p obviously does not look as good as 1080p reconstructed/upscaled to 4k,
but then 1080p reconstructed/upscaled to 4k does not look as good as just running it at actual 4k.

That is kinda the point, its a bit like DSR or VSR except being less taxing because you dont actually run an "internal" res of 4k while being displayed on a 1080p screen for example.

DLSS is a mitigating solution, one to deal (read: make up for) with the requirements of RT, but in the future, if hardware is fast enough, you would get much better image quality by running a native res with proper anti aliasing (although that seems a bit of a thing of the past for some reason...where is my SMAA and SSAA gone? why is it all this crappy post processing nonsense?)
 
Back
Top