• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Updates Cyberpunk 2077, Minecraft RTX, and 4 More Games with DLSS

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
3,044 (1.08/day)
NVIDIA's Deep Learning Super Sampling (DLSS) technology uses advanced methods to offload sampling in games to the Tensor Cores, dedicated AI processors that are present on all of the GeForce RTX cards, including the prior Turing generation and now Ampere. NVIDIA promises that the inclusion of DLSS is promising to deliver up to a 40% performance boost, or even more. Today, the company has announced that DLSS is getting support in Cyberpunk 2077, Minecraft RTX, Mount & Blade II: Bannerlord, CRSED: F.O.A.D., Scavengers, and Moonlight Blade. The inclusion of these titles is now making NVIDIA's DLSS technology present in a total of 32 titles, which is no small feat for new technology.

Below, you can see the company provided charts about the performance of DLSS inclusion in the new titles, except the Cyberpunk 2077.
Update: The Cyberpunk 2077 performance numbers were leaked (thanks to kayjay010101 on TechPowerUp Forums), and you can check them out as well.



View at TechPowerUp Main Site
 
The thing that would give me the most of a performance boost is: Being able to actually buy one of these RTX3000 series cards at MSRP.
 
The Cyberpunk slides exist and were leaked yesterday.
Cyberpunk-2077-NVIDIA-GeForce-RTX-Official-PC-Performance-benchmarks-With-Ray-Tracing-DLSS-on-RTX-3090-RTX-3080-RTX-3070-RTX-3060-Ti-_3-Custom.png
Cyberpunk-2077-NVIDIA-GeForce-RTX-Official-PC-Performance-benchmarks-With-Ray-Tracing-DLSS-on-RTX-3090-RTX-3080-RTX-3070-RTX-3060-Ti-_2-Custom.png

Cyberpunk-2077-NVIDIA-GeForce-RTX-Official-PC-Performance-benchmarks-With-Ray-Tracing-DLSS-on-RTX-3090-RTX-3080-RTX-3070-RTX-3060-Ti-_1.png
 
The narative they are trying to push here is kinda bs though.

4k DLSS is...not 4k, 4k is 4k, 4k DLSS is whatever the actual resolution is and then upscaled.

But "upscaled" has gotten a bit of a bad rep so lets go to language/words war and rename it....and flip it, instead of saying 1080p with DLSS (mentioning the actual render resolution with upscaling to for example 4k) they flip it and say its 4k Upscaled and dont mention what the actual resolution is.

Sadly we find reviewers jumping onboard with this and its just not right, you can't test "4k" with DLSS to a card running ACTUAL 4k, both would have to be running actual 4k and the DLSS can then have it upscaled to 8k or whatever, idc, but both should be rendering at actually 4k to compare.

It is pretty cool tech though, no doubt
 
Look like sweet scaling with DLSS Quality Mode, though only the 3090 can be called "playable" with Ultra RT at 1440p.
I wonder if the performance boost from DLSS remain the same without RT, that's the more interesting topic.
 
The narative they are trying to push here is kinda bs though.

4k DLSS is...not 4k, 4k is 4k, 4k DLSS is whatever the actual resolution is and then upscaled.

But "upscaled" has gotten a bit of a bad rep so lets go to language/words war and rename it....and flip it, instead of saying 1080p with DLSS (mentioning the actual render resolution with upscaling to for example 4k) they flip it and say its 4k Upscaled and dont mention what the actual resolution is.

Sadly we find reviewers jumping onboard with this and its just not right, you can't test "4k" with DLSS to a card running ACTUAL 4k, both would have to be running actual 4k and the DLSS can then have it upscaled to 8k or whatever, idc, but both should be rendering at actually 4k to compare.

It is pretty cool tech though, no doubt
If both 4K native and 4K upscaled looks functionally identical - or even better than native sometimes in the case of DLSS Quality mode - then who actually cares? The "4K" just refers to the output resolution. Many things are not at native res, would you still not consider them 4K? For example, running a game at 4K but textures are just 2048x2048? Would you consider that "really" 2K?

4K DLSS is by all means 4K. The output image is 4K. It has 4K levels of detail, it looks 4K. It just happens to inject artificial detail where it's missing in the internal render resolution. But the output image is indistingishable from native in most cases, especially if you're not comparing pixel by pixel.
I still don't understand being an "anti-DLSSer". It's fantastic tech that pretty much is just free FPS. As long as you keep it to Quality or Balanced mode, performance has an internal res too low to produce a decent image.

Quality mode renders at 66% total res (so 4K is upscaled from 1440p)
Balanced mode renders renders at 50% total res (so 4K is upscaled from 1080p)
Performance mode renders at 33% total res (so 4K is upscaled internally from 720p, yuck)
 
Oooo wee, set everything to ultra and enjoy sub 60 fps in 4K. This is why you buy a $1500+ GPU!
 
Oooo wee, set everything to ultra and enjoy sub 60 fps in 4K. This is why you buy a $1500+ GPU!

I bought the Titan X Maxwell just to enjoy Witcher 3 at 1440p 60fps, not even maxed Hairworks :roll:
 
Oooo wee, set everything to ultra and enjoy sub 60 fps in 4K. This is why you buy a $1500+ GPU!
Much better than 22 fps.
 
I bought the Titan X Maxwell just to enjoy Witcher 3 at 1440p 60fps, not even maxed Hairworks :roll:
Yeah, it's almost as if bleeding edge is hard to run because it's literally the best we can make or something. Who'd a thunk it?
 
where are the ones that bought 2x 8800 Ultra to run crysis?
 
Because Minecraft with RTX is where it's at.
 
The narative they are trying to push here is kinda bs though.

4k DLSS is...not 4k, 4k is 4k, 4k DLSS is whatever the actual resolution is and then upscaled.

But "upscaled" has gotten a bit of a bad rep so lets go to language/words war and rename it....and flip it, instead of saying 1080p with DLSS (mentioning the actual render resolution with upscaling to for example 4k) they flip it and say its 4k Upscaled and dont mention what the actual resolution is.

Sadly we find reviewers jumping onboard with this and its just not right, you can't test "4k" with DLSS to a card running ACTUAL 4k, both would have to be running actual 4k and the DLSS can then have it upscaled to 8k or whatever, idc, but both should be rendering at actually 4k to compare.

It is pretty cool tech though, no doubt
In games like Control or Death Stranding, 4K DLSS looks a lot better than 4K native, better definition, better details in textures, and so on
 
If both 4K native and 4K upscaled looks functionally identical - or even better than native sometimes in the case of DLSS Quality mode - then who actually cares? The "4K" just refers to the output resolution. Many things are not at native res, would you still not consider them 4K? For example, running a game at 4K but textures are just 2048x2048? Would you consider that "really" 2K?

4K DLSS is by all means 4K. The output image is 4K. It has 4K levels of detail, it looks 4K. It just happens to inject artificial detail where it's missing in the internal render resolution. But the output image is indistingishable from native in most cases, especially if you're not comparing pixel by pixel.
I still don't understand being an "anti-DLSSer". It's fantastic tech that pretty much is just free FPS. As long as you keep it to Quality or Balanced mode, performance has an internal res too low to produce a decent image.

Quality mode renders at 66% total res (so 4K is upscaled from 1440p)
Balanced mode renders renders at 50% total res (so 4K is upscaled from 1080p)
Performance mode renders at 33% total res (so 4K is upscaled internally from 720p, yuck)

Resolution refers to the amount of unique pixels produced, thats it, that is all.

In games like Control or Death Stranding, 4K DLSS looks a lot better than 4K native, better definition, better details in textures, and so on

not the discussion being had, feels bad I have to say this just to make sure we dont very emotional here, I think DLSS is a great upscaling/reconstructing tech, that is NOT the issue.
The issue is comparing one gpu running a game at ACTUAL 4k and the other at 1080p that was reconstructed to a quality level of 4k and then pretending that that is a fair comparison, it just isnt.

You can however for example say that at 1080p X and Y run the same at 1080p but that Y looks way better and sharper because of DLSS.
 
Oooo wee, set everything to ultra and enjoy sub 60 fps in 4K. This is why you buy a $1500+ GPU!

You can trust Nvidia to deliver a crippling effect for your new GPU. Last CDPR release they had Hairworks. I still struggle to see why they even created that, you never even noticed it being on.

A few months ago, Cyberpunk supposedly ran on a toaster. Now we need a 3090. Thanks, Nv! But we can cheer because DLSS made turned 25 fps into 40 so that's 'playable' ;)

But I'm not buying that nonsense. Just avoid RT and enjoy the game, nuff said - and THIS is why you don't buy into RT/DLSS yet. its a marketing dream you're stepping into, not reality. It requires and deserves a fat middle finger from us. There is no baseline we can look at, so this is entirely Nvidia's reality we're seeing, and they tweak it according to the cards they want to sell. Or, pretend to sell.
 
Look like sweet scaling with DLSS Quality Mode

Rendering 4k at 1440p (less than a half of pixels) improves framerate, who would have thought... :D

In games like Control or Death Stranding, 4K DLSS looks a lot better than 4K native
Bullshit.
DLSS 2 is not a "rue NN" (1.0 was, and failed) but TAA derivative (link below) and as any TAA derivative it:
1) Improves lines
2) Adds blur
3) Wipes out fine details
4) All that gets even worse when quick movement is involved

Examining screenshots of pretty much anything with fine detail on it (not that face again please) you'll see it improves lines (and is great at it) but adds blur, loses detail. It's the best anti-aliasing on the market, available in a handful of games and incompatible with VRR. It is also NV only, as usual with that brand.

Pay attention to two areas here, bush above dude's head, and the long grass on the right (this is a screeshot from a site hyping this crap as "better than native"):

1607524110632.png


Lines get improved, grass rejoice, but note what happened to the bush above dude's head:

1607524133093.png



it was this:

1607524151481.png


And this is what DLSS 2 did to it:

1607524163120.png



Note that TAA derivatives wiping out details/adding blur, struggling with quickly moving/small objects is a very well known problem (and that DLSS 2 is based on TAA is not up for debate)



And, yes, antialiasing does make things look better, but it is in no way a
 
Last edited:
In games like Control or Death Stranding, 4K DLSS looks a lot better than 4K native, better definition, better details in textures, and so on

It is wise to realize and constantly remind yourself of the fact that Nvidia needed extensive time spent on their render farms to get it to that point. Most DLSS 'enabled' games don't get that boost. Its a big black box for us, and Nvidia controls all the knobs.

Not good from a customer POV, and even if the tech itself can be 'enabled' easily, the net gain is highly variable both in quality, available settings and actual performance gain. You're left to the green Mercy, ergo, the most commercially viable titles, for a limited period of time.

If we do this upscale business, make it industry wide, unified and available like all those methods of AA. We shouldn't accept anything else IMO.
 
It is wise to realize and constantly remind yourself of the fact that Nvidia needed extensive time spent on their render farms to get it to that point.
It is wise, at this point, to realize the differences between 1.0 and 2.0 and stop spreading FUD.
1.0 was per game training at farms. 2.0 is not.
 
Resolution refers to the amount of unique pixels produced, thats it, that is all.

not the discussion being had, feels bad I have to say this just to make sure we dont very emotional here, I think DLSS is a great upscaling/reconstructing tech, that is NOT the issue.
The issue is comparing one gpu running a game at ACTUAL 4k and the other at 1080p that was reconstructed to a quality level of 4k and then pretending that that is a fair comparison, it just isnt.

You can however for example say that at 1080p X and Y run the same at 1080p but that Y looks way better and sharper because of DLSS.

Not sure if you understand, 4K DLSS is still 4K, all the pixel counts are there.
Normal upscaling is like stretching an jpeg image to make it bigger, it will look blurry. DLSS is like stretching a PDF file, it will look sharp no matter how you stretch it because all the missing details are filled it (reconstructed).

Another benefit to DLSS is that monitor only display their best image quality at their native resolution, using non native resolution will make it look blurry. You can compare IQ with a 4K monitor running upscaled 1440p vs a native 1440p screen, the 1440p screen will look better.
 
Not sure if you understand, 4K DLSS is still 4K, all the pixel counts are there.
Normal upscaling is like stretching an jpeg image to make it bigger, it will look blurry. DLSS is like stretching a PDF file, it will look sharp no matter how you stretch it because all the missing details are filled it (reconstructed).

Another benefit to DLSS is that monitor only display their best image quality at their native resolution, using non native resolution will make it look blurry. You can compare IQ with a 4K monitor running upscaled 1440p vs a native 1440p screen, the 1440p screen will look better.

Yes...and it has to be filled in because the information isnt actually there because its isnt actually running the game at that resolution, that is the entire point.
 
Yes...and it has to be filled in because the information isnt actually there because its isnt actually running the game at that resolution, that is the entire point.

As long as the end result is comparable, who cares. Work smarter, not harder. Don't get too stuck up on how things are done in the past.
 
Oooo wee, set everything to ultra and enjoy sub 60 fps in 4K. This is why you buy a $1500+ GPU!
The 2560x1440 slides don’t look too good either.
 
Rendering 4k at 1440p (less than a half of pixels) improves framerate, who would have thought... :D


Bullshit.
DLSS 2 is not a "rue NN" (1.0 was, and failed) but TAA derivative (link below) and as any TAA derivative it:
1) Improves lines
2) Adds blur
3) Wipes out fine details
4) All that gets even worse when quick movement is involved

Examining screenshots of pretty much anything with fine detail on it (not that face again please) you'll see it improves lines (and is great at it) but adds blur, loses detail. It's the best anti-aliasing on the market, available in a handful of games and incompatible with VRR. It is also NV only, as usual with that brand.

Pay attention to two areas here, bush above dude's head, and the long grass on the right (this is a screeshot from a site hyping this crap as "better than native"):

View attachment 178843

Lines get improved, grass rejoice, but note what happened to the bush above dude's head:

View attachment 178844


it was this:

View attachment 178845

And this is what DLSS 2 did to it:

View attachment 178846


Note that TAA derivatives wiping out details/adding blur, struggling with quickly moving/small objects is a very well known problem (and that DLSS 2 is based on TAA is not up for debate)



And, yes, antialiasing does make things look better, but it is in no way a

Im not seeing a difference here... what did DLSS 2 do to it?

On a zoomed in shot like that theres a tiny color difference with the blood in the hair maybe?
 
The 2560x1440 slides don’t look too good either.
Thing is at max this uses almost every RT feature possible right now:
RT Shadows
RT Reflections
RT Ambient Occlusion
RT Direct Lighting
RT Global Illumination

So yeah, on ultra this is pretty much modern day Crysis.
 
Back
Top