• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Cyberpunk 2077 Gets DLSS 4 Support With Patch 2.21. All RTX Users Can Try the New Transformers DLSS Model

A great looking game just got a small update to look better

Nvidia users: Nice
Radeon users: hurrdurrr if the game looked so real, why does it need to look real-er
 
Putting the new nvngx_DLLs only in yhe 2.20 fixed a lot flickers !.. the bench was beautifull too.
 
A great looking game just got a small update to look better

Nvidia users: Nice
Radeon users: hurrdurrr if the game looked so real, why does it need to look real-er
Nvidia users keep revisiting this to see what's there for them, which is understandable, but are they really playing games, you might wonder ;) I totally get it, if you spend $1500,- on a featureset you want to see it being worth a damn.

I've moved on ages ago, the game ain't that interesting for that long. After all, I play games.
 
A great looking game just got a small update to look better

Nvidia users: Nice
Radeon users: hurrdurrr if the game looked so real, why does it need to look real-er
What about Intel users? Don't leave them out, even though there's like 6 of them in total. :laugh:
 
I would appreciate similar patch for RTX version of Witcher3.
 
the base framerate is still 20
52 years from now, at this tempo, it'll be possible with 8R12B PT (today, the game uses 2R2B) at 5120x2160 at the very least. I would've enjoyed looking at this stuff actually having a positive number of frames per second.
 
I would appreciate similar patch for RTX version of Witcher3.
For upscaling you can copy over the new "nvngx_dlss.dll" file from Cyberpunk 2077 latest patch, and activate DLSS preset profile J (Transformer model) with something like Nvidia Profile Inspector and it will work.

A quick Google search will help you find the file and probably instructions that are 100 times easier to understand than what I posted. :(
 
After all, I play games.
Well isn't one or the other you know, we can revisit the 'tech demo' game, get excited about the improvements, then quite easily transfer them into other games to reap the rewards. And then go on to play those games, or indeed any others.

I certainly don't consider the interest in technology and testing the improvements, and playing games, to be mutually exclusive, and they need not be framed as such.
 
we can revisit the 'tech demo' game, get excited about the improvements, then quite easily transfer them into other games to reap the rewards.
Or we can revisit the "tech demo" game, get a good giggle about several NPCs T-posing like no tomorrow, then quite easily forget what we were about to achieve in the first place and consequently destroy like half a million enemies just for fun.
 
So wait, if the RT in Cyberpunk was SOOOO realistic why does it need a 7GB update and patch, new card that has a totally different RT function to look realistic?

What are these lighting stability issues that are a problem?

Please go back to bed grandpa, we're talking about DLSS, not RT here. Don't even know why if a game has RT or DLSS, it doesn't need to be patched ever. Don't forget to take your meds also.
 
??????????????

The ray tracing is accurate, the update is for the AI upscaling, ray reconstruction, and frame generation components to make them look sharper and perform better. It's an update to the AI models, not the lighting system.

Also, the 50 series has better ray tracing cores that are better at interacting with objects that have complex geometry, so presumably the game needs an update to allow the 50 series to work with that.

Although, I guess I shouldn't expect a 7900 XTX owner to understand how ray tracing and AI works.
What a burn...:fear:
 
They call this marketing, I think. At this point Cyberpunk should be so polished it'll buy you a trip to Warsaw when you fire it up, but nope. Still a half broken game rife with small issues. A poster child for RT that is apparently never done, and never truly great seeing as it needs continuous patching.

The ray tracing is accurate. But ray reconstruction needs an update to make them 'sharper and perform better'. Meanwhile, there are still artifacts in the image.

What the fuck are you smoking? Do you understand how it works? I don't think you do. Its called a business model. Not real advancements - Nvidia just created a playground that is fully proprietary. Its a black box.
Ray tracing is accurate. You can test this by turning on full path tracing and turning all upscaling off. You'll get a fully path traced image, where every single pixel is ray traced. Or actually, I don't think you can, because AMD can't make ray tracing hardware to save their life, despite having 7 years since Microsoft put it into DirectX and made it an open standard.

Again, you obviously have no idea how any of this actually works. Ray reconstruction is an AI algorithm that allows the GPU to trace less rays (improving performance), and uses a specialized algorithm to fill in the gaps. Updated algorithm = looks closer to how it does fully path traced. There are artifacts because it's still AI generated, but there are less now, because the algorithm is better.
 

One of my favorite channel on YouTube, they did a really good job explaining how Path Tracing and Ray Tracing algorithm work in games and movies.

It is definitely not just some marketing mumbo jumbo.
Yeah, this is what gets me whenever people start complaining about ray tracing and "the performance just isn't worth it." It's not an Nvidia marketing scam, it's been a thing since Cars (2006), remember how great that looked? It still holds up today, the opening sequence with McQueen rolling out of the truck, the reflections in his hood, the multi-sided lighting of the cars on the track, it's simply incredible. It just is so computationally intensive it used to take server farms of thousands of CPUs networked together years to render ray traced animated movies or CGI. Even baked lighting in games is just ray tracing done ahead of time by the developer.

Now, Moore's Law and engineering has finally advanced to the point where it can be done real-time, and it absolutely is worth it. It looks better, and it makes game development easier, that's why ID software updated their engine to require hardware ray tracing for Indiana Jones and Doom. Instead of putting hundreds of man hours into lighting a scene, debugging, testing all the contingencies and stupid things players might do, they can just set up the lighting source and be confident the lighting will be correct in all situations.
 
mine was 3.3 GB, :confused: (on GOG), but I dont have Phantom Liberty - waiting til it isn't being tweaked anymore for the "definitive experience".
What do you mean? The gameplay is finished, and it's good. They've turned support over to another company for bug fixes and maintenance like this.
 
mine was 3.3 GB, :confused: (on GOG), but I dont have Phantom Liberty - waiting til it isn't being tweaked anymore for the "definitive experience".

I have Phantom Liberty so I guess that's the difference.

Also, you should buy it. Perfection is iterative.
 
Ray tracing is accurate. You can test this by turning on full path tracing and turning all upscaling off. You'll get a fully path traced image, where every single pixel is ray traced. Or actually, I don't think you can, because AMD can't make ray tracing hardware to save their life, despite having 7 years since Microsoft put it into DirectX and made it an open standard.

Again, you obviously have no idea how any of this actually works. Ray reconstruction is an AI algorithm that allows the GPU to trace less rays (improving performance), and uses a specialized algorithm to fill in the gaps. Updated algorithm = looks closer to how it does fully path traced. There are artifacts because it's still AI generated, but there are less now, because the algorithm is better.
Thanks for telling me I'm wrong and then no less than one paragraph later, telling me I'm right.

Also, I can run path tracing just fine on a 7900XT. It adds a light grain, which again, isn't accuracy, but noise. Additionally, don't be silly, I have access to Nvidia cards whenever I want ;)

Now, you correctly stated that ray tracing was here for a long time already. But baked in. That's exactly where our paths divide on how I view the potential use of RT, and how people feel who think it needs to be done in real time. In the real time implementation you are still not getting perfect accuracy. But you do lose a shitload of performance on it. Is it really that much better against a well baked implementation? I've experienced numerous games that really just don't improve with RT based lighting, because the baked lighting is already very well done. And that is a show of skill, developer talent. The only thing the real time RT really does enable is lower the bar for less talented developers to make something similar - and YOU pay for their lack of employed talent and spent time with a huge performance hit, and more expensive GPUs.
 
Last edited:
Also, I can run path tracing just fine on a 7900XT. It adds a light grain, which again, isn't accuracy, but noise.

1737737676062.png


The council has determined that was a lie

Now, you correctly stated that ray tracing was here for a long time already. But baked in. That's exactly where our paths divide on how I view the potential use of RT, and how people feel who think it needs to be done in real time. In the real time implementation you are still not getting perfect accuracy. But you do lose a shitload of performance on it. Is it really that much better against a well baked implementation? I've experienced numerous games that really just don't improve with RT based lighting, because the baked lighting is already very well done. And that is a show of skill, developer talent. The only thing the real time RT really does enable is lower the bar for less talented developers to make something similar - and YOU pay for their lack of employed talent and spent time with a huge performance hit, and more expensive GPUs.

Actually, I think it's better if a talented developer can spend 30 minutes setting up light sources in a scene and rely on ray tracing for it to look correct, instead of spending 100 hours making sure the baked lighting will work perfectly and every object in the scene won't mess up the lighting. Less time on details like lighting means more time to work on other aspects of the game. I'm sure we can both agree that it doesn't matter how good a game looks if the gameplay is bad or worse, boring. Just look at all the UE5 slop coming out recently. Bad games will be bad games, ray tracing doesn't solve that. What ray tracing DOES do is allow talented developers to spend more time on other parts of the game. You may knash your teeth as your 7900 XT fades into oblivion, but this is obviously where the industry as a whole is heading. UE5 Lumen, Id software updating their engine to require ray tracing hardware, etc.
 
View attachment 381479

The council has determined that was a lie



Actually, I think it's better if a talented developer can spend 30 minutes setting up light sources in a scene and rely on ray tracing for it to look correct, instead of spending 100 hours making sure the baked lighting will work perfectly and every object in the scene won't mess up the lighting. Less time on details like lighting means more time to work on other aspects of the game. I'm sure we can both agree that it doesn't matter how good a game looks if the gameplay is bad or worse, boring. Just look at all the UE5 slop coming out recently. Bad games will be bad games, ray tracing doesn't solve that. What ray tracing DOES do is allow talented developers to spend more time on other parts of the game. You may knash your teeth as your 7900 XT fades into oblivion, but this is obviously where the industry as a whole is heading. UE5 Lumen, Id software updating their engine to require ray tracing hardware, etc.
You do realize that UE5 slop is developers doing exactly what you just said right? Cutting cost with improved workflows as offered by the engine. Its not a path to better games. Good games are made by good devs. RT isnt a player in this story and any costs not made are simply not in the budget at all.
 
View attachment 381479

The council has determined that was a lie



Actually, I think it's better if a talented developer can spend 30 minutes setting up light sources in a scene and rely on ray tracing for it to look correct, instead of spending 100 hours making sure the baked lighting will work perfectly and every object in the scene won't mess up the lighting. Less time on details like lighting means more time to work on other aspects of the game. I'm sure we can both agree that it doesn't matter how good a game looks if the gameplay is bad or worse, boring. Just look at all the UE5 slop coming out recently. Bad games will be bad games, ray tracing doesn't solve that. What ray tracing DOES do is allow talented developers to spend more time on other parts of the game. You may knash your teeth as your 7900 XT fades into oblivion, but this is obviously where the industry as a whole is heading. UE5 Lumen, Id software updating their engine to require ray tracing hardware, etc.
Don’t own the game.

But to be fair 39FPS is quite the 1440 number that I’m sure you are fine playing at right? Without turning down any settings or using any upscaling tech? Right?

Please go back to bed grandpa, we're talking about DLSS, not RT here. Don't even know why if a game has RT or DLSS, it doesn't need to be patched ever. Don't forget to take your meds also.

Not a grandpa yet (that I know of) but probably soon enough. Gonna be 45 this year. That being said, I played games before AF and AA were decent and the games had to be good. A lot of women look good with a lot of makeup on too… doesn’t mean I brought them home.
 
Does DLSS 4 need a specific driver or it just works with the DLL update?
 
Last edited:
Back
Top