• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA DLSS Transformer Cuts VRAM Usage by 20%

I'm still trying to figure out why I would want to run DLSS at all. The quantity of NV's black screen of death increases dramatically, and image quality is most certainly worse than native. I get more than acceptable framerates out of native res so can someone help me understand why any of this matters? I have to be missing some key data point.
 
The reason would be that it's better than simply just running a lower native resolution due to performance reasons. The stability is likely on your end or game specific and possibly something the developers needs to iron out with their code implementation.
 
This seems like a good thing from NVidia. Shows evidence that NVidia does actually care about the budget market.

The quantity of NV's black screen of death increases dramatically
That is complete moose muffins. Just stop with that nonsense.
 
Last edited:
I'm still trying to figure out why I would want to run DLSS at all. The quantity of NV's black screen of death increases dramatically, and image quality is most certainly worse than native. I get more than acceptable framerates out of native res so can someone help me understand why any of this matters? I have to be missing some key data point.
Because it’s literally the best modern AA method available. I say “modern” because MSAA is useless for the most part with current day engines and SSAA is untenable due to performance costs. Even if you don’t need the additional performance of the upscaling tech (which in a lot of cases people do) DLAA alone by itself is basically as good as it gets in terms of AA, hands down. What even is “native” these days? Shitty TAA? Or a completely broken non-AA image that modern engines made with TAA in mind produce?
 
Because it’s literally the best modern AA method available. I say “modern” because MSAA is useless for the most part with current day engines and SSAA is untenable due to performance costs.
Why do you all keep saying that first part when it's not true?
There are game engines out there that use Deferred rending with MSAA?

I'll admit the SSAA part being true. Also somewhat true with SMAA & MSAA. Tats why I want something like the lowest card RTX 6030 to match a RTX 5090, & to do it without frame generation or D.L.S.S enable.
 
There are game engines out there that use Deferred rending with MSAA?
Yes, there are. And there MSAA fails to resolve like half of the image in a satisfactory manner simply because of its nature. It doesn’t work well. Or, if you MAKE it work well you are looking at near-SSAA frametime costs. It simply isn’t worth it.

Tats why I want something like the lowest card RTX 6030 to match a RTX 5090, & to do it without frame generation or D.L.S.S enable.
I too enjoy science fiction.
 
Yes, there are. And there MSAA fails to resolve like half of the image in a satisfactory manner simply because of its nature. It doesn’t work well. Or, if you MAKE it work well you are looking at near-SSAA frametime costs. It simply isn’t worth it.

Still better than any TAA in motion.
 
I'm still trying to figure out why I would want to run DLSS at all.
Because squeezing more frames out of older hardware or slower hardware helps a lot of people?
The quantity of NV's black screen of death increases dramatically, and image quality is most certainly worse than native.
The black screens have nothing to do with DLSS, nvidia's drivers for blackwell are still a mess. Image quality with DLSS4 is pretty good, in both still images and moving footage it is quite hard to guess which is which.
I get more than acceptable framerates out of native res so can someone help me understand why any of this matters? I have to be missing some key data point.
I think the key data you are missing is "not everyone is you or thinks like you". If you have a 4090 running 1080p, yeah, you'll probably never need DLSS. But if someone is running a 4060 or 4070 at 2k and wants to hit ultra or high settings while dabbling into RT, then DLSS makes unplayable games playable. This is also true for anyone still on RTX 2000 hardware, where DLSS may give them one more generation of use out of older hardware.

Because it’s literally the best modern AA method available. I say “modern” because MSAA is useless for the most part with current day engines and SSAA is untenable due to performance costs. Even if you don’t need the additional performance of the upscaling tech (which in a lot of cases people do) DLAA alone by itself is basically as good as it gets in terms of AA, hands down. What even is “native” these days? Shitty TAA? Or a completely broken non-AA image that modern engines made with TAA in mind produce?
MSAA is still perfectly usable and SSAA isnt impossible to run. IDK what bizzaro universe you come from, but I'm glad I dont live there. Sounds like hell, where you're forced to use TAA everywhere.

Don't get the whining about any alleged lack of VRAM. If a certain nVidia card does not have as much VRAM as you need then... I don't know... don't buy it, I guess? Or is anyone holding a gun to your head and forcing you to buy a card with what you perceive to be too little VRAM?

Alternatively, buy an AMD card since they seem to be giving away VRAM for free in spades, right?

Life could be so simple but no... here we go with another completely unnecessary whinefest :D .
Did it ever occur to you that the presence of 8GB cards artificially raises the price of 16GB cards via pointless market segmentation, thus costing us ALL more money?

I know you REALLY want to call people whiners, but use your noggin for 5 seconds here.
 
I'm still trying to figure out why I would want to run DLSS at all. The quantity of NV's black screen of death increases dramatically, and image quality is most certainly worse than native. I get more than acceptable framerates out of native res so can someone help me understand why any of this matters? I have to be missing some key data point.

Who told you this? This has never been the case, like, ever. DLSS doesn't cause BSODs, and of course the image loses sharpness, it's no longer running at the native image, unless it is. I took these screenshots for another thread, but I'll post it again here, you'd be hard pressed to tell both of these apart on most monitors, btw. It's DLSS at performance (25% resolution scale) vs. DLAA, both with 2x FG enabled.

1080p.jpg


dlaa.jpg
 
The lighting is clearly borked on bottom image, but otherwise looks pretty close. The top looks much more natural since the lighting didn't get tossed in the dumpster. That color palette looks better than Skyrim's bleak n dull one for the record.
 
The lighting is clearly borked on bottom image, but otherwise looks pretty close. The top looks much more natural since the lighting didn't get tossed in the dumpster. That color palette looks better than Skyrim's bleak n dull one for the record.

The lighting is a bit different because the weather started to change, but the bottom image is DLAA preset K, transformer model applied to native resolution image. Oblivion Remastered does look better than Skyrim, yeah.
 
This update directly addresses the needs of gamers running on 8 GB or lower graphics cards by trimming VRAM usage by 20%.

No it doesn't.

Cowboy Problems GIF by Allison Ponthier
 
I'm still trying to figure out why I would want to run DLSS at all. The quantity of NV's black screen of death increases dramatically, and image quality is most certainly worse than native. I get more than acceptable framerates out of native res so can someone help me understand why any of this matters? I have to be missing some key data point.
I'm yet to have a single BSOD in the past 2 and half years with my 3060 Ti using DLSS and now also zero crash issues in the past 2 weeks or so since I've got my 5070 and running DLSS/DLAA and pretty much all of the Nvidia tech it has.
DLSS with Transformer is literally better than TAA in pretty much every game I've tried so far and yes even in motion not just on stand still screens while its still giving me a decent performance boost and also lowering the power draw of my system which is something I personally do care about.
To me native gaming is just about dead unless its an old game that doesn't support it, DLSS/DLAA also fixes a lot of those texture/geometry flickering that a lot of newer games seems to have and some might not notice it but that is something I'm very sensitive of and annoyed by once I notice it..
 
No it doesn't.

Cowboy Problems GIF by Allison Ponthier
Yes, it does. Your lack of understanding doesn't prevent it from working as intended.
EDIT: Now come on, let's see your snarky comeback..

The lighting is clearly borked on bottom image, but otherwise looks pretty close.
I'm not seeing it. Whatever difference you're seeing isn't significant enough to have any serious complaint about, nor will it interfere with gameplay or enjoyment of same.
 
Last edited:
Yes, it does. Your lack of understanding doesn't prevent it from working as intended.

Most of the modern games are developed targeting consoles first with 16GB unified RAM thus optimized for 12~14GB actual usable VRAM spaces.
While most of the game engines can dynamically load/offload textures to adapt less VRAM
12GB is the bare minimum for 2025 onwards.

8GB / 0.8 is 10
You can search on the general thoughts on the VRAM limitations on the 10GB 3080 in 2025


Your 'lack of understanding' won't make 8GB magically works.
It is just not enough.
 
The lighting is a bit different because the weather started to change, but the bottom image is DLAA preset K, transformer model applied to native resolution image. Oblivion Remastered does look better than Skyrim, yeah.

Certainly explains why the bottom looks worse since it lacks the filter lighting so comparatively ends up looking worse in general. If you zoom in though the bottom does actually have cleaner details that are crisp, but isn't as diverse on colors since the lighting changed for the worse. Unfortunately the bottom highlights what I hate most about games in general in terms of graphics. It looks flat and unnatural. Nuanced lighting is something a games don't do a great job of still unfortunately hell even animation is pretty flat and a rather unnatural looking. The transformer model on the bottom could be breaking some post process effects unless that happens on the top as well with DLSS. It's hard to blame the bottoms look on the in game weather when it looks pretty bright and sunny in both overhead in the skies. That's not exactly dark and gloomy. The lighting and shading on bottom completely got tossed out that's not like some nuanced change in lighting due to some overcast weather event. It might not be due to DLAA or might, but either way it looks bad comparatively. The other details are more detailed and more crisp with some of the lines and edge and have more pop to them. In game it would look even worse between the two versus a static comparison. It's next to impossible to overlook that the lighting and shading differences.

That's actually one area where lower resolution can look better in cases if you layer in more post process and have higher average frame rates you it'll look more natural even if it is less detailed in terms of resolution. It's mixed baggage though depending on the resolution and amount of compute you can throw at added post process versus just using a higher native resolution w/o piling on post process. The bottom image is like highlights, midtones, and shadows? Sorry traveler I've not heard these words before would you be interested in sampling a sweet roll for mere 4 silver coins. I use to be adventurer like you, but then I grew older and tired so I became a sweet roll cook.
 
Who told you this? This has never been the case, like, ever. DLSS doesn't cause BSODs, and of course the image loses sharpness, it's no longer running at the native image, unless it is. I took these screenshots for another thread, but I'll post it again here, you'd be hard pressed to tell both of these apart on most monitors, btw. It's DLSS at performance (25% resolution scale) vs. DLAA, both with 2x FG enabled.

View attachment 405884

View attachment 405883

I can see a pretty gross difference in image quality when I play with DLSS4 vs Native (no MSAA) at 4K in some games. I don't have any of my recordings handy, but if I get back to my machines, I'll post them. Most of the issues are during motion. Certain scenes even with low motion show some localized blurring, and what I'd call a smearing effect. There are some blurred halos at times. To be fair, there are games where the difference is negligible or not noticeable to the human eye.

As for the black screens, no one told me anything. Not sure why or if there was a bone to pick? I have multiple 5-series (5080s and 5090s) and a couple AMD cards (pre-9k) sitting around. The black screens are not Windows BSODs, but truly black screens. Some result in a hang, while some result in a spinning down of the card before recovering live. The behavior doesn't exist on the AMD cards or on the 5-series cards if DLSS is off. Someone else commented it is likely driver/game combinations. I tend to agree.

Regardless, what I've posted is the limit of my knowledge here. It sounds like DLSS may not provide any advantage for me if I'm happy with my framerates at the resolution and settings I play at. Thanks for the responses.
 
Most of the modern games are developed targeting consoles first with 16GB unified RAM thus optimized for 12~14GB actual usable VRAM spaces.
That's an assumption on your part. You have no real way of knowing what the divide is for each game. It's a variable that CAN NOT be known without the devs of each game declaring that information. Therefore your statement has little to no merit.
Your 'lack of understanding' won't make 8GB magically works.
It is just not enough.
You're the one making assumptions. I'm gonna side with the idea that because it's being worked on and they talking about it and releasing the info publicly that they know better how their software works.

So again, your failure to understand does not mean anything to everyone else.
 
I'm yet to have a single BSOD in the past 2 and half years with my 3060 Ti using DLSS and now also zero crash issues in the past 2 weeks or so since I've got my 5070 and running DLSS/DLAA and pretty much all of the Nvidia tech it has.
DLSS with Transformer is literally better than TAA in pretty much every game I've tried so far and yes even in motion not just on stand still screens while its still giving me a decent performance boost and also lowering the power draw of my system which is something I personally do care about.
To me native gaming is just about dead unless its an old game that doesn't support it, DLSS/DLAA also fixes a lot of those texture/geometry flickering that a lot of newer games seems to have and some might not notice it but that is something I'm very sensitive of and annoyed by once I notice it..
Great response. I responded to another comment about the crashes. For the visuals it comes down to the individual. Since getting to 3840×1600 and then 4k, a native image just looks better to me. To be fair, framerates are acceptable in almost all cases. In a couple years, I might be singing a different tune if framerates drop off too much in future titles. Artifacts do bother me which I can see with upscaling.
 
I'm seeing some fuzziness on the fur of the bear, but not seeing the lighting issue you mentioned earlier. What am I missing, or is it just so small a difference that it doesn't come off well?

That's because I slightly edited the better starting image that had lighting that wasn't a borked to begin with and improved it. I can still improve the one that had worse lighting though as well just it's starting from a weaker point in the first place.

Here's the DLAA with edits...it helps for certain, but I still don't like it as much overall though some aspects are better in a bit of the crispness, but the lighting and shading wasn't as good to begin so while it's better than it was before edits it's still not as good as the the DLSS was with edits given it had better lighting to start with and further enhanced. It's a case of more natural lighting and shading versus more refined details. Lighting and shading helps a lot with general depth perception it brings you into the scene a lot more. One feels a lot more like you could reach out touch and feel the stuff within the scene while the other kind of feels flat and a bit fake and kind of doesn't feel so 3D in terms of perception and depth. Basically one does a better job tricky the eye into thinking it's real life environment even though it's still very much a fake 3D world environment.
F-T-L-dlaa-v4.jpg
 
Last edited:
That's an assumption on your part. You have no real way of knowing what the divide is for each game. It's a variable that CAN NOT be known without the devs of each game declaring that information. Therefore your statement has little to no merit.

You're the one making assumptions. I'm gonna side with the idea that because it's being worked on and they talking about it and releasing the info publicly that they know better how their software works.

So again, your failure to understand does not mean anything to everyone else.

8GB / 0.8 is just 10GB
Check 3080 VRAM issue for more context.

It is a fact that many users had complaints about 3080 VRAM issue since 2021.

You can accuse me for making assumptions or anything, it is meaningless.
Just a simple google search and people quickly realize 10GB is not enough.
There is no point debating on simple facts.

You are just ignoring them and refuse to understand the reality.
 
As for the black screens, no one told me anything. Not sure why or if there was a bone to pick? I have multiple 5-series (5080s and 5090s) and a couple AMD cards (pre-9k) sitting around. The black screens are not Windows BSODs, but truly black screens. Some result in a hang, while some result in a spinning down of the card before recovering live. The behavior doesn't exist on the AMD cards or on the 5-series cards if DLSS is off. Someone else commented it is likely driver/game combinations. I tend to agree.

Regardless, what I've posted is the limit of my knowledge here. It sounds like DLSS may not provide any advantage for me if I'm happy with my framerates at the resolution and settings I play at. Thanks for the responses.

This might have been the DP 1.4 black screen bug resolved with 576.80, unless it's been going on pre-50 series too. I never had anything, but I use HDMI since I use an OLED TV as my monitor. Though, unsure if that caused TDRs (which is the behavior you described and I highlighted). Still, odd. It's not exactly a normal thing. DLSS lowers the overall power consumption but the card is still susceptible to transient spikes, it might be that your PSU hasn't been too happy with that load pattern. Hard to tell.


That's because I slightly edited the better starting image that had lighting that wasn't a borked to begin with and improved it. I can still improve the one that had worse lighting though as well just it's starting from a weaker point in the first place.

Here's the DLAA with edits...it helps for certain, but I still don't like it as much overall though some aspects are better in a bit of the crispness, but the lighting and shading wasn't as good to begin so while it's better than it was before edits it's still not as good as the the DLSS was with edits given it had better lighting to start with and further enhanced. It's a case of more natural lighting and shading versus more refined details.
View attachment 405912

I mean, that image in itself exceeded the forum's limit for 16 MB for image attachments, so I had to lower it to jpg (albeit still at 100% quality quotient) to post. It's not a perfect representation of what the image comes off as in its purest form, although you can see where the DLSS blur tends to show if you're really careful - bear fur, some of the foliage, it handles grass surprisingly well. But this is feeding 25% of the pixel count, so three-fourths of the image is basically being inferred here. I think it's a decent result, one you'd find more than acceptable on a small monitor or laptop panel. I'm satisfied with the overall result even on a large 4K panel and an eye for detail, I just have a problem when DLSS is flaunted as a fix for crap optimization (and sadly Oblivion is such a case).
 
This might have been the DP 1.4 black screen bug resolved with 576.80, unless it's been going on pre-50 series too. I never had anything, but I use HDMI since I use an OLED TV as my monitor. Though, unsure if that caused TDRs (which is the behavior you described and I highlighted). Still, odd. It's not exactly a normal thing. DLSS lowers the overall power consumption but the card is still susceptible to transient spikes, it might be that your PSU hasn't been too happy with that load pattern. Hard to tell.




I mean, that image in itself exceeded the forum's limit for 16 MB for image attachments, so I had to lower it to jpg (albeit still at 100% quality quotient) to post. It's not a perfect representation of what the image comes off as in its purest form, although you can see where the DLSS blur tends to show if you're really careful - bear fur, some of the foliage, it handles grass surprisingly well. But this is feeding 25% of the pixel count, so three-fourths of the image is basically being inferred here. I think it's a decent result, one you'd find more than acceptable on a small monitor or laptop panel. I'm satisfied with the overall result even on a large 4K panel and an eye for detail, I just have a problem when DLSS is flaunted as a fix for crap optimization (and sadly Oblivion is such a case).

I wouldn't personally use the one where the lighting and shading is thrown out the window. That's not a weather change or some minor compression anomaly from how the file got saved to upload to the forum because if it were it would be consistent with both images. It does not look like a game weather change causing it either it's not just a simple the scene is a bit more light or dark because it got more cloudy and actually looking at both the sky looks nearly identical. It literally broke the lighting and shading the fall off gradient from light to dark just like stripped away and disposed off somehow in one versus the other. I'm not saying it didn't have some positives in spite of it, but overall I dislike it comparatively for some of the reasons I touched upon. Really neither option is perfect and there are key differences between each of the two.

The DLSS seemed to be the one with the more ideal lighting and shading while DLAA had some more detail that you could notice a bit in terms of crispness, but former was broken so I just didn't like it as much since I felt it was harder to ignore where it was worse as opposed to better. To me it's far more obvious where it looked very clearly worse as opposed subtly better in some nuanced details while lacking sorely in other key places.
 
Back
Top