• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA DLSS Transformer Cuts VRAM Usage by 20%

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
3,189 (1.11/day)
NVIDIA has officially released its DLSS Transformer model as part of the 310.3.0 SDK, combining advanced upscaling with a substantial reduction in video memory requirements. This update directly addresses the needs of gamers running on 8 GB or lower graphics cards by trimming VRAM usage by 20%. Unlike previous DLSS versions that relied on convolutional neural networks to infer missing pixels, the Transformer approach evaluates the relationships among all pixels in a frame and applies that understanding across multiple frames. Despite the increased sophistication of this method, NVIDIA's engineers have refined their memory management routines to maintain lean resource demands and deliver sharper, more consistent visuals.

Tested at regular resolutions, the improvements are striking: running DLSS at 1080p now consumes just 87.8 MB of VRAM, down from 106.9 MB in the prior SDK release, while similar reductions of around 20% apply at 1440p, 4K and even 8K. For those using GPUs with limited memory, these savings translate into smoother performance and the ability to enable richer graphics features without compromising image quality. Game developers and engine partners can expect to integrate the DLSS Transformer model into their titles and tools over the coming months, with early tests already highlighting crisper edges, more stable frame rates and consistently high upscaling performance.



View at TechPowerUp Main Site | Source
 
"s" We are meant to not question the NV, we should buy more 50-series. Repeat after me "the more you buy, the more you save" ! "/s"
 
Last edited:
How about the Fake Frame generator? Still requires a huge amount a VRAM.
 
solid if you dont want to give your customers more Vram while keep asking more and more money.
 
looks like this year nvidia is fine tuning its AI stuff... dlss, frame gen.. all to run better lowering vram across the board...
how much does it help... hope we can get some sort of benchmark going on
 
how much does it help... hope we can get some sort of benchmark going on
Not much. There's very little NV can do to make their 8 GB cards magically outperform hypothetical 10 GB counterparts by driver and feature optimisations. Video games and tools are written by 3rd parties who more often than not disregard the whole NV advancements as a thing.

Of course an NV GPU with X GB VRAM will run out of VRAM later than an AMD GPU with the exact same amount because of better overflow handling and overall superior VRAM management (+ higher bandwidth if we compare 9070 VS 5070; 9060 VS 5060) but the difference isn't massive, 10 percent at the very most if we only look at the worst case scenarios for AMD.

3 gigabyte VRAM modules were and still are a must for 5060 series. 12 GB woulda made all the sense in the world... but alas.
 
Not much. There's very little NV can do to make their 8 GB cards magically outperform hypothetical 10 GB counterparts by driver and feature optimisations. Video games and tools are written by 3rd parties who more often than not disregard the whole NV advancements as a thing.

Of course an NV GPU with X GB VRAM will run out of VRAM later than an AMD GPU with the exact same amount because of better overflow handling and overall superior VRAM management (+ higher bandwidth if we compare 9070 VS 5070; 9060 VS 5060) but the difference isn't massive, 10 percent at the very most if we only look at the worst case scenarios for AMD.

3 gigabyte VRAM modules were and still are a must for 5060 series. 12 GB woulda made all the sense in the world... but alas.
Go watch videos on YT of AMD vs nvidia side by side. You will notice nvidia cards usually use around 10% less vram usage over all. So to say can do very little well they can.
 
Very good DLSS4 transformer model is fantastic, but skipping RTX 50 series it's a joke. Hopefully RTX 60 series has TSMC 3N or better, and 16GB VRAM at minimum for the 6070 tier so it's actually an upgrade not just a TDP increase.
 
Of course an NV GPU with X GB VRAM will run out of VRAM later than an AMD GPU with the exact same amount because of better overflow handling and overall superior VRAM management (+ higher bandwidth if we compare 9070 VS 5070; 9060 VS 5060) but the difference isn't massive, 10 percent at the very most if we only look at the worst case scenarios for AMD.
Don't know about Nvidia's superiority in what you mentioned, but I know RX 9060XT has PCIe 5.0 x16 while 5060(Ti) has only PCIe 5.0 x8. This difference is showing when Vram is not enough.
3 gigabyte VRAM modules were and still are a must for 5060 series. 12 GB woulda made all the sense in the world... but alas.
In desktop you still have the option of buying the 16GB version at least with 5060Ti.
In my opinion the worst situation is in laptops, that's where we especially need 24Gbit modules.
 
Come on 8GiB VRAM is enough. /Sarcasm.

That article sounds like more compression. I also experimented with file system compression. 20 % is quite usual rate for low cpu file compression.
Edit: The common thing is the 20% compression. You may check the title for the 20%: NVIDIA DLSS Transformer Cuts VRAM Usage by 20%

That feature will be bought with gpu performance or SYSTEM DRAM and CPU performance. There is no free stuff.
 
Last edited:
DLSS Transformers is literally the first time that I would say NV has an undisputed “killer feature”. It’s genuinely amazing tech, even Performance presets look very good with it. It’s finally arrived at the “basically free performance” level DLSS was touted as for all these years and it’s nice that NV keeps optimizing it. I absolutely don’t condone releasing 8Gb cards in 2025, but the market is what it is, unfortunately.

That article sounds like more compression. I also experimented with file system compression. 20 % is quite usual rate for low cpu file compression.
Nothing in common whatsoever, but do go off.
 
I might be wrong but doesnt DLSS Trasformer also lowers performance by ~10% ?
 
DLSS Transformers is literally the first time that I would say NV has an undisputed “killer feature”. It’s genuinely amazing tech, even Performance presets look very good with it. It’s finally arrived at the “basically free performance” level DLSS was touted as for all these years and it’s nice that NV keeps optimizing it.
Yup, to me DLSS Transformer is a game changer and it was the main deciding factor to go with an Nvidia card instead of an AMD card when I've upgraded recently.
Yes FSR 4 is good but the support is just lacking and nowadays I do use DLSS in pretty much every game that supports it and I will continue to do so cause yeah at this point its like free performance and at least I can get rid of that crappy TAA that most games seems to have.
 
I might be wrong but doesnt DLSS Trasformer also lowers performance by ~10% ?

Nowhere near 10%, even on older GPUs. It does take longer than the CNN model to infer, but worst case scenario you're looking at 3-5% if it's on a low end Ampere or Turing GPU with low tensor performance. TPU's showcase of the technology found 3% on the RTX 3060. Should be nearly indistinguishable (0-2%) on higher end previous generation GPUs or on Ada and Blackwell architecture ones, even on lower end models.
 
Nowhere near 10%, even on older GPUs. It does take longer than the CNN model to infer, but worst case scenario you're looking at 3-5% if it's on a low end Ampere or Turing GPU with low tensor performance. TPU's showcase of the technology found 3% on the RTX 3060. Should be nearly indistinguishable (0-2%) on higher end previous generation GPUs or on Ada and Blackwell architecture ones, even on lower end models.
Yeah it was already more than fine on my previous 3060 Ti, I did notice a slight performance hit but imo it was still worth it over CNN Quality heck Transformer Balanced looks about the same if not better than CNN Quality at least in the games where I've tested them.
 
Yeah it was already more than fine on my previous 3060 Ti, I did notice a slight performance hit but imo it was still worth it over CNN Quality heck Transformer Balanced looks about the same if not better than CNN Quality at least in the games where I've tested them.

Yup, best case scenario IMO, other than people looking for max image quality by using DLAA with preset K, would be laptops with limited VRAM (4-6 GB), transformer model is enough of a visual upgrade that you can safely drop from Balanced to Performance, gain fps or lower power consumption while retaining a good image quality. That's where it'll win the most IMO, and I expect Switch 2 games will use this tech a lot.
 
Yup, best case scenario IMO, other than people looking for max image quality by using DLAA with preset K, would be laptops with limited VRAM (4-6 GB), transformer model is enough of a visual upgrade that you can safely drop from Balanced to Performance, gain fps or lower power consumption while retaining a good image quality. That's where it'll win the most IMO, and I expect Switch 2 games will use this tech a lot.
DLAA was a bit too much for my 3060 Ti especially with preset K but that does indeed looks the best like its a clear upgrade over native so for anyone who can handle that its the way to go.
If that tech is in a handheld then that should be really nice for sure.
Currently I can do PT Cyberpunk with DLSS Transformer Balanced+Ray reconstruction with a mix of high and ultra settings on top and that way its a solid 60+ FPS which is a nice base for framegen if I want to use it and the game plays and looks really damn good. 'Most likely how I'm gonna replay the game with Phantom Liberty once I get to it :laugh: '
 
DLAA was a bit too much for my 3060 Ti especially with preset K but that does indeed looks the best like its a clear upgrade over native so for anyone who can handle that its the way to go.
If that tech is in a handheld then that should be really nice for sure.
Currently I can do PT Cyberpunk with DLSS Transformer Balanced+Ray reconstruction with a mix of high and ultra settings on top and that way its a solid 60+ FPS which is a nice base for framegen if I want to use it and the game plays and looks really damn good. 'Most likely how I'm gonna replay the game with Phantom Liberty once I get to it :laugh: '

Your new 5070 will do great there, that's for sure. And you can always enjoy most games with DLAA preset K included with it, as long as you can replace the DLSS DLL, the new DLLs will work on any game that supports DLSS 3. Only one I'm aware of that you cannot upgrade the DLL is Honkai Star Rail because it'll fail the file integrity check rendering the game unplayable. It has DLSS 3.7.10 though, so it still looks pretty good. Unlikely miHoYo will keep it updated, though, considered they're as lazy as it gets with the PC versions of their games. I love their games, but their PC versions clearly aren't their highest priority.
 
Don't get the whining about any alleged lack of VRAM. If a certain nVidia card does not have as much VRAM as you need then... I don't know... don't buy it, I guess? Or is anyone holding a gun to your head and forcing you to buy a card with what you perceive to be too little VRAM?

Alternatively, buy an AMD card since they seem to be giving away VRAM for free in spades, right?

Life could be so simple but no... here we go with another completely unnecessary whinefest :D .
 
Go watch videos on YT of AMD vs nvidia side by side. You will notice nvidia cards usually use around 10% less vram usage over all. So to say can do very little well they can.
This is exactly what I said so I don't know why you posted that in an arguing manner...
I know RX 9060XT has PCIe 5.0 x16 while 5060(Ti) has only PCIe 5.0 x8
For 8 GB models, it actually matters but not THAT significantly. Sure, a cut-down NV card is worse in this regard but not drastically. For 16 gig models, however, running on PCI-e 4.0/5.0 guarantees you'll see no meaningful impact on either GPU. Maybe 1 or 3 percent here and there will be lost but is it really a big deal?
In desktop you still have the option of buying the 16GB version at least with 5060Ti.
Sure but it's ridiculously expensive. And I'm yet to figure why would anyone in their right mind seriously consider buying a laptop for gaming.
 
Your new 5070 will do great there, that's for sure. And you can always enjoy most games with DLAA preset K included with it, as long as you can replace the DLSS DLL, the new DLLs will work on any game that supports DLSS 3. Only one I'm aware of that you cannot upgrade the DLL is Honkai Star Rail because it'll fail the file integrity check rendering the game unplayable. It has DLSS 3.7.10 though, so it still looks pretty good. Unlikely miHoYo will keep it updated, though, considered they're as lazy as it gets with the PC versions of their games. I love their games, but their PC versions clearly aren't their highest priority.
Currently I'm playing Wuthering Waves and its latest update with RT on max + with the latest or preset K set in the App and it seems to work pretty well, the game has a bottleneck somewhere else anyway so I have the settings cranked up since its not my GPU thats limiting it. 'Supposedly its an issue with this update since it was fine in the other regions..'
Lately I don't do DLL swaps and only use the Nvidia App as long as the game is supported there.
 
Back
Top