• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Lossless scaling

Joined
May 13, 2024
Messages
82 (0.23/day)
Processor Ryzen 7 5800X3D
Motherboard MSI Pro B550M-VC Wifi
Cooling Thermalright Peerless Assassin 120 SE
Memory 2x16GB G.Skill RipJaws DDR4-3600 CL16
Video Card(s) Asus DUAL OC RTX 4070 Super
Storage 4TB NVME, 2TB SATA SSD, 4TB SATA HDD
Display(s) Dell S2722DGM 27" Curved VA 1440p 165hz
Case Fractal Design Pop Air MIni
Power Supply Corsair RMe 750W 80+ Gold
Mouse Logitech G502 Hero
Keyboard GMMK TKL RGB Black
VR HMD Oculus Quest 2
What's the deal with lossless scaling? Does it actually deliver on it's promises? Is it worth using if you have a capable modern GPU and there's a native upscaling and frame gen solution in the game you're playing?
 
The trouble with Lossless scaling is, that it's behind a paywall & also technically DRM too.
I was wondering why reviewers haven't come across comparing other upscalers so far & this maybe the main reason.
Features such as D.L.S.S, F.S.R, & X.e.S.S are technically free as long the hardware you buy supports the versions of it.

My opinion is that these features were "not technically free" either as the prices of cards have increased substantially with their inclusion. Sometimes more the triple or quadruple their original tier pricing. It is no doubt that Nvidia running a A.I machine for 6 years straight endless for D.L.S.S contributed to the cost of increasing their cards prices. The price of that usages power isn't free.

The dual GPU/Frame gen think looks interesting since you get iGPU's to run the algorithm, & not need two of the same cards.
 
What's the deal with lossless scaling? Does it actually deliver on it's promises? Is it worth using if you have a capable modern GPU and there's a native upscaling and frame gen solution in the game you're playing?

It works but don't expect non-native implementations of DLSS / FSR to work as well as native ones for the most part.

Better yet just use Magpie. It's free and more flexable than lossless scaling.
 
i have lossless scaling on steam, it requires a lot of math to get it to work right, like you have to know your resolution then what scaling to use, etc. i honestly find it more annoying than useful. i still haven't found a good use case for it yet, i wasted 5 bucks on it, but oh well.
 
I use lossless scaling for frame gen in Cyberpunk, it works well for intended purpose, works better than DLSS, less blurry. Works very well with little artifacts when framerate above 30fps. I tried frame gen on my laptop while iGPU do the frame gen (Intel UHD on i5 11400H) and dGPU (RTX 3050) run the game. It didn't work as well as I'd hoped.
 
I used it back when driver-based integer scaling was not an option for my video card due to greedy vendor. The developer has since added a tonne more features. It's worth every penny, in my opinion.
 
What's the deal with lossless scaling? Does it actually deliver on it's promises? Is it worth using if you have a capable modern GPU and there's a native upscaling and frame gen solution in the game you're playing?

It works but at the same time doesn't work very well. Based on my experience using it for a short period of time on my old RTX 3090, the performance hit was a lot higher when compared to using hardware based upscaling like DLSS. The image quality is also not as good compared to DLSS 3, which is to be expected. Maybe it would compare more favorably with FSR 2 or 3.

I didn't bother to try the frame gen because I could just use FSR 3 frame gen instead.
 
I was using Lossless scaling as means to view Youtube in 3k that was only in 1080p :D, You can scale any window, and for older games, or games that are in low res this is quite fun, I've upscaled Baba is You that it has almost now borderlines, and textures are nicer for eye.

Upscaled
photo_2025-04-11_07-29-09 (2).jpg




Not upscaled
photo_2025-04-11_07-29-09.jpg


I'm thinking of using it as scaler for old tv shows that are in 720 or 1080.
 
Last edited:
"Lossless scaling" sounds like an oxymoron to me...
 
It refers to the Integer Scaling they started out with.
I got that, I still think it doesn't make much sense. "Lossless" was used in the context of data compression because the operation was meant to be reversed. Scaling changes the data for end consumption. In the cases where stuff are scaled as compression means, all scaling would be lossy.
 
I did try it a couple times but seemed more trouble than its worth since native just works better right now.
 
I got that, I still think it doesn't make much sense. "Lossless" was used in the context of data compression because the operation was meant to be reversed. Scaling changes the data for end consumption. In the cases where stuff are scaled as compression means, all scaling would be lossy.

Integer Scaling takes each pixel from the input and multiplies it. At 2x scaling, each source pixel is replaced by a 2x2 square with the same color. Not a single bit of source data is lost or compressed, and the process is reversible. Contrast with commonly used bilinear interpolation scaling which is lossy and causes blur.
 
"Blur" is not "loss." Both bilinear and nearest-neighbour upscaled images would still contain the original data. Reversal maths would be complicated in bilinear case (not by much, if the original grid structure was simply subdivided), no doubt. But, and this was is the point here: Reversal is not required or considered here in the first place!
 
"Blur" is not "loss." Both bilinear and nearest-neighbour upscaled images would still contain the original data. Reversal maths would be complicated in bilinear case (not by much, if the original grid structure was simply subdivided), no doubt. But, and this was is the point here: Reversal is not required or considered here in the first place!

I hope you're making sense to someone because you're certainly not making any sense to me. Integer scaling will remain lossless regardless of how you feel about it. This application was named accurately based on that fact. We're done here.
 
For the upscaling part, Lossless Scaling is not really that impressive. It's not really better than driver level upscaling of AMD and Nvidia. But it is easily applicapable to pretty much any game and video source so it has its uses.

In Frame Generation, Lossless Scaling actually does something neither AMD nor Nvidia havent thought of yet and I think it is brilliant. Dare I say, it will the way to follow for FG in coming years. And that is something called Adaptive Frame Generation.
What Adaptive mode in LS does is instead of sticking with 2x, 3x, 4x, 10x generation frames as usual, it sticks to a given target framerate. You basically input your monitor refresh rate and you always get that many frames in a second. So you always get the benefit of increased motion fluidity from FG as much as you can see on your monitor. This is better than fixed mode lock to your monitor refresh rate. Because you dont lose fluidity in a heavier scene if your card cant handle nor you have to stick with increased input lag when scene is much easier to run since fixed modes lock you to certain base ingame framerate. And it just much more easier to use as you dont need try to find a good framerate/quality setting to use FG at its best.
And LS uses a Flow Scale to predict motion in frames where you can adjust to increase motion fluidity by lowering FG overhead. I'd say whole experience is pretty close to AFMF2 so it's actually pretty enjoyable.
Of course in game FG has it's motion vector advantages, but LS can work in any game on any card. You dont even have to have a card actually, it runs on integrated graphics too.

So all in all, I think LS dev does something great. Though it can never actually beat natively supported DLSS, FSR or XeSS, it is more accessible than them and actually has the potential to lead the FG tech to better implementations.
 
Personally,with a big library of games, I have used it on those titles that were locked at 30 or 60 on PC. I'm now able to play those same games at a framegenned 180Hz with extreme fluidity, no corruption and minimal input lag. It really feels great.
 
I like it for making videos smoother but I prefer my AFMF2 as an FG alternative in games. It's simpler and feels more responsive to me. Could be placebo.
 
"Blur" is not "loss." Both bilinear and nearest-neighbour upscaled images would still contain the original data. Reversal maths would be complicated in bilinear case (not by much, if the original grid structure was simply subdivided), no doubt. But, and this was is the point here: Reversal is not required or considered here in the first place!
I know you are correct but you have to remember the target audience here is largely gamers and they really don't care how it works as long as it does.

I don't mean that as insult to gamers by the way. We all get there at some point. It's fine to want to kick back and not care how the pixels occured on your screen.
 
What's the deal with lossless scaling? Does it actually deliver on it's promises? Is it worth using if you have a capable modern GPU and there's a native upscaling and frame gen solution in the game you're playing?
I have used Lossless Scaling ever since X3 mode was added, then since July 2024 I have used it in dual-GPU mode, it's been a massive difference, I don't drop base FPS so I get better input latency and quality than running FSR/AFMF FG (No DLSS since I use an RTX 3060, my RTX 3060 + RX 6600M combination costed me $325, a banger of a combination for what it offers) while keeping a lot of customization, the scaling functions look okay but most people want it for the Frame Generation, you can even pair it with some processor's integrated graphics and lower the flow scale, this allowed my rtx 3050 laptop to run the frame generation on the integrated graphics and max out the 144hz display much easier in modern games.
 
Back
Top