• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Beats Intel to Integer Scaling, but not on all Cards

Heh, I was just tapping out of the argument, no reason to leave the discussion.

Anyway, the benefit to the blocky pixel look is that text is sharper and easier to read than it would be if it were interpolated, so it's more than just games and emulation.

This is especially relevant on high-dpi displays such as 4K TVs and monitors because the alternative is Windows DPI scaling which is still very much a mixed bag in terms of quality and consistency, especially when only half of the Windows 10 interface and even fewer applications actually dpi-scale gracefully. Even if DPI scaling looks better, the jarring contrast between one application rendering text at 4K native with subpixel AA (Cleartype) alongside another application running at 1080p and bilnear filtering makes it look way worse than if the whole image was scaled using the same method.

Integer scaling faces one additional issue on the desktop, and that's subpixel AA.

With a typical RGB stripe, the horizonal resolution is tripled by using colour channels as effective edges - look up cleartype if that's not something you're already familiar with - and integer scaling is going to fail on that because the physical RGB stripe doesn't integer scale.

1080p on 4K monitors will only work well for integer scaling when subpixel AA is used if the physical stripe went RRGGBB for each pair of horizontal pixels instead of RGBRGB. The easy option is to just disable subpixel AA but a smarter option would be for the graphics driver to be aware of subpixel AA and the physical stripe layout, so that it could adjust rendered RGB subpixel AA output to a native RGBRGB physical pixel (using 2x integer scaling as an example there).

Test_nn.png
Test_hq3x.png


The ZSNES HQX option (above) would seem to be really well-suited to desktop UIs - certainly better than integer scaling (which some prefer to the default bilinear scaling).
If a GPU driver gave me an HQX option, I'd switch to it immediately - I'm just not sure what the GPU overhead would be, but you save so much by dropping from 4K rendering to 1080p rendering that I don't think it would matter on any GPU.
 
Last edited:
I CAN'T LIVE without integer scaling.
 
Why in the heck is this only enabled on Turing cards? Need to figure out a way around this.
 
Why in the heck is this only enabled on Turing cards? Need to figure out a way around this.
Based on how it's done it should be trivial to do on all supported cards NVidia is making drivers for.

Test_nn.png
Test_hq3x.png


The ZSNES HQX option (above) would seem to be really well-suited to desktop UIs - certainly better than integer scaling (which some prefer to the default bilinear scaling).
If a GPU driver gave me an HQX option, I'd switch to it immediately - I'm just not sure what the GPU overhead would be, but you save so much by dropping from 4K rendering to 1080p rendering that I don't think it would matter on any GPU.
I have to say, I still like the smoothed out version. To me it just looks better.
 
Yep, there are loads of pixel scaling methods but I showed ZSNES's HQX filter above because I feel that is the best suited filter for mixed graphics and text (ie, games and applications).

In the absence of scaling options, we've had to make do with blurry interpolation for 20 years. Integer scaling is an improvement for people who don't like blurring such as me. The real issue is the lack of any options at all; We used to have all these options!

In the late '90s as LCD displays started replacing CRTs, the blurry interpolation filtering was mandatory because the hardware display scalers used simple bicubic resampling to get results at non-integer scaling factors like 1024x768 on a 1280x1024 LCD screen. It's a one-size-fits-all solution that did the job adequately for the lowest price, requiring only a single fixed-function chip. 20 years later, we're still tolerating the lame result of this lowest-common-denominator fixed-function scaling, despite the fact that GPUs now have a myriad of optimisations for edge-detection, antialiasing, filtering, post-processing and other image-improving features.

Test_nn.png
I prefer this integer-scaled image (currently Turing only)

3sWnFGD.png
over this usual blurry display-scaled or GPU-scaled image

Test_hq3x.png
only because nobody is currently offering this as an option.

The technology to make scaling better has existed for 20 years, yet Intel and Nvidia are willy-waving over the ability to turn off the bad filtering rather than giving us better filtering. Am I the only one who sees this as a dumb move - regression rather than progress?

I would like to see an options list like this; I mean we already have a myriad of AA options for supersampling - why not apply the techniques developed over the last two decades to subsampling? There's FXAA, SMAA, Temporal AA (TXAA) - and probably more I've forgotten about that are post-process filters that can be applied to any image. Games already apply these filters to upscaled images - that's what happens when you reduce the 'resolution scale' slider in conjunction with FXAA or TXAA etc.

It can't be too hard to enable that feature outside of games, can it? I mean it already happens if I use a VR headset to view my Windows desktop and webpages so it's not a technical challenge, it just needs someone at Intel/AMD/Nvidia to include it in a driver control panel for use outside of gaming rather than as a gaming-only option.
 
Last edited:
Back
Top