• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Beats Intel to Integer Scaling, but not on all Cards

Joined
Feb 20, 2019
Messages
281 (1.06/day)
System Name PowerEdge R730 DRS Cluster
Processor 4x Xeon E5-2698 v3
Cooling Many heckin screamy bois
Memory 480GB ECC DDR4-2133
Video Card(s) Matrox G200eR2
Storage SD Card. Yep, really no other local storage.
Display(s) It's probably a couple of boring Dell Ultrasharps and a sacrificial laptop.
Case 39U 6-rack server room with HEVC and 44KVA UPS
Mouse Maybe
Keyboard Yes!
Software ESXi 6.5 U3
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Heh, I was just tapping out of the argument, no reason to leave the discussion.

Anyway, the benefit to the blocky pixel look is that text is sharper and easier to read than it would be if it were interpolated, so it's more than just games and emulation.

This is especially relevant on high-dpi displays such as 4K TVs and monitors because the alternative is Windows DPI scaling which is still very much a mixed bag in terms of quality and consistency, especially when only half of the Windows 10 interface and even fewer applications actually dpi-scale gracefully. Even if DPI scaling looks better, the jarring contrast between one application rendering text at 4K native with subpixel AA (Cleartype) alongside another application running at 1080p and bilnear filtering makes it look way worse than if the whole image was scaled using the same method.

Integer scaling faces one additional issue on the desktop, and that's subpixel AA.

With a typical RGB stripe, the horizonal resolution is tripled by using colour channels as effective edges - look up cleartype if that's not something you're already familiar with - and integer scaling is going to fail on that because the physical RGB stripe doesn't integer scale.

1080p on 4K monitors will only work well for integer scaling when subpixel AA is used if the physical stripe went RRGGBB for each pair of horizontal pixels instead of RGBRGB. The easy option is to just disable subpixel AA but a smarter option would be for the graphics driver to be aware of subpixel AA and the physical stripe layout, so that it could adjust rendered RGB subpixel AA output to a native RGBRGB physical pixel (using 2x integer scaling as an example there).



The ZSNES HQX option (above) would seem to be really well-suited to desktop UIs - certainly better than integer scaling (which some prefer to the default bilinear scaling).
If a GPU driver gave me an HQX option, I'd switch to it immediately - I'm just not sure what the GPU overhead would be, but you save so much by dropping from 4K rendering to 1080p rendering that I don't think it would matter on any GPU.
 
Last edited:

las

Joined
Nov 14, 2012
Messages
773 (0.30/day)
Processor i9-9900K @ 5.2 GHz
Motherboard AsRock Z390 Extreme
Cooling Custom Water
Memory 32GB DDR4 4000/CL15
Video Card(s) 1080 Ti @ 2 GHz
Storage 970 Evo Plus 1TB + 64TB NAS
Display(s) Asus PG279Q 27" 1440p/IPS/165 Hz/Gsync + LG C7V 65" 2160p/OLED/HDR
Case Fractal Define Mini C
Audio Device(s) Asus Essence STX w/ Upgraded Op-Amps
Power Supply EVGA SuperNOVA 850 G2
Mouse Logitech G Pro Wireless
Keyboard Logitech G610 / MX Red + O Rings
Software Windows 10 Pro x64
I CAN'T LIVE without integer scaling.
 
Joined
Jul 19, 2006
Messages
42,964 (8.83/day)
Processor i7 8700K
Motherboard Asus Maximus Hero X WiFi
Cooling Water
Memory 32GB G.Skill 3200Mhz CL14
Video Card(s) GTX 1080
Storage SSD's
Display(s) Nixeus EDG27
Case Thermaltake Core X5
Audio Device(s) SoundBlaster Zx
Power Supply Corsair H1000i
Mouse Finalmouse Pro
Keyboard Razer BlackWidow Tournament Ed.
Why in the heck is this only enabled on Turing cards? Need to figure out a way around this.
 
Joined
Jul 5, 2013
Messages
7,457 (3.21/day)
Why in the heck is this only enabled on Turing cards? Need to figure out a way around this.
Based on how it's done it should be trivial to do on all supported cards NVidia is making drivers for.



The ZSNES HQX option (above) would seem to be really well-suited to desktop UIs - certainly better than integer scaling (which some prefer to the default bilinear scaling).
If a GPU driver gave me an HQX option, I'd switch to it immediately - I'm just not sure what the GPU overhead would be, but you save so much by dropping from 4K rendering to 1080p rendering that I don't think it would matter on any GPU.
I have to say, I still like the smoothed out version. To me it just looks better.
 
Joined
Feb 20, 2019
Messages
281 (1.06/day)
System Name PowerEdge R730 DRS Cluster
Processor 4x Xeon E5-2698 v3
Cooling Many heckin screamy bois
Memory 480GB ECC DDR4-2133
Video Card(s) Matrox G200eR2
Storage SD Card. Yep, really no other local storage.
Display(s) It's probably a couple of boring Dell Ultrasharps and a sacrificial laptop.
Case 39U 6-rack server room with HEVC and 44KVA UPS
Mouse Maybe
Keyboard Yes!
Software ESXi 6.5 U3
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Yep, there are loads of pixel scaling methods but I showed ZSNES's HQX filter above because I feel that is the best suited filter for mixed graphics and text (ie, games and applications).

In the absence of scaling options, we've had to make do with blurry interpolation for 20 years. Integer scaling is an improvement for people who don't like blurring such as me. The real issue is the lack of any options at all; We used to have all these options!

In the late '90s as LCD displays started replacing CRTs, the blurry interpolation filtering was mandatory because the hardware display scalers used simple bicubic resampling to get results at non-integer scaling factors like 1024x768 on a 1280x1024 LCD screen. It's a one-size-fits-all solution that did the job adequately for the lowest price, requiring only a single fixed-function chip. 20 years later, we're still tolerating the lame result of this lowest-common-denominator fixed-function scaling, despite the fact that GPUs now have a myriad of optimisations for edge-detection, antialiasing, filtering, post-processing and other image-improving features.

I prefer this integer-scaled image (currently Turing only)

over this usual blurry display-scaled or GPU-scaled image

only because nobody is currently offering this as an option.

The technology to make scaling better has existed for 20 years, yet Intel and Nvidia are willy-waving over the ability to turn off the bad filtering rather than giving us better filtering. Am I the only one who sees this as a dumb move - regression rather than progress?

I would like to see an options list like this; I mean we already have a myriad of AA options for supersampling - why not apply the techniques developed over the last two decades to subsampling? There's FXAA, SMAA, Temporal AA (TXAA) - and probably more I've forgotten about that are post-process filters that can be applied to any image. Games already apply these filters to upscaled images - that's what happens when you reduce the 'resolution scale' slider in conjunction with FXAA or TXAA etc.

It can't be too hard to enable that feature outside of games, can it? I mean it already happens if I use a VR headset to view my Windows desktop and webpages so it's not a technical challenge, it just needs someone at Intel/AMD/Nvidia to include it in a driver control panel for use outside of gaming rather than as a gaming-only option.
 
Last edited:
Top