• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel adds Integer Scaling support to their Graphics lineup

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,641 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Intel's Lisa Pearce today announced on Twitter, that the company has listened to user feedback from Reddit and will add nearest neighbor integer scaling to their future graphics chips. Integer scaling is the holy grail for gamers using console emulators, because it will give them the ability to simply double/triple or quadruple existing pixels, without any loss in sharpness that is inherent to traditional upscaling algorithms like bilinear or bicubic. This approach also avoids ringing artifacts that come with other, more advanced, scaling methods.

In her Twitter video, Lisa explained that this feature will only be available on upcoming Gen 11 graphics and beyond - previous GPUs lack the hardware required for implementing integer scaling. In terms of timeline, she mentioned that this will be part of the driver "around end of August", which also puts some constraints of the launch date of Gen 11, which seems to be rather sooner than later, based on that statement.





It is unclear at this time, whether the scaling method is truly "integer" or simply "nearest neighbor". While "integer scaling" is nearest neighbor at its core, i.e. it picks the closest pixel color and does no interpolation, the difference is that "integer scaling" uses only integer scale factors. For example, Zelda Breath of the Wild runs at 900p natively, which would require a 2.4x scaling factor for a 4K display. Integer scaling would use a scale factor of x2, resulting in a 1800p image, with black borders on top - this is what the gamers want. The nearest neighbor image would not have the black bars, but every second pixel would be tripled instead of doubled, to achieve the 2.4x scaling factor, but resulting in a sub-optimal presentation.

Update Jul 13: Intel has posted an extensive FAQ on their website, which outlines the details of their Integer Scaling implementation, and we can confirm that it is done correctly - the screenshots clearly show black borders all around the upscaled image, which is exactly what you would expect for scaling with integer scale factors. Intel does provide two modes, called "NN" (Nearest Neighbor) and "IS" (Integer Scaling).

Will Intel implement pure integer scaling with borders?

Yes, the driver being released in late August will provide users with the option to force integer scaling. The IS option will restrict scaling of game images to the greatest possible integer multiplier. The remaining screen area will be occupied by a black border, as mentioned earlier.



View at TechPowerUp Main Site
 
This is pretty cool, finally an end to blurry upscale.

From INTEL no less. Damn
 
Low quality post by fynxer
Haha, time to make an update, she is probably upwards 10-15 years younger in here profile picture.

Just undermines her credibility as a person, when working a high profile job, not to represent her real self.

QYXTJ1CiJxDQRl4Sw8RwyUsVCnJYlmcvyrVB-6g7JiQ.jpg
 
Oh it is Lisa......wait it is Intel.
 
Thank you!
AMD and NVIDIA, you're next!
 
I find it funny this has been blocked by lack of hardware support. Of all the scaling methods, this is the one that barely makes an impact if you implement it in software only.
 
Last edited:
I find it funny this has been blocked by lack of hardware support. Of all the scaling methods, this is the one that barely makes an impact if you implement in in software only.
Yeah I don't buy the "implementing it on older generations would be a hack" argument either
 
I actually prefer Bilinear and Trilinear "scaling" filters as they give a softer blending effect. I've never been a fan of the sharp-edge pixel look. But I digress...
I believe this is catering to those that prefer the original, blocky look. It can also be a bonus when eyesight starts failing us ;)
 
that only works if you have multiple mipmapped images , which isn't the case here
Good point.
I believe this is catering to those that prefer the original, blocky look.
Thing is, we never had that BITD. Because of the way TV CRT's worked, there was always the smoothing/blurring/blending effect caused by the way the electron gun beams scanned through the color masks and produced an image. Anyone who says they remember the "blocky" look is kinda fooling themselves because it just never happened that way. I'm not saying at all that it's wrong to prefer that look, just that we never had it back then because of the physical limitations of the display technology of the time.

So this Integer Scaling thing, while useful to some, isn't all that authentic for many applications.
 
Good point.

Thing is, we never had that BITD. Because of the way TV CRT's worked, there was always the smoothing/blurring/blending effect caused by the way the electron gun beams scanned through the color masks and produced an image. Anyone who says they remember the "blocky" look is kinda fooling themselves because it just never happened that way. I'm not saying at all that it's wrong to prefer that look, just that we never had it back then because of the physical limitations of the display technology of the time.

So this Integer Scaling thing, while useful to some, isn't all that authentic for many applications.
It's not entirely authentic, but it's as close to the original as possible.
When instead of one pixel you display 9 or 16, any interpolation method will be way softer than the original.
Like you said, you prefer a softer look. It's just that others prefer it the other way around (imagine that :D )
 
When instead of one pixel you display 9 or 16, any interpolation method will be way softer than the original.
It's a bit more complicated than that, but yes.

Like you said, you prefer a softer look. It's just that others prefer it the other way around (imagine that :D )
I understand this of course. Just saying that such is not authentic due to the way the display technology of the time produced images, whether watching TV broadcasts, VHS/Laserdisc movies, video games or even early computers.
 
  • Like
Reactions: bug
I just want Intel release a dedicated post processing card based around FPGA's where you can easily toggle and reconfigure the hardware to maximize performance a bit like UAD DSP's for audio. Just have it post process a incoming HDMI/Display port signal and output to the display w/o adding more than like 1-4ms latency I'd be happy if it can do a lot of post process effects like reshade w/o impacting performance negatively in the process. I think the one thing Intel really needs the push and emphasis is taking advantage of it's FPGA tech both for it's CPU's and GPU's bundling in a little bit on both of those two things could go a long way I'd think. I mean eventually some of the FPGA duties could moved to more fixed hardware ASIC functionality, but FPGA's are flexible and having a bit of that can be nice.
 
Definitely a nice feature to have, now leak some juicy info about your dGPU please
 
I dont want to see Lisa Pearce on a 8k TV, I want to see her is real life. She's looking nice.
 
Haha, time to make an update, she is probably upwards 10-15 years younger in here profile picture.

Just undermines her credibility as a person, when working a high profile job, not to represent her real self.

QYXTJ1CiJxDQRl4Sw8RwyUsVCnJYlmcvyrVB-6g7JiQ.jpg
Both pictures are on her Twitter Page. What does her looks matter? She is paid for her brain.
 
Yeah now if only I could run CEMU on intel graphics
 
Back
Top