Saturday, July 13th 2019

Intel adds Integer Scaling support to their Graphics lineup

Intel's Lisa Pearce today announced on Twitter, that the company has listened to user feedback from Reddit and will add nearest neighbor integer scaling to their future graphics chips. Integer scaling is the holy grail for gamers using console emulators, because it will give them the ability to simply double/triple or quadruple existing pixels, without any loss in sharpness that is inherent to traditional upscaling algorithms like bilinear or bicubic. This approach also avoids ringing artifacts that come with other, more advanced, scaling methods.

In her Twitter video, Lisa explained that this feature will only be available on upcoming Gen 11 graphics and beyond - previous GPUs lack the hardware required for implementing integer scaling. In terms of timeline, she mentioned that this will be part of the driver "around end of August", which also puts some constraints of the launch date of Gen 11, which seems to be rather sooner than later, based on that statement.

It is unclear at this time, whether the scaling method is truly "integer" or simply "nearest neighbor". While "integer scaling" is nearest neighbor at its core, i.e. it picks the closest pixel color and does no interpolation, the difference is that "integer scaling" uses only integer scale factors. For example, Zelda Breath of the Wild runs at 900p natively, which would require a 2.4x scaling factor for a 4K display. Integer scaling would use a scale factor of x2, resulting in a 1800p image, with black borders on top - this is what the gamers want. The nearest neighbor image would not have the black bars, but every second pixel would be tripled instead of doubled, to achieve the 2.4x scaling factor, but resulting in a sub-optimal presentation.

Update Jul 13: Intel has posted an extensive FAQ on their website, which outlines the details of their Integer Scaling implementation, and we can confirm that it is done correctly - the screenshots clearly show black borders all around the upscaled image, which is exactly what you would expect for scaling with integer scale factors. Intel does provide two modes, called "NN" (Nearest Neighbor) and "IS" (Integer Scaling).
Will Intel implement pure integer scaling with borders?

Yes, the driver being released in late August will provide users with the option to force integer scaling. The IS option will restrict scaling of game images to the greatest possible integer multiplier. The remaining screen area will be occupied by a black border, as mentioned earlier.
Sources: Twitter, FAQ on Intel Website
Add your own comment

56 Comments on Intel adds Integer Scaling support to their Graphics lineup

#26
Recus
I checked Zelda Breath of the Wild Cemu 4k videos and they don't look blurry at all. Also you can always use ESRGAN.
Posted on Reply
#27
koblongata
Pixel perfect scaling not only increases clarity but contrast as well, this is very exciting.

I mean if your graphics card cannot drive 4K smoothly (I don't know, even the top of the line ones can struggle), it's gonna look much much worse than one that can scale 1080P to 4K perfectly.

Even 1080P movie contents will benefit from it. It's like once you see it you cannot go back kind of difference.
Posted on Reply
#28
bug
koblongata, post: 4070646, member: 188574"
Pixel perfect scaling not only increases clarity but contrast as well, this is very exciting.

I mean if your graphics card cannot drive 4K smoothly (I don't know, even the top of the line ones can struggle), it's gonna look much much worse than one that can scale 1080P to 4K perfectly.

Even 1080P movie contents will benefit from it. It's like once you see it you cannot go back kind of difference.
You can let the monitor handle the scaling though. I'd hope there are at least some monitors out there that can scale without interpolating.
Posted on Reply
#29
koblongata
bug, post: 4070681, member: 157434"
You can let the monitor handle the scaling though. I'd hope there are at least some monitors out there that can scale without interpolating.
Hey... Even a high end home theater receiver costing thousands of dollars cannot offer upscaling better than true "pixel perfect" integer upscaling...

The thing is in order to be pixel perfect, the signal has to be formatted correctly right from the source (in this case the video card).

It isn't as simple as you guessed buddy... the difference is literally night and day, it will make 1080P games look clear and vibrant like 4K games on a 4K TV, or else there won't be so many users requesting it so much for YEARS.
Posted on Reply
#30
bug
koblongata, post: 4070735, member: 188574"
Hey... Even a high end home theater receiver costing thousands of dollars cannot offer upscaling better than true "pixel perfect" integer upscaling...

The thing is in order to be pixel perfect, the signal has to be formatted correctly right from the source (in this case the video card).

It isn't as simple as you guessed buddy... the difference is literally night and day, it will make 1080P games look clear and vibrant like 4K games on a 4K TV, or else there won't be so many users requesting it so much for YEARS.
Yeah, I'm sure the signal needs to be nearly magical in order for the monitor to take a color and fill a 2x2 or 4x4 matrix with it :wtf:
Posted on Reply
#31
RoutedScripter
Good, I do dolphin-emu a lot and can relate, but I don't have an iGPU on CPU.
Posted on Reply
#32
koblongata
bug, post: 4070736, member: 157434"
Yeah, I'm sure the signal needs to be nearly magical in order for the monitor to take a color and fill a 2x2 or 4x4 matrix with it :wtf:
There are no display or receiver or anything I am aware of can do integer upscaling... Yeah, I guess it must be really hard or something :wtf:

Anyhow, I think it is better done within the video card though, so you can output your 1080P games/videos in pixel perfect 4K signal to any 4K TV, old or new.
Posted on Reply
#33
bug
koblongata, post: 4070772, member: 188574"
There are no display or receiver or anything I am aware of can do integer upscaling... Yeah, I guess it must be really hard or something :wtf:

Anyhow, I think it is better done within the video card though, so you can output your 1080P games/videos in pixel perfect 4K signal to any 4K TV, old or new.
Well, if there's no display that can do that, then my assumption was wrong. That doesn't imply integer upscaling is hard though.
Posted on Reply
#34
ripsteakjaw
lexluthermiester, post: 4069782, member: 134537"
Good point.

Thing is, we never had that BITD. Because of the way TV CRT's worked, there was always the smoothing/blurring/blending effect caused by the way the electron gun beams scanned through the color masks and produced an image. Anyone who says they remember the "blocky" look is kinda fooling themselves because it just never happened that way. I'm not saying at all that it's wrong to prefer that look, just that we never had it back then because of the physical limitations of the display technology of the time.

So this Integer Scaling thing, while useful to some, isn't all that authentic for many applications.
The effect you prefer of smeared filters isn't any more "authentic." Playing a game on a crt looks nothing like it. You can however apply crt shaders to the pixelated look and get an authentic looking picture, but not with that bilinear supereagle crap.
Posted on Reply
#35
lexluthermiester
ripsteakjaw, post: 4078549, member: 188906"
but not with that bilinear supereagle crap.
I didn't say anything about "SuperEagle". Not a fan at of that filter. Please don't quote me with words I didn't say in the conversation.
Posted on Reply
#36
koblongata
bug, post: 4070915, member: 157434"
Well, if there's no display that can do that, then my assumption was wrong. That doesn't imply integer upscaling is hard though.
That's what really puzzles me too, if it is easy, why hasn't anyone done it? People have been asking for it since more than 10 years ago, since LCD first came out.
Maybe streaming over HDMI isn't as lossless as we think? so you need more specific instructions for pixel perfect alignment?
Or maybe it's just a consensus of the industry (display, media transport, video card makers..etc.) so that they can keep forcing people to upgrade even if not everyone needs it?
I really don't know, but I just know I really want Integer Scaling.
Posted on Reply
#37
RainingTacco
Finally i can buy 4k monitor and dont have to worry about playing at 1080p with blurry image.
Posted on Reply
#38
bug
RainingTacco, post: 4080519, member: 188847"
Finally i can buy 4k monitor and dont have to worry about playing at 1080p with blurry image.
Only if you play titles that Intel's IGPs can push :(
Posted on Reply
#39
Turmania
I think my eyes are failing on me or age start's to show as I don't see any difference between those 2 pictures.
Posted on Reply
#40
bug
Turmania, post: 4080533, member: 182201"
I think my eyes are failing on me or age start's to show as I don't see any difference between those 2 pictures.
Impossible. One picture shows the scaling methods in action side by side, while the other shows Ms. Pearce.
Posted on Reply
#41
lexluthermiester
I find it interesting that a little more than half of the people who voted in the poll said that they find this useful. I didn't realize that so many people even understood what Integer Scaling was let alone had an interest in it.
Posted on Reply
#42
erocker
*
lexluthermiester, post: 4080634, member: 134537"
I find it interesting that a little more than half of the people who voted in the poll said that they find this useful. I didn't realize that so many people even understood what Integer Scaling was let alone had an interest in it.
Not sure why you wouldn't think a lot of people don't know what this is or if they want it? Even if they didn't know before, the picture showing what it is, is obviously of interest to most.
Posted on Reply
#43
TheDeeGee
Suddenly it feels good to still use a 16:10 Monitor =]
Posted on Reply
#44
RainingTacco
bug, post: 4080521, member: 157434"
Only if you play titles that Intel's IGPs can push :(
Yeah i didnt read its for IGP only. Oh well i hope they will also include this technology in their standalone GPUs that are coming in 2020[?]
Posted on Reply
#45
lexluthermiester
erocker, post: 4080667, member: 28484"
Not sure why you wouldn't think a lot of people don't know what this is or if they want it? Even if they didn't know before, the picture showing what it is, is obviously of interest to most.
You say that like you think it's common knowledge. Not sure I'd agree with that. Seeing a picture only shows the result and W1zzard explained some of the details of what the effect is. However, most people do not understand how & why it actually works and thus why it can only be included in the latest drivers for the latest hardware. One would think that such a function could be retrofitted in driver updates to previous gen hardware, but it's not that simple. However, integer scaling can be done in software if the time is taken to implement it. Just takes much more processing time.
Posted on Reply
#46
Mescalamba
Yup, Im positive Intel is gonna be next good thing for GPUs.

Or at least I really hope for that cause Im bit fed of AMD-nVidia having nearly cartel deal.
Posted on Reply
#47
bug
RainingTacco, post: 4080733, member: 188847"
Yeah i didnt read its for IGP only. Oh well i hope they will also include this technology in their standalone GPUs that are coming in 2020[?]
It will come to their discrete cards, that's almost certain. I was just saying, for the time being, Intel's only graphics accelerators are their IGP.
Posted on Reply
#48
Shihabyooo
Wait, a data-generation algorithm that doesn't involve AI! What is this, 1995?

It's a nice option to have around, but I'm not sure it'll see much use. And I really don't see Nvidia and AMD needing it, considering how they can be either overpowered for applications that benefit from this or counter-productive (with AA) to have it with the ones used on their cards.

bug, post: 4069707, member: 157434"
Of all the scaling methods, this is the one that barely makes an impact if you implement it in software only.
Which is why I'm wondering why hardware support for it should be news to console emulators, many already catering to the pixel-perfect crowd (assuming I'm not misunderstanding the original post :| ).
Posted on Reply
#49
lexluthermiester
Shihabyooo, post: 4080970, member: 91709"
And I really don't see Nvidia and AMD needing it, considering how they can be either overpowered for applications that benefit from this or counter-productive (with AA) to have it with the ones used on their cards.
That and the fact that their GPU's can effortlessly do it in software.
Posted on Reply
#50
Minus Infinity
bug, post: 4069787, member: 157434"
It's not entirely authentic, but it's as close to the original as possible.
When instead of one pixel you display 9 or 16, any interpolation method will be way softer than the original.
Like you said, you prefer a softer look. It's just that others prefer it the other way around (imagine that :D )
Try Topaz Gigapixel AI for upscaling you will be amazed at what it can do. Obviously this tech would be years away from use in graphics card but you can achieve amazing results.
Posted on Reply
Add your own comment