• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel adds Integer Scaling support to their Graphics lineup

I checked Zelda Breath of the Wild Cemu 4k videos and they don't look blurry at all. Also you can always use ESRGAN.
 
Pixel perfect scaling not only increases clarity but contrast as well, this is very exciting.

I mean if your graphics card cannot drive 4K smoothly (I don't know, even the top of the line ones can struggle), it's gonna look much much worse than one that can scale 1080P to 4K perfectly.

Even 1080P movie contents will benefit from it. It's like once you see it you cannot go back kind of difference.
 
Last edited:
Pixel perfect scaling not only increases clarity but contrast as well, this is very exciting.

I mean if your graphics card cannot drive 4K smoothly (I don't know, even the top of the line ones can struggle), it's gonna look much much worse than one that can scale 1080P to 4K perfectly.

Even 1080P movie contents will benefit from it. It's like once you see it you cannot go back kind of difference.
You can let the monitor handle the scaling though. I'd hope there are at least some monitors out there that can scale without interpolating.
 
You can let the monitor handle the scaling though. I'd hope there are at least some monitors out there that can scale without interpolating.

Hey... Even a high end home theater receiver costing thousands of dollars cannot offer upscaling better than true "pixel perfect" integer upscaling...

The thing is in order to be pixel perfect, the signal has to be formatted correctly right from the source (in this case the video card).

It isn't as simple as you guessed buddy... the difference is literally night and day, it will make 1080P games look clear and vibrant like 4K games on a 4K TV, or else there won't be so many users requesting it so much for YEARS.
 
Last edited:
Hey... Even a high end home theater receiver costing thousands of dollars cannot offer upscaling better than true "pixel perfect" integer upscaling...

The thing is in order to be pixel perfect, the signal has to be formatted correctly right from the source (in this case the video card).

It isn't as simple as you guessed buddy... the difference is literally night and day, it will make 1080P games look clear and vibrant like 4K games on a 4K TV, or else there won't be so many users requesting it so much for YEARS.
Yeah, I'm sure the signal needs to be nearly magical in order for the monitor to take a color and fill a 2x2 or 4x4 matrix with it :wtf:
 
Good, I do dolphin-emu a lot and can relate, but I don't have an iGPU on CPU.
 
Yeah, I'm sure the signal needs to be nearly magical in order for the monitor to take a color and fill a 2x2 or 4x4 matrix with it :wtf:

There are no display or receiver or anything I am aware of can do integer upscaling... Yeah, I guess it must be really hard or something :wtf:

Anyhow, I think it is better done within the video card though, so you can output your 1080P games/videos in pixel perfect 4K signal to any 4K TV, old or new.
 
There are no display or receiver or anything I am aware of can do integer upscaling... Yeah, I guess it must be really hard or something :wtf:

Anyhow, I think it is better done within the video card though, so you can output your 1080P games/videos in pixel perfect 4K signal to any 4K TV, old or new.
Well, if there's no display that can do that, then my assumption was wrong. That doesn't imply integer upscaling is hard though.
 
Good point.

Thing is, we never had that BITD. Because of the way TV CRT's worked, there was always the smoothing/blurring/blending effect caused by the way the electron gun beams scanned through the color masks and produced an image. Anyone who says they remember the "blocky" look is kinda fooling themselves because it just never happened that way. I'm not saying at all that it's wrong to prefer that look, just that we never had it back then because of the physical limitations of the display technology of the time.

So this Integer Scaling thing, while useful to some, isn't all that authentic for many applications.

The effect you prefer of smeared filters isn't any more "authentic." Playing a game on a crt looks nothing like it. You can however apply crt shaders to the pixelated look and get an authentic looking picture, but not with that bilinear supereagle crap.
 
Well, if there's no display that can do that, then my assumption was wrong. That doesn't imply integer upscaling is hard though.

That's what really puzzles me too, if it is easy, why hasn't anyone done it? People have been asking for it since more than 10 years ago, since LCD first came out.
Maybe streaming over HDMI isn't as lossless as we think? so you need more specific instructions for pixel perfect alignment?
Or maybe it's just a consensus of the industry (display, media transport, video card makers..etc.) so that they can keep forcing people to upgrade even if not everyone needs it?
I really don't know, but I just know I really want Integer Scaling.
 
Finally i can buy 4k monitor and dont have to worry about playing at 1080p with blurry image.
Only if you play titles that Intel's IGPs can push :(
 
Last edited:
I think my eyes are failing on me or age start's to show as I don't see any difference between those 2 pictures.
 
I think my eyes are failing on me or age start's to show as I don't see any difference between those 2 pictures.
Impossible. One picture shows the scaling methods in action side by side, while the other shows Ms. Pearce.
 
I find it interesting that a little more than half of the people who voted in the poll said that they find this useful. I didn't realize that so many people even understood what Integer Scaling was let alone had an interest in it.
 
I find it interesting that a little more than half of the people who voted in the poll said that they find this useful. I didn't realize that so many people even understood what Integer Scaling was let alone had an interest in it.

Not sure why you wouldn't think a lot of people don't know what this is or if they want it? Even if they didn't know before, the picture showing what it is, is obviously of interest to most.
 
Suddenly it feels good to still use a 16:10 Monitor =]
 
Not sure why you wouldn't think a lot of people don't know what this is or if they want it? Even if they didn't know before, the picture showing what it is, is obviously of interest to most.
You say that like you think it's common knowledge. Not sure I'd agree with that. Seeing a picture only shows the result and W1zzard explained some of the details of what the effect is. However, most people do not understand how & why it actually works and thus why it can only be included in the latest drivers for the latest hardware. One would think that such a function could be retrofitted in driver updates to previous gen hardware, but it's not that simple. However, integer scaling can be done in software if the time is taken to implement it. Just takes much more processing time.
 
Last edited:
Yup, Im positive Intel is gonna be next good thing for GPUs.

Or at least I really hope for that cause Im bit fed of AMD-nVidia having nearly cartel deal.
 
Yeah i didnt read its for IGP only. Oh well i hope they will also include this technology in their standalone GPUs that are coming in 2020[?]
It will come to their discrete cards, that's almost certain. I was just saying, for the time being, Intel's only graphics accelerators are their IGP.
 
Wait, a data-generation algorithm that doesn't involve AI! What is this, 1995?

It's a nice option to have around, but I'm not sure it'll see much use. And I really don't see Nvidia and AMD needing it, considering how they can be either overpowered for applications that benefit from this or counter-productive (with AA) to have it with the ones used on their cards.

Of all the scaling methods, this is the one that barely makes an impact if you implement it in software only.

Which is why I'm wondering why hardware support for it should be news to console emulators, many already catering to the pixel-perfect crowd (assuming I'm not misunderstanding the original post :| ).
 
And I really don't see Nvidia and AMD needing it, considering how they can be either overpowered for applications that benefit from this or counter-productive (with AA) to have it with the ones used on their cards.
That and the fact that their GPU's can effortlessly do it in software.
 
Back
Top