• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Beats Intel to Integer Scaling, but not on all Cards

OR if you are so good, why not code yourself for the support?

Sure, let me just fork nouveau, and have documented apis to raise the boost clocks...

oh right, that's literally an nvidia "trade secret" making what you just stated nearly impossible. Sounds fun.

Might be something to do with Turing's ability to do concurrent int&fp math... I would not be surprised that Volta cards could support this as well.

Integer scaling is not hard. For god's sake man, the cpu could do it.
 
Sure, let me just fork nouveau, and have documented apis to raise the boost clocks...

oh right, that's literally an nvidia "trade secret" making what you just stated nearly impossible. Sounds fun.



Integer scaling is not hard. For god's sake man, the cpu could do it.
That's why it is so easily applied to software right?

Oh wait, it actually ISNT that easy, thats why GPU makers implementing it is a big deal.

Again, if it isnt that hard why dont you write a simple program to do this automatically and make big bucks? And what do boost clocks have to do with integer scaling? I mean, Nvidia does provide an API to mess with GPU clock rates, thats what the likes of precision X and afterburner hook into.
 
That's why it is so easily applied to software right?

It is all the time in things like emulators, yes.

Intercepting display output can't just be done in an app, however.
 
I just want to run my 4K HDTV at 1080p without blurring everything because of interpolation.

TV's own scaler = interpolation blur
Nvidia's GPU scaler = interpolation blur.

There's no way that disabling interpolation requires Turing; Nvidia are just being d*ckholes because they're first.
 
Integer scaling is not hard. For god's sake man, the cpu could do it.
Exactly. I'm still wondering why this needs to be done in hardware at all? It can be done in software perfectly and at very little processing penalty...

There's no way that disabling interpolation requires Turing; Nvidia are just being d*ckholes because they're first.
Fanboy alert, and you'd be wrong. NVidia was the first to implement it, but they didn't decide on it or announce it first;
 
Fanboy alert
I'm a fanboy for criticizing Nvidia? I criticize who I want - deal with it.
I'm criticising you right now for even bringing it up; It adds nothing of value to the discussion.

and you'd be wrong. NVidia was the first to implement it, but they didn't decide on it or announce it first
Why does Intel's announcement make me wrong? You're wrong! Intel products can't scale yet and the Turing card I have in my machine is already doing it.
That's first, by any definition - except yours, apparently.... :kookoo:
 
I'm a fanboy for criticizing Nvidia? I criticize who I want - deal with it.
I'm criticising you right now for even bringing it up; It adds nothing of value to the discussion.
You offered criticism in a way that was clearly and directly insulting(calling them "d*ckholes") to the company in question for doing little more than being first at something positive. That is "fanboying" by definition.
Why does Intel's announcement make me wrong? You're wrong! Intel products can't scale yet and the Turing card I have in my machine is already doing it.
That's first, by any definition - except yours, apparently.... :kookoo:
Context seems to be a lost concept on you. Intel announced it first, but because NVidia has a much more expansive and refined GPU hardware set they got it to market before Intel finished their implementation. AMD could likely do the same, if they cared.
 
I called them d*ckholes because their reason for locking it to Turing is a blatant lie in an arrogant attempt to sell more Turing cards when it's obviously a simple software operation. Even you agree on that last point.

I was using software integer scaling in Z-Snes and KGen emulators over 20 years ago. It's an incredibly simple post-processing filter and emulators of the late '90s even allowed you to write your own custom scaling filters; Making an integer scaler was a single short line of code and often used as the first step in other more complex filters. Intel may have announced it, but they sure as hell weren't first either. It's a concept that existed 25 years ago and fell out of mainstream use.

You should also look up the definition of the word fanboyism if you're going to start throwing it around, though - because trust me, you're using it wrong.
 
Haha man you always can hit it right on the spot


Yep only for Navi, somehow I didn't hear any outcry from the hypocrites from red camp. Does that make AMD kind of douch? By the red fans' standard I guess yes.

It's all double standards my friend, I must confess I didn't think the gamer community was desperate for image sharpening tech... but I guess Navi also needs to set itself apart from the mighty future proof tech that is Vega.
 
That's why it is so easily applied to software right?

It kind of is.

http://tanalin.com/en/projects/integer-scaler/ (freeware 500kb .exe & very small memory footprint ~ 2MiB)

https://store.steampowered.com/app/993090/Lossless_Scaling/ (didn't buy, it's 3 bucks, and for that it allows you to choose scale factor 1x,2x,3x,4x,... - also you can apply anti-aliasing on the losslessly scaled image)

These are just two I know of. There's possibly some other Freeware or Open-Source stuff floating around on github that achieves the exact same.
It's not some kind of miracle software.
Everyone who wants this functionality, because he is an eccentric oldschool gamer can already use it without much effort.
It's all factitious commotion.
 
There was nothing wrong with ATI's tesselator. The problem was DirectX being unable to work with it.

hum I don't know..
AMD improved the performance with the tesselation these last years on Polaris and vegas but their cards still struggle.
 
but their cards still struggle.
Struggle is a relative perspective. In the upper mid-tier range Radeon dominates on performance/cost ratio. Where they struggle is the top-tier range of performance.

Back on topic, been doing some reading on the way that Polaris, Vega and Navi do scaling in general and it seems like it would be a trivial effort to include integer scaling in driver. In fact, the same can be said for NVidia all the way back to Tesla(GTX2xx). Why it hasn't been implemented thus far on both sides could simply boil down to a lack of demand coupled with the minimal resource cost of doing that function in software because of how well known the coding is, how efficient it is to code and that is relatively very simple.
 
NVIDIA's integer upscaling feature has been added only to its "Turing" architecture GPUs (both RTX 20-series and GTX 16-series), but not on older generations. NVIDIA explains that this is thanks to a "hardware-accelerated programmable scaling filter" that was introduced with "Turing."
Suuuuuuuuuuuuure it is... :laugh: :laugh: :laugh: :laugh: :laugh:
 
129653


I like Integer scaling so far!
 
That was a software implementation. Integer scaling can be done in software all day long. Doing it on hardware is a very different story.

Who cares whether it's done in software or hardware? It's a near-zero cost operation that will add imperciptible overhead. Speaking of double standards, RTX features were added to 10-series cards in software, despite the fact that they suck at it. IMO that's the honest way to do things - giving people on older cards who like RTX features a chance to see them and genuine incentive to upgrade to Turing that can hardware-accelerate those features if they like what they see.

As for the definition of fanboy:
  • Fan = loyal to a team/brand, zealous, irrationally devoted.
  • Boy = child, immature
I attacked your preferred team/brand for making a BS excuse and you start throwing childish insults to defend them for no rational reason? I think I've found the fanboy, by definition.

As I mentioned earlier, I criticise who I please - Nvidia, Intel, and AMD alike, because BS needs to be called out rather than excused or endorsed - regardless of who it is. Anyway, I'm out. I don't really have anything more to add and this isn't valuable to the scaler discussion.
 
Last edited:
Integer scaling is not hard. For god's sake man, the cpu could do it.
If all you want is simple "integer scaling", that is already fully supported in basically every GPU since the 90s, and is simply achieved by rendering a low-resolution framebuffer to a full resolution framebuffer with nearest neighbor filtering enabled. The only thing new here is having this as a driver level feature without requiring application level support. (implementing this in applications is trivial though)

I would question whether this type of feature belongs in a driver though, not because I don't see the usefulness of this feature, but because I fear this will be yet another poorly maintained driver feature, or that other features will suffer instead. I think drivers already contain way too much that don't belong there, and all the manufacturers struggle with maintaining the feature set they already provide, especially AMD who to this date don't offer reliable OpenGL support.

Secondly, I would question whether integer scaling is the right solution to begin with. While multiplying pixel sizes to display them sharply on modern LCD/OLED displays is better than blurring, it will make them more blocky than they ever were on CRTs. Various emulators, console replicas and external scalers(like the Framemeister) already offer more options than just integer scaling, like the shape of the pixels, spacing between scanlines, etc. So if the goal of this integer scaling is to run old games and relpicate the "CRT look", then perhaps we need a even better solution than this?
 
If all you want is simple "integer scaling", that is already fully supported in basically every GPU since the 90s, and is simply achieved by rendering a low-resolution framebuffer to a full resolution framebuffer with nearest neighbor filtering enabled. The only thing new here is having this as a driver level feature without requiring application level support. (implementing this in applications is trivial though)

I would question whether this type of feature belongs in a driver though, not because I don't see the usefulness of this feature, but because I fear this will be yet another poorly maintained driver feature, or that other features will suffer instead. I think drivers already contain way too much that don't belong there, and all the manufacturers struggle with maintaining the feature set they already provide, especially AMD who to this date don't offer reliable OpenGL support.

Secondly, I would question whether integer scaling is the right solution to begin with. While multiplying pixel sizes to display them sharply on modern LCD/OLED displays is better than blurring, it will make them more blocky than they ever were on CRTs. Various emulators, console replicas and external scalers(like the Framemeister) already offer more options than just integer scaling, like the shape of the pixels, spacing between scanlines, etc. So if the goal of this integer scaling is to run old games and relpicate the "CRT look", then perhaps we need a even better solution than this?

I just want it to render 1080p perfectly on my theoretical 4k display, as an example.
 
I just want it to render 1080p perfectly on my theoretical 4k display, as an example.
When using a desktop computer, the scaling is done on the monitor/TV end. This does of course mean that it's up to the monitor to decide how to scale, and whether to give you any options about it, which some monitors does.

As I've mentioned, doing this on the driver side does give some benefits, but it will also require a robust interface to control it, even though the basic integer scaling is super simple to implement.
Applications can of course do whatever they want.
 
Last edited:
When using a desktop computer, the scaling is done on the monitor/TV end.

Not if the driver does it first.

A lot of TVs/monitors get it really, really wrong.
 
Not if the driver does it first.

A lot of TVs/monitors get it really, really wrong.
It's hard to name a single TV or monitor that gets it right.

In fairness to R-T-B, I like the idea of a better-than-bilinear option too.
Back in the early emulation days, a popular filter for 2x scaling (so appropriate for 1080p on a 4k display) was called SAI - shown on the right compared to integer scaling on the left.

2xsai.png
 
Speaking of double standards, RTX features were added to 10-series cards in software, despite the fact that they suck at it. IMO that's the honest way to do things - giving people on older cards who like RTX features a chance to see them and genuine incentive to upgrade to Turing that can hardware-accelerate those features if they like what they see.
That had nothing to do with the topic at hand.
As for the definition of fanboy:
  • Fan = loyal to a team/brand, zealous, irrationally devoted.
  • Boy = child, immature
Yup, you got it. Nailed it perfectly!
I attacked your preferred team/brand for making a BS excuse
You're making assumptions. Stop doing that.
and you start throwing childish insults to defend them for no rational reason? I think I've found the fanboy, by definition.
Let's review;
Turing; Nvidia are just being d*ckholes because they're first.
You were saying?
As I mentioned earlier, I criticise who I please - Nvidia, Intel, and AMD alike, because BS needs to be called out rather than excused or endorsed - regardless of who it is.
Typical fallback argument.
Anyway, I'm out. I don't really have anything more to add and this isn't valuable to the scaling discussion.
Good idea.
It's hard to name a single TV or monitor that gets it right.
Thought you were leaving? Nevermind, you've actually contributed to the conversion in a way that's constructive.
In fairness to R-T-B, I like the idea of a better-than-bilinear option too.
Back in the early emulation days, a popular filter for 2x scaling (so appropriate for 1080p on a 4k display) was called SAI - shown on the right compared to integer scaling on the left.
To be fair, I don't understand how anyone can prefer the blocky pixel look over the filter that makes the image look much more natural. The one on the right looks much better IMHO. SuperSAI took that a step further and improved image quality.
 
Back
Top