• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Beats Intel to Integer Scaling, but not on all Cards

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
NVIDIA, with its GeForce 436.02 Gamescom-special drivers. among several new features and performance updates, introduced integer scaling, a resolution upscaling algorithm that scales up extremely low-resolution visuals to a more eye-pleasing blocky pixellated lines by multiplying pixels in a "nearest-neighbor" pattern without changing their color, as opposed to bilinear upscaling, that blurs the image by attempting to add details where none exist, by altering colors of multiplied pixels.

Intel originally announced an integer upscaler this June that will be exclusive to the company's new Gen11 graphics architecture, since older generations of its iGPUs "lack the hardware requirements" to pull it off. Intel's driver updates that add integer-scaling are set to arrive toward the end of this month, and even when they do, only a tiny fraction of Intel hardware actually benefit from the feature (notebooks and tablets that use "Ice Lake" processors).



NVIDIA's integer upscaling feature has been added only to its "Turing" architecture GPUs (both RTX 20-series and GTX 16-series), but not on older generations. NVIDIA explains that this is thanks to a "hardware-accelerated programmable scaling filter" that was introduced with "Turing." Why this is a big deal? The gaming community has a newfound love for retro games from the 80's thru 90's, with the growing popularity of emulators and old games being played through DOSBox. Many small indie game studios are responding to this craze with hundreds of new titles taking a neo-retro pixellated aesthetic (eg: "Fez").

When scaled up to today's high-resolution displays, many of these games look washed out thanks to bilinear upscaling by Windows. This calls for something like integer upscaling. We're just surprised that NVIDIA and Intel aren't implementing integer upscaling on unsupported GPUs by leveraging programmable shaders. Shader-based upscaling technologies aren't new. The home-theater community is vastly invested in the development of MadVR, a custom-video renderer that packs several shader-based upscaling algorithms that only need Direct3D 9.0c-compliant programmable shaders.

Update 12:57 UTC: The NVIDIA 436.02 drivers have been released now and are available for download.

View at TechPowerUp Main Site
 
I still remember when Manuel@Nvidia said that they don't have plan to do this.
This must be because Intel and AMD plan to support interger scaling.
 
Nicking selling points from Intel and AMD with the drop of a single driver..... ouch.
 
No BS just action, i like this. Intel promised to deliver this already. I have not seen any mention of AMD going to support integer scaling
 
So they HW locked this to Turing. Good one.
 
So they HW locked this to Turing. Good one.
Leather Jackets are expensive.

On the other hand, maybe it was a good idea to skip the 20XX series.

Cuz with Turing you get RTX and Iteger Scaling for the price of one.
 
It only took 10 years of begging..and to be locked to turing.. and we are celebrating this why exactly??

Eh comparing to getting nothing? How about celebrate for progression? OR if you are so good, why not code yourself for the support?

So much entitlement.
 
I like the fuzzy looking one better personally. If you squint your eyes; it looks even better.

I hope this is a troll and not you genuinely being blind? :roll:
 
I have not seen any mention of AMD going to support integer scaling

Isnt this the similar tech as "Radeon Image Sharpening" ?!

Great new's dont get me wrong but one more time I feel like Nvidia has been sandbagging this and many other features unless the competition offers them. And one more time locking it for Turing, still trying to rise those low RTX sales...
Is it illegal ?! no but makes them kind of douches. :rolleyes:
 
It only took 10 years of begging..and to be locked to turing.. and we are celebrating this why exactly??

Because Nvidia was 1st
 
So they HW locked this to Turing. Good one.
I bet this will probably be available to older cards just like Fast Sync and RTRT on later driver revisions. Gotta put some smile to Turing owner first. I see no reason why this isn't possible on older cards when a software solution like Lossless Scaling is available for quite some time now and work pretty well too.
 
Isnt this the similar tech as "Radeon Image Sharpening" ?!

Great new's dont get me wrong but one more time I feel like Nvidia has been sandbagging this and many other features unless the competition offers them. And one more time locking it for Turing, still trying to rise those low RTX sales...
Is it illegal ?! no but makes them kind of douches. :rolleyes:


Nope. Not even a million miles close. This is faaaar away from that shader filter gimmick on Radeon cards.

 
That thing that you call gimmick, works better than DLSS.


It is integer scaling we are talking about here. So let's get back to topic since trom89 derailed us


So when is AMD bringing on the integer scaling support?
 
Isnt this the similar tech as "Radeon Image Sharpening" ?!

Great new's dont get me wrong but one more time I feel like Nvidia has been sandbagging this and many other features unless the competition offers them. And one more time locking it for Turing, still trying to rise those low RTX sales...
Is it illegal ?! no but makes them kind of douches. :rolleyes:

Doesn't RIS require a Navi card to work?
 
Does that make AMD kind of douch? By the red fans' standard I guess yes.

Yes, it does. Slowing the release of new tech for leverage and/or monies is always douchy in my view.
And no, there is no red or green team, tech improvement is always great for the consumer.
 
Because Nvidia was 1st
ATI was first with tessellation, doesn't mean they did it right or best. Then again Nvidia fucked tessellation up in games just to fuck competition.

Niche application from niche hardware. Intel still dominates the video adapter space by population.
 
Bout dam time! Now people won't hesitate to buy higher resolution displays than their GPU can handle. Bring out 'em 8k displays!
 
Back
Top