Sunday, February 11th 2024

Microsoft to Standardize AI Super-resolution for Games and Windows UWP Apps

A Windows 11 Insider build for the upcoming "24H2" release of Windows exposes a new graphics setting called "automatic super resolution," or ASR as Microsoft intends to call it. Its caption reads "use AI to make supported games play more smoothly with enhanced details." The toggle is designed to work not just with games, but also Windows UWP apps. PhantomOfEarth reports that the feature is Microsoft's own super resolution model that's different from DLSS, FSR, or XeSS. It is exposed as a feature on machines with an NPU or AI accelerator compatible with Microsoft's APIs. Apparently, the upscaler component of ASR leverages AI to reconstruct details. Since the caption reads "supported games," ASR may not work universally. It remains to be seen how its image quality compares to that of DLSS, FSR, or XeSS.

Besides games, ASR is being designed to mainly enhance the quality of video captured by webcams for collaboration apps such as Teams, Skype, and Camera—all three of which are UWP apps. Windows 11 23H2 already leverages AI on PCs with NPUs to offer several webcam filters, background manipulation, audio noise filtering, and other enhancements. ASR would attempt to enhance the resolutions. 720p remains the mainstream webcam resolution, with some premium notebooks integrating 1080p.
Sources: PhantomofEarth (Twitter), VideoCardz
Add your own comment

17 Comments on Microsoft to Standardize AI Super-resolution for Games and Windows UWP Apps

#1
john_
Features using the AI capabilities of the CPUs could become a reason for boosting sales of new computers and for people to upgrade to Windows 11 from Windows 10.
Posted on Reply
#2
Firedrops
That... is not what the word "standardize" means...
Posted on Reply
#3
natr0n
This is like the beginnings of physx add on cards for ai now. I'm getting a feeling about it.
Posted on Reply
#4
Tartaros
natr0nThis is like the beginnings of physx add on cards for ai now. I'm getting a feeling about it.
Relying in existing GPUs first and then in future CPUs, I magine. I was wondering if at some point AI processing was going to be in all GPUs no matter the price point but looks like it won't.
Posted on Reply
#5
Camm
I was wondering why we weren't seeing much push for FSR\XeSS integration with the NPU's showing up in laptop parts lately, guess missed Microsoft pushing it as an integration into Windows.

Good to see a standard, hopefully it might push DLSS and its vendor lock out of the market, although the elephant in the room is what happens to Linux in this scenario.
Posted on Reply
#6
Jism
TartarosRelying in existing GPUs first and then in future CPUs, I magine. I was wondering if at some point AI processing was going to be in all GPUs no matter the price point but looks like it won't.
Intel and amd are betting on selling CPU's with some form of "AI" thing in there. It's just specialised part within the chip that can do some basic "AI" calculations.

The whole AI thing can be stolen - just give us performance cores. Not that energy efficient nonsense. I can switch the Power saving tab on my windows install just fine myself.
Posted on Reply
#7
Ferrum Master
Let me fix it
with some premium notebooks integrating 1080p. Ditching that crap altogether.
Posted on Reply
#8
Broken Processor
If they really want to help PC gamers maybe they should sort out game bar or even better nuke it.
Posted on Reply
#9
TheDeeGee
When are they going to standardize the taskbar and add back in the "Small Taskbar" feature like we had in Windows 7/8/10 ?
Posted on Reply
#10
theouto
let's hope it's a good implementation AND off by default
Posted on Reply
#11
SOAREVERSOR
TartarosRelying in existing GPUs first and then in future CPUs, I magine. I was wondering if at some point AI processing was going to be in all GPUs no matter the price point but looks like it won't.
Most computers out in the wild are laptops with IGPs getting an AI coproc onto all of these gets you the hardware base to get moving with something like this.
Posted on Reply
#12
ThrashZone
Hi,
What happened to the people saying you really can't tell the difference between low and high resolution
Now it's super resolution or bust :fear:
I generally turn it off display wise because the higher the resolution is = the smaller everything is.
Image quality wise yeah bring it :cool:
Posted on Reply
#13
Beermotor
Some technical information since the original Twitter poster didn't really dig into it:What do you guys think?
Posted on Reply
#14
Ware
natr0nThis is like the beginnings of physx add on cards for ai now. I'm getting a feeling about it.
No. We've had GNA built in since 10th gen. Only people working with AI used it so nobody cared. Now you're getting a coprocessor that consumers can actually use.
CammGood to see a standard...what happens to Linux in this scenario.
Yes, good to see a standard since AMD's has an AI coprocessor too.
This upscaling tech is obviously a windows thing, but the NPU is supported in linux, so no reason it couldn't be used in a similar way.
Posted on Reply
#15
A&P211
TartarosRelying in existing GPUs first and then in future CPUs, I magine. I was wondering if at some point AI processing was going to be in all GPUs no matter the price point but looks like it won't.
All future AI processing will be done on specialized ASIC machines kinda like mining is done.
Posted on Reply
#16
Ferrum Master
A&P211All future AI processing will be done on specialized ASIC machines kinda like mining is done.
Glorified names for matrix multiplier controller... Some thech journalist could at least join patterns thus and why AVX512 and for what for it was designed to mitigate and is being killed off and why it really it doesn't have place in a generic x86-64 CPU.
Posted on Reply
#17
Minus Infinity
SOAREVERSORMost computers out in the wild are laptops with IGPs getting an AI coproc onto all of these gets you the hardware base to get moving with something like this.
Modern iGPU in Hawk Point and Meteor Lake are still way faster than the NPU for AI. If you have a discrete modern gpu you are way faster again. NPU's performance will have to increase dramatically. Their main benefit at the moment is efficiency if performance isn't important.
Posted on Reply
Add your own comment
Apr 30th, 2024 16:56 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts