• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Microsoft Paves the Way for Industry-Wide Adoption of Variable Rate Shading

Raevenlord

News Editor
Joined
Aug 12, 2016
Messages
3,755 (1.18/day)
Location
Portugal
System Name The Ryzening
Processor AMD Ryzen 9 5900X
Motherboard MSI X570 MAG TOMAHAWK
Cooling Lian Li Galahad 360mm AIO
Memory 32 GB G.Skill Trident Z F4-3733 (4x 8 GB)
Video Card(s) Gigabyte RTX 3070 Ti
Storage Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB
Display(s) Acer Nitro VG270UP (1440p 144 Hz IPS)
Case Lian Li O11DX Dynamic White
Audio Device(s) iFi Audio Zen DAC
Power Supply Seasonic Focus+ 750 W
Mouse Cooler Master Masterkeys Lite L
Keyboard Cooler Master Masterkeys Lite L
Software Windows 10 x64
Microsoft today via a devblog announced their push to make Variable Rate Shading an industry-wide adoption in the search of increased performance that can support the number of pixels and quality of those pixels in future games. The post starts with what is likely the foremost question on the mind of any discerning user that hears of a technique to improve performance: "does it degrade image quality?". And the answer, it seems, is a no: no discernible image quality differences between the Variable Rate Shading part of the image, and the fully rendered one. I'll give you the option to speak on your own perception, though: analyze the image below and cast your vote on the poll.

As resolution increases, so does the amount of work that any given GPU has to process to generate a single frame - and compare that to the amount of additional work that goes from rendering a 30 FPS, 1080p game to a 60 FPS, 4K one, and... It stands to reason that ways of squeezing the highest amount of performance from a given process are at a premium. Particularly in the console space, where cost concerns require the usage of more mainstream-equivalent hardware, which requires creative ways of bridging the desired image quality and the available rendering time for each frame.





We've already spoken at length regarding Variable Rate Shading, regarding both its NVIDIA Turing debut, and the AMD patent application that aims to implement a feature that's akin to that. In this case, this is Microsoft that's saying they're adapting Variable Rate Shading onto DX12, allowing developers to easily take advantage of the feature in their games. According to Microsoft, integration of VRS technology through DX12 should take developers no more than a few days of work, whilst enabling some 14% performance gains for other rendering efforts, such as resolution, target frames per second, or even more relevant image quality improvements.



Microsoft details three ways for developers to integrate the technology into their rendering engines (per draw; within a draw by using a screenspace image; or within a draw, per primitive). This should enable developers to mix and match the implementation that best applies to their engine. At the time of announcement, Microsoft said that Playground Games and Turn 10 (Forza), Ubisoft, Massive Entertainment (The Division, Avatar), 343 Industries (Halo), Stardock, Io Interactive, Activision, Epic Games and Unity have all shown their intention of adding VRS to their game engines and upcoming games. That most of these are affiliated with Microsoft's push isn't surprising: remember what we said early in that the console development space (and VR) is where these technologies are needed most.

PS: The left part of the image is the fully rendered one, the right part of the image is the VRS-powered one, rendered at a 14% increased performance.

View at TechPowerUp Main Site
 
This is just temporary solution until next gen gpus come but it will help a little i guess.
 
This is just temporary solution until next gen gpus come but it will help a little i guess.
Not really. If it eases the load on the GPU by 20-30%, you can use that to enable better shadows, AA, AF or whatever, no matter how fast your card becomes. Already we're seeing rage at any new monitor launch that's not a 144Hz panel, I suspect when we'll all be playing at 4k@240Hz, people will still want more. Next gen GPUs will do little to mitigate that.
 
As the generational increases in hardware performance shrinks, the increase in optimization of software increases. This seems like the next step after primitive discards to increase performance.

Reads like something the Stardock guys would develop.
 
Nice tech, Turing already supports it and it's clear AMD are going to play follow the leader.

Win win.
 
I hope no one with your logic is in charge of developing games...

Truth is majority of game devs make optimisations all the time so your logic is flawed.
 
But isn’t that what intended troll was about? Sorry to spoil your party. :rolleyes:

Like I said, damn your easily upset, at least your girl Vya has your back.
 
Doing this sort of selective shading makes sense on VR where you can't even look at the peripherals very well, but I'm worried that we are going down a bad path with checkerbox rendering, TAA, DLSS, and variable rate shading. If you roll all of this tech together just to say it's 4k60fps, but it doesn't even look as good as upscaled 1440p, what's the point?

I'll have to look at some more comparisons on my gaming TV, because I can't tell crap from a compressed jpeg.
 
Doing this sort of selective shading makes sense on VR where you can't even look at the peripherals very well, but I'm worried that we are going down a bad path with checkerbox rendering, TAA, DLSS, and variable rate shading. If you roll all of this tech together just to say it's 4k60fps, but it doesn't even look as good as upscaled 1440p, what's the point?

I'll have to look at some more comparisons on my gaming TV, because I can't tell crap from a compressed jpeg.
I’m with you. I’m hesitant for anything that reduces my IQ, so I’m going to have to see much more of this in detail before I start jumping up and down and spitting nickels.
 
Last edited:
if this tech is something that can take a crap monitor and make it look like a million bucks, I'm all for it. My biggest gripe is that game developers will do what they always do and take shortcuts and ruin it for the folk that cant afford big 4k gaming TVs.
 
Like I said, damn your easily upset, at least your girl Vya has your back.
Hey Fluffy, keep your trolling to the bedroom, mkay?
You are boring, repetitive and unimaginative...that crap belongs in your bedroom, not here!
 
Hey Fluffy, keep your trolling to the bedroom, mkay?
You are boring, repetitive and unimaginative...that crap belongs in your bedroom, not here!

You stink of Vaseline too.
 
Might be time for at least a couple of you to find something else to do that isn't trashing the forums. Keep the personal jabs and BS to yourselves. Keep it on topic and within our guidelines or move along.
 
- A guy says something that isn't optimization oriented.
- I say that I hope people with that logic won't be in charge of developing a game
- and you say:
Truth is majority of game devs make optimisations all the time so your logic is flawed.
- so my response is... eh?
 
Back
Top