• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

UL Benchmarks Outs 3DMark Feature Test for Variable-Rate Shading Tier-2

Let us all jump to "it surely is always indistinguishable in all games" based on one static screenshot, shall we?

Try it your Navi card. Oh wait. :P
 
The whole devs have to implement part is really the risky bit. If that is on the engine level and the 'heatmap' generation can be automated and tweaked in a simple, transparent way then all is well, but if its forcing devs to do per-game sequence optimizations that can go to shit, fast. Then it'd just be DLSS all over again.
As I was running this new VRS benchmark, I also went back to the Port Royal DLSS test and was surprised, or rather I'd forgotten what a big performance bump it was and how little quality loss there was. It's not that it's cheating in any way, it's just that so much effort was put into manually optimising the DLSS implementation for that single minute-long, scripted scene that it isn't a realistic representation of what a time-pressured game dev will do for dynamic content. Real-game DLSS is so bad that I just prefer to run at a lower resolution or turn down the graphics options slightly to get higher framerates.

And that's why I have my doubts about the effectiveness and validity of the 3DMark VRS benchmark. This carefully-tuned maximum possible result looks fantastic but game devs are just going to tick the box to enable the feature in the engine and it'll either hurt image quality or offer minimal performance improvements without significant tweaking.
 
VRS tests in 3DMark are not "maximum possible" scenarios - by default it is actually trying to mimic a realistic game scenario and get as close to being "invisible" as possible. The goal was to make the difference not visible while moving unless you specifically knew what to look at on the Tier 2 test.

Are the gains bigger than they most likely will be in games? I don't think so.

Note that Tier 1 one is much more lightweight (and only 1080p vs 4k in Tier 2) to accommodate the lower end of the hardware spectrum, ie. Ice Lake iGPU and due to simpler version of the tech, the quality difference there is more apparent.

If you want maximum possible, there are settings in the Interactive Mode. You can also save screenshots from it for closer analysis and observe how framerate changes as you trade off quality for performance.

VRS is going to be (eventually) a major feature that actually allows "8K gaming" one day without completely hilarious amounts of GPU power. With that many pixels it is outright stupid to do everything at full res unless you are gaming on a 72" screen or something. Think about hypothetical 34" ultrawide gaming monitor that actually has ~8K horizontal resolution (well, four times the pixels of current 3440x1440 ultrawides, so 6880x2880) - The pixels are so small that you have to be smart about where you put your pixel processing power or you just end up wasting most of it on detail that no-one can see without a magnifying glass.

Yes, it'll be years before such screens are around and common, but underlying tech like this is always years ahead of practical use. This is also an interesting feature because it is completely up to game developers and artists - with smart defaults they can do some really impressive stuff that you can't really tell apart from "real full resolution".

And yes, widespread adoption in games will take years. The fact that this is DX12 only already means it'll be another year or two before most games even could use it (due to Win7 users who cling to their dying OS).
 
Lol I'm sure a certain green company paid them some cash to have this feature implemented and released just in time for the new AMD GPU releases.
 
Lol I'm sure a certain green company paid them some cash to have this feature implemented and released just in time for the new AMD GPU releases.

Nope.

In fact, Microsoft is the main driver behind VRS tech and it is part of DirectX. NVIDIA just happens to be the first to supporting it in hardware (together with Intel with the Ice Lake iGPU). Also I would be very very surprised if AMD would suddenly stop supporting all DirectX features and not do VRS support.
 
Back
Top