Thursday, December 5th 2019

UL Benchmarks Outs 3DMark Feature Test for Variable-Rate Shading Tier-2

UL Benchmarks today announced an update to 3DMark, with the expansion of the Variable-Rate Shading (VRS) feature-test with support for VRS Tier-2. A component of DirectX 12, VRS Tier 1 is supported by NVIDIA "Turing" and Intel Gen11 graphics architectures (Ice Lake's iGPU). VRS Tier-2 is currently supported only by NVIDIA "Turing" GPUs. VRS Tier-2 adds a few performance enhancements such as lower levels of shading for areas of the scene with low contrast to their surroundings (think areas under shadow), yielding performance gains. The 3DMark VRS test runs in two passes, pass-1 runs with VRS-off to provide a point of reference; and pass-2 with VRS-on, to test performance gained. The 3DMark update with VRS Tier-2 test will apply for the Advanced and Professional editions.

DOWNLOAD: 3DMark v2.11.6846
Add your own comment

58 Comments on UL Benchmarks Outs 3DMark Feature Test for Variable-Rate Shading Tier-2

#51
cucker tarlson
medi01Let us all jump to "it surely is always indistinguishable in all games" based on one static screenshot, shall we?
you're way late,this was used in last two wolfenstein games already.
FluffmeisterTry it your Navi card. Oh wait. :p
Feliz Navi-dead :laugh:
Posted on Reply
#52
Chrispy_
Vayra86The whole devs have to implement part is really the risky bit. If that is on the engine level and the 'heatmap' generation can be automated and tweaked in a simple, transparent way then all is well, but if its forcing devs to do per-game sequence optimizations that can go to shit, fast. Then it'd just be DLSS all over again.
As I was running this new VRS benchmark, I also went back to the Port Royal DLSS test and was surprised, or rather I'd forgotten what a big performance bump it was and how little quality loss there was. It's not that it's cheating in any way, it's just that so much effort was put into manually optimising the DLSS implementation for that single minute-long, scripted scene that it isn't a realistic representation of what a time-pressured game dev will do for dynamic content. Real-game DLSS is so bad that I just prefer to run at a lower resolution or turn down the graphics options slightly to get higher framerates.

And that's why I have my doubts about the effectiveness and validity of the 3DMark VRS benchmark. This carefully-tuned maximum possible result looks fantastic but game devs are just going to tick the box to enable the feature in the engine and it'll either hurt image quality or offer minimal performance improvements without significant tweaking.
Posted on Reply
#53
FM_Jarnis
VRS tests in 3DMark are not "maximum possible" scenarios - by default it is actually trying to mimic a realistic game scenario and get as close to being "invisible" as possible. The goal was to make the difference not visible while moving unless you specifically knew what to look at on the Tier 2 test.

Are the gains bigger than they most likely will be in games? I don't think so.

Note that Tier 1 one is much more lightweight (and only 1080p vs 4k in Tier 2) to accommodate the lower end of the hardware spectrum, ie. Ice Lake iGPU and due to simpler version of the tech, the quality difference there is more apparent.

If you want maximum possible, there are settings in the Interactive Mode. You can also save screenshots from it for closer analysis and observe how framerate changes as you trade off quality for performance.

VRS is going to be (eventually) a major feature that actually allows "8K gaming" one day without completely hilarious amounts of GPU power. With that many pixels it is outright stupid to do everything at full res unless you are gaming on a 72" screen or something. Think about hypothetical 34" ultrawide gaming monitor that actually has ~8K horizontal resolution (well, four times the pixels of current 3440x1440 ultrawides, so 6880x2880) - The pixels are so small that you have to be smart about where you put your pixel processing power or you just end up wasting most of it on detail that no-one can see without a magnifying glass.

Yes, it'll be years before such screens are around and common, but underlying tech like this is always years ahead of practical use. This is also an interesting feature because it is completely up to game developers and artists - with smart defaults they can do some really impressive stuff that you can't really tell apart from "real full resolution".

And yes, widespread adoption in games will take years. The fact that this is DX12 only already means it'll be another year or two before most games even could use it (due to Win7 users who cling to their dying OS).
Posted on Reply
#54
evolucion8
Vayra86This is pretty impressive. Bring it on.



And then a gen later AMD copies it.
The most clueless post of the topic.
Posted on Reply
#55
Vayra86
evolucion8The most clueless post of the topic.
Thanks for contributing!
Posted on Reply
#56
Penev91
Lol I'm sure a certain green company paid them some cash to have this feature implemented and released just in time for the new AMD GPU releases.
Posted on Reply
#58
FM_Jarnis
Penev91Lol I'm sure a certain green company paid them some cash to have this feature implemented and released just in time for the new AMD GPU releases.
Nope.

In fact, Microsoft is the main driver behind VRS tech and it is part of DirectX. NVIDIA just happens to be the first to supporting it in hardware (together with Intel with the Ice Lake iGPU). Also I would be very very surprised if AMD would suddenly stop supporting all DirectX features and not do VRS support.
Posted on Reply
Add your own comment
Apr 25th, 2024 18:45 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts