Upscaling lowers the VRAM usage, because the game is rendered at lower resolution. Only FG increases VRAM usage
However, there is an increase in usage from the native resolution without DLSS.
It can also be understood from your tests, that the difference in VRAM usage from 720p to 1080p is small, and that what matters is something else, so it is not clear why now you are trying to make it become decisive, and always decisive.
The 3060 12 GB, is kinda similar to the B580's situation. You do have more VRAM which will let you run bigger workloads, but your GPU horsepower is so limited that you will never get decent FPS in any of them, so you'll never run them in real-life. This means you'll use upscaling, for those resolutions, which brings the VRAM usage back down.
Unlike the B580, the 3060 has MUCH less raw GPU perf, which amplifies the problem. For DLSS Transformer upscaling specifically, the performance hit is much small on Blackwell than on older architectures, so this will cost you even more FPS and the extra VRAM won't help you one bit. But the 3060 is $220, I'd still pick the 5050 for 250 (if I had to choose between those two cards)
Tests of similar GPUs should not be done with high-maximum settings. You seem to ignore that the setting that affects the VRAM is the quality of the textures, especially when you can exclude the RT, and other things not suitable for this range.
One thing is clear: With the 3060 you can still keep that setting higher.
The overall weight of the game is not necessarily linked to the occupation of VRAM, that depends primarily on the quality of the assets, and, again, with the 3060 you can afford better assets.
There are cases of relatively light games that occupy more than 8GB of VRAM, just as there are heavy games that might not reach it, so it is not clear what you are forcing, which seems like a bottleneck, a sort of vademecum to defend the small quantity...
Playing with settign there are also cases in which you can aim for 1440p with DLSS, and when you do that and the VRAM does not reach the 12GB threshold, you will be able to make it work better, because it is predictable that if the upscaler starts with better textures the result will also be better.
Asset quality is one of the parameters that even the "laymen" notice well, in this case, as I said, it has little influence on the overall weight, but VRAM is needed.
Is Stellar Blade a heavy game? NO. Is Hogwarts Legacy without RT? Not even, so why treat as an axiom what is not?
You are effectively calling W1z a liar. You need to stop. Your shtick is coming off like it's full on mindless drivel.
I think he meant that they are not enough to show the whole picture. If so I agree. You can't simulate VRAM usage with 5 minute ephemeral tests, among other things without considering the annoyance of the abnormal block that could have little impact on the overall average, while being annoying in practice.
Without even considering the cases in which the textures are not loaded, or are loaded late. In those cases, the average frame rate value will matter even less.