Key points in your post, and neither are guarantees. What'll happen is that dynamic resolution will force an Nvidia GPU into lower detail levels sooner and faster than games on the similar performing AMD GPU. Consoles DO have access to 10+ GB of VRAM though. They will be having 11-13GB for graphics, potentially.
I agree with you and most others when it comes to how we used to approach VRAM. But what has been is not what tomorrow looks like. We're looking at higher bandwidth usage and a preference for fast access to storage, ESPECIALLY because there is much more going over that bus since allocation is even more dynamic than it used to be. Already we notice how interconnects matter wrt latency for example. Frame times are already better in numerous games on an RDNA2 card. Look at TPU reviews. Its not just the FPS - frame delivery also benefits from not swapping and re-allocating. And that's right away, post release. It won't be getting worse on the AMD side of things, only better. Ampere similarly can still improve... but there is much less wiggle room.
Something's gonna give. 10GB means heavier load on swaps which results in one of two things: Lower detail level, or lower performance. The only real basis anyone has for saying '10GB is enough' is that you believe Nvidia on its green eyes that it will be. But Nvidia is not leading the game industry at the start of a new console gen. We all know this - if you look at the past, we've seen every new console gen was a major influence for a new performance level in games. Nvidia pre-empted the last console release with a very solid GTX 970 (and lo and behold... it also had a so-so-memory system!) but even so, you don't want to run a 4GB GPU for the last crop of PS4 games do you? You want 6 or 8 at least.
So if you're going to look back, don't look with rose tinted glasses, but be honest and apply the principle everywhere. Games DO exceed the last console gen's launch GPU right now and they have been for several years and its not even exclusive to the highest resolution. The logical conclusion here is that the 3080's 10GB will therefore be obsolete long before this console gen is over. And then, when you're being honest with yourself... consider whether that is acceptable or not. The trend however is clear: Nvidia has been cutting back on VRAM per performance tier while steadily increasing the price points, the competitor is not, and the competitor is defining the general direction of game/port development. Just put two and two together.
Its all crystal ball guesswork... but this is my educated guess.
Who keeps a GPU for 8 years tho? An entire console generation. Most that does that, keeps playing the same games, like WoW etc
Most PC gamers, that play new AAA games, will be upgrading AT LEAST 2 times in a console generation, for me, more like 4 times. Every 2 year is what I do
A high amount of VRAM will not save you, because GPU is not getting faster, so you will still be looking at low fps in the end, forcing you to lower image quality, and VRAM requirement drops as a result, making the VRAM pointless again
Thats why you don't go all-out on VRAM if GPU is not absolute high-end to begin with, it's better to upgrade more often, than trying to futureproof
Never be more than 2 generations behind if you want proper driversupport from Nvidia/AMD or even game dev's (which are testing with newest and last gen mostly)
Nvidia and AMD will focus on newest arch first, then "last gen", older arch's *might* get support, might not - You will see wonky performance and issues, this is what people with older GPU's often experience in new games
Take 390X for example, it had 8GB, today it can barely do 1080p maxed, GPU is way too dated but VRAM is fine, still does not save performance. Then look at 3070 with 8GB too, performs like a 2080 Ti even at 4K;
https://www.techpowerup.com/review/msi-geforce-rtx-3090-suprim-x/33.html
A friend of mine bought 380X card solely because of VRAM. He though he could be using the card for 5+ years, but he has experienced flicker and weird glitches in tons of new games in the last years + bad performance in most games relased after 2018, like very bad in some of them (unplayable) shadowbugs and even crashing (but the card does 3dmark looping for hours, meaning it's the game/drivers)
Meanwhile Fury X released with 4GB and AMD claimed this was more than enough, yet aged like milk because of the 4GB, can still do 1080p "fine" tho, the problem is that it was a 1440p-4K card, 980 Ti still does 1440p DECENT in most games using medium settings, Fury X can barely do low
6GB in 2020 is still decent for 1080p, 8GB for 1440p and 10GB for 4K, and you should be fine for a few years, if not lower some settings and enjoy anyway, who cares if you play a game at 95% IQ instead of 100%