I kind of enjoy reading this back and forth.
What gets me is the youtube videos. Earlier today I watched one claiming that previous generations (3000 series) were crippled because they don't have enough VRAM. I then look at some of their testing...and the cards on the block aren't exactly consumer choices. I then look at steam for a bell weather, and only recently had the 1660 been dethroned by the 3050...a card that is already not great but was one of the lowest price (and thus most affordable) during the pandemic. Heck, even today sites like Newegg are flogging the 1000 series GPUs...in the age to the 4000 series.
As far as my opinion, the usual caveats apply:
1080p is the standard resolution for most gaming.
60 Hz is still the standard most people game at, based only on the most cost efficient monitors.
Games are getting bigger, sloppier, and more resource intensive. This drives up requirements generally without driving up appreciable fun.
Most people game for fun, and not to measure their system performances.
Stating all of the above, I believe that barring the need to play the latest poorly optimized port trash 8 GB is enough for entry level. I believe that anything above that is probably more future resistant.
Taking a tangent for one moment;
3060 - 12 GB
3060ti - 8 GB
3070 - 8 GB
3070ti - 8 GB
3080 - 10 GB
3080 - 12 GB
3080ti - 12 GB
There's plenty of reason to argue that the 3060ti and 3060 having faster, but less, VRAM is fine. Where I have issues is that Nvidia basically is adding 2 GB every other generation to their "price friendly" options and their higher end seems to be rapidly growing...but pricing is uniformly going up. That is to say a 1070 had an MSRP of $379, and the 3060ti has a $399 tag. Performance is about 20% lower on the card two generations older and having the same VRAM...whereas the trend of each generation moving down one peg should be 1070=3050, but the 1070 is 7% faster...with 8GB of VRAM...
That's a bit goofy...but let me follow up with a question. If we know the 970 was crippled by slow VRAM years ago, the 1060 3GB was just silly underperforming the 6GB, and VRAM is one of the cheapest components to up then why don't we see more variants? Well, the 4070ti. 3 variants with varying VRAM, the also came with crippled core counts that would have made laymen understand things as less VRAM=worse without knowing less VRAM+less cores = worse.
Tangent aside, I do have a second answer. When opting for the 3080, 3080 12GB, or the 3080ti I chose the 12GB version. For me the choice was simple. In 2019 the 5700 XT was well under the $349 MSRP. It came with 8GB, and was replaced with the 6700 XT with 12 GB. Extrapolating, if Nvidia pulled the same generational middle range increase we'd see a 3070 at 12 GB, and the 4070 at 16 GB. (ignoring the 2000 series as a refresh).. It kind of boggles my mind that AMD is doubling down on VRAM at all levels, Nvidia is betting on uspcaling cheaper parts and on the high end feeding the beast VRAM, but effectively we're discussing all of this because of a fraction of a fraction of all games...
I think that if you want the best visuals then 8GB is too little, if you want to focus on games more than graphics 8GB is enough, and that we'll continue to see useless increases in requirements to drive new hardware sales because there's only a small minority of people gaming at high resolution or high refresh rates...neither of which is really going to be a thing until sub $200 cards can support them.
Reinforcing the above, the newest consoles can theoretically run 4k. They don't run high refresh rates. Their 4k is...not with all of the bells and whistles. Most people still game on consoles or mobile....so the market is aiming for that. Both Nvidia and AMD are making sure 4k and high refresh don't happen by pricing people out of the market with what used to be a $400 card (1070ti) at $800 (4070ti)....which makes the equation less about a required minimum and more about a point of affordability when formerly mid-high tier cards are priced at almost twice a console without any other hardware accounted for. So, my response here is that the "minimum" is whatever you can reasonably afford...."requirements" defined by greedy programs are not really reasonable when you're playing 5+ year old games because a $100 bill buys either an 8 hour AAA campaign, multiplayer, and dlc or a dozen indie games that increase your playing time tenfold versus that one expensive option using the latest bells and whistles.