Honestly this thread sorta turned into a bash nvidia/defend my Nvidia card thread don't get me wrong Nvidia deserves to be criticized but people defending 8GB of ram on a 400-500 GPU is comical at best in 2023.
You see this with every time the bar movies up in relation to pc gaming.
When 8-12 thread cpus become much better for gaming than 4 thread ones. ( There are other factors that matter like Cache, ST performance, clock speed, but they all tend to be higher with more cores especially on intel CPU)
When 16GB of ram started to outpace 8GB of Ram.
when SSD became almost a requirement over HDD.
And now having more than 8GB of vram is better than having 8GB or less.
Although you saw the same people complaining when 4GB outpaced 2Gb, and then 6GB outpaced 4GB, and then 8GB outpaced 6GB you get the picture.
I don't get it assuming developers properly use it more is better it never will not be better.
It doesn't magically make any of the stuff mentioned above unusable it's just likely going to give you issues in certain scenarios and a lesser overall gaming experience after all regardless of how good a game is if it stutters it completely breaks the immersion for most.
Either way for anyone who actually wants to game in 2023 and play every game the target should be
8-12 thread modern cpu with a decent amount of cache.
16GB of ram although 32GB isn't a bad idea
10-12GB of vram
ssd
For gamers trying to target 1080p 60-120hz at medium/high settings an 8 GB gpu is fine but at this point 8GB in 2023 should strictly be considered entry level.
IF you don't agree awesome Nvidia will likely have an overpriced 4060ti with 8GB just for you it will be called the I hate progress special edition and will likely age as well as the Intel i5 7600k.....
So, I chalk this up as two things. One is history, and the other is a frustration with an industry that doesn't see its consumers as valuable enough to invest in.
To the former, you may be aware of the limitations of a 32 bit os. Why bring this up? Well, for years the consoles and PC were competing with 32 bit hardware. It had a finite limitation on addressable space, and thus your system having 4 GB was literally enough. There was a magic time when developers were limited by their actual resources, so they made compromises to make better games.
This isn't a set of thick rose tinted goggles. These games often had idiotic compromises to modern sensibilities. Try going back to System Shock 2 and telling me that the game wasn't a steaming mess sometimes. That said, you had to optimize. You had to cut what didn't work. You had to be economical and choose what was done. In another example, you have Fallout 3 which had an "open world" that loaded in tiles...and anyone who played that for any period of time probably encountered a run through the wasteland only to hit an invisible loading screen or two...all so they could load the next brown and gray tileset. Again, not perfection but definitely with limitations.
The reason that this is a problem is that new games simply don't require that level of optimization. I'll look at Bethesda, who are still basically running Gamebryo as their engine. In Fallout 4 you've got the same load cells issue, the same issues with rendering distance, and the solution wasn't to improve the system. Their solution was basically to increase hardware requirements or have terrible performance (read: downtown Boston) and call it a day. "Throw more resources at it" rather than "optimize it so that it works" is the new answer to the problem.
Now, Fallout 4 also touches on point 2. Rarely is a PC port given the love required to be awesome. There are horror stories abound, but the general assumption is that because of poor optimization you need a PC not equal to, but more powerful than a console to get the same performance. I call some of that truth, and some of that shenanigans. Consoles advertise 4k...without discussing that it's tiled mode. They discuss 4k, without noting the upscaling routines. The PC has 144+ Hz refresh monitors to contend with...whereas 60 Hz is basically what most consoles aim for as far as monitor refresh rates. All of this is fine, but it's obtuse to consumers.
Let me be slightly less obtuse. The problem is that with known hardware developers test features, enable what doesn't tank performance too much, and they're done. That's console optimization. Now, PC optimization is a joke. What hardware do you assume is running? What drivers for the GPU are there? How do a CPU, RAM, and GPU interact? Can you really assume that someone running a Sandybridge CPU with a 3060ti has roughly the same performance as a low end 5000 series Ryzen with a 1080ti? All of this is why instead of optimization it's more accurate to minimum feature listing and a recommended and minimum performance hardware set.
This may sound like a dichotomy. Earlier I railed against poor coding...and here I'm talking about not optimizing things at all. Marrying the two requires you, for a moment, pretend to be a code monkey. How do you solve the impossible problem of optimizing code for the unknown? The only real option is to make the code run well enough on a test system. Push it out to launch, and when it breaks you collect feedback and work on fixes for reported issues. Bad coding is often a sign of both lazy practices and intricate hardware.
Once you marry the two of these ideas it's not difficult to see why we fail, and why companies aren't willing to truly optimize for the literal myriad of configurations which can theoretically be out in the wild.
Now your final point...you're welcome to your opinions. I have nothing worth saying.