i think 8gb is cutting it close in 2023, but i also see some damn stupid considerations in that video:
- gaming at ultra is stupid, it has been more then proved, any video that shows that a card struggles with ultra settings is idiotic
- gaming at very high 60fps or even lower like i seen on that video, is also stupid when you can lower the settings a better frame rates, better experience overall and can hardly notice the difference
-the 6800 is on average 10% faster then the 3070 so taking conclusion just based on vram is also stupid, not to mention certain games prefer team red or green, none of that is taking into account
Funny thing is, the 8GB I have now has enabled ultra textures regardless and still does, even if I cut down on some super performance hogging post effects. Even in a game like TW Wh3, I get FPS sub 50, but I can still run maximum quality textures, or near it, while I drop down some other settings like SSR to keep the thing nicely playable. It still looks great for the performance it has then. On a 7 y/o card, at a native 3440x1440. Everything I play today still works stutter free. Like butter smooth. Even at 40 FPS, perfectly playable. So yes, the GPU is pegged to 100%, but the experience is rock solid and detail close to top end.
VRAM enables you to actually maintain high fidelity / detail on the basic assets and high LOD settings on geometry, while not costing a lot out of your GPU core. In that sense its a LOT more than just textures. Its the base detail of the whole picture, the thing every other post effects builds up on. Its also draw distance, which is huge for immersion in open world games.
You got this completely backwards, as do all those others who defend 'I"ll dial down some textures' because they lack VRAM and saying its fine. What KILLS performance on any GPU, is realtime processing and post processing: it adds latency to every frame regardless of the texture detail you got. Whereas high texture detail doesn't or barely adds latency because its already loaded into VRAM way before its used.
People should stop these idiotic coping mechanisms and see things for what they are. History is full of these examples and this has never been different. If you lack either core/thread count or RAM on your CPU/board, or if you lack VRAM on your GPU, you're cutting its life expectancy short. Its that simple. Bad balance is bad balance and it means you're not getting the best out of your hardware.
For my next GPU its going to be very simple. 3x the perf of the 1080? I want about 3x the VRAM. That 24GB 7900XTX is looking mighty fine, covering that principle perfectly. The 7900XT and 6800XT are also along that scale with 20~16GB. But 12GB on a 4070ti is ridiculous and I'm staying far, far away. Given the cost of today's GPUs, I think 5-6 years life expectancy at top dog gaming is what we should be aiming for, most definitely. That keeps gaming affordable. If I would add up the cost of the 1080 > resale at 150~200 EUR, and then buying a 7900XTX, I'd end up at about 1,1K EUR for a total of 12 years of gaming. That's less than 100 EUR a year. Having the perfect VRAM balance makes that happen.
Several members here have pointed out that VRAM usage and VRAM needed aren't always the same and that's true. You will know when the game needs more VRAM than you have because it will resort to using System RAM which is way, way slower.
You will know you're missing hardware resources when the game stutters, that applies to a lack of any kind of memory - or any kind of hiccups in feeding that memory, which means lack of CPU cycles for that specific frame, or not having the data available in time. All of this points at system RAM and VRAM and its bandwidth. All of my rigs for gaming had stuttery experiences until I stopped buying midrange shite GPUs with crappy Nvidia memory subsystems.
Back when GPUs were dabbling between much lower memory capacities (256~512MB > 1GB/2GB), the urge to upgrade was apparent much faster. Those were the days where you'd turn 180 degrees in a shooter and you could easily be served massive stutter as textures had to be swapped - and if you didn't stutter at it, you'd have pop-in of assets and textures all the time... Remember Far Cry 3... VRAM in GPUs actually started to become great around the early Maxwell days, when we realized 3GB was a massive boon over 2; and 4GB midrange followed up soon after - heck even the 970 with its crippled 4GB was in a miles better place than anything Kepler based. 6GB lasted super long, you could say it still does on low end 1060s today, and 8GB similarly was built to last when it was new. To compare to today and still be looking at 8GB on much more powerful GPUs saying its all fine... is a strange thing to see.