• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Is 8gb vram the minimum target for 2023 gaming?

Is 8gb vram the minimum entry for gaming 2023 and onwards

  • Yes

    Votes: 69 56.6%
  • No

    Votes: 53 43.4%

  • Total voters
    122
  • Poll closed .
Status
Not open for further replies.
Too bad that 7900XT didn't come with 8GB you could of saved some money on that wasted 12 extra GB it has.... Speaking of which it's a shame my 4090 didn't come with 8GB not sure what I'm going to do with this useless extra 16GB it has....

I guess I will just have to get into stable diffusion so it doesn't go to waste....
Just because 8 GB is perfectly fine on a certain card with certain games at certain settings, it doesn't mean it's also fine on a different card with different games and/or different settings.

Like I said earlier, I'm happy to play many games on my HTPC with a 4 GB 6500 XT in it. Where the card runs out of VRAM it also runs out of GPU power, so technically, it isn't constrained just by its 4 GB VRAM, but by a combination of things. Having 2 or 3 times the amount wouldn't help it much, if at all. I imagine it's a similar situation with an 8 GB Vega64.

Having only 8 GB on a 7900 XT would be just as stupid as having 12 or 16 GB or more on a 6500 XT.
 
Just because 8 GB is perfectly fine on a certain card with certain games at certain settings, it doesn't mean it's also fine on a different card with different games and/or different settings.

Like I said earlier, I'm happy to play many games on my HTPC with a 4 GB 6500 XT in it. Where the card runs out of VRAM it also runs out of GPU power, so technically, it isn't constrained just by its 4 GB VRAM, but by a combination of things. Having 2 or 3 times the amount wouldn't help it much, if at all. I imagine it's a similar situation with an 8 GB Vega64.

Having only 8 GB on a 7900 XT would be just as stupid as having 12 or 16 GB or more on a 6500 XT.

I was being sarcastic due to the mental gymnastics people are doing over 8GB cards... 16 GB on the 6500XT would be funny although it should never have come in a 4GB varient....
 
I was being sarcastic due to the mental gymnastics people are doing over 8GB cards... 16 GB on the 6500XT would be funny although it should never have come in a 4GB varient....
As an owner, I think 6 GB would have been fine, but 8 GB on a card of its class is overkill. The engineers at AMD probably knew this, and chose to take the Nvidia route and give the card the lower amount of the two options that its memory controller supports. It's not the end of the world, though, because like I mentioned, the card runs out of GPU juice as well as VRAM in 99% of the cases where it struggles. Having 8 GB wouldn't help it much.
 
Several members here have pointed out that VRAM usage and VRAM needed aren't always the same and that's true. You will know when the game needs more VRAM than you have because it will resort to using System RAM which is way, way slower.
 
Several members here have pointed out that VRAM usage and VRAM needed aren't always the same and that's true. You will know when the game needs more VRAM than you have because it will resort to using System RAM which is way, way slower.
I don't care what the others say about you my Valkyrian feline friend, you know your stuff.


Learn that the vast majority of games only use the vram they need. Games like cod are the exception, not the rule.


I mean @AusWolf beat me to it but it's like mic drop time

Ah, using "usage" is just to make it easier to understand for less technical people, because "allocation" is such a big word
 
Last edited:
Several members here have pointed out that VRAM usage and VRAM needed aren't always the same and that's true. You will know when the game needs more VRAM than you have because it will resort to using System RAM which is way, way slower.

Yep although VRAM spilling over into main system memory doesn't always mean performance will be bad. Some game engines have the ability to swap textures back and forth asynchronously so that you don't see a performance hit. The downside is that there may be visable blurry textures, like in Hogwarts Legacy for example when running high texture quality on an 8GB video card. There's also instances where data spilt over into main system memory is just not that important to the game's overall performance so having a certain amount in the main system memory in some games might not be that big of a deal. I do believe devs have done a good job ensuring good performance on 8GB cards given how long that's remained the standard in mainstream but it's about time we moved on. 8GB should be the minimum by now given it was in $240 graphics cards 7 years ago. That's a long time in the PC pace.
 
Tbh I thought the stutters I had in FH5 were cpu related, but I swapped it out with a single ccd cpu and the stutters are still there :D

In 2023 I don’t know what I would buy, but 8GB is a bit on the thin side. And honestly the extra 2GB on the 3080 made no sense to me, because 2GB is feck all really. 16 should probably be the minimum..
Well, with nVidia allotting 16GB VRAM on the RTX 4080, I seriously doubt you'd see equal amount of VRAM on lower end cards (vis-a-vis the RTX 4080). Like the RTX 3060 Ti/3070/3070 Ti, nVidia is gonna limit amount of VRAM on the RTX 4070/4070 Ti to 12GB. Unless nVidia does another outlier like the odd RTX 3060 12GB by arming its RTX 4060 with 16GB of VRAM.

nVidia knows full well that with more and more games chewing up VRAM like mad muthas, these cards really ought to have more VRAM, they would have a shorter period of usefulness at max setting + RT at up to 1440P (which they are quite capable of) and like RTX 3060 Ti/3070/3070 Ti owners, would have to upgrade sooner rather than later. Obviously, 16GB VRAM would last longer than 12GB in the long run at certain res and ingame setting, that's a fact.
 
Last edited:
I’ve had my card for 2 years, it’s been good to me for the most part. I have a hard time wanting to support nvidia, and even amd.
 
Yes we should. Reviews only use the term "VRAM usage" to make it somewhat understandable for not so tech-oriented readers. VRAM reservation, or VRAM allocation would mean nothing to them.

You can see games that use 10+ GB on a card that has more max out 8 GB on an 8 GB card, but still run fine. I remember trying RE: Village demo on my 2070 after some people here in the forum suggested that it runs like crap with Ultra graphics on an 8 GB card. It ran just fine.

See:
oh, allocation ... well i am not one of the less technical people ... but i do tend to take thing at face value ... so, what calling it usage did for me was missdirecting me instead of just saying what it is for what it is ...

corrected the post, although the argument of Skyrim maxing out vRAM still hold even if it's "anecdotal" being allocated 11.2gb ... instead of the vanilla 1.8gb :p and the allocation still change with resolution textures and effect (RT) but it does not use all available at all, unless not enough vRAM.
 
i think 8gb is cutting it close in 2023, but i also see some damn stupid considerations in that video:
- gaming at ultra is stupid, it has been more then proved, any video that shows that a card struggles with ultra settings is idiotic
- gaming at very high 60fps or even lower like i seen on that video, is also stupid when you can lower the settings a better frame rates, better experience overall and can hardly notice the difference
-the 6800 is on average 10% faster then the 3070 so taking conclusion just based on vram is also stupid, not to mention certain games prefer team red or green, none of that is taking into account

Funny thing is, the 8GB I have now has enabled ultra textures regardless and still does, even if I cut down on some super performance hogging post effects. Even in a game like TW Wh3, I get FPS sub 50, but I can still run maximum quality textures, or near it, while I drop down some other settings like SSR to keep the thing nicely playable. It still looks great for the performance it has then. On a 7 y/o card, at a native 3440x1440. Everything I play today still works stutter free. Like butter smooth. Even at 40 FPS, perfectly playable. So yes, the GPU is pegged to 100%, but the experience is rock solid and detail close to top end.

VRAM enables you to actually maintain high fidelity / detail on the basic assets and high LOD settings on geometry, while not costing a lot out of your GPU core. In that sense its a LOT more than just textures. Its the base detail of the whole picture, the thing every other post effects builds up on. Its also draw distance, which is huge for immersion in open world games.

You got this completely backwards, as do all those others who defend 'I"ll dial down some textures' because they lack VRAM and saying its fine. What KILLS performance on any GPU, is realtime processing and post processing: it adds latency to every frame regardless of the texture detail you got. Whereas high texture detail doesn't or barely adds latency because its already loaded into VRAM way before its used.

People should stop these idiotic coping mechanisms and see things for what they are. History is full of these examples and this has never been different. If you lack either core/thread count or RAM on your CPU/board, or if you lack VRAM on your GPU, you're cutting its life expectancy short. Its that simple. Bad balance is bad balance and it means you're not getting the best out of your hardware.

For my next GPU its going to be very simple. 3x the perf of the 1080? I want about 3x the VRAM. That 24GB 7900XTX is looking mighty fine, covering that principle perfectly. The 7900XT and 6800XT are also along that scale with 20~16GB. But 12GB on a 4070ti is ridiculous and I'm staying far, far away. Given the cost of today's GPUs, I think 5-6 years life expectancy at top dog gaming is what we should be aiming for, most definitely. That keeps gaming affordable. If I would add up the cost of the 1080 > resale at 150~200 EUR, and then buying a 7900XTX, I'd end up at about 1,1K EUR for a total of 12 years of gaming. That's less than 100 EUR a year. Having the perfect VRAM balance makes that happen.

Several members here have pointed out that VRAM usage and VRAM needed aren't always the same and that's true. You will know when the game needs more VRAM than you have because it will resort to using System RAM which is way, way slower.
You will know you're missing hardware resources when the game stutters, that applies to a lack of any kind of memory - or any kind of hiccups in feeding that memory, which means lack of CPU cycles for that specific frame, or not having the data available in time. All of this points at system RAM and VRAM and its bandwidth. All of my rigs for gaming had stuttery experiences until I stopped buying midrange shite GPUs with crappy Nvidia memory subsystems.

Back when GPUs were dabbling between much lower memory capacities (256~512MB > 1GB/2GB), the urge to upgrade was apparent much faster. Those were the days where you'd turn 180 degrees in a shooter and you could easily be served massive stutter as textures had to be swapped - and if you didn't stutter at it, you'd have pop-in of assets and textures all the time... Remember Far Cry 3... VRAM in GPUs actually started to become great around the early Maxwell days, when we realized 3GB was a massive boon over 2; and 4GB midrange followed up soon after - heck even the 970 with its crippled 4GB was in a miles better place than anything Kepler based. 6GB lasted super long, you could say it still does on low end 1060s today, and 8GB similarly was built to last when it was new. To compare to today and still be looking at 8GB on much more powerful GPUs saying its all fine... is a strange thing to see.
 
Last edited:
No 3D games uses less electricity and works on all cpu.
shadows, light, raytracing, high resulotion, antialiasing eats all fps.
 
You use shared VRAM even if your on-board VRAM isn't full. And please stop calling everyone you disagree with a troll.
The point he's making and stressing is that even with sufficient VRAM, you're already using system RAM to swap assets.
At some point, something's gonna give,
and the thing that gives are your frametimes which means 'Stutter'.

The same relation applies between bandwidth and VRAM. Nvidia's using cache to alleviate bandwidth constraints but that combo along with low VRAM means they're taxing data swapping heavily, and depend heavily on developer TLC to keep the whole affair smooth. And failing at it, as we can see in the results already. Developers haven't got time for that, just the same way as mGPU on DX12 died a swift death when it got pushed to 'the developer'. That push basically just ended SLI fingers and Crossfire within the space of two generations of GPU. Economical realities don't lie. Time is money and devs will always prefer to focus their time on their game, not on bullshit surrounding it.

Its not rocket science, honestly, and it never has been, and this silly story of 'muh usage is not muh allocation' needs to stahp. Allocation IS usage because it means you get to swap less and therefore you're mitigating frame time variation, keeping that variability focused on the GPU core's capabilities, not its memory subsystems. I could dig up some TPU reviews to show the actual examples, including confirmation from the W1zz himself. We had at least one in the Ampere days already on 10GB. Those exceptions make the rule here and time will prove it.
 
Last edited:
Correlation is not causation. Bad game engine design or implementation can be causing this. Like i said i keep seeing examples of shitty ports.

Just trying a completely different card from another brand, with different drivers, more powerful, is not a scientific method at all, is just anecdotal clickbait for the masses. I'm refering to that HU video.
I sort of agree. Hogwarts and Last of us is prime examples of not bad, but very unoptimized game engines. UE4 is partly to blame, but after a few patches performance has increased a lot so there bad optimization was the issue.
 
I sort of agree. Hogwarts and Last of us is prime examples of not bad, but very unoptimized game engines. UE4 is partly to blame, but after a few patches performance has increased a lot so there bad optimization was the issue.
Don't mistake overall performance improvements with the ones that we're talking about here. Those improvements applied to every card, not just the ones with low VRAM.
And shitty ports.. yeah! But does it matter? Don't we all just want to play those shitty ports regardless? They got ported because the content itself is top notch, after all, and this has been happening since the PS2/3 days, including them being shitty and unoptimized. Some of them never getting optimized.
 
The RTX 3070 has issues all the way down to 1080p medium settings so how low are 3070 owners supposed to set their expectations? The 6800 can play those games at much higher texture settings while maintaining a good frame rate, something the 3070 could have done had Nvidia simply included more VRAM.

"gaming at ultra is stupid" and "gaming at very high" is both an opinion and not reflective of what was shown in the video. First, the video demonstrates performance at ultra and high, not very high. Second, the RX 6800 achieves 119 FPS on high settings and 97 FPS with ultra settings. The 3070 gets 71 FPS on high settings with a whopping 37 FPS 1% lows. As stated before, the 3070 could have good performance here if it had more VRAM. Second, it should go without saying that not everyone is going to agree with you that gaming on ultra and high settings is stupid. That's an opinion and a large portion of 3070 owners and PC gamers in general are likely to disagree.

It is honestly amazing the number of people willing to perform mental gymnastics to somehow cotort this lack of VRAM into a non-issue, it's akin to Apple telling it's customers they are holding their phones wrong. No, a customer uses their product as they wish, you aren't going to dictate what is and isn't proper use. If people feel like they got ripped off with their 3070, there is some very valid reasoning behind that which cannot be dismissed because you think they are stupid because their opinions don't align with the Nvidia hivemind.
Unless you use RT and ultra textures, 3070 is sufficient for most games, especially using dlss. One issue is that certain games using too much vram for no good reason. Last of us and Hogwarts are prime examples where simple ini-tweaks can reduce vram-usage by a massive amount without imagequality dropping. Running stock at Hogwarts 1080p at high-ish settings it uses 7.5gb vram, if I set framebuffer to 3072mb in ini-file I see no reduction in texturequality, but usage drops to 5.5gb. Going to 2048mb did not affect textures either on my setup. At 4k you would probably get issues unless running dlss.

Don't mistake overall performance improvements with the ones that we're talking about here. Those improvements applied to every card, not just the ones with low VRAM.
And shitty ports.. yeah! But does it matter? Don't we all just want to play those shitty ports regardless? They got ported because the content itself is top notch, after all, and this has been happening since the PS2/3 days, including them being shitty and unoptimized. Some of them never getting optimized.
Yeah, I agree with you. My main point is that developers push titles that have technical issues and fix them the following weeks. Vram reqs on both Hogwarts and LOS has dropped after a few patches. Impatience for fast profits seems the biggest issue.
 
This is from an older build without motion blur, AA and RT:

FH5 is not exactly the pinnacle of graphical complexity, it's a racing game where the most that is being rendered is a road and a bunch of cars, not exactly anywhere close to being a stress test for VRAM.

I am currently playing Returnal and it's regularly going over 14GB in VRAM usage and that's at 1440p. I don't know why this discussion is not over yet, games are regularly going over 8GB and sometimes over 12GB as well, I don't get the denial.
 
Yeah, I agree with you. My main point is that developers push titles that have technical issues and fix them the following weeks. Vram reqs on both Hogwarts and LOS has dropped after a few patches. Impatience for fast profits seems the biggest issue.
Oh yeah, but I think we have to be realistic too, that ain't gonna change, and I'm always keen to point out the developer POV as well on this. How 'fair' is it, that some GPU mfgrs release products that force developers into extra time and money sinks? That's why that console mainstream is so important to look at when you buy a GPU. What it has, is what games will generally just use, or want to use.
 
I just watched a video of a 6gb vram rtx 4060 laptop 1080p, and the reviewer was showing how blurry the bricks looked in Hogwarts Legacy, even if you set it to ultra settings, the vram is just capped regardless of what settings you pick, and it did look quite terrible.

That is probably the future, so yeah, 12gb should probably be the min for a new purchase.
Wow, I wonder how far does it go. Kinda disappointing it does not inform the player though that selected graphics are beyond what is supported by hardware and quality will be lowered.
 
Well, with nVidia allotting 16GB VRAM on the RTX 4080, I seriously doubt you'd see equal amount of VRAM on lower end cards (vis-a-vis the RTX 4080). Like the RTX 3060 Ti/3070/3070 Ti, nVidia is gonna limit amount of VRAM on the RTX 4070/4070 Ti to 12GB. Unless nVidia does another outlier like the odd RTX 3060 12GB by arming its RTX 4060 with 16GB of VRAM.

nVidia knows full well that with more and more games chewing up VRAM like mad muthas, these cards really ought to have more VRAM, they would have a shorter period of usefulness at max setting + RT at up to 1440P (which they are quite capable of) and like RTX 3060 Ti/3070/3070 Ti owners, would have to upgrade sooner rather than later. Obviously, 16GB VRAM would last longer than 12GB in the long run at certain res and ingame setting, that's a fact.

Exactly - nvidia knows this full well. Infact it's a calculated move as always on their part - planned obsolescence.
Nvidia doesn't want you to use a gpu for 5+ years... you are supposed to buy a new one every gen.
 
Exactly - nvidia knows this full well. Infact it's a calculated move as always on their part - planned obsolescence.
Nvidia doesn't want you to use a gpu for 5+ years... you are supposed to buy a new one every gen.
You could always lower the settings...

All these youtubers test on "ultra" or "maximum" or whatever instead of "high". Usually the maximum settings are completely unoptimized because that's the "base" for other settings, especially the case with textures, but plenty other settings. They are just left as an option for future PCs or people with too much money that buy literally top-end.

Tests on "high" would be way more sensible and show actual requirements and allow proper comparison of cards.
 
You could always lower the settings...

All these youtubers test on "ultra" or "maximum" or whatever instead of "high". Usually the maximum settings are completely unoptimized because that's the "base" for other settings, especially the case with textures, but plenty other settings. They are just left as an option for future PCs or people with too much money that buy literally top-end.

Tests on "high" would be way more sensible and show actual requirements and allow proper comparison of cards.

No, it is not especially the case with textures - infact it's the exact oppersite. Max texture setting is what console is using, and lower settings on pc uses lower resolution textures that more often than not looks really bad. The only exception to this is HD texture packs added to the games...
 
With 80% of pc gamers having 8gb or less, they'd better target those builds to have a playable, stutter free, reasonably good looking experience.

The rate thing are evolving anyone with an 8gb or less card should be willing to sacrifice settings to get a great experience, which is what everyone but the 1% with top end hardware do all the time anyway, one way or another. I also reject the notion texture quality is the be all end all of iq. The amount og games I've played lately where they're next to the last thing I notice, at least for me, makes it not much of an issue.
 
With 80 of pc gamers having 8gb or less, they'd better target those builds to have a playable, stutter free, reasonably good looking experience.

They target console specs, and make it work on pc afterwards.
 
Status
Not open for further replies.
Back
Top