• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Is 8gb vram the minimum target for 2023 gaming?

Is 8gb vram the minimum entry for gaming 2023 and onwards

  • Yes

    Votes: 69 56.6%
  • No

    Votes: 53 43.4%

  • Total voters
    122
  • Poll closed .
Status
Not open for further replies.
Yep many games dynamically adjust quality now, something of which many reviewers dont pick up on. (instead they just test framerate and consider it all good if performance doesnt drop).

I didn't know about this until just now with that laptop 4060 video, and your comment. Absolutely wild. Budget gamers are getting fucked. lol
 
RAM speed changes
RAM type changes
GPU chip changes

limiting the "minimum" to just RAM as opposed to overall performance is a bad way to compare GPUs. After all the RX 480 8GB is as different as the GTX 1070 is as different as the RTX 4060ti
I agree but only up to a point.
When memory capacity starts limiting performance on GPU to a non-playable vs. playable experience (if only memory capacity was higher), that's where I draw the line of "bad way to compare".

Think about it like this :
You can always use double sided memory chips placing, or GDDR6X instead of GDDR6 to get around "hard limit" capacity (example : 12GB 192-bit GDDR6X vs. 8GB 256-bit GDDR6 or 16GB 256-bit G6 vs. 12GB 192-bit G6X). Judging by margins NV makes, that's no-issue on cost side.
Having multiple times performance of earlier GPUs, while still having the same memory capacity is suboptimal in the long run.
Think about it : 6GB Titan Black is 2013/early 2014 card, and 8GB 390/390Xs are from the same time period.
How much faster is RTX 3070 vs. OG Titan, and how many of the features of Ampere require additional VRAM usage ?
Either get more VRAM, or get used to lower graphics settings (or resolution) in newer* games.
*assuming future games don't get better at image quality vs. VRAM usage metric
 
It totally depends what your use case is, I'm equally happy with 5600xt 6GB at 1080p as I am with 6800xt 16GB at 4k.
 
It totally depends what your use case is, I'm equally happy with 5600xt 6GB at 1080p as I am with 6800xt 16GB at 4k.

It's always use case based, technically you can get away with a Pentium II with 512 MB of RAM if all you play is visual novels.

What the OP likely meant and what reviewers typically mean when they say things like 2023 gaming or acceptable performance for modern games is that you can play new AAA games at decent quality settings at a decent frame-rate without frequent stuttering. In otherwords, a smooth experience.

The trend we are seeing is that even at 1080p with DLSS enabled (so technically 720p), 8GB is not enough. Heck even lowering quality settings was not enough to get rid of the stuttering for the 3070 in HWUB's new video:


8GB looks to be the bare minimum for new AAA titles but you are going to want more VRAM if you want to play these newer AAA titles without significant compromise.
 
Last edited:
I would say no probably 12GB especially if you are spending more than 350usd.

Need is very subjective because everyone games different and even plays different games I just think this is more of a depending on what you are spending you should expect a certain amount of vram. Me saying a card needs 16GB is just as wrong as someone else saying it needs 8GB....

150-300 usd 8GB
350-600 usd 12GB
700-1200 usd 16GB
1500+ usd at least 24GB

I would say these are the min at each price bracket people should expect in 2023 but that's just my 2 cents everyone's needs are different.
 
Last edited:
8 gb is the bare minimum I will say. Not something I would choose my self throw. With only 8 gb you will not have a good time I games. Future games Will only demand more Vram.

I would not go for less than 12 gb vram. Specially if it is the plan to keep the gpu for a few years at least.

The low end 8 gb vram minimum i think showed with rtx 3050.

8 gb i would only go for if it was because I couldn't not afford anything else. Even so, I would go for a cheaper used card with more vram than after a new card with less vram.

I had a rtx 3080 10 gb vram. That was all ready at launch with to little vram and I suffered from it.

12 gb vram is what should be the goal for people to get, as long they can afford it of cause. I could never settle or be happy/satisfied with 8 gb vram.
 
I would say no probably 12GB especially if you are spending more than 350usd.

Need is very subjective because everyone games different and even plays different games I just think this is more of a depending on what you are spending you should expect a certain amount of vram. Me saying a card needs 16GB is just as wrong as someone else saying it needs 8GB....

150-300 usd 8GB
350-600 usd 12GB
700-1200 usd 16GB
1500+ usd at least 24GB

I would say these are the min at each price bracket people should expect in 2023 but that's just my 2 cents everyone's needs are different.
Actually you can get a 7900XTX for around $950 USD.
 
Actually you can get a 7900XTX for around $950 USD.

I know but that has nothing to do with what I said but it would be in the green from a price perspective of how much vram a card should have given it's cost... Even with 24GB though the card isn't fast enough at both raster and RT for me to personally care that it has 24GB but that's a completely different topic.

I think the better comparison is the 4070ti vs the 7900XT they both cost around 800 but to me the 4070ti has inadequate vram given it's price. Others might disagree and that's fine.

I was saying how much vram a card should have based on cost not what people can currently get in the price brackets I specified again it's just my 2 cents.
 
It's always use case based, technically you can get away with a Pentium II with 512 MB of RAM if all you play is visual novels.

What the OP likely meant and what reviewers typically mean when they say things like 2023 gaming or acceptable performance for modern games is that you can play new AAA games at decent quality settings at a decent frame-rate without frequent stuttering. In otherwords, a smooth experience.

The trend we are seeing is that even at 1080p with DLSS enabled (so technically 720p), 8GB is not enough. Heck even lowering quality settings was not enough to get rid of the stuttering for the 3070 in HWUB's new video:


8GB looks to be the bare minimum for new AAA titles but you are going to want more VRAM if you want to play these newer AAA titles without significant compromise.

My strong suspicion is its a load of tosh to try and convince people to upgrade... Desperate times and that. Samsungs profits down 96% in the last quarter etc.

I used to play 1080p on a 2GB GTX 960 without that many issues.

Probably can't be bothered to get hold of games that might need 8GB at 1080p

Most AAA games that are also on the consoles aren't going to need that much, that's for sure.
 
I know but that has nothing to do with what I said but it would be in the green from a price perspective of how much vram a card should have given it's cost... Even with 24GB though the card isn't fast enough at both raster and RT for me to personally care that it has 24GB but that's a completely different topic.

I think the better comparison is the 4070ti vs the 7900XT they both cost around 800 but to me the 4070ti has inadequate vram given it's price. Others might disagree and that's fine.

I was saying how much vram a card should have based on cost not what people can currently get in the price brackets I specified again it's just my 2 cents.
Well since the Game I play the most is TWWH3. I bought a 6800XT from launch. I enjoyed the card and I was satisfied. The 6500XT was launched and I could not believe how much negative pressure was put on the card so I bought one for myself. I had a system with a 5600G and placed it in that machine. The first thing I noticed was the clock speed. The 6500XT easily OCed to 2977 MHZ and the Memory had no issue going to 2400 MHZ. Then I changed the CPU to a 5600 and noticed a bump in performance going to PCIe 4. Then I took the card apart and was astounded to see a chip that was smaller than the VRAM chips. Then I read about 7000 and how it was going to use chiplets.

So yes I bought a 7900XTX and that card was uber sweet but (don't drink beer and change a Waterblock) so I bricked that card but I was able to get a refund anyway. The next day I went back on the etailer and the 7900XT cards were available. Since I had a Nitro card I did a ton of research and got the Pulse for $400 less (fit the WB too). In both cases it revealed a card with 6 of those chiplets (or 8? ) and guess what? I do not have to change the Display to 1440P to enjoy my 144Hz 4K panel that has Freesync. That means that when I see that I get no slow downs at all playing Huge Battles at 4K in TWWH3 I do not miss my 6800XT. I will say though that since the chiplet is new tech we can expect (and already have seen) improved performance as time goes on. I saw someone crying about AMD not supporting the 6000 cards for 3 Months (OMG) while they were ironing out all the percieved red flags that were all over the Internet for the 7000. In a time when there was absolutely nothing wrong with the drivers that were currently available for all 6000 series cards.

I agree that the prices of GPUs are ridiculous but I am in my 50s and have a career so buying a card for $1299 Canadian to support my hobby that keeps my brain sharp is definitely worth it (I would rather pay $700). There is a video floating (the very one above) that shows the 6800 blowing away the 3070 in new Games. What do you think a 7900XT would do? It might not seem like a lot but the extra 4GB in VRAM means an extra 2GB in VRAM usage for most Games and with a fast chip like the 7900X3D (Yes it is faster than a 5800X3D in every way) you will have the smile that compelling hardware brings like the first time we got one of those Korean 1440P monitors for $300 that was totally driven by GPU scaling so everything was smooth and Crossfire was a joy.
 
Well since the Game I play the most is TWWH3. I bought a 6800XT from launch. I enjoyed the card and I was satisfied. The 6500XT was launched and I could not believe how much negative pressure was put on the card so I bought one for myself. I had a system with a 5600G and placed it in that machine. The first thing I noticed was the clock speed. The 6500XT easily OCed to 2977 MHZ and the Memory had no issue going to 2400 MHZ. Then I changed the CPU to a 5600 and noticed a bump in performance going to PCIe 4. Then I took the card apart and was astounded to see a chip that was smaller than the VRAM chips. Then I read about 7000 and how it was going to use chiplets.

So yes I bought a 7900XTX and that card was uber sweet but (don't drink beer and change a Waterblock) so I bricked that card but I was able to get a refund anyway. The next day I went back on the etailer and the 7900XT cards were available. Since I had a Nitro card I did a ton of research and got the Pulse for $400 less (fit the WB too). In both cases it revealed a card with 6 of those chiplets (or 8? ) and guess what? I do not have to change the Display to 1440P to enjoy my 144Hz 4K panel that has Freesync. That means that when I see that I get no slow downs at all playing Huge Battles at 4K in TWWH3 I do not miss my 6800XT. I will say though that since the chiplet is new tech we can expect (and already have seen) improved performance as time goes on. I saw someone crying about AMD not supporting the 6000 cards for 3 Months (OMG) while they were ironing out all the percieved red flags that were all over the Internet for the 7000. In a time when there was absolutely nothing wrong with the drivers that were currently available for all 6000 series cards.

I agree that the prices of GPUs are ridiculous but I am in my 50s and have a career so buying a card for $1299 Canadian to support my hobby that keeps my brain sharp is definitely worth it (I would rather pay $700). There is a video floating (the very one above) that shows the 6800 blowing away the 3070 in new Games. What do you think a 7900XT would do? It might not seem like a lot but the extra 4GB in VRAM means an extra 2GB in VRAM usage for most Games and with a fast chip like the 7900X3D (Yes it is faster than a 5800X3D in every way) you will have the smile that compelling hardware brings like the first time we got one of those Korean 1440P monitors for $300 that was totally driven by GPU scaling so everything was smooth and Crossfire was a joy.

Basically it seems we both agree 8GB is not enough in 2023..... And yeah I have a 4090 so definitely don't mind investing into my hobby at the same time the 24GB of vram had almost zero bearing on my decision to buy one it was the RT performance at 4k if the gpu performed like a 4070 and had 24GB of vram I still wouldn't buy it in 2023.....
 
Basically it seems we both agree 8GB is not enough in 2023.....

Of course but I expect the next test HWUB will do is the 6600 vs the 3070 and it will be interesting to see what happens.
 
Of course but I expect the next test HWUB will do is the 6600 vs the 3070 and it will be interesting to see what happens.

6700XT?

The 6600 has the same amount of vram and is much slower so I fail to see how that would be relevant.
 
My strong suspicion is its a load of tosh to try and convince people to upgrade... Desperate times and that. Samsungs profits down 96% in the last quarter etc.

I mean, you are making a subjective assertion against objective data without even explaining your reasoning beyond a conspiracy theory or providing proof.

It's pretty crazy to asset that AMD and Nvidia are conspiring to boost Samsung's profits but even less sense when you consider that GDDR6X is Micron only. On top of that, for video cards that don't use GDDR6X, Samsung is not the only supplier.

I used to play 1080p on a 2GB GTX 960 without that many issues.

I used to play games at 1080p on a 5850 without many issues as well. The problem with the example statement I gave and the statement you made is that it leaves out important context like which games and when. "without many issues" is subjective as well but at the very least it implies some issues were encountered. It's about as vague and subjective as you could make and in turn it doesn't really contribute to the conversation.
 
8GB VRAM is dead on arrival in 2023.

It's not, if it's for a x50 class of gpu.
The x60s have been capable of playing most of the AAA games, so it would be bad if they come with 8GB. A 10GB, if possible would be great.
The x70s are ok with 12GB. Their prices are not ok though...

AMDs gpus may suck in RT but their monstrous capacity in VRAM is very welcome.
They will run out of core speed first and then out of VRAM.
 
the users here are the vanguard of PC tech and are a niche audience, I think many people would be surprised by how many people game with i7-7700k type CPUs and GTX 1060 to 1660 6GB type video cards.

Yeah I've noticed this, not just here but on other tech forums/sites too and I was also a part of a Discord server/community with a budget mindset for almost a year. 'a budget hardware YT channel's server'
This was years ago when I had my 1600x/RX 570 system and within that community that was considered a good setup back then cause most ppl there had worse as a daily driver. 'even tho my system was already aging imo'

I would say that in my country even my 3060 Ti would be considered a luxury item for this kind of 'hobby' and I kinda felt bad for spending this ammount of money on a GPU. 'my most expensive piece of hardware bought so far'

Grew up using budget-mid range hardware so I'm more than used to playing on such systems with no real issues to speak of.
As long as I'm not forced to play on all low settings I'm okay, thats when I usually upgrade something or the whole system depends on what I'm playing and where is the bigger issue.

To be honest I'm playing less nowadays even tho I have a better system than before, so having to play on High/medium instead of Ultra really wont bother me in new games when I do decide to play those. 'out of all the 2023 relases so far I've only played Atomic Heart/Forspoken demo and Diablo 4 beta which I will be playing as my main game for a good while anyway'

I wouldn't mind having more Vram but its not really an issue for me yet, I'm more curious about how the new Unreal Engine 5 games will deal with my card say on ~high-ish settings at least.
 
Last edited:
AMDs gpus may suck in RT but their monstrous capacity in VRAM is very welcome.
They will run out of core speed first and then out of VRAM.

Maybe I'm cynical and although I'm happy that it is the case I feel the only reason AMD gpu's come with more Vram than their Nvidia counterparts is because if say the 7900XT had 12GB almost nobody would buy it over the 4070ti Although they did try to sell it comically for 900 usd.... Same with the 24GB vs 16GB on the 7900XTX vs 4080 but in that comparison the 4080 has adequate vram at least for the next 4 years
 
Maybe I'm cynical and although I'm happy that it is the case I feel the only reason AMD gpu's come with more Vram than their Nvidia counterparts is because if say the 7900XT had 12GB almost nobody would buy it over the 4070ti Although they did try to sell it comically for 900 usd.... Same with the 24GB vs 16GB on the 7900XTX vs 4080 but in that comparison the 4080 has adequate vram at least for the next 4 years

I agree with you regarding the 7900s (but they don't suck at RT that much).
I was referring to 6000 which had 12/16gb and today that pays back.
 
I agree with you regarding the 7900s (but they don't suck at RT that much).
I was referring to 6000 which had 12/16gb and today that pays back.

For sure that was a smart move on their part I just don't know how much it matters people still buy Nvidia over Radeon it's been at least since the 680/7970 that AMD offered more Vram at similar price points and they still have very little mindshare and seemingly even less market share.

I agree they should be applauded and had they not decided to not compete with the 4090 I might have purchased one of their cards.
 
Last edited:
I think if you're buying a new video card now and it's 8gb, you have to be ok with NOT playing max settings at anything above 1080p in AAA titles going forward. That's totally fine for the majority of esports titles like valorant, lol, dota2, cs:go, and probably overwatch.

Both the resident evil 4 and last of us part 1 performance reviews on TPU have shown that 8gb isn't enough vram. In the past 16GB cards seemed like they were only going to be necessary for 4k, but that's not the case anymore. Hardware Unboxed said it best, 8gb cards should be considered entry level at this point.
 
Status
Not open for further replies.
Back
Top