• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Is 8gb vram the minimum target for 2023 gaming?

Is 8gb vram the minimum entry for gaming 2023 and onwards

  • Yes

    Votes: 69 56.6%
  • No

    Votes: 53 43.4%

  • Total voters
    122
  • Poll closed .
Status
Not open for further replies.
8gb can be viable for many years to come, especially at 1080p using fsr/dlss,but you must be willing to do *.ini-tweaks in certain games to bring down usage. In Hogwarts I got fps drops to 40-50fps on my 3060ti, and vram usage was around 7500mb, after ini-tweaks it is now 5500mb, image quality looks identical, but fps is rock solid sbove 60fps.
 
It's always use case based, technically you can get away with a Pentium II with 512 MB of RAM if all you play is visual novels.

What the OP likely meant and what reviewers typically mean when they say things like 2023 gaming or acceptable performance for modern games is that you can play new AAA games at decent quality settings at a decent frame-rate without frequent stuttering. In otherwords, a smooth experience.

The trend we are seeing is that even at 1080p with DLSS enabled (so technically 720p), 8GB is not enough. Heck even lowering quality settings was not enough to get rid of the stuttering for the 3070 in HWUB's new video:


8GB looks to be the bare minimum for new AAA titles but you are going to want more VRAM if you want to play these newer AAA titles without significant compromise.

i think 8gb is cutting it close in 2023, but i also see some damn stupid considerations in that video:
- gaming at ultra is stupid, it has been more then proved, any video that shows that a card struggles with ultra settings is idiotic
- gaming at very high 60fps or even lower like i seen on that video, is also stupid when you can lower the settings a better frame rates, better experience overall and can hardly notice the difference
-the 6800 is on average 10% faster then the 3070 so taking conclusion just based on vram is also stupid, not to mention certain games prefer team red or green, none of that is taking into account
 
I'm starting to regret buying my 3070 due to the vram though I havent encountered the games that requires more vram
 
I'm starting to regret buying my 3070 due to the vram though I havent encountered the games that requires more vram

Stay away from ultra setting, pointless anyway, and it should be fine for some time at 1440p. Or playing games at 30 or 50fps high when you can just lower the settings a bit and have a better experience. Nothing new to gaming anyway.
I'm not happy with only 8gb having a very similar card, but it's not the end of the world either.
 
I personally wouldn't buy a gpu with any less than 12gb, regardless of resolution. The card that becomes the 4070 really should have been the 4060, and should be the bare minimum if you intend to play AAA games the next 2 years.

16GB is the bare minimum for 1080/1440. But keep in mind... that 4K gaming ie 4 times increase doesn't mean 24GB is enough either. 32GB is the magic number actually needed for future proof 4K gaming.

Welcome to 2023. Well at last.

I don't know about that... i haven't seen any games at 4k coming close to having an issue with 24gb vram.

8k is a different story though.
 
A x60 class of card could never play AAA games for the next 2 following years. Check any 960/1060/2060/3060...
The x70 class is for that (with a couple of exclusions such as very bad optimised console ports or tech showcase games like Crysis)...
 
Stay away from ultra setting, pointless anyway, and it should be fine for some time at 1440p. Or playing games at 30 or 50fps high when you can just lower the settings a bit and have a better experience. Nothing new to gaming anyway.
I'm not happy with only 8gb having a very similar card, but it's not the end of the world either.

Some ultra settings are pointless, but textures is one that is not - you always want that cranked as high as possible... and that's exactly what you can't with limited amounts of vram.
 
Stay away from ultra setting, pointless anyway, and it should be fine for some time at 1440p. Or playing games at 30 or 50fps high when you can just lower the settings a bit and have a better experience. Nothing new to gaming anyway.
I'm not happy with only 8gb having a very similar card, but it's not the end of the world either.

Yeah I also don't know whats so bad about lowering a thing or two or just playing on high/disabling RT in new games.
I guess its just happening earlier now also thanks to new relases being badly optimized for the most part.

A x60 class of card could never play AAA games for the next 2 following years. Check any 960/1060/2060/3060...
The x70 class is for that (with a couple of exclusions such as very bad optimised console ports or tech showcase games like Crysis)...

Except that there are ppl, actually a lot who still plays AAA games on 1060/2060 tier cards. Sure maybe 1080p with lowered settings/FSR/DLSS enabled but they still do play those games and have fun like the rest.
There are bunch of YT channels dedicated to budget gaming and its doable.
I am myself like that and it was never an issue, just had to mind my settings and expectations.

For example:

1080p native medium ~45-50+ fps on average or around the same with high+FSR 2 Quality.
Whats playable is ofc subjective but for most ppl playing with older hardware that is totally fine in a singleplayer game. 'myself included'
 
Last edited:
A x60 class of card could never play AAA games for the next 2 following years. Check any 960/1060/2060/3060...
The x70 class is for that (with a couple of exclusions such as very bad optimised console ports or tech showcase games like Crysis)...

No offense, but that is an increadible dumb post.

The 960 was the worst card ever made, but the 1060, 2060 and 3060 could / can play AAA games at 1080p just fine...
 
Yeah I also don't know whats so bad about lowering a thing or two or just playing on high/disabling RT in new games.
I guess its just happening earlier now also thanks to new relases being badly optimized for the most part.



Except that there are ppl, actually a lot who still plays AAA games on 1060/2060 tier cards. Sure maybe 1080p with lowered settings/FSR/DLSS enabled but they still do play those games and have fun like the rest.
There are bunch of YT channels dedicated to budget gaming and its doable.
I am myself like that and it was never an issue, just had to mind my settings and expectations.

For example:

1080p native medium ~45-50+ fps on average or around the same with high+FSR 2 Quality.
Whats playable is ofc subjective but for most ppl playing with older hardware that is totally fine in a singleplayer game. 'myself included'
Well said. Hell, I can play most games on my HTPC with a 4 GB 6500 XT in it, and my brother has just got a 2060 because the 2 GB on his 960 wasn't really cutting it anymore. The newest AAA games with ultra settings aren't the only option out there.
 
The x60 class can play most of the AAA titles with some or many tweaks.
So, 8GB are enough for 1080p with DLSS or not.

The 4070 is a 12GB card with 3080+ level of performance and DLSS3. How's that the bare minimum?
 
The x60 class can play most of the AAA titles with some or many tweaks.
So, 8GB are enough for 1080p with DLSS or not.

The 4070 is a 12GB card with 3080+ level of performance and DLSS3. How's that the bare minimum?

The 3060 is a 12gb card...
 
The 3060 is a 12gb card...
looool
I have to admit I didn't see that coming.

Anyway, although 3060 is a 12gb card, i consider 8gb to be the minimum for a x60 or for any gaming gpu in 2023.
 
looool
I have to admit I didn't see that coming.

Anyway, although 3060 is a 12gb card, i consider 8gb to be the minimum for a x60 or for any gaming gpu in 2023.

I'm sure you do...

The 3060 doesn't have any issues in current games due to it's 12gb vram, unlike all the 8gb cards, which has a ton of issues in alot of recent games, unless textures are lowered to potato levels.
 
Well said. Hell, I can play most games on my HTPC with a 4 GB 6500 XT in it, and my brother has just got a 2060 because the 2 GB on his 960 wasn't really cutting it anymore. The newest AAA games with ultra settings aren't the only option out there.

Yeah I'm more than used to tweaking my settings and play whatever games that way.
For example when Borderlands 3 was relased I had a 4 GB RX 570 with my current 2560x1080 monitor and that was already a pretty big GPU bottleneck in that game so I had to tweak it down to mid-high or so. 'game hardly looks better on higher settings anyway, currently playing it maxed out on my rig'
That did not stop me from having a blast and put around ~800 hours into the game with my 570 until I've upgraded to a 1070 and later to my 3060 Ti.

I only recall 1 game that was completely unplayable at the time of relase is HZD, that game just refused to work with my 4 GB card and had missing textures even on low. 'game/AMD issue cause it was fine on my bro's GTX 970 on the same settings/res'
This game was the main reason of my 1070 upgrade actually.


Obviously having more Vram would be useful in 2023 'this point I'm not arguing' but its not like suddenly I can't play anything new with my 8 GB card just gotta deal with the fact that I can't play new games on max/ultra settings anymore so back to tweaking like I always did.
 
Some ultra settings are pointless, but textures is one that is not - you always want that cranked as high as possible... and that's exactly what you can't with limited amounts of vram.

give me a real world example of a game i can't play decently at medium setting (i, like i believe most, prefer higher frame rates at medium before i ever use high or ultra at 30 or 60fps) because of the textures? 1440p in my case

don't get me wrong i'm not saying they don't exist, i have no idea, i just wanted to check for myself. I never encountered one but i don't play every single game possible
 
give me a real world example of a game i can't play decently at medium setting (i, like i believe most, prefer higher frame rates at medium before i ever use high or ultra at 30 or 60fps) because of the textures? 1440p in my case

don't get me wrong i'm not saying they don't exist, i have no idea, i just wanted to check for myself. I never encountered one but i don't play every single game possible

In most games you can play at medium while they still look ok - with one very crucial exception - textures. They usually look terrible when not maxed out, and texture quality has no performance impact, as long as you have a sufficient amount of vram.

Thus why you want to have at least 12gb vram, even at 1080p.

As for examples, there are many... hogwarts, the last of us, the callisto protocol, resident evil 4... basically all recently released AAA titles. They all need to have texture quality lowered on 8gb gpus to be able to play without stuttering due to vram swapping. And textures is as said the one setting you do not want to lower.
 
This thread will go on forever, because it entirely depends on the types of games you play.
Is 8GB the minimum for a lot of titles, if you want to avoid stutter and slowdowns? 100% yes


If you play lighter games or dont care about the odd slowdown, then no.

give me a real world example of a game i can't play decently at medium setting (i, like i believe most, prefer higher frame rates at medium before i ever use high or ultra at 30 or 60fps) because of the textures? 1440p in my case

don't get me wrong i'm not saying they don't exist, i have no idea, i just wanted to check for myself. I never encountered one but i don't play every single game possible
You'll see those FPS values on average, but your 0.1% lows for example will be a mess with erratic frame drops
 
In most games you can play at medium while they still look ok - with one very crucial exception - textures. They usually look terrible when not maxed out, and texture quality has no performance impact, as long as you have a sufficient amount of vram.

Thus why you want to have at least 12gb vram, even at 1080p.

As for examples, there are many... hogwarts, the last of us, the callisto protocol, resident evil 4... basically all recently released AAA titles. They all need to have texture quality lowered on 8gb gpus to be able to play without stuttering due to vram swapping. And textures is as said the one setting you do not want to lower.

of those game i only own TLoU, but i haven't played it yet because the game is a mess (waiting for updates do fix it), i can't really conclude anything, no one can.
I also own Returnal, recent AAA game, no issue, i can run high textures no problem.
I confess those are the most recent game releases i own. Played Forza 5, F1 22, Uncharted, Super Man are the most recent ones i played, no problem there.

Aren't Hogwarths and Callisto very badly optimized games, like really bad? i'm sure you can't draw conclusion from those. Many times i see people mentioning these shit shows of bad ports as examples and that makes no sense to me.

Resident Evil games they all are a mess, RE village that i own and played at launch was a disgrace, this should be no different i assume.
 
of those game i only own TLoU, but i haven't played it yet because the game is a mess (waiting for updates do fix it), i can't really conclude anything, no one can.
I also own Returnal, recent AAA game, no issue, i can run high textures no problem.
I confess those are the most recent game releases i own. Played Forza 5, F1 22, Uncharted, Super Man are the most recent ones i played, no problem there.

Aren't Hogwarths and Callisto very badly optimized games, like really bad? i'm sure you can't draw conclusion from those.

Callisto is very badly optimized in regards to everything. Hogwarts has bad cpu utilization, but isn't badly optimized otherwise - it's just demanding vram wise with highest quality textures :)
 
You'll see those FPS values on average, but your 0.1% lows for example will be a mess with erratic frame drops
Well, I agree on what he says here. It will become much worse. It will be a norm onwards. Just accept it. It is just a click, now we need a CPU with SSE support or similar. In the old days it was a norm.

 
Callisto is very badly optimized in regards to everything. Hogwarts has bad cpu utilization, but isn't badly optimized otherwise - it's just demanding vram wise with highest quality textures :)

but how do you know bad 1% and 0.1% lows are because of lack of more vram or shitty port? even a 1060 with 6gb of vram can do doom at ultra nightmare texture settings. I need another good game to draw those kind of conclusions not shitty ports.


don't get me wrong 8gb on these cards are idiotic, but so are most shitty ports AAA games these days
 
Do you guys use an 8GB card or do you just look at reviews?
 
but how do you know bad 1% and 0.1% lows are because of lack of more vram or shitty port? even a 1060 with 6gb of vram can do doom at ultra nightmare texture settings. I need another good game to draw those kind of conclusions not shitty ports.


don't get me wrong 8gb on these cards are idiotic, but so are most shitty ports AAA games these days

Because frametimes are fine when you lower texture quality, thus vram usage...

Do you guys use an 8GB card or do you just look at reviews?

If you are refering to me, i've tested it on my friends 3060 ti / 1080p setup.
 
In most games you can play at medium while they still look ok - with one very crucial exception - textures. They usually look terrible when not maxed out, and texture quality has no performance impact, as long as you have a sufficient amount of vram.

Thus why you want to have at least 12gb vram, even at 1080p.

As for examples, there are many... hogwarts, the last of us, the callisto protocol, resident evil 4... basically all recently released AAA titles. They all need to have texture quality lowered on 8gb gpus to be able to play without stuttering due to vram swapping. And textures is as said the one setting you do not want to lower.

Tbh to my eyes in newer games high-ultra textures look identical or at least close enough to not notice the diff while playing.
Thats coming from someone who uses HD texture mods for older games cause older games do have bad textures sometimes even on max. 'I've played Witcher 3 with a HD texture mod from Nexus on my RX 570 years ago with no issues and it looked much better than what the game had'

In new games tho, they are good enough on high imo.
I've forgot that I had Resi 4 demo installed so I just played that now.
2560x1080 native with RT OFF/High 6 GB textures which is 1 lower than max and rest settings maxed with only volumetric lights/fog on medium. 'useless performance hog in most games so I always turn that down'
This way the demo played with no issues staying above 60 fps even in that village fight, no real stutters to speak of either.

This thread will go on forever, because it entirely depends on the types of games you play.
Is 8GB the minimum for a lot of titles, if you want to avoid stutter and slowdowns? 100% yes


If you play lighter games or dont care about the odd slowdown, then no.


You'll see those FPS values on average, but your 0.1% lows for example will be a mess with erratic frame drops

Yeah that sound about right so this is also my last post here, this just goes in circles cause it depends a lot on personal preferences and whatnot.
For me personally 0.1% lows don't matter, thats literally 0.1% of my playtime in the game which don't bother me at all in a singleplayer game, most games will have that occasional stutter no matter what you do or what system you play on. 'not even talking about badly optimized games here, just in general especially in unreal engine games'
 
  • Like
Reactions: 64K
Status
Not open for further replies.
Back
Top