• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Is 8gb vram the minimum target for 2023 gaming?

Is 8gb vram the minimum entry for gaming 2023 and onwards

  • Yes

    Votes: 69 56.6%
  • No

    Votes: 53 43.4%

  • Total voters
    122
  • Poll closed .
Status
Not open for further replies.
custom settings is always superior to ultra or high settings ;) although most of the time it's ultra with AA turned off, for me :laugh: (well mainly because AA is not really changing anything above 1440p for me )

although that does not avoid some of my games to almost max vRAM :laugh: (i just ended a Skyrim session with 11.2gb max vRAM usage :laugh: new high hit! before it was 9-10gb but i added some more eye candy :D and still steady 60fps )
 
Don't forget this was how it felt for most buying a 3080 lol

Hi,
Man that dude is for sure is the poster child "Your brain on crack/ meth mixture" :eek:

Consoles tell the truth spec's wise
If you can't hit those on pc save money and get one instead.
 
custom settings is always superior to ultra or high settings ;) although most of the time it's ultra with AA turned off, for me :laugh: (well mainly because AA is not really changing anything above 1440p for me )

although that does not avoid some of my games to almost max vRAM :laugh: (i just ended a Skyrim session with 11.2gb max vRAM usage :laugh: new high hit! before it was 9-10gb but i added some more eye candy :D and still steady 60fps )

if you ever upgraded ram you know it doesn't work like that. The more you have the more it uses, or even more exactly the more it reserves, as you never really see the used vram.

I'm not saying the 8gb cards don't max, but those skyrim using 11.2gb are anecdotal, again i have to use this word.
 
Shots fired!!!


Balls in your court Nvidia... :laugh:

Although amd could help it's case by not trying to charge 900 usd for a 7800XT masquerading as a 7900XT...

Talk to us about the actual 7700XT amd I mean what they will actually call the 7800XT.
 
Even though numerous games are perfectly playable at 1080p with maxed out details on an 8 GB card, newer ones require more video memory to run smoothly. An ever increasing number of AAA titles can already overfill the 8 GB buffer at ultra:

9 GB VRAM @ 1080p
Forza Horizon 5
Watch Dogs Legion
Final Fantasy XV
Cyberpunk 2077 + RT
Halo Infinite + RT
Marvel's Spider-Man Remastered

10 GB VRAM @ 1080p
Resident Evil 4 + RT
Marvel's Midnight Suns + RT
Marvel's Spider-Man: Miles Morales

11 GB VRAM @ 1080p
The Last of Us
Forspoken

14 GB VRAM @ 1080p
Hogwarts Legacy + RT
Call of Duty: Warzone 2.0

Depending on how effectively the engine manages video memory, when it starts running out of VRAM we could see crashes, texture pop-ups, very low resolution textures being utilized, or even textures missing entirely. In addition, there could be severe hitching/frame rate drops when entering new areas. These symptoms can be alleviated to a degree by lowering the in-game settings, but reducing texture quality tends to diminish the title's visual appeal. How much fidelity we're willing to trade for a more playable experience is obviously a personal matter.

While I'm sure that not all games will require an 8 GB card in 2023, I would not recommend buying one today, unless exclusively for older titles. I believe that 12 GB will be the sensible choice for AAA at 1080p in 2023.

In addition, many publishers disregard optimization altogether and keep putting out half-baked titles, yet people continue buying them. We can expect popular games to require more and more VRAM in the coming years -- the writing is on the wall.
 
Last edited:

do tell me more about that please, seriously do tell me more, i can't wait.

Well first off you called both Ultra AND High stupid, of which even the video you linked contradicts. Second, that video presents a lot more nuanced stance than you are implying. Third, HWUB shows the difference in the video I linked between what the 3070 is capable of and what the RX 6800 is capable of and as they noted the difference in quality is significant. The video you linked is an older video using older games and is less relevant than a newer video with newer games showing you what these cards can do today. This would be akin to me linking older HWUB's videos to say "hey having more than 8GB is dumb" despite having just saw a video providing otherwise. Grapsing at straws, plain and simple.
 
Even though numerous games are perfectly playable at 1080p with maxed out details on an 8 GB card, newer ones require more video memory to run smoothly. An ever increasing number of AAA titles can already overfill the 8 GB buffer at ultra:

9 GB VRAM @ 1080p
Forza Horizon 5
Watch Dogs Legion
Final Fantasy XV
Cyberpunk 2077 + RT
Halo Infinite + RT
Marvel's Spider-Man Remastered

10 GB VRAM @ 1080p
Resident Evil 4 + RT
Marvel's Midnight Suns + RT
Marvel's Spider-Man: Miles Morales

11 GB VRAM @ 1080p
The Last of Us
Forspoken

14 GB VRAM @ 1080p
Hogwarts Legacy + RT
Call of Duty: Warzone 2.0

Depending on how effectively the engine manages video memory, when it starts running out of VRAM we could see crashes, very low resolution textures or textures missing entirely, or there could be severe hitching/frame rate drops. These symptoms can be alleviated to a degree by lowering the in-game settings, but reducing texture quality tends to diminish the title's visual appeal. How much fidelity we're willing to trade for a more playable experience is obviously a personal matter.

While I'm sure that not all games will require an 8 GB card in 2023, I would not recommend buying one now, unless exclusively for older titles. I believe that 12 GB will be the sensible choice for AAA at 1080p in 2023.

And although many publishers disregard optimization altogether and keep putting out half-baked titles, people continue buying them. We can expect popular games to require more and more VRAM in the coming years -- the writing is on the wall.
:shadedshu:
learn the difference between using vram and needing vram
 
Even though numerous games are perfectly playable at 1080p with maxed out details on an 8 GB card, newer ones require more video memory to run smoothly. An ever increasing number of AAA titles can already overfill the 8 GB buffer at ultra:

9 GB VRAM @ 1080p
Forza Horizon 5

i have over 272 hours on FH5, playing on 1440p max settings on a 8gb card, and i never had any issue, people need to stop talking out of their ass, like someone said if you don't have a card with 8gb stop talking nonsense. If you have more then 8Gb the game will reserve more then 8Gb, that's it, it doesn't mean people can't play smoothly with 8Gb

i even launched the game to screen capture the settings.
 
Last edited:
:shadedshu:
learn the difference between using vram and needing vram

Learn that the vast majority of games only use the vram they need. Games like cod are the exception, not the rule.
 
Well first off you called both Ultra AND High stupid

i said very high not high, go read again. And the video agrees with me, use high it's in the damn clickbait

more accurately use whatever is needed to get decent framerates, unless it's lower then medium, unless it's really necessary. Isn't this the rule we go by in gaming since over 15 years now? is this news to everyone here? i don't get it
 
i said very high not high, go read again. And the video agrees with me, use high it's in the damn clickbait

There was no "very high" setting tested in the original video I linked as I pointed out earlier. You were complaining about their methodology when they didn't even test the setting you were complaining about.
 
i have over 272 hours on FH5, playing on 1440p max settings on a 8gb card, and i never had any issue, people need to stop talking out of their ass, like someone said if you don't have a card with 8gb stop talking nonsense. If you have more then 8Gb the game will reserve more then 8Gb, that's it, it doesn't mean people can't play smoothly with 8Gb

i even launched the game to screen capture the settings.

View attachment 291141

Speaking of talking out of your arse....

The game uses the amount of vram it needs - games can usually just get away with using a couple of gb of "shared video memory" (aka using the system memory as cache) without causing stuttering.

Hence why going a little over your vram limit doesn't cause massive stuttering. But the more you go over, the worse it becomes.
 
There was no "very high" setting tested in the original video I linked as I pointed out earlier. You were complaining about their methodology when they didn't even test the setting you were complaining about.

didn't what i said fully agrees with their own video? what is your point?

dumb is calling ultra dumb and testing using dumb, i think anyway
 
i have over 272 hours on FH5, playing on 1440p max settings on a 8gb card, and i never had any issue, people need to stop talking out of their ass, like someone said if you don't have a card with 8gb stop talking nonsense. If you have more then 8Gb the game will reserve more then 8Gb, that's it, it doesn't mean people can't play smoothly with 8Gb

i even launched the game to screen capture the settings.

View attachment 291141

Ah the good old "if it works for me in a single game it applies across the board" logic. Using a completely subjective metric such as 'it worked for me' to dismiss all who dare disagree, regardless of the small sample size of one and utter lack of definition of that metric or providing performance figures.
 
The game uses the amount of vram it needs

that's not how vram works at all.

Ah the good old "if it works for me in a single game it applies across the board" logic. Using a completely subjective metric such as 'it worked for me' to dismiss all who dare disagree, regardless of the small sample size of one and utter lack of definition of that metric or providing performance figures.

it's not selective, he selected it, it's the game i most play and have installed right now, it's also the 1st example he gives and it's wrong.
 
There was no "very high" setting tested in the original video I linked as I pointed out earlier. You were complaining about their methodology when they didn't even test the setting you were complaining about.
What you call different settings depends on the individual game. You're arguing about nothing while missing the point, which is the fact that you don't have to play every game at the highest settings to enjoy it.
 
that's not how vram works at all.



it's not selective, he selected it, it's the game i most play and have installed right now, it's also the 1st example he gives and it's wrong.

If you care to enlighten yourself (doubt it, as you obviously just want to troll) you ought to download hwinfo64 and have a look at shared video memory while playing games. In alot of games (including FH5) it will be a couple of GB, exactly cause it can't fit it all in your vram.
 
If you care to enlighten yourself (doubt it, as you obviously just want to troll) you ought to download hwinfo64 and have a look at shared video memory while playing games. In alot of games (including FH5) it will be a couple of GB, exactly cause it can't fit it all in your vram.
You use shared VRAM even if your on-board VRAM isn't full. And please stop calling everyone you disagree with a troll.
 
if you ever upgraded ram you know it doesn't work like that. The more you have the more it uses, or even more exactly the more it reserves, as you never really see the used vram.

I'm not saying the 8gb cards don't max, but those skyrim using 11.2gb are anecdotal, again i have to use this word.
vRAM usage/allocation is a thing ... it depend mostly on resolutions and textures (and RT) ...
btw unmodded TES:V vRAM usage/allocation is ~1.8gb and it did shows just that either on the 8gb 1070 or the current 12gb 6700 XT ...
 
Last edited:
oh ... wait, you said you don't see the used vRAM ... i guess it should rename vRAM usage to vRAM reserved aka: all your card have, then?
Yes we should. Reviews only use the term "VRAM usage" to make it somewhat understandable for not so tech-oriented readers. VRAM reservation, or VRAM allocation would mean nothing to them.

You can see games that use 10+ GB on a card that has more max out 8 GB on an 8 GB card, but still run fine. I remember trying RE: Village demo on my 2070 after some people here in the forum suggested that it runs like crap with Ultra graphics on an 8 GB card. It ran just fine.

See:
Ah, using "usage" is just to make it easier to understand for less technical people, because "allocation" is such a big word
 
Well first off you called both Ultra AND High stupid, of which even the video you linked contradicts. Second, that video presents a lot more nuanced stance than you are implying. Third, HWUB shows the difference in the video I linked between what the 3070 is capable of and what the RX 6800 is capable of and as they noted the difference in quality is significant. The video you linked is an older video using older games and is less relevant than a newer video with newer games showing you what these cards can do today. This would be akin to me linking older HWUB's videos to say "hey having more than 8GB is dumb" despite having just saw a video providing otherwise. Grapsing at straws, plain and simple.

Obviously anyone buying a midrange gpu minus the 3060 from Nvidia doesn't care about high or ultra texture settings. They really want that conslow like experience. This is really only an issues with Nvidia cards and you know how people love to drink that green Kool-Aid and defend Nvidia at all cost. Don't get me wrong amd fanboys are just as bad but at least they don't have to defend their 6700XT/6800s for not having enough vram in 2023......

It's not that hard guys

8GB is what should come on entry level cards.... For people who say cards less than 200 usd are ok with 4GB just blows my mind we have had entry level 8GB cards for over half a decade.....
12GB+ should come on midrange cards $400+
16GB+ in the high end $800+

This shouldn't even be a debate in 2023

The fact that consumers will be like no give me less vram on my $400+ gpu hurts my brain..... Just to defend either their gpu purchase or Nvidia because again of the two primary gpu makers they are the ones who equip their cards with inadequate vram.

Even my 3080ti has inadequate vram and was relegated to 1440p duties with 12GB which is firmly midrange now will it be ok for another 2 years at that resolution who knows but I guess I can drop it to 1080p if not.....
 
learn the difference between using vram and needing vram
people need to stop talking out of their ass, like someone said if you don't have a card with 8gb stop talking nonsense

This is from an older build without motion blur, AA and RT:

forza5_dx12_max.jpg


Yeah, I agree. Some people here definitely need to learn and stop talking nonsense.
 
No I just retired a day one brought Vega64 still working fine into retirement with it's 8GB I've been running 8GB cards the last 8 years, the whole PS4 generation oddly, and now, that coincidentally the ps5 and Xbox went beyond that I have too.
And PC still has a overhead to pay.

No normy like me has a valid opinion on the upgrade above because of course it was better so I can't gauge the input from the memory alone but stutter and level load chug wise at 4k , happy.
 
No I just retired a day one brought Vega64 still working fine into retirement with it's 8GB I've been running 8GB cards the last 8 years, the whole PS4 generation oddly, and now, that coincidentally the ps5 and Xbox went beyond that I have too.
And PC still has a overhead to pay.

Too bad that 7900XT didn't come with 8GB you could of saved some money on that wasted 12 extra GB it has.... Speaking of which it's a shame my 4090 didn't come with 8GB not sure what I'm going to do with this useless extra 16GB it has....

I guess I will just have to get into stable diffusion so it doesn't go to waste....
 
Last edited:
Status
Not open for further replies.
Back
Top