• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Is 8gb vram the minimum target for 2023 gaming?

Is 8gb vram the minimum entry for gaming 2023 and onwards

  • Yes

    Votes: 69 56.6%
  • No

    Votes: 53 43.4%

  • Total voters
    122
  • Poll closed .
Status
Not open for further replies.
My 3070 ti might aswell be a door holder.

s/
 
Wow this has to be the first 50/50 result.
 
Wow this has to be the first 50/50 result.

It just confirms what we already know from steam most people don't spend more than $350 on a gpu

Shame they don't have that 8GB 4080 for $999 for all these peeps who think it's enough.... I better keep quiet don't want to give Nvidia any ideas....
 
Last edited:
It would be interesting to revisit this thread 3 or 4 years from now. Remember the question was "2023 and onwards". Most gamers keep their card for 2 generations.
 
After seeing what Hogwarts used in VRAM at 1080 low texture settings with my RX 6650XT OC, I sold it and got an RX 6750XT OC to replace it. To me, 8Gb is not enough for some of the newer games even at low settings at 1080, so I would look at 10Gb or 12Gb minim for VRAM if you do not want to but a card in the next year.
 
I didn't say it was enough, I just said I was ok with it :D

I can max it out with ease, since I play at 4K..

I don't play all the newest games, and I play at 60Hz,, so that helps :D

Spending close to a G, let alone more than a G for a midrange GPU is not very appealing :)
 
true, my steam deck is only 60 fps, and I really am enjoying it more than I expected. I do wish it had freesync though to increase the smoothness effect.
Funny we do PC consulting for a living and my son prefers his Nintendo switch to his 3080 PC setup.
 
this, the minimum is what a person needs for the minimum performance they want

also, hasn't this been posted countless times before
Exactly my thoughts. My minimum isn't necessarily the same as your minimum. The resolutions, graphical settings and the games we play differ greatly. Thus, it is impossible to say what "the minimum" is, if there is such a thing.
 
Exactly my thoughts. My minimum isn't necessarily the same as your minimum. The resolutions, graphical settings and the games we play differ greatly. Thus, it is impossible to say what "the minimum" is, if there is such a thing.

Well said.
 
Well said.


I still don't think it's wrong for gamers to expect a min amount of vram based on what they are spending though and really the better way to look at vram amount to begin with.

8GB on a 500 usd gpu was ok a half decade ago but gamers should expect more in 2023. I also think 12GB on an 800+ gpu is pathetic though.... Currently it's only Nvidia that does this though it would seem.
 
I still don't think it's wrong for gamers to expect a min amount of vram based on what they are spending though and really the better way to look at vram amount to begin with.

8GB on a 500 usd gpu was ok a half decade ago but gamers should expect more in 2023.
It still depends on the games you play. Any amount of VRAM is only as good as the game that uses it.

Edit: With that said, I also don't think one should necessarily look at 500 USD GPUs. It all depends on one's needs.
 
Last edited:
It still depends on the games you play. Any amount of VRAM is only as good as the game that uses it.

if you are spending 500 usd or more you should be able to play any game you want at any settings you choose at least at 1440p you shouldn't be limited due to poor design choices by a manufacturer cheaping out on Vram we are already seeing this with the 3070/3070ti in games a 6800/6700XT play no problem without stutter the Nvidia alternative stutter hard. We likely will see the same with the 4070/4070ti in two years vs the 7900XT/7800XT assuming AMD doesn't pull and Nvidia on lower tier cards...

Also people talk about a Plague tale and how good it looks while not using a ton of vram but guess what turn on RT and the performance on the 3070 dies hard even at 1080p not because it isn't strong enough but because it doesn't have enough vram.
 
if you are spending 500 usd or more you should be able to play any game you want at any settings you choose at least at 1440p you shouldn't be limited due to poor design choices by a manufacturer cheaping out on Vram we are already seeing this with the 3070/3070ti in games a 6800/6700XT play no problem without stutter the Nvidia alternative stutter hard. We likely will see the same with the 4070/4070ti in two years vs the 7900XT/7800XT assuming AMD doesn't pull and Nvidia on lower tier cards...

Also people talk about a Plague tale and how good it looks while not using a ton of vram but guess what turn on RT and the performance on the 3070 dies hard even at 1080p not because it isn't strong enough but because it doesn't have enough vram.
I don't disagree, but we're talking about a hypothetical minimum amount, not a comparison of 500 USD cards.
 
I don't disagree, but we're talking about a hypothetical minimum amount, not a comparison of 500 USD cards.

But a hypothetical minimums should change based on what someone is spending a 4050 shouldn't come with the same amount of Vram as the 4060ti but guess what it most likely will.

What people spend their own hard earned money is their own business but people should expect more than what Nvidia is giving them in the midrange and even in what they would like to call the high end with the 4070ti and probably even the vanilla 4070.. Now if people look at those cards and go geez they are a good deal for me good for them.
 
But a hypothetical minimums should change based on what someone is spending
Exactly. Plus, based on what you play, at what resolution, etc. That's why I don't think there is a direct answer to the OP question.
 
Exactly. Plus, based on what you play, at what resolution, etc. That's why I don't think there is a direct answer to the OP question.

For sure.

I think the better question would be what is the minimum amount for a low end/mid range/high end card separated by cost.

My one and only issue is when GPU makers purposefully gives a card less Vram than it should have basically when the only thing limiting the card is not it's actual performance level but how much memory it has. They really should have made the 3070ti a 16GB card it's not like they didn't know they could sell it for much higher than MSRP and honestly it would still be an awesome card today if it did that doesn't mean I'm saying games need 16GB of Vram I'm just saying for when the time games might need more than eight and due to the bus configuration it was the only option Nvidia had for more memory. Instead it's just an ok card that sometimes looks pretty bad compared to the AMD cards that competed in it's general price bracket.
 
I kind of enjoy reading this back and forth.

What gets me is the youtube videos. Earlier today I watched one claiming that previous generations (3000 series) were crippled because they don't have enough VRAM. I then look at some of their testing...and the cards on the block aren't exactly consumer choices. I then look at steam for a bell weather, and only recently had the 1660 been dethroned by the 3050...a card that is already not great but was one of the lowest price (and thus most affordable) during the pandemic. Heck, even today sites like Newegg are flogging the 1000 series GPUs...in the age to the 4000 series.

As far as my opinion, the usual caveats apply:
1080p is the standard resolution for most gaming.
60 Hz is still the standard most people game at, based only on the most cost efficient monitors.
Games are getting bigger, sloppier, and more resource intensive. This drives up requirements generally without driving up appreciable fun.
Most people game for fun, and not to measure their system performances.

Stating all of the above, I believe that barring the need to play the latest poorly optimized port trash 8 GB is enough for entry level. I believe that anything above that is probably more future resistant.


Taking a tangent for one moment;
3060 - 12 GB
3060ti - 8 GB
3070 - 8 GB
3070ti - 8 GB
3080 - 10 GB
3080 - 12 GB
3080ti - 12 GB
There's plenty of reason to argue that the 3060ti and 3060 having faster, but less, VRAM is fine. Where I have issues is that Nvidia basically is adding 2 GB every other generation to their "price friendly" options and their higher end seems to be rapidly growing...but pricing is uniformly going up. That is to say a 1070 had an MSRP of $379, and the 3060ti has a $399 tag. Performance is about 20% lower on the card two generations older and having the same VRAM...whereas the trend of each generation moving down one peg should be 1070=3050, but the 1070 is 7% faster...with 8GB of VRAM...
That's a bit goofy...but let me follow up with a question. If we know the 970 was crippled by slow VRAM years ago, the 1060 3GB was just silly underperforming the 6GB, and VRAM is one of the cheapest components to up then why don't we see more variants? Well, the 4070ti. 3 variants with varying VRAM, the also came with crippled core counts that would have made laymen understand things as less VRAM=worse without knowing less VRAM+less cores = worse.


Tangent aside, I do have a second answer. When opting for the 3080, 3080 12GB, or the 3080ti I chose the 12GB version. For me the choice was simple. In 2019 the 5700 XT was well under the $349 MSRP. It came with 8GB, and was replaced with the 6700 XT with 12 GB. Extrapolating, if Nvidia pulled the same generational middle range increase we'd see a 3070 at 12 GB, and the 4070 at 16 GB. (ignoring the 2000 series as a refresh).. It kind of boggles my mind that AMD is doubling down on VRAM at all levels, Nvidia is betting on uspcaling cheaper parts and on the high end feeding the beast VRAM, but effectively we're discussing all of this because of a fraction of a fraction of all games...
I think that if you want the best visuals then 8GB is too little, if you want to focus on games more than graphics 8GB is enough, and that we'll continue to see useless increases in requirements to drive new hardware sales because there's only a small minority of people gaming at high resolution or high refresh rates...neither of which is really going to be a thing until sub $200 cards can support them.

Reinforcing the above, the newest consoles can theoretically run 4k. They don't run high refresh rates. Their 4k is...not with all of the bells and whistles. Most people still game on consoles or mobile....so the market is aiming for that. Both Nvidia and AMD are making sure 4k and high refresh don't happen by pricing people out of the market with what used to be a $400 card (1070ti) at $800 (4070ti)....which makes the equation less about a required minimum and more about a point of affordability when formerly mid-high tier cards are priced at almost twice a console without any other hardware accounted for. So, my response here is that the "minimum" is whatever you can reasonably afford...."requirements" defined by greedy programs are not really reasonable when you're playing 5+ year old games because a $100 bill buys either an 8 hour AAA campaign, multiplayer, and dlc or a dozen indie games that increase your playing time tenfold versus that one expensive option using the latest bells and whistles.
 
Had a disagreement with a fellow forum member in another forum, it was over the HardwareUnboxed very recent review of the RX 6800 (starting at 510USD) vs the RTX 3070 (starting at 530USD). It clearly showed that the 8GB VRAM buffer on the latter to be a real issue with games at certain setting, even Hogwarts Legacy with the recent patch ran worse on the RTX 3070 vs the RX 6800, both in terms of smoothness and PQ.

I'd said that based on their present prices, it was a fair comparison, which only seems to annoy him. He'd argued that such comparisons must be made based on GPU tier and MSRP, I'd argued most would not care for this, and that the present price points is the deciding factor as to which card should be compared to which. Anyway, IF I were to include the RTX 3070 Ti (starts at >600USD) vs the RX 6800, it'd be even worse.
 
… They really should have made the 3070ti a 16GB card it's not like they didn't know they could sell it for much higher than MSRP and honestly it would still be an awesome card today if it did that doesn't mean I'm saying games need 16GB of Vram

Not really, as by the time 16GB is filled the frame rate is dropping below 45, so I prefer to trade off that to 90 FPS at 8GB most of the time. I even play fortnite on a 2GB card at 900p. at 40-60 Fps.
 
No I meant the 6600
Eh, why would Steve do that? Obviously the RTX 3070 is gonna win, but you'd be comparing a 510USD card against a card that not even half of that price! Only reason I can think of for such a comparison is to sooth bruised RTX 3070 owners' ego.
 
Eh, why would Steve do that? Obviously the RTX 3070 is gonna win, but you'd be comparing a 510USD card against a card that not even half of that price! Only reason I can think of for such a comparison is to sooth bruised RTX 3070 owners' ego.

Still super confused as to why that comparison is relevant it would be like comparing a 6800XT to a 4080 just because they both have 16GB of vram.....

Not really, as by the time 16GB is filled the frame rate is dropping below 45, so I prefer to trade off that to 90 FPS at 8GB most of the time. I even play fortnite on a 2GB card at 900p. at 40-60 Fps.

The saddest thing about the 3070 is it's over 100% faster than your 980ti but only has 33% more vram smh and also the 980ti is over a half decade older.
 
Last edited:
Still super confused as to why that comparison is relevant it would be like comparing a 6800XT to a 4080 just because they both have 16GB of vram.....



The saddest thing about the 3070 is it's over 100% faster than your 980ti but only has 33% more vram smh and also the 980ti is over a half decade older.
No, the relevance is price, the RX 6800 can be had for 510USD, the RTX 3070 can be had for 530USD, the RTX 3070 Ti for >600USD. The RX 6800 is in about the same price bracket as the RX 3070, hence the relevance. Pretty much the same argument I'd had in the other forum, the guy was talking about cards being compared should be in the same or similar tier, or based on MSRP.

I'd opined that I'd agreed with the tech sites making the comparison, that price is the deciding factor....not some notion about GPU tier (even when the cheaper and apparently perceived 'higher' tier card outperforms the more expensive 'lower' tiered one). Who cares about GPU tier IF the perceived higher tier card is cheaper? What's of real importance is the price, and it's true now that the RX 6800 can be had for a little less than the RTX 3070.
 
Also people talk about a Plague tale and how good it looks while not using a ton of vram but guess what turn on RT and the performance on the 3070 dies hard even at 1080p not because it isn't strong enough but because it doesn't have enough vram.

To be fair they only implemented RT shadows for now and its meh at best, at some points arguably worse looking than with RT off.
Plague imo is still one of the best 'recent-ish' looking games w/o having a stupid ammount of Vram usage/requirement.

As much as I liked playing Cyberpunk with some RT 'ultra lights/reflections' on top of ~mix of ultra/high settings+DLSS Quality but if its the difference between perfectly playable and unplayable I will just turn RT off and call it a day. 'I still have Plague installed but I honestly can't tell a big diff with RT on or OFF, maybe when they add reflections at least'
 
Last edited:
If I am buying something new I would not touch anything that doesn't have at least 12 GB of VRAM and I consider that the bare minimum for a mid range card, I would not pay more than 600$ for a card that doesn't have at least 16 GB.
 
16GB is the bare minimum for 1080/1440. But keep in mind... that 4K gaming ie 4 times increase doesn't mean 24GB is enough either. 32GB is the magic number actually needed for future proof 4K gaming.

Welcome to 2023. Well at last.
 
Status
Not open for further replies.
Back
Top