• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Is 8gb vram the minimum target for 2023 gaming?

Is 8gb vram the minimum entry for gaming 2023 and onwards

  • Yes

    Votes: 69 56.6%
  • No

    Votes: 53 43.4%

  • Total voters
    122
  • Poll closed .
Status
Not open for further replies.
Steve from HUB did a great job making the video he set out to make. He intended from the get-go to showcase 8GB of VRAM falling over, and he achieved just that. I'm also not saying he doesn't have a point, but the video showcased this "I'll prove myself right by constructing the test to prove myself right" clear as day. Likewise it would be possible to construct the testing methodology and game samples to do the inverse and cripple the 6800XT and have the 3070 come off much better, but the 6800XT's weakness is well known at this point, and VRAM is the topic du jour.

No he did not make a great job.


he made clickbait to make himself more money. It's like taking a Ferrari to the dunes and complaining it doesn't work properly as a buggy. The 3070 should have more VRAM but the video is idiotic clickbait.
 
Steve from HUB did a great job making the video he set out to make. He intended from the get-go to showcase 8GB of VRAM falling over, and he achieved just that. I'm also not saying he doesn't have a point, but the video showcased this "I'll prove myself right by constructing the test to prove myself right" clear as day. Likewise it would be possible to construct the testing methodology and game samples to do the inverse and cripple the 6800XT and have the 3070 come off much better, but the 6800XT's weakness is well known at this point, and VRAM is the topic du jour.
Indeed. He started with conclusion already made, and adjusted the test to fit that conclusion.
Unfortunately, few people are calling this out, and I hope this doesn't spill over to other places and turn into some kind of panic.
I was annoyed because because that was the first time I was disappointed from a video of theirs. Not saying Hardware Unboxed is flawless but they generally are on the level and a force for good in the world. This was the first video that was... off.

No he did not make a great job.

he made clickbait to make himself more money. It's like taking a Ferrari to the dunes and complaining it doesn't work properly as a buggy. The 3070 should have more VRAM but the video is idiotic clickbait.
That's exactly the point. His intention was to make a clickbait-y and (slightly) alarmist video, and he did just that and did it well, because people seem to eat it up. Mission accomplished.
 
No he did not make a great job.


he made clickbait to make himself more money. It's like taking a Ferrari to the dunes and complaining it doesn't work properly as a buggy. The 3070 should have more VRAM but the video is idiotic clickbait.
You can spin it around either way by saying
1. "you should lower your settings, because Ultra doesn't give much extra over High, but High runs miles better on a card like the 3070" or
2. "the 3070 doesn't have enough VRAM because such an expensive / x70-series / whatever card should run everything on Ultra without any issues".
Which side you pick and how you word it is what journalism is these days. The rest is taken care of by herd mentality.

I haven't watched the video, by the way, but I will. I'm curious.
 
So 11 pages & very little substance, let me add my $2.0 :D
Movie Leo GIF
 
So 11 pages & very little substance, let me add my $2.0 :D
Personally, I think there was very little substance in the opening question itself, so it was only to be expected, I guess. ;)
 
No he did not make a great job.


he made clickbait to make himself more money. It's like taking a Ferrari to the dunes and complaining it doesn't work properly as a buggy. The 3070 should have more VRAM but the video is idiotic clickbait.
I agree all of this is just 'subme clickme pls' for ad revenue... and yet its not untrue either. Midrange Nvidia is going to be and already is becoming a bloodbath of poor balance.

People with a set of brains don't need these videos to put two and two together, that's the gist to me. If people needed these videos to get to 'an understanding' of what happens, they're sheep, they don't know anything, and can be told anything. That's what we see around social media and communities at large - circlejerk behaviour, and oh boy do we all feel unique in our opinions :D Its too much irony for my blood honestly, I can't even... the fact people don't see that (or do, but don't understand what it means) is mind blowing to me.

The underlying principle here is that ad revenue thrives on a hype or two every day. If you click on it, you're the product and you're giving your time and self away for free to make others money. Indeed, fool & money parted, continuously :) And in a way, we 'want it' too, its fun having those micro conflicts and ensuing discussion right? That's what we're doing right here right now, too...

Its my avatar in a nutshell ;)
 
Last edited:
Here is my ss / 4k native ultra

4kultra.JPG.6b0edd4a51fb0b88be5f90b86d034bf2.JPG

Yeah, native 4k... except it is 1600x900...

Give us a link to that alleged native 4k image...
 
Yeah, native 4k... except it is 1600x900...

Give us a link to that alleged native 4k image...
The posted image is 1600x900. That doesn't mean that the game is as well. Just look at the small text in the overlay. C'mon, man...

I think it's always advisable to scale down your images before you upload them here. Not everyone on the forum has blazing fast fibre connections.
 
I agree all of this is just 'subme clickme pls' for ad revenue... and yet its not untrue either. Midrange Nvidia is going to be and already is becoming a bloodbath of poor balance.

the premise is right sure, we all agree on that, nvidia is just wrong on this.
But the way he did it was a mess on purpose i would say. GN already explained and so should he (if he doesn't know probably shouldn't be making videos about this), what you see is reserved vram not used vram, also that if you compare against a card with more vram it will reserve more just because more is available not because it's actually used. And using ultra to say the card can't run the games is also not the best way to do this, when himself said ultra is useless.
Create internet drama to get views.
 
You can spin it around either way by saying
1. "you should lower your settings, because Ultra doesn't give much extra over High, but High runs miles better on a card like the 3070" or
2. "the 3070 doesn't have enough VRAM because such an expensive / x70-series / whatever card should run everything on Ultra without any issues".
Which side you pick and how you word it is what journalism is these days. The rest is taken care of by herd mentality.

I haven't watched the video, by the way, but I will. I'm curious.
those nitwits didn't even watch the video

1) that's tim not Steve, Steve typically has the full beard
2) it's an opinion piece which he clearly states
3) the whole premise is if you dial down settings to very high rather than ultra, you get better performance (duh) for little to no visual quality loss in most cases (duh) and then shows videos of several games allowing the person watching to judge for themselves. At no point does he state it's the only way to play video games just one that allows better performance if your video card is struggling to hit a preferred FPS
4) these idiot fan boys are just pissed at the title of the video and believe ultra settings PC gaming is the only way to play a video game
 
Indeed. He started with conclusion already made, and adjusted the test to fit that conclusion.
"Oh no, the man with an idea set out to prove that idea and succeeded"
"How dishonest"
This is some "I don't trust scientists because they're paid to prove the science" tier argument.
 
I daily drive the system in my sig/specs. 4K, max settings, 120Hz. Only some real heavy modern games like FFVII remake am I lowering res to 1440p get better frame rates. This is with a RX 6900 XT

I just took my RX 5700 XT (which is what the RX 6900 XT replaced as the daily driver) and put it in a new SFF build for the living room. Obviously these cards are quite a ways apart in terms of capabilities. Thanks to FSR and RSR I can play any game on the living room SFF system that I can on the main system in the computer room. Though many games I play I don't need to lower anything. IF needed and FSR is offered I leave res at 4K and lower FSR quality until I get 60+ FPS. If FSR is not offered in game I use RSR and lowering res to either 1440p or 1080p....but there is also a huge difference between me sitting 3ft away from a screen (computer room) and sitting like 12ft+ away from a display (living room). I cannot even tell the difference from that large of a distance. My "old" RX 5700 XT sure still feels fun to play games on.....I'm not complaining. I wouldn't call it the "minimum" needed as I am sure there are people who would be perfectly OK with less and this whole debate is incredibly subjective and dependent on the individual regarding what games they play and what settings they prefer.

My point is sort of....."no, it's only the minimum target if you X and want Y and also need Z" and all those variables depend on the user. This debate is missing those critical variables. If the original question was led on more like "Is 8GB enough to play current and upcoming AAA titles that fully utilize current hardware's features?" then the answer would be.....yes......because those type of games I don't think you will get away with 4GB or less of VRAM.....maybe 6GB....maybe.

otherwise minimum entry to game is just "does the game run". and lots of cards with less than 8GB can still run lots of newer games.
 
"Oh no, the man with an idea set out to prove that idea and succeeded"
"How dishonest"
This is some "I don't trust scientists because they're paid to prove the science" tier argument.
There is a difference between "I want to test if 8GB of VRAM is sufficient for gaming" and "I want to prove that 8GB is insufficient for gaming".
In the first case, you have a theory and perform tests to check if your theory is correct.
In the second case, you have a pre-established conclusion that you want to achieve, which may involve performing the tests in such a way as to support your conclusion.

And yes, it is, at least to an extent, dishonest. Steve deliberately tested both cards with Ray Tracing enabled. No person familiar with PC building purchases the RTX 3070 or RX 6800 and expects smooth Ray Tracing performance across all games. Thus, testing with Ray Tracing enabled is unrealistic for these cards. Ray Tracing was enabled basically to exaggerate the issue so Steve can point to it and say "See! I told you!".
The scientific method is "I want to perform tests to see if I am correct". What Steve did was "I know I am correct, and I'm going to do the tests in a way that support my position".

And for the record, I don't dislike Hardware Unboxed, quite the opposite. I wouldn't be supporting them on Patreon if I didn't like their content overall.
 
And yes, it is, at least to an extent, dishonest. Steve deliberately tested both cards with Ray Tracing enabled. No person familiar with PC building purchases the RTX 3070 or RX 6800 and expects smooth Ray Tracing performance across all games. Thus, testing with Ray Tracing enabled is unrealistic for these cards.

The selling point of a 3070 over a 6800 is supposed to be the ray tracing, remove that and you're left with what ?

If enabling a feature causes issues because of insufficient memory then the issue is the insufficient memory not the fact that you have ray tracing on, isn't that obvious ?
 
I think 8GB is a good 1080p card. That was the resolution I was at when I got my 3070Ti a couple of years ago. All games I played were maxed out and for that the card is a monster. Now I am at 3840x2160 and I cant play everything at full detail, but for me its ok. My next card will be what comes after Ada.
 
The selling point of a 3070 over a 6800 is supposed to be the ray tracing, remove that and you're left with what ?
Who's removing anything lol, manage your settings and expectations on a per setup basis. A 6800XT is easy to shame if we chose unrealistic scenarios. Every game that came before now, and the ones to come where the 3070 does well, screw those? Crap ports with bloated VRAM requirements may be an increasingly disappointing norm, but that doesn't undo every single other game, RT included, making up the vast majority of all games that the 3070 can play at very acceptable settings overall.

Nobody is running a game at 2fps at per HUB's forced gimped settings to make a point and going, oh well, this thing is tech garbage *slides 3070 into trash*.

Yes 8GB is a limiting factor. The vast majority of cards have limiting facors. Does it make a 3070 useless? hell no
 
Yes 8GB is a limiting factor. The vast majority of cards have limiting facors. Does it make a 3070 useless? hell no
If you claim the testing is unrealistic because ray tracing is enabled then you're implying one should expect a 3070 user to not use ray tracing as result, because you know, it's unrealistic. But then the 6800XT is considerably faster than a 3070 in raster, so going by that logic yes, the 3070 would become effectively useless because the one scenario where it's supposed to have an advantage is actually too "unrealistic" apparently. Mind you, Nvidia has being hammering RT performance into everyone's head as a selling point for years.

Crap ports with bloated VRAM requirements may be an increasingly disappointing norm
You know what I consider crappy ports that have disappointingly become the norm as well ? Game where RT causes performance to drop off a cliff into a chasm but I am sure you'd disagree with me because ray tracing is amazing and people should just manage their expectations.

Games are what they are, they are now using a lot more VRAM than a couple of years ago, they are not crappy ports because of that.
 
If you claim the testing is unrealistic because ray tracing is enabled then you're implying one should expect a 3070 user to not use ray tracing as result, because you know, it's unrealistic. But then the 6800XT is considerably faster than a 3070 in raster, so going by that logic yes, the 3070 would become effectively useless because the one scenario where it's supposed to have an advantage is actually too "unrealistic" apparently. Mind you, Nvidia has being hammering RT performance into everyone's head as a selling point for years.
Yes, exactly!

The RTX 3070 is kind of in limbo when it comes to ray tracing performance. It isn't terrible but it isn't good enough to comfortably handle all games on the market at a playable framerate. It can handle some, but not all, which makes it a bad (or at the very least sub-optimal) choice for people who are serious about ray tracing.

The RTX 3050 also supports ray tracing. Does anyone expect that card to provide solid ray tracing performance? Should that card be judged based on its ray tracing performance? I mean, it supports it, so if it can't handle a modern AAA game with ray tracing enabled, then it has to be trash, right?
Just because a card is marketed on a certain feature, doesn't mean it will perform it well. That's on NVIDIA's marketing, not on the card itself.

Let's look at motherboards for another example. Take you average H510/H610 or A520/A620 board. Does anyone realistically expects those boards to do perfectly well with a i9 or Ryzen 9 processors, respectively? Marketing departments can try and convince people as much as they want but no PC builder worth their salt would pair such a motherboard with such a processor. Just because a high-tier processor is listed as supported does not make it a good idea to use it in a cheap-ass board. Of course, PC builders/enthusiasts generally don't complain about this because they understand reality.

It's called market segmentation: differently priced products for different needs. The RTX 3070 happens to simply not be particularly well suited for ray tracing (regardless of its VRAM capacity). Steve knows this but chose to make the card look bad to "prove" his point.
 
Last edited:
No person familiar with PC building purchases the RTX 3070 or RX 6800 and expects smooth Ray Tracing performance across all games.
Hilarious.
So now if you're not a PC expert, and you just listen to the marketing showing you this amazing Raytracing feature (only on Nvidia cards) that you just HAVE to buy for the low low price of $600, you get a fat joke of a card that'll croak under lack of VRAM in less than 2 years, and it's your fault for being none the wiser. And the only way to still use it is to turn off that RT that you especially picked the card for.

Your BS has gone from "I don't trust the scientists because they're paid" to "If you were an expert in the subject, you'd know".

The poor idiots who bought into the raytracing meme aren't experts. And they don't have to be. That's why we have reviewers. As reviewers, HWUB did warn against this probable problem, and got more than a little bit of a reaction for pointing out the future VRAM problems. As usual with the Nvidia drones, that earned them a pleasant stock of shit cake delivered all over the comments and social media. I'm sure that's going to be another non-argument of "bias" in your eyes, but pointing out a problem and getting shot at is actually not pleasant, and coming back later with a big video that acts as a big "I told you so" is actually very justified. And it gives them more credit when they're raising the same flag for the 12Gb 4070s.

Edit: and also because I'm tired of hearing the same non-argument, no, lack of VRAM isn't a hard limit on the chip's power. It's being stingy. It's being cheap. VRAM isn't THAT expensive. AMD puts perfectly acceptable GDDR6 in cards that are much cheaper than Nvidia's overpriced crap. It's not a price problem, it's Nvidia being cheap while demanding ultra premium prices. For a 3070 Ti particularly, you ask for $600 and then deliver something that's actually powerful, but coupled with so little VRAM that it'll end up gimped? It's like buying a Mercedes with all the interior done in the cheapest plastic to save 200 bucks. It's like taking your girl out for dinner and you tell her to walk 1km for the big restaurant so that you can save on gas. It's so cheap that it's insulting.

HELLLLL, not even an hour after I write this, check out this "expert that should've known better":
I seriously feel bad for this poor schmuck. He'll learn about the issues when it's too late and ofc all the drones are in the comments going "VrAM IsSSueS LOLLLLL".
 
Last edited:
You need at least 12Gb vram to play games smoothly now
Not true my 3080 10 GB GPU will rock and roll at 2K above 90 FPS(DLSS off) on FS 2020, MW 2, control, cyberpunk and Dead Space. No issues except bad code or ports.
 
Not true my 3080 10 GB GPU will rock and roll at 2K above 90 FPS(DLSS off) on FS 2020, MW 2, control, cyberpunk and Dead Space. No issues except bad code or ports.

Yeah, all games that require more than 10gb vram with maxed out settings are just bad code... *eye roll*
 
Not true my 3080 10 GB GPU will rock and roll at 2K above 90 FPS(DLSS off) on FS 2020, MW 2, control, cyberpunk and Dead Space. No issues except bad code or ports.
What we mean isn't that you can't play at 10Gb. What we mean is that you shouldn't buy a GPU with less than 12Gb today since its longevity will be nothing.
And your card will probably start having some minor (at first) VRAM issues by 2024.
I expect that a 12Gb GPU will be reaching its limits and start choking a bit before its lifecycle even ends, which is less than what the 3070s can say...
 
I find Hardware Unboxed to be just a tad alarmist about the VRAM thing. It is their third video (that I recall of) where they talk about this.
J
ust like they were a tad "alarmist" with the 1060 3GB. People criticized them for calling out the VRAM on that card too but look how poorly those comments aged. It is not alarmist to point to objective data collected and saying "hey guys there's a potential trend forming in new AAA titles wherein the 3070's performance has really taken a dive due to it's lack of VRAM".

I consider The Last of US the only game where the RTX 3070 struggles due to lack of VRAM, at least according to that video.
All other games either ran fine on the RTX 3070 or had Ray Tracing enabled, which I'll call unfair and/or impractical. Neither the RTX 3070, nor the RX 6800 is a good card to get if you are serious about Ray Tracing, regardless how much VRAM they have, so enabling it exacerbates the issue artificially.

Yes, the issue exists. Yes, 8GB of VRAM is indeed slowly becoming insufficient for people who want to play the latest AAA games. Yes, NVIDIA is stingy with its VRAM.
My point is that the entire video makes the issue appear worse than it is.

Texture swapping was an issue with or without RT.

100% if the 3070 had more VRAM RT performance would be a lot more respectable. No idea where you are getting the notion that more VRAM wouldn't help when we know for a fact the lack of VRAM is what's tanking performance, not the GPU core itself.

Reviewers and Nvidia sold the 3070 as a capable RT part. A lot of people bought the 3070 over the 6700 XT / 6800 because of the RT. It essentially loosing the ability to do so reasonably well in such a short period of time would understandably be a problem for those individuals and the reviews that recommended the card for RT. People expect a certain number of years out of their video cards, especially as prices have increased.
 
Stay away from ultra setting, pointless anyway, and it should be fine for some time at 1440p. Or playing games at 30 or 50fps high when you can just lower the settings a bit and have a better experience. Nothing new to gaming anyway.
I'm not happy with only 8gb having a very similar card, but it's not the end of the world either.

This is a matter of personal preference. I don't think ultra settings are pointless as some games its adds a decent amount of ambience.

My wife enjoys the simplicity that medium settings brings to a lot of newer games. I, on the other hand, like to AT LEAST keep shadowing, lighting, and texture options as high as I can take them. I'd rather play games at 60fps as close to maxed out as I can at 1440p, than play at 90fps at medium. One thing I won't turn on though, is RT, because the hit you take from that is not worth it to me. I can totally understand if you're into esports that framerate is king. For me, it's not. I like a good mixture of framerate and visuals.

I've often wondered if my age and experiences with early PC gaming impact my opinion on this. When I got into PC gaming the Voodoo2 12mb was king, and every generation offered a massive increase in visual fidelity. I sometimes think I'm still in that mindset.

EDIT: what I took away from the video, and it seems to me the conclusion he came to, whether he admitted it or not, was that for future games, 8gb is going to struggle so it may be worth looking at the AMD offerings as at least you won't run into what the 3070 ran into in that video.

Also, with new technology coming out like your CPU being able to access your VRAM directly, it seems like this VRAM dicussion would become more important.
 
Last edited:
Hilarious.
So now if you're not a PC expert, and you just listen to the marketing showing you this amazing Raytracing feature (only on Nvidia cards) that you just HAVE to buy for the low low price of $600, you get a fat joke of a card that'll croak under lack of VRAM in less than 2 years, and it's your fault for being none the wiser. And the only way to still use it is to turn off that RT that you especially picked the card for.

Your BS has gone from "I don't trust the scientists because they're paid" to "If you were an expert in the subject, you'd know".

The poor idiots who bought into the raytracing meme aren't experts. And they don't have to be. That's why we have reviewers. As reviewers, HWUB did warn against this probable problem, and got more than a little bit of a reaction for pointing out the future VRAM problems. As usual with the Nvidia drones, that earned them a pleasant stock of shit cake delivered all over the comments and social media. I'm sure that's going to be another non-argument of "bias" in your eyes, but pointing out a problem and getting shot at is actually not pleasant, and coming back later with a big video that acts as a big "I told you so" is actually very justified. And it gives them more credit when they're raising the same flag for the 12Gb 4070s.

Edit: and also because I'm tired of hearing the same non-argument, no, lack of VRAM isn't a hard limit on the chip's power. It's being stingy. It's being cheap. VRAM isn't THAT expensive. AMD puts perfectly acceptable GDDR6 in cards that are much cheaper than Nvidia's overpriced crap. It's not a price problem, it's Nvidia being cheap while demanding ultra premium prices. For a 3070 Ti particularly, you ask for $600 and then deliver something that's actually powerful, but coupled with so little VRAM that it'll end up gimped? It's like buying a Mercedes with all the interior done in the cheapest plastic to save 200 bucks. It's like taking your girl out for dinner and you tell her to walk 1km for the big restaurant so that you can save on gas. It's so cheap that it's insulting.

I seriously feel bad for this poor schmuck. He'll learn about the issues when it's too late and ofc all the drones are in the comments going "VrAM IsSSueS LOLLLLL".
I don't see a point in continuing the discussion if you even can't keep things civil, and have to resort to name calling.

Texture swapping was an issue with or without RT.

100% if the 3070 had more VRAM RT performance would be a lot more respectable. No idea where you are getting the notion that more VRAM wouldn't help when we know for a fact the lack of VRAM is what's tanking performance, not the GPU core itself.

Reviewers and Nvidia sold the 3070 as a capable RT part. A lot of people bought the 3070 over the 6700 XT / 6800 because of the RT. It essentially loosing the ability to do so reasonably well in such a short period of time would understandably be a problem for those individuals and the reviews that recommended the card for RT. People expect a certain number of years out of their video cards, especially as prices have increased.
Texture swapping without ray tracing was an issue for basically only three of the games tested.

I agree that with more VRAM the RTX 3070 would have been much more competitive. My point was that Steve contradicted himself a bit. At the end of the video he says "this isn't to say that all 8GB graphics cards are now useless and/or obsolete, or that all the graphics cards released in the last two years should've had more than 8GB of VRAM", however earlier he says just that, that the 3070 SHOULD have had more VRAM. Also, at the end he outright states that 8GB is no longer sufficient for high-end gaming which perhaps could be construed as a contradiction to "the 8GB cards are not useless/obsolete" statement.
He also claims definitively that there will be many more games by the year's end that will break 8GB cards. Now, he may indeed be right but he also may not be. Stating it as if it is guaranteed to happen is, in my book, at least a bit alarmist.

Also, perhaps he should've used the RTX 3070 Ti, as it is a closer competitor to the RX 6800. Granted, in games where the 8GB VRAM is insufficient, the performance won't be much different than the RTX 3070, so the latter can still "prove his point", Still, it would have been a bit fairer comparison overall, in my opinion.
 
Status
Not open for further replies.
Back
Top